To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To conduct a pilot study implementing combined genomic and epidemiologic surveillance for hospital-acquired multidrug-resistant organisms (MDROs) to predict transmission between patients and to estimate the local burden of MDRO transmission.
Pilot prospective multicenter surveillance study.
The study was conducted in 8 university hospitals (2,800 beds total) in Melbourne, Australia (population 4.8 million), including 4 acute-care, 1 specialist cancer care, and 3 subacute-care hospitals.
All clinical and screening isolates from hospital inpatients (April 24 to June 18, 2017) were collected for 6 MDROs: vanA VRE, MRSA, ESBL Escherichia coli (ESBL-Ec) and Klebsiella pneumoniae (ESBL-Kp), and carbapenem-resistant Pseudomonas aeruginosa (CRPa) and Acinetobacter baumannii (CRAb). Isolates were analyzed and reported as routine by hospital laboratories, underwent whole-genome sequencing at the central laboratory, and were analyzed using open-source bioinformatic tools. MDRO burden and transmission were assessed using combined genomic and epidemiologic data.
In total, 408 isolates were collected from 358 patients; 47.5% were screening isolates. ESBL-Ec was most common (52.5%), then MRSA (21.6%), vanA VRE (15.7%), and ESBL-Kp (7.6%). Most MDROs (88.3%) were isolated from patients with recent healthcare exposure.
Combining genomics and epidemiology identified that at least 27.1% of MDROs were likely acquired in a hospital; most of these transmission events would not have been detected without genomics. The highest proportion of transmission occurred with vanA VRE (88.4% of patients).
Genomic and epidemiologic data from multiple institutions can feasibly be combined prospectively, providing substantial insights into the burden and distribution of MDROs, including in-hospital transmission. This analysis enables infection control teams to target interventions more effectively.
Existing peer-reviewed literature describing emergency medical technician (EMT) acquisition and transmission of 12-lead electrocardiograms (12L-ECGs), in the absence of a paramedic, is largely limited to feasibility studies.
The objective of this retrospective observational study was to describe the impact of EMT-acquired 12L-ECGs in Suffolk County, New York (USA), both in terms of the diagnostic quality of the transmitted 12L-ECGs and the number of prehospital percutaneous coronary intervention (PCI)-center notifications made as a result of transmitted 12L-ECGs demonstrating a ST-elevation myocardial infarction (STEMI).
A pre-existing database was queried for Emergency Medical Services (EMS) calls on which an EMT acquired a 12L-ECG from program initiation (January 2017) through December 31, 2019. Scanned copies of the 12L-ECGs were requested in order to be reviewed by a blinded emergency physician.
Of the 665 calls, 99 had no 12L-ECG available within the database. For 543 (96%) of the available 12L-ECGs, the quality was sufficient to diagnose the presence or absence of a STEMI. Eighteen notifications were made to PCI-centers about a concern for STEMI. The median time spent on scene and transporting to the hospital were 18 and 11 minutes, respectively. The median time from PCI-center notification to EMS arrival at the emergency department (ED) was seven minutes (IQR 5-14).
In the event a cardiac monitor is available, after a limited educational intervention, EMTs are capable of acquiring a diagnostically useful 12L-ECG and transmitting it to a remote medical control physician for interpretation. This allows for prehospital PCI-center activation for a concern of a 12L-ECG with a STEMI, in the event that a paramedic is not available to care for the patient.
Prehospital use of lung ultrasound (LUS) by paramedics to guide the diagnoses and treatment of patients has expanded over the past several years. However, almost all of this education has occurred in a classroom or hospital setting. No published prehospital use of LUS simulation software within an ambulance currently exists.
The objective of this study was to determine if various ambulance driving conditions (stationary, constant acceleration, serpentine, and start-stop) would impact paramedics’ abilities to perform LUS on a standardized patient (SP) using breath-holding to simulate lung pathology, or to perform LUS using ultrasound (US) simulation software. Primary endpoints included the participating paramedics’: (1) time to acquiring a satisfactory simulated LUS image; and (2) accuracy of image recognition and interpretation. Secondary endpoints for the breath-holding portion included: (1) the agreement between image interpretation by paramedic versus blinded expert reviewers; and (2) the quality of captured LUS image as determined by two blinded expert reviewers. Finally, a paramedic LUS training session was evaluated by comparing pre-test to post-test scores on a 25-item assessment requiring the recognition of a clinical interpretation of prerecorded LUS images.
Seventeen paramedics received a 45-minute LUS lecture. They then performed 25 LUS exams on both SPs and using simulation software, in each case looking for lung sliding, A and B lines, and seashore or barcode signs. Pre- and post-training, they completed a 25-question test consisting of still images and videos requiring pathology recognition and formulation of a clinical diagnosis. Sixteen paramedics performed the same exams in an ambulance during different driving conditions (stationary, constant acceleration, serpentines, and abrupt start-stops). Lung pathology was block randomized based on driving condition.
Paramedics demonstrated improved post-test scores compared to pre-test scores (P <.001). No significant difference existed across driving conditions for: time needed to obtain a simulated image; clinical interpretation of simulated LUS images; quality of saved images; or agreement of image interpretation between paramedics and blinded emergency physicians (EPs). Image acquisition time while parked was significantly greater than while the ambulance was driving in serpentines (Z = -2.898; P = .008). Technical challenges for both simulation techniques were noted.
Paramedics can correctly acquire and interpret simulated LUS images during different ambulance driving conditions. However, simulation techniques better adapted to this unique work environment are needed.
Alzheimer’s disease (AD) studies are increasingly targeting earlier (pre)clinical populations, in which the expected degree of observable cognitive decline over a certain time interval is reduced as compared to the dementia stage. Consequently, endpoints to capture early cognitive changes require refinement. We aimed to determine the sensitivity to decline of widely applied neuropsychological tests at different clinical stages of AD as outlined in the National Institute on Aging – Alzheimer’s Association (NIA-AA) research framework.
Amyloid-positive individuals (as determined by positron emission tomography or cerebrospinal fluid) with longitudinal neuropsychological assessments available were included from four well-defined study cohorts and subsequently classified among the NIA-AA stages. For each stage, we investigated the sensitivity to decline of 17 individual neuropsychological tests using linear mixed models.
1103 participants (age = 70.54 ± 8.7, 47% female) were included: n = 120 Stage 1, n = 206 Stage 2, n = 467 Stage 3 and n = 309 Stage 4. Neuropsychological tests were differentially sensitive to decline across stages. For example, Category Fluency captured significant 1-year decline as early as Stage 1 (β = −.58, p < .001). Word List Delayed Recall (β = −.22, p < .05) and Trail Making Test (β = 6.2, p < .05) became sensitive to 1-year decline in Stage 2, whereas the Mini-Mental State Examination did not capture 1-year decline until Stage 3 (β = −1.13, p < .001) and 4 (β = −2.23, p < .001).
We demonstrated that commonly used neuropsychological tests differ in their ability to capture decline depending on clinical stage within the AD continuum (preclinical to dementia). This implies that stage-specific cognitive endpoints are needed to accurately assess disease progression and increase the chance of successful treatment evaluation in AD.
Hydrogen lithography has been used to template phosphine-based surface chemistry to fabricate atomic-scale devices, a process we abbreviate as atomic precision advanced manufacturing (APAM). Here, we use mid-infrared variable angle spectroscopic ellipsometry (IR-VASE) to characterize single-nanometer thickness phosphorus dopant layers (δ-layers) in silicon made using APAM compatible processes. A large Drude response is directly attributable to the δ-layer and can be used for nondestructive monitoring of the condition of the APAM layer when integrating additional processing steps. The carrier density and mobility extracted from our room temperature IR-VASE measurements are consistent with cryogenic magneto-transport measurements, showing that APAM δ-layers function at room temperature. Finally, the permittivity extracted from these measurements shows that the doping in the APAM δ-layers is so large that their low-frequency in-plane response is reminiscent of a silicide. However, there is no indication of a plasma resonance, likely due to reduced dimensionality and/or low scattering lifetime.
Registry-based trials have emerged as a potentially cost-saving study methodology. Early estimates of cost savings, however, conflated the benefits associated with registry utilisation and those associated with other aspects of pragmatic trial designs, which might not all be as broadly applicable. In this study, we sought to build a practical tool that investigators could use across disciplines to estimate the ranges of potential cost differences associated with implementing registry-based trials versus standard clinical trials.
We built simulation Markov models to compare unique costs associated with data acquisition, cleaning, and linkage under a registry-based trial design versus a standard clinical trial. We conducted one-way, two-way, and probabilistic sensitivity analyses, varying study characteristics over broad ranges, to determine thresholds at which investigators might optimally select each trial design.
Registry-based trials were more cost effective than standard clinical trials 98.6% of the time. Data-related cost savings ranged from $4300 to $600,000 with variation in study characteristics. Cost differences were most reactive to the number of patients in a study, the number of data elements per patient available in a registry, and the speed with which research coordinators could manually abstract data. Registry incorporation resulted in cost savings when as few as 3768 independent data elements were available and when manual data abstraction took as little as 3.4 seconds per data field.
Registries offer important resources for investigators. When available, their broad incorporation may help the scientific community reduce the costs of clinical investigation. We offer here a practical tool for investigators to assess potential costs savings.
Heat stress is a global issue constraining pig productivity, and it is likely to intensify under future climate change. Technological advances in earth observation have made tools available that enable identification and mapping livestock species that are at risk of exposure to heat stress due to climate change. Here, we present a methodology to map the current and likely future heat stress risk in pigs using R software by combining the effects of temperature and relative humidity. We applied the method to growing-finishing pigs in Uganda. We mapped monthly heat stress risk and quantified the number of pigs exposed to heat stress using 18 global circulation models and projected impacts in the 2050s. Results show that more than 800 000 pigs in Uganda will be affected by heat stress in the future. The results can feed into evidence-based policy, planning and targeted resource allocation in the livestock sector.
This guidance paper from the European Psychiatric Association (EPA) aims to provide evidence-based recommendations on early intervention in clinical high risk (CHR) states of psychosis, assessed according to the EPA guidance on early detection. The recommendations were derived from a meta-analysis of current empirical evidence on the efficacy of psychological and pharmacological interventions in CHR samples. Eligible studies had to investigate conversion rate and/or functioning as a treatment outcome in CHR patients defined by the ultra-high risk and/or basic symptom criteria. Besides analyses on treatment effects on conversion rate and functional outcome, age and type of intervention were examined as potential moderators. Based on data from 15 studies (n = 1394), early intervention generally produced significantly reduced conversion rates at 6- to 48-month follow-up compared to control conditions. However, early intervention failed to achieve significantly greater functional improvements because both early intervention and control conditions produced similar positive effects. With regard to the type of intervention, both psychological and pharmacological interventions produced significant effects on conversion rates, but not on functional outcome relative to the control conditions. Early intervention in youth samples was generally less effective than in predominantly adult samples. Seven evidence-based recommendations for early intervention in CHR samples could have been formulated, although more studies are needed to investigate the specificity of treatment effects and potential age effects in order to tailor interventions to the individual treatment needs and risk status.
The aim of this guidance paper of the European Psychiatric Association is to provide evidence-based recommendations on the early detection of a clinical high risk (CHR) for psychosis in patients with mental problems. To this aim, we conducted a meta-analysis of studies reporting on conversion rates to psychosis in non-overlapping samples meeting any at least any one of the main CHR criteria: ultra-high risk (UHR) and/or basic symptoms criteria. Further, effects of potential moderators (different UHR criteria definitions, single UHR criteria and age) on conversion rates were examined. Conversion rates in the identified 42 samples with altogether more than 4000 CHR patients who had mainly been identified by UHR criteria and/or the basic symptom criterion ‘cognitive disturbances’ (COGDIS) showed considerable heterogeneity. While UHR criteria and COGDIS were related to similar conversion rates until 2-year follow-up, conversion rates of COGDIS were significantly higher thereafter. Differences in onset and frequency requirements of symptomatic UHR criteria or in their different consideration of functional decline, substance use and co-morbidity did not seem to impact on conversion rates. The ‘genetic risk and functional decline’ UHR criterion was rarely met and only showed an insignificant pooled sample effect. However, age significantly affected UHR conversion rates with lower rates in children and adolescents. Although more research into potential sources of heterogeneity in conversion rates is needed to facilitate improvement of CHR criteria, six evidence-based recommendations for an early detection of psychosis were developed as a basis for the EPA guidance on early intervention in CHR states.
Neurocognitive and functional neuroimaging studies point to frontal lobe abnormalities in schizophrenia. Molecular and behavioural genetic studies suggest that the frontal lobe is under significant genetic influence. We carried out structural magnetic resonance imaging (MRI) of the frontal lobe in monozygotic (MZ) twins concordant or discordant for schizophrenia and healthy MZ control twins.
The sample comprised 21 concordant pairs, 17 discordant affected and 18 discordant unaffected twins from 19 discordant pairs, and 27 control pairs. Groups were matched on sociodemographic variables. Patient groups (concordant, discordant affected) did not differ on clinical variables. Volumes of superior, middle, inferior and orbital frontal gyri were calculated using the Cavalieri principle on the basis of manual tracing of anatomic boundaries. Group differences were investigated covarying for whole-brain volume, gender and age.
Results for superior frontal gyrus showed that twins with schizophrenia (i.e. concordant twins and discordant affected twins) had reduced volume compared to twins without schizophrenia (i.e. discordant unaffected and control twins), indicating an effect of illness. For middle and orbital frontal gyrus, concordant (but not discordant affected) twins differed from non-schizophrenic twins. There were no group differences in inferior frontal gyrus volume.
These findings suggest that volume reductions in the superior frontal gyrus are associated with a diagnosis of schizophrenia (in the presence or absence of a co-twin with schizophrenia). On the other hand, volume reductions in middle and orbital frontal gyri are seen only in concordant pairs, perhaps reflecting the increased genetic vulnerability in this group.
Major depression is a significant problem for people with a traumatic brain injury (TBI) and its treatment remains difficult. A promising approach to treat depression is Mindfulness-based cognitive therapy (MBCT), a relatively new therapeutic approach rooted in mindfulness based stress-reduction (MBSR) and cognitive behavioral therapy (CBT). We conducted this study to examine the effectiveness of MBCT in reducing depression symptoms among people who have a TBI.
Twenty individuals diagnosed with major depression were recruited from a rehabilitation clinic and completed the 8-week MBCT intervention. Instruments used to measure depression symptoms included: BDI-II, PHQ-9, HADS, SF-36 (Mental Health subscale), and SCL-90 (Depression subscale). They were completed at baseline and post-intervention.
All instruments indicated a statistically significant reduction in depression symptoms post-intervention (p < .05). For example, the total mean score on the BDI-II decreased from 25.2 (9.8) at baseline to 18.2 (11.7) post-intervention (p=.001). Using a PHQ threshold of 10, the proportion of participants with a diagnosis of major depression was reduced by 59% at follow-up (p=.012).
Most participants reported reductions in depression symptoms after the intervention such that many would not meet the criteria for a diagnosis of major depression. This intervention may provide an opportunity to address a debilitating aspect of TBI and could be implemented concurrently with more traditional forms of treatment, possibly enhancing their success. The next step will involve the execution of multi-site, randomized controlled trials to fully demonstrate the value of the intervention.
Schizophrenia is a devastating mental disorder with diverse dimensions of symptoms like delusions, hallucinations, affective symptoms and alterations in cognition. Declarative memory deficits are among the most important factors leading to poor functional outcomes in this disorder. Recently it was supposed, that sleep disturbances in patients with schizophrenia might contribute to these memory impairments (Manoach et al. 2009, Ferrarelli et al. 2010, Lu and Göder 2012). In young healthy subjects it was shown that declarative memory consolidation was enhanced by inducing slow oscillation-like potential fields during sleep (Marshall et al. 2006). In the present study transcranial direct current stimulation (tDCS) was applied to 14 patients with schizophrenia on stable medication with a mean age of 33 years. The main effects of tDCS in comparison to sham stimulation were: An enhancement in declarative memory retention and an increase in mood after sleep. In conclusion, so-tDCS offers an interesting approach for studying the relationship of sleep and memory in psychiatric disorders and could possibly improve disturbed memory processing in patients with schizophrenia.
Traumatic brain injuries (TBI) may lead to persistent depression symptoms. We conducted several pilot studies to examine the efficacy of mindfulness-based interventions to deal with this issue; all showed strong effect sizes. The logical next step was to conduct a randomized controlled trial (RCT).
We sought to determine the efficacy of mindfulness-based cognitive therapy for people with depression symptoms post-TBI (MBCT-TBI).
Using a multi-site RCT design, participants (mean age = 47) were randomized to intervention or control arms. Treatment participants received a group-based, 10-week intervention; control participants waited. Outcome measures, administered pre- and post-intervention, and after three months, included: Beck Depression Inventory-II (BDI-II), Patient Health Questionnaire-9 (PHQ-9), and Symptom Checklist-90-Revised (SCL-90-R). The Philadelphia Mindfulness Scale (PHLMS) captured present moment awareness and acceptance.
BDI-II scores decreased from 25.47 to 18.84 in treatment groups while they stayed relatively stable in control groups (respectively 27.13 to 25.00; p = .029). We did not find statistically significant differences on the PHQ-9 and SCL-90R post- treatment. However, after three months, all scores were statistically significantly lower than at baseline (ps < .01). Increases in mindfulness were associated with decreases in BDI-II scores (r = -.401, p = .025).
MBCT-TBI may alleviate depression symptoms up to three months post-intervention. Greater mindfulness may have contributed to the reduction in depression symptoms although the association does not confirm causality. More work is required to replicate these findings, identify subgroups that may better respond to the intervention, and refine the intervention to maximize its effectiveness.
Given the common view that pre-exercise nutrition/breakfast is important for performance, the present study investigated whether breakfast influences resistance exercise performance via a physiological or psychological effect. Twenty-two resistance-trained, breakfast-consuming men completed three experimental trials, consuming water-only (WAT), or semi-solid breakfasts containing 0 g/kg (PLA) or 1·5 g/kg (CHO) maltodextrin. PLA and CHO meals contained xanthan gum and low-energy flavouring (approximately 122 kJ), and subjects were told both ‘contained energy’. At 2 h post-meal, subjects completed four sets of back squat and bench press to failure at 90 % ten repetition maximum. Blood samples were taken pre-meal, 45 min and 105 min post-meal to measure serum/plasma glucose, insulin, ghrelin, glucagon-like peptide-1 and peptide tyrosine-tyrosine concentrations. Subjective hunger/fullness was also measured. Total back squat repetitions were greater in CHO (44 (sd 10) repetitions) and PLA (43 (sd 10) repetitions) than WAT (38 (sd 10) repetitions; P < 0·001). Total bench press repetitions were similar between trials (WAT 37 (sd 7) repetitions; CHO 39 (sd 7) repetitions; PLA 38 (sd 7) repetitions; P = 0·130). Performance was similar between CHO and PLA trials. Hunger was suppressed and fullness increased similarly in PLA and CHO, relative to WAT (P < 0·001). During CHO, plasma glucose was elevated at 45 min (P < 0·05), whilst serum insulin was elevated (P < 0·05) and plasma ghrelin suppressed at 45 and 105 min (P < 0·05). These results suggest that breakfast/pre-exercise nutrition enhances resistance exercise performance via a psychological effect, although a potential mediating role of hunger cannot be discounted.
We study decision procedures for two knowledge problems critical to the verification of security protocols, namely the intruder deduction and the static equivalence problems. These problems can be related to particular forms of context matching and context unification. Both problems are defined with respect to an equational theory and are known to be decidable when the equational theory is given by a subterm convergent term rewrite system (TRS). In this work, we extend this to consider a subterm convergent TRS defined modulo an equational theory, like Commutativity. We present two pairs of solutions for these important problems. The first solves the deduction and static equivalence problems in rewrite systems modulo shallow theories such as Commutativity. The second provides a general procedure that solves the deduction and static equivalence problems in subterm convergent systems modulo syntactic permutative theories, provided a finite measure is ensured. Several examples of such theories are also given.
Healthcare personnel (HCP) were recruited to provide serum samples, which were tested for antibodies against Ebola or Lassa virus to evaluate for asymptomatic seroconversion.
From 2014 to 2016, 4 patients with Ebola virus disease (EVD) and 1 patient with Lassa fever (LF) were treated in the Serious Communicable Diseases Unit (SCDU) at Emory University Hospital. Strict infection control and clinical biosafety practices were implemented to prevent nosocomial transmission of EVD or LF to HCP.
All personnel who entered the SCDU who were required to measure their temperatures and complete a symptom questionnaire twice daily were eligible.
No employee developed symptomatic EVD or LF. EVD and LF antibody studies were performed on sera samples from 42 HCP. The 6 participants who had received investigational vaccination with a chimpanzee adenovirus type 3 vectored Ebola glycoprotein vaccine had high antibody titers to Ebola glycoprotein, but none had a response to Ebola nucleoprotein or VP40, or a response to LF antigens.
Patients infected with filoviruses and arenaviruses can be managed successfully without causing occupation-related symptomatic or asymptomatic infections. Meticulous attention to infection control and clinical biosafety practices by highly motivated, trained staff is critical to the safe care of patients with an infection from a special pathogen.
Pigweed is difficult to manage in grain sorghum because of widespread herbicide resistance, a limited number of registered effective herbicides, and the synchronous emergence of pigweed with grain sorghum in Kansas. The combination of cultural and mechanical control tactics with an herbicide program are commonly recognized as best management strategies; however, limited information is available to adapt these strategies to dryland systems. Our objective for this research was to assess the influence of four components, including a winter wheat cover crop (CC), row-crop cultivation, three row widths, with and without a herbicide program, on pigweed control in a dryland system. Field trials were implemented during 2017 and 2018 at three locations for a total of 6 site-years. The herbicide program component resulted in excellent control (>97%) in all treatments at 3 and 8 weeks after planting (WAP). CC provided approximately 50% reductions in pigweed density and biomass for both timings in half of the site-years; however, mixed results were observed in the remaining site-years, ranging from no attributable difference to a 170% increase in weed density at 8 WAP in 1 site-year. Treatments including row-crop cultivation reduced pigweed biomass and density in most site-years 3 and 8 WAP. An herbicide program is required to achieve pigweed control and should be integrated with row-crop cultivation or narrow row widths to reduce the risk of herbicide resistance. Additional research is required to optimize the use of CC as an integrated pigweed management strategy in dryland grain sorghum.
Successful pigweed management requires an integrated strategy to delay the development of resistance to any single control tactic. Field trials were implemented during 2017 and 2018 in three counties in Kansas on dryland (limited rainfall, nonirrigated), glufosinate-resistant soybean. The objective was to assess pigweed control with combinations of a winter wheat cover crop (CC), three soybean row widths (76, 38, and 19 cm), row-crop cultivation 2.5 weeks after planting (WAP), and an herbicide program to develop integrated pigweed management recommendations. All combinations of the four components were assessed by 16 treatments. All treatments with the herbicide program resulted in excellent (>97%) pigweed control and were analyzed separately from the other components. Treatments containing row-crop cultivation reduced pigweed density and biomass 3 and 8 WAP in all locations compared with the 76-cm row width plus no CC treatment. CC impacts were mixed. In Riley County, Palmer amaranth density and biomass were reduced; in Reno County, no additional Palmer amaranth control was observed; in Franklin County, the CC had greater waterhemp density and biomass compared with the treatments containing no CC. Narrow row widths achieved the most consistent results of all cultural components when data were pooled across locations: Decreasing row widths from 76 to 38 cm resulted in a 23% reduction in pigweed biomass 8 WAP and decreasing row width from 38 to 19 cm achieved a 15% reduction. Row-crop cultivation should be incorporated where possible as a mechanical option to manage pigweed, and narrow row widths should be used to suppress late-season pigweed growth when feasible. Inconsistent pigweed control from CC was achieved and should be given special consideration before implementation. The integral use of these components with an herbicide program as a system should be recommended to achieve the best pigweed control and reduce the risk of developing herbicide resistance.
Despite United States national learning objectives referencing research fundamentals and the critical appraisal of medical literature, many paramedic programs are not meeting these objectives with substantive content.
The objective was to develop and implement a journal club educational module for paramedic training programs, which is all-inclusive and could be distributed to Emergency Medical Services (EMS) educators and EMS medical directors to use as a framework to adapt to their program.
Four two-hour long journal club sessions were designed. First, the educator provided students with four types of articles on a student-chosen topic and discussed differences in methodology and structures. Next, after a lecture about peer-review, students used search engines to verify references of a trade magazine article. Third, the educator gave a statistics lecture and critiqued the results section of several articles found by students on a topic. Finally, students found an article on a topic of personal interest and presented it to their classmates, as if telling their paramedic partner about it at work. Before and after the series, students from two cohorts (2017, 2018) completed a survey with questions about demographics and perceptions of research. Students from one cohort (2017) received a follow-up survey one year later.
For the 2016 cohort, 13 students participated and provided qualitative feedback. For the 2017 and 2018 cohorts, 33 students participated. After the series, there was an increased self-reported ability to find, evaluate, and apply medical research articles, as well as overall positive trending opinions of participating in and the importance of prehospital research. This ability was demonstrated by every student during the final journal club session. McNemar’s and Related-Samples Cochran’s Q testing of questionnaire responses suggested a statistically significant improvement in student approval of exceptions from informed consent.
The framework for this paramedic journal club series could be adapted by EMS educators and medical directors to enable paramedics to search for, critically appraise, and discuss the findings of medical literature.