To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Bipolar disorder (BD) is a familial psychiatric disorder associated with frontotemporal and subcortical brain abnormalities. It is unclear whether such abnormalities are present in relatives without BD, and little is known about structural brain trajectories in those at risk.
Neuroimaging was conducted at baseline and at 2-year follow-up interval in 90 high-risk individuals with a first-degree BD relative (HR), and 56 participants with no family history of mental illness who could have non-BD diagnoses. All 146 subjects were aged 12–30 years at baseline. We examined longitudinal change in gray and white matter volume, cortical thickness, and surface area in the frontotemporal cortex and subcortical regions.
Compared to controls, HR participants showed accelerated cortical thinning and volume reduction in right lateralised frontal regions, including the inferior frontal gyrus, lateral orbitofrontal cortex, frontal pole and rostral middle frontal gyrus. Independent of time, the HR group had greater cortical thickness in the left caudal anterior cingulate cortex, larger volume in the right medial orbitofrontal cortex and greater area of right accumbens, compared to controls. This pattern was evident even in those without the new onset of psychopathology during the inter-scan interval.
This study suggests that differences previously observed in BD are developing prior to the onset of the disorder. The pattern of pathological acceleration of cortical thinning is likely consistent with a disturbance of molecular mechanisms responsible for normal cortical thinning. We also demonstrate that neuroanatomical differences in HR individuals may be progressive in some regions and stable in others.
Spinal muscular atrophy (SMA) is a devastating rare disease that affects individuals regardless of ethnicity, gender, and age. The first-approved disease-modifying therapy for SMA, nusinursen, was approved by Health Canada, as well as by American and European regulatory agencies following positive clinical trial outcomes. The trials were conducted in a narrow pediatric population defined by age, severity, and genotype. Broad approval of therapy necessitates close follow-up of potential rare adverse events and effectiveness in the larger real-world population.
The Canadian Neuromuscular Disease Registry (CNDR) undertook an iterative multi-stakeholder process to expand the existing SMA dataset to capture items relevant to patient outcomes in a post-marketing environment. The CNDR SMA expanded registry is a longitudinal, prospective, observational study of patients with SMA in Canada designed to evaluate the safety and effectiveness of novel therapies and provide practical information unattainable in trials.
The consensus expanded dataset includes items that address therapy effectiveness and safety and is collected in a multicenter, prospective, observational study, including SMA patients regardless of therapeutic status. The expanded dataset is aligned with global datasets to facilitate collaboration. Additionally, consensus dataset development aimed to standardize appropriate outcome measures across the network and broader Canadian community. Prospective outcome studies, data use, and analyses are independent of the funding partner.
Prospective outcome data collected will provide results on safety and effectiveness in a post-therapy approval era. These data are essential to inform improvements in care and access to therapy for all SMA patients.
Innovation Concept: Research training programs for students, especially in emergency medicine (EM), may be difficult to initiate due to lack of protected time, resources, and mentors (Chang Y, Ramnanan CJ. Academic Medicine 2015). We developed a ten-week summer program for medical students aimed at cultivating research skills through mentorship, clinical enrichment, and immersion in EM research culture through shadowing and project support. Methods: Five second year Ontario medical students were recruited to participate in the Summer Training and Research in Emergency Medicine (STAR-EM) program at University Health Network, Toronto, from June - Aug, 2019. Program design followed review of existing summer research programs and literature regarding challenges to EM research (McRae, Perry, Brehaut et al. CJEM 2018). The program had broad emergency physician (EP) engagement, with five EP research project mentors, and over ten EPs delivering academic sessions. Curriculum development was collaborative and iterative. All projects were approved by the hospital Research Ethics Board (REB). Curriculum, Tool or Material: Each weekly academic morning comprised small group teaching (topics including research methodology, manuscript preparation, health equity, quality improvement, and wellness), followed by EP-led group progress review of each student's project. Each student spent one half day per week in the emergency department (ED), shadowing an EP and identifying patients for recruitment for ongoing mentor-initiated ED research projects. Remaining time was spent on independent student project work. Presentation to faculty and program evaluation occurred in week 10. Scholarly output included one abstract submitted for publication per student. Program evaluation by students reflected a uniform impression that course material and mentorship were each excellent (100%, n = 5). Interest in pursuing academic EM as a career was identified by all students. Faculty researchers rated the program as very effective (80%, n = 4) or somewhat effective (20%, n = 1) in terms of enhancing productivity and scholarly output. Conclusion: The STAR-EM program provides a transferable model for other academic departments seeking to foster the development of future clinician investigators and enhance ED research culture. Program challenges included delays in REB approval for student projects and engaging recalcitrant staff to participate in research.
To describe the infection control preparedness measures undertaken for coronavirus disease (COVID-19) due to SARS-CoV-2 (previously known as 2019 novel coronavirus) in the first 42 days after announcement of a cluster of pneumonia in China, on December 31, 2019 (day 1) in Hong Kong.
A bundled approach of active and enhanced laboratory surveillance, early airborne infection isolation, rapid molecular diagnostic testing, and contact tracing for healthcare workers (HCWs) with unprotected exposure in the hospitals was implemented. Epidemiological characteristics of confirmed cases, environmental samples, and air samples were collected and analyzed.
From day 1 to day 42, 42 of 1,275 patients (3.3%) fulfilling active (n = 29) and enhanced laboratory surveillance (n = 13) were confirmed to have the SARS-CoV-2 infection. The number of locally acquired case significantly increased from 1 of 13 confirmed cases (7.7%, day 22 to day 32) to 27 of 29 confirmed cases (93.1%, day 33 to day 42; P < .001). Among them, 28 patients (66.6%) came from 8 family clusters. Of 413 HCWs caring for these confirmed cases, 11 (2.7%) had unprotected exposure requiring quarantine for 14 days. None of these was infected, and nosocomial transmission of SARS-CoV-2 was not observed. Environmental surveillance was performed in the room of a patient with viral load of 3.3 × 106 copies/mL (pooled nasopharyngeal and throat swabs) and 5.9 × 106 copies/mL (saliva), respectively. SARS-CoV-2 was identified in 1 of 13 environmental samples (7.7%) but not in 8 air samples collected at a distance of 10 cm from the patient’s chin with or without wearing a surgical mask.
Appropriate hospital infection control measures was able to prevent nosocomial transmission of SARS-CoV-2.
Up to 20% of the population in industrialised countries are employed as shift workers. Shift work is an independent risk factor for metabolic diseases, such as type-2 diabetes, cardiovascular disease (CVD) and obesity. This may be associated with shift workers’ typical habit of eating during the night, as it forces the body to process nutrients when the body is expecting a period of fast. This study aimed to examine whether redistributing meal times, to create a defined overnight fast period, can improve CVD risk factors in night shift workers.
Eligible participants were permanent or rotating night shift workers who habitually ate on night shift between 1am to 6am and had abdominal obesity as assessed by waist circumference, but were otherwise healthy. This randomised crossover trial comprised a four-week control period and a four-week intervention period separated by a minimum two-week washout period. During the intervention period, participants were advised to rearrange meal and snack times to create a five hour nightly fast between 1am and 6am. Up to four random 24-hour food recalls per participant were performed during both periods of the study, to check compliance and to assess energy intake. All recall periods included a night shift. Participants attended the research facility at the end of each period to be weighed (seca, gmbh & co. kg, Hamburg, Germany). Work schedule and meals were standardised 24 hours prior to attending the research facility. Data were analysed using paired t-test and reported as mean (SD).
Participants (n = 19) were aged 41(10) years. Daily energy intake was not markedly different between the two study periods, intake was 10633 (3591) kJ/ day in the intervention period vs. 10919 (4276) kJ/ day in the control period (n = 60 recalls in each period, p = 0.670). Body weight was significantly lower at the end of the intervention period compared with at the end of the control period (86.2 (17) vs. 87.1 (18) kg, p = 0.001). Similarly, BMI was lower at the end of the intervention period compared with end of control period (30.7 (6) vs. 31.1 (6) kg/m2, p = 0.001).
Increasing evidence indicates that working night shifts potentiates weight gain. We show that advising shift workers to avoid eating during 1am and 6am for a four-week period had a positive impact on body weight. Manipulating meal and snack times for shift workers may be a simple strategy to assist in weight management.
The Atacama Large (Sub-)millimeter Array (ALMA) has provided glimpse of the interstellar medium (ISM) properties of galaxies at the Epoch of Reionization (EoR); however, detailed understanding of their internal structure is still lacking. We present properties of molecular cloud complexes (MCCs) in a prototypical galaxy at this epoch studied in cosmological zoom-in simulations (Leung et al. 2019c). Typical MCC mass and size are comparable to nearby spirals and starburst galaxies (Mgas∼106.5Mȯ and R≃45–100 pc). MCCs are highly supersonic, with velocity dispersion of σgas≃20–100 km s−1 and pressure of P/kB ≃107.6Kcm−3, which are comparable to gas-rich starburst galaxies. In addition, we perform stability analysis to understand the origin and dynamical properties of MCCs. We find that MCCs are globally stable in the main disk of Althæa. Densest regions where star formation is expected to take place in clumps and cores on even smaller scales instead have lower virial parameter and Toomre-Q values. Detailed studies of the star-forming gas dynamics at the EoR thus require a spatial resolution of < 40 pc ( ≃ 0.01″), which is within reach of ALMA, to complement studies of stellar populations at EoR using the James Webb Space Telescope (JWST).
Adverse childhood experiences (ACEs) of parents are associated with a variety of negative health outcomes in offspring. Little is known about the mechanisms by which ACEs are transmitted to the next generation. Given that maternal depression and anxiety are related to ACEs and negatively affect children’s behaviour, these exposures may be pathways between maternal ACEs and child psychopathology. Child sex may modify these associations. Our objectives were to determine: (1) the association between ACEs and children’s behaviour, (2) whether maternal symptoms of prenatal and postnatal depression and anxiety mediate the relationship between maternal ACEs and children’s behaviour, and (3) whether these relationships are moderated by child sex. Pearson correlations and latent path analyses were undertaken using data from 907 children and their mothers enrolled the Alberta Pregnancy Outcomes and Nutrition study. Overall, maternal ACEs were associated with symptoms of anxiety and depression during the perinatal period, and externalizing problems in children. Furthermore, we observed indirect associations between maternal ACEs and children’s internalizing and externalizing problems via maternal anxiety and depression. Sex differences were observed, with boys demonstrating greater vulnerability to the indirect effects of maternal ACEs via both anxiety and depression. Findings suggest that maternal mental health may be a mechanism by which maternal early life adversity is transmitted to children, especially boys. Further research is needed to determine if targeted interventions with women who have both high ACEs and mental health problems can prevent or ameliorate the effects of ACEs on children’s behavioural psychopathology.
Vertically aligned nitrogen-doped nanocrystalline diamond nanorods are fabricated from nitrogen-doped nanocrystalline diamond films using reactive ion etching in oxygen plasma. These nanorods show enhanced thermionic electron emission (TEE) characteristics, viz., a high current density of 12.0 mA/cm2 and a work function value of 4.5 eV with an applied voltage of 3 V at 923 K. The enhanced TEE characteristics of these nanorods are ascribed to the induction of nanographitic phases at the grain boundaries and the field penetration effect through the local field enhancement from nanorods owing to a high aspect ratio and an excellent field enhancement factor.
Section 1 of the FM14 focus on bridging the astronomy research and outreach communities - recent highlights, emerging collaborations, best practices and support structures. This paper also contains supplementary materials that point to contributed talks and poster presentations that can be found online.
Introduction: Diagnosing pulmonary embolism (PE) can be challenging because the signs and symptoms are often non-specific. Studies have shown that evidence-based algorithms are not always adhered to in the Emergency Department (ED) and are often not used correctly, which leads to unnecessary CT scanning. The YEARS diagnostic algorithm, consisting of three items (clinical signs of deep vein thrombosis, hemoptysis, and whether pulmonary embolism is the most likely diagnosis) and D-dimer, is a novel and simplified way to approach suspected acute PE. The purpose of this study was to 1) evaluate the use of the YEARS algorithm in the ED and 2) to compare the rates of testing for PE if the YEARS algorithm was used. Methods: This was a health records review of ED patients investigated for PE at two emergency departments over a two-year period (April 2013-March 2015). Inclusion criteria were ED physician ordered CT pulmonary angiogram, ventilation-perfusion scan, or D-dimer for investigation of PE. Patients under the age of 18 and those without a D-dimer test were excluded. PE was considered to be present during the emergency department visit if PE was diagnosed on CT or VQ (subsegmental level or above), or if the patient was subsequently found to have PE or deep vein thrombosis during the next 30 days. Trained researchers extracted anonymized data. The rate of CT/VQ imaging and the false negative rate was calculated. Results: There were 1,163 patients that were tested for PE and 1,083 patients were eligible for our analysis. Of the total, 317/1,083 (29.3%; 95%CI 26.6-32.1%) had CT/VQ imaging for PE, and 41/1,083 (3.8%; 95%CI 2.8-5.1%) patients were diagnosed with PE at baseline. Three patients had a missed PE, resulting in a false negative rate of 0.4% (95%CI 0.1-1.2%). If the YEARS algorithm was used, 211/1,083 (19.5%; 95%CI 17.2-22.0%) would have required imaging for PE. Of the patients who would not have required imaging according to the YEARS algorithm, 8/872 (0.9%; 95%CI 0.5-1.8%) would have had a missed PE. Conclusion: If the YEARS algorithm was used in all patients with suspected PE, fewer patients would have required imaging with a small increase in the false negative rate.
Introduction: Diagnosing pulmonary embolism (PE) can be challenging because the signs and symptoms are often non-specific. Studies have shown that evidence-based algorithms are not always adhered to in the Emergency Department (ED), which leads to unnecessary CT scanning. The pulmonary embolism rule-out criteria (PERC) can identify patients who can be safely discharged from the ED without further investigation for PE. The purpose of this study is to evaluate the use of the PERC rule in the ED and to compare the rates of testing for PE if the PERC rule was used. Methods: This was a health records review of ED patients investigated for PE at two emergency departments over a two-year period (April 2013-March 2015). Inclusion criteria were ED physician ordered CT pulmonary angiogram, ventilation-perfusion scan, or D-dimer for investigation of PE. Patients under the age of 18 were excluded. PE was considered to be present during the emergency department visit if PE was diagnosed on CT or VQ (subsegmental level or above), or if the patient was subsequently found to have PE or deep vein thrombosis during the next 30 days. Trained researchers extracted anonymized data. The rate of CT/VQ imaging and the negative predictive value was calculated. Results: There were 1,163 patients that were tested for PE and 1,097 patients were eligible for our analysis. Of the total, 330/1,097 (30.1%; 95%CI 27.4-32.3%) had CT/VQ imaging for PE, and 48/1,097 (4.4%; 95%CI 3.3-5.8%) patients were diagnosed with PE. 806/1,097 (73.5%; 95%CI 70.8-76.0%) were PERC positive, and of these, 44 patients had a PE (5.5%; 95%CI 4.1-7.3%). Conversely, 291/1,097 (26.5%; 95%CI 24.0-29.2%) patients were PERC negative, and of these, 4 patients had a PE (1.4%; 95%CI 0.5-3.5%). Of the PERC negative patients, 291/291 (100.0%; 95%CI 98.7-100.0%) had a D-dimer test done, and 33/291 (11.3%; 95%CI 8.2-15.5%) had a CT angiogram. If PERC was used, CT/VQ imaging would have been avoided in 33/1,097 (3%; 95%CI 2.2-4.2%) patients and the D-dimer would have been avoided in 291/1,097 (26.5%; 95%CI 24.0-29.2%) patients. Conclusion: If the PERC rule was used in all patients with suspected PE, fewer patients would have further testing. The false negative rate for the PERC rule was low.
The correlation between objective and subjective nasal obstruction is poor, and dissatisfaction rates after surgery for nasal obstruction are high. Accordingly, novel assessment techniques may be required. This survey aimed to determine patient experience and preferences for the measurement of nasal obstruction.
Prospective survey of rhinology patients.
Of 72 questionnaires distributed, 60 were completed (response rate of 83 per cent). Obstruction duration (more than one year) (χ2 = 13.5, p = 0.00024), but not obstruction severity, affected willingness to spend more time being assessed. Questionnaires (48 per cent) and nasal inspiratory peak flow measurement (53 per cent) are the most commonly used assessment techniques. Forty-nine per cent of participants found their assessment unhelpful in understanding their obstruction. Eighty-two per cent agreed or strongly agreed that a visual and numerical aid would help them understand their blockage.
Many patients are dissatisfied with current assessment techniques; a novel device with visual or numerical results may help. Obstruction duration determines willingness to undergo longer assessment.
Multidrug-resistant organisms (MDROs) are increasingly reported in residential care homes for the elderly (RCHEs). We assessed whether implementation of directly observed hand hygiene (DOHH) by hand hygiene ambassadors can reduce environmental contamination with MDROs.
From July to August 2017, a cluster-randomized controlled study was conducted at 10 RCHEs (5 intervention versus 5 nonintervention controls), where DOHH was performed at two-hourly intervals during daytime, before meals and medication rounds by a one trained nurse in each intervention RCHE. Environmental contamination by MRDOs, such as methicillin-resistant Staphylococcus aureus (MRSA), carbapenem-resistant Acinetobacter species (CRA), and extended-spectrum β-lactamse (ESBL)–producing Enterobacteriaceae, was evaluated using specimens collected from communal areas at baseline, then twice weekly. The volume of alcohol-based hand rub (ABHR) consumed per resident per week was measured.
The overall environmental contamination of communal areas was culture-positive for MRSA in 33 of 100 specimens (33%), CRA in 26 of 100 specimens (26%), and ESBL-producing Enterobacteriaceae in 3 of 100 specimens (3%) in intervention and nonintervention RCHEs at baseline. Serial monitoring of environmental specimens revealed a significant reduction in MRSA (79 of 600 [13.2%] vs 197 of 600 [32.8%]; P<.001) and CRA (56 of 600 [9.3%] vs 94 of 600 [15.7%]; P=.001) contamination in the intervention arm compared with the nonintervention arm during the study period. The volume of ABHR consumed per resident per week was 3 times higher in the intervention arm compared with the baseline (59.3±12.9 mL vs 19.7±12.6 mL; P<.001) and was significantly higher than the nonintervention arm (59.3±12.9 mL vs 23.3±17.2 mL; P=.006).
The direct observation of hand hygiene of residents could reduce environmental contamination by MDROs in RCHEs.
Despite the growing interest in the phenomenon of learning without intention, the incidental learning of phonological features, especially prosodic features, has received relatively little attention. This paper reports an experiment on incidental learning of lexical stress rules, and investigates whether the resultant knowledge can be unconscious, abstract, and rule based. Participants were incidentally exposed to a lexical stress system where stress location of a word is mainly determined by the final phoneme, syllable type, and syllable weight. Learning was assessed by a pronunciation judgment task. Results indicate that participants were able to transfer their knowledge of stress patterns to novel words whose final phoneme was not previously encountered, suggesting that participants had acquired abstract and potentially rule-based knowledge. The combined use of subjective and objective measures of awareness in the present study provides a strong piece of evidence of the acquisition of implicit knowledge.
To assess the level of all-hazards disaster preparedness and training needs of emergency department (ED) doctors and nurses in Hong Kong from their perspective, and identify factors associated with high perceived personal preparedness.
This study was a cross-sectional territory-wide online survey conducted from 9 September to 26 October, 2015.
The participants were doctors from the Hong Kong College of Emergency Medicine and nurses from the Hong Kong College of Emergency Nursing.
We assessed various components of all-hazards preparedness using a 25-item questionnaire. Backward logistic regression was used to identify factors associated with perceived preparedness.
A total of 107 responses were analyzed. Respondents lacked training in disaster management, emergency communication, psychological first aid, public health interventions, disaster law and ethics, media handling, and humanitarian response in an overseas setting. High perceived workplace preparedness, length of practice, and willingness to respond were associated with high perceived personal preparedness.
Given the current gaps in and needs for increased disaster preparedness training, ED doctors and nurses in Hong Kong may benefit from the development of core-competency-based training targeting the under-trained areas, measures to improve staff confidence in their workplaces, and efforts to remove barriers to staff willingness to respond. (Disaster Med Public Health Preparedness. 2018; 12: 329–336)
Mycobacterium marinum, a bacterium found in freshwater and saltwater, can infect persons with direct exposure to fish or aquariums. During December 2013, the New York City Department of Health and Mental Hygiene learned of four suspected or confirmed M. marinum skin or soft tissue infections (SSTIs) among persons who purchased whole fish from Chinese markets. Ninety-eight case-patients with non-tuberculous mycobacteria (NTM) SSTIs were identified with onset June 2013–March 2014. Of these, 77 (79%) were female. The median age was 62 years (range 30–91). Whole genome sequencing of clinical isolates revealed two main clusters and marked genetic diversity. Environmental samples from distributors yielded NTM though not M. marinum. We compared 56 case-patients with 185 control subjects who shopped in Chinese markets, frequency-matched by age group and sex. Risk factors for infection included skin injury to the finger or hand (odds ratio [OR]: 15·5; 95% confidence interval [CI]: 6·9–37·3), hand injury while preparing fish or seafood (OR 8·3; 95% CI 3·8–19·1), and purchasing tilapia (OR 3·6; 95% CI 1·1–13·9) or whiting (OR 2·7; 95% CI 1·1–6·6). A definitive environmental outbreak source was not identified.
Introduction: The Surviving Sepsis Campaign (SSC) suggests that hypovolemic patients, in the setting of hypoperfusion, be administered 30 mL/kg crystalloid fluid within the first 3 hours of presentation to hospital. More recent evidence suggests that fluid resuscitation within 30 min of sepsis identification is associated with reduced mortality, hospital length of stay and ICU days. This study describes Emergency Department (ED) fluid resuscitation of patients with septic shock and/or sepsis-related in-hospital mortality, prior to implementation of a sepsis medical directive. Methods: Retrospective chart review of adult patients (18+ years), presenting to two tertiary care EDs between 01 Nov 2014 and 31 Oct 2015, with >=2 SIRS criteria and/or ED suspicion of infection and/or ED or hospital discharge sepsis diagnosis. Data were abstracted from electronic health records. Patients with septic shock, or who expired in the ED/hospital, were selected for manual chart review of clinical variables including: time, type and volume of ED IV fluid administration. Results: 13,506 patient encounters met inclusion criteria. In-hospital mortality rates were 2% (sepsis), 11.5% (severe sepsis), and 24.1% (septic shock). Of patients hypotensive at triage, fluids were administered to 33/50 (66.00%) septic shock patients, and 22/43 (51.16 %) patients who eventually expired. For all septic shock and expired patients (943), median time to IV fluid initiation was 60.50 minutes [29.75 to 101.25] for septic shock and 77.00 minutes [36.00 to 127.00] for expired patients. Median volume of fluid administered was 1.50L [1.0 to 2.00] for septic shock and 1.00L [1.00 to 2.00] for expired patients. Of septic shock and expired patients, IV fluid administration and body weight data was available for 148 encounters (15.6%). Within this group, 19 (12.8%) received no IV fluid. 90 (60.8%) received 0.1-75% of their recommended IV fluid volume. 25 (16.9%) received 75.1-125%, and 14 (9.4%) received >125.1% of their recommended fluid volume. Conclusion: In this study, severe forms of sepsis were often treated with <30 mL/kg crystalloid fluid. Fluids were administered outside of the recommended 30 min, but within the 3 h, time windows. In-hospital mortality was consistent with published data. Future research will examine a broader data set for IV fluid resuscitation in sepsis, and will measure the impact of a fluid resuscitation in sepsis medical directive.
Introduction: When ventricular fibrillation (VF) cannot be terminated with conventional external defibrillation, it is classified as refractory VF (RVF). There is a paucity of information regarding prehospital or patient factors that may be associated with RVF. The objectives of this study were to determine factors that may be associated with RVF, the initial ED rhythm for patients with prehospital RVF, and the incidence of survival in patients who had RVF and were transported to hospital. Methods: Ambulance Call Records (ACRs) of patients with out of hospital cardiac arrest between Mar. 1 2012 and Apr. 1 2016 were reviewed. Cases of RVF (≥5 consecutive shocks delivered) were determined by manual review of the ACR. ED and hospital records were analyzed to determine outcomes of patients who were in RVF and transported to hospital. Descriptive statistics were calculated and all variables were tested for an association with initial ED rhythm, survival to admission, and survival to discharge. Results: Eighty-five cases of RVF were identified. A history of coronary artery disease (47.10%) and hypertension (50.60%) were the most common comorbidities in patients transported to the ED with RVF. Upon arrival to the ED, 24 (28.2%) remained in RVF, 38 (44.7%) had a non-shockable rhythm, and 23 (27.1%) had return of spontaneous circulation. Thirty-four (40%) survived to admission, while only 18 (21.2%) survived to discharge. Pre-existing comorbidities, time to first shock, time on scene, and transport time were not statistically associated with initial ED rhythm, survival to admission or discharge. Patient age was statistically associated with improved rhythm on ED arrival (p=0.013) and survival to discharge (58.24 yrs vs 67.40 yrs, Δ9.17, 95% CI 1.82 to 16.52, p=0.015). Conclusion: The majority of patients with prehospital RVF have a rhythm deterioration by the time care is transferred to the ED. Of these patients with a rhythm deterioration, few survive to hospital discharge. Younger patients are more likely to remain in RVF and survive to discharge. Further research is required to determine prehospital treatment strategies for RVF, as well as patient populations that may benefit from those treatments.