To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Lipid-based nutrient supplements (LNS) may be beneficial for malnourished HIV-infected patients starting antiretroviral therapy (ART). We assessed the effect of adding vitamins and minerals to LNS on body composition and handgrip strength during ART initiation. ART-eligible HIV-infected patients with BMI <18·5 kg/m2 were randomised to LNS or LNS with added high-dose vitamins and minerals (LNS-VM) from referral for ART to 6 weeks post-ART and followed up until 12 weeks. Body composition by bioelectrical impedance analysis (BIA), deuterium (2H) diluted water (D2O) and air displacement plethysmography (ADP), and handgrip strength were determined at baseline and at 6 and 12 weeks post-ART, and effects of LNS-VM v. LNS at 6 and 12 weeks investigated. BIA data were available for 1461, D2O data for 479, ADP data for 498 and handgrip strength data for 1752 patients. Fat mass tended to be lower, and fat-free mass correspondingly higher, by BIA than by ADP or D2O. At 6 weeks post-ART, LNS-VM led to a higher regain of BIA-assessed fat mass (0·4 (95 % CI 0·05, 0·8) kg), but not fat-free mass, and a borderline significant increase in handgrip strength (0·72 (95 % CI −0·03, 1·5) kg). These effects were not sustained at 12 weeks. Similar effects as for BIA were seen using ADP or D2O but no differences reached statistical significance. In conclusion, LNS-VM led to a higher regain of fat mass at 6 weeks and to a borderline significant beneficial effect on handgrip strength. Further research is needed to determine appropriate timing and supplement composition to optimise nutritional interventions in malnourished HIV patients.
Introduction: In-hospital cardiac arrest (IHCA) most commonly occurs in non-monitored areas, where we observed a 10min delay before defibrillation (Phase I). Nurses (RNs) and respiratory therapists (RTs) cannot legally use Automated External Defibrillators (AEDs) during IHCA without a medical directive. We sought to evaluate IHCA outcomes following usual implementation (Phase II) vs. a Theory-Based educational program (Phase III) allowing RNs and RTs to use AEDs during IHCA. Methods: We completed a pragmatic before-after study of consecutive IHCA. We used ICD-10 codes to identify potentially eligible cases and included IHCA cases for which resuscitation was attempted. We obtained consensus on all data definitions before initiation of standardized-piloted data extraction by trained investigators. Phase I (Jan.2012-Aug.2013) consisted of baseline data. We implemented the AED medical directive in Phase II (Sept.2013-Aug.2016) using usual implementation strategies. In Phase III (Sept.2016-Dec.2017) we added an educational video informed by key constructs from a Theory of Planned Behavior survey. We report univariate comparisons of Utstein IHCA outcomes using 95% confidence intervals (CI). Results: There were 753 IHCA for which resuscitation was attempted with the following similar characteristics (Phase I n = 195; II n = 372; III n = 186): median age 68, 60.0% male, 79.3% witnessed, 29.7% non-monitored medical ward, 23.9% cardiac cause, 47.9% initial rhythm of pulseless electrical activity and 27.2% ventricular fibrillation/tachycardia (VF/VT). Comparing Phases I, II and III: an AED was used 0 times (0.0%), 21 times (5.6%), 15 times (8.1%); time to 1st rhythm analysis was 6min, 3min, 1min; and time to 1st shock was 10min, 10min and 7min. Comparing Phases I and III: time to 1st shock decreased by 3min (95%CI -7; 1), sustained ROSC increased from 29.7% to 33.3% (AD3.6%; 95%CI -10.8; 17.8), and survival to discharge increased from 24.6% to 25.8% (AD1.2%; 95%CI -7.5; 9.9). In the VF/VT subgroup, time to first shock decreased from 9 to 3 min (AD-6min; 95%CI -12; 0) and survival increased from 23.1% to 38.7% (AD15.6%; 95%CI -4.3; 35.4). Conclusion: The implementation of a medical directive allowing for AED use by RNs and RRTs successfully improved key outcomes for IHCA victims, particularly following the Theory-Based education video. The expansion of this project to other hospitals and health care professionals could significantly impact survival for VF/VT patients.
Introduction: Guidelines recommend serial conventional cardiac troponin (cTn) measurements 6-9 hours apart for non-ST-elevation myocardial infarction (NSTEMI) diagnosis. We sought to develop a pathway based on absolute/relative changes between two serial conventional cardiac troponin I (cTnI) values 3-hours apart for 15-day MACE identification. Methods: This was a prospective cohort study conducted in the two large ED's at the Ottawa Hospital. Adults with NSTEMI symptoms were enrolled over 32 months. Patients with STEMI, hospitalized for unstable angina, or with only one cTnI were excluded. We collected baseline characteristics, Siemens Vista cTnI at 0 and 3-hours after ED presentation, disposition, and ED length of stay (LOS). Adjudicated primary outcome was 15-day MACE (AMI, revascularization, or death due to cardiac ischemia/unknown cause). We analysed cTnI values by 99th percentile cut-off multiples (45, 100 and 250ng/L). Results: 1,683 patients (mean age 64.7 years; 55.3% female; median ED LOS 7 hours; 88 patients with 15-day MACE) were included. 1,346 (80.0%) patients with both cTnI ≤45ng/L; and 58 (3.4%) of the 213 patients with one value≥100ng/L but both <250ng/L or ≤20% change did not suffer MACE. Among 124 patients (7.4%) with one value >45ng/L but both <100ng/L based on 3 or 6-hour cTnI, one patient with Δ<10ng/L and 6 of 19 patients with Δ≥20ng/L were diagnosed with NSTEMI (patients with Δ10-19ng/L between first and second cTnI had third one at 6-hours). Based on the results, we developed the Ottawa Troponin Pathway (OTP) with a 98.9% sensitivity (95%CI 96.7-100%) and 94.6% specificity (95%CI 93.4-95.7%). Conclusion: The OTP, using two conventional cTnI measurements performed 3-hours apart, should lead to better identification of NSTEMI particularly those with values >99th percentile cut-off, standardize management and reduce the ED LOS.
Introduction: For rhythm control of acute atrial fibrillation (AAF) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAF, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an apriori-specified modified intention-to-treat (MITT) basis excluding patients who never received the study infusion (e.g. spontaneous conversion). Data were analyzed using chi-squared tests and logistic regression. Our target sample size was 374 evaluable patients. Results: Of 395 randomized patients, 18 were excluded from the MITT analysis; none were lost to follow-up. The Drug-Shock (N = 198) and Shock Only (N = 180) groups (total = 378) were similar for all characteristics including mean age (60.0 vs 59.5 yrs), duration of AAF (10.1 vs 10.8 hrs), previous AF (67.2% vs 68.3%), median CHADS2 score (0 vs 0), and mean initial heart rate (119.9 vs 118.0 bpm). More patients converted to normal sinus rhythm in the Drug-Shock group (97.0% vs 92.2%; absolute difference 4.8%, 95% CI 0.2-9.9; P = 0.04). The multivariable analyses confirmed the Drug-Shock strategy superiority (P = 0.04). There were no statistically significant differences for time to conversion (91.4 vs 85.4 minutes), total ED length of stay (7.1 vs 7.7 hours), disposition home (97.0% vs 96.1%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion was more common in the Drug-Shock group (8.1% vs 0.6%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAF patients to go home in sinus rhythm. A strategy of initial cardioversion with procainamide was superior to a strategy of immediate ECV.
The study aimed at assessing stunting, wasting and breast-feeding as correlates of body composition in Cambodian children. As part of a nutrition trial (ISRCTN19918531), fat mass (FM) and fat-free mass (FFM) were measured using 2H dilution at 6 and 15 months of age. Of 419 infants enrolled, 98 % were breastfed, 15 % stunted and 4 % wasted at 6 months. At 15 months, 78 % were breastfed, 24 % stunted and 11 % wasted. Those not breastfed had lower FMI at 6 months but not at 15 months. Stunted children had lower FM at 6 months and lower FFM at 6 and 15 months compared with children with length-for-age z ≥0. Stunting was not associated with height-adjusted indexes fat mass index (FMI) or fat-free mass index (FFMI). Wasted children had lower FM, FFM, FMI and FFMI at 6 and 15 months compared with children with weight-for-length z (WLZ) ≥0. Generally, FFM and FFMI deficits increased with age, whereas FM and FMI deficits decreased, reflecting interactions between age and WLZ. For example, the FFM deficits were –0·99 (95 % CI –1·26, –0·72) kg at 6 months and –1·44 (95 % CI –1·69; –1·19) kg at 15 months (interaction, P<0·05), while the FMI deficits were –2·12 (95 % CI –2·53, –1·72) kg/m2 at 6 months and –1·32 (95 % CI –1·77, –0·87) kg/m2 at 15 months (interaction, P<0·05). This indicates that undernourished children preserve body fat at the detriment of fat-free tissue, which may have long-term consequences for health and working capacity.
Salmonella spp. continue to be a leading cause of foodborne morbidity worldwide. To assess the risk of foodborne disease, current national regulatory schemes focus on prevalence estimates of Salmonella and other pathogens. The role of pathogen quantification as a risk management measure and its impact on public health is not well understood. To address this information gap, a quantitative risk assessment model was developed to evaluate the impact of pathogen enumeration strategies on public health after consumption of contaminated ground turkey in the USA. Public health impact was evaluated by using several dose–response models for high- and low-virulent strains to account for potential under- or overestimation of human health impacts. The model predicted 2705–21 099 illnesses that would result in 93–727 reported cases of salmonellosis. Sensitivity analysis predicted cooking an unthawed product at home as the riskiest consumption scenario and microbial concentration the most influential input on the incidence of human illnesses. Model results indicated that removing ground turkey lots exceeding contamination levels of 1 MPN/g and 1 MPN in 25 g would decrease the median number of illnesses by 86–94% and 99%, respectively. For a single production lot, contamination levels higher than 1 MPN/g would be needed to result in a reported case to public health officials. At contamination levels of 10 MPN/g, there would be a 13% chance of detecting an outbreak, and at 100 MPN/g, the likelihood of detecting an outbreak increases to 41%. Based on these model prediction results, risk management strategies should incorporate pathogen enumeration. This would have a direct impact on illness incidence linking public health outcomes with measurable food safety objectives.
To assess differences in cognition functions and gross brain structure in children seven years after an episode of severe acute malnutrition (SAM), compared with other Malawian children.
Prospective longitudinal cohort assessing school grade achieved and results of five computer-based (CANTAB) tests, covering three cognitive domains. A subset underwent brain MRI scans which were reviewed using a standardized checklist of gross abnormalities and compared with a reference population of Malawian children.
Children discharged from SAM treatment in 2006 and 2007 (n 320; median age 9·3 years) were compared with controls: siblings closest in age to the SAM survivors and age/sex-matched community children.
SAM survivors were significantly more likely to be in a lower grade at school than controls (adjusted OR = 0·4; 95 % CI 0·3, 0·6; P < 0·0001) and had consistently poorer scores in all CANTAB cognitive tests. Adjusting for HIV and socio-economic status diminished statistically significant differences. There were no significant differences in odds of brain abnormalities and sinusitis between SAM survivors (n 49) and reference children (OR = 1·11; 95 % CI 0·61, 2·03; P = 0·73).
Despite apparent preservation in gross brain structure, persistent impaired school achievement is likely to be detrimental to individual attainment and economic well-being. Understanding the multifactorial causes of lower school achievement is therefore needed to design interventions for SAM survivors to thrive in adulthood. The cognitive and potential economic implications of SAM need further emphasis to better advocate for SAM prevention and early treatment.
Predictive analytics in health is a complex, transdisciplinary field requiring collaboration across diverse scientific and stakeholder groups. Pilot implementation of participatory research to foster team science in predictive analytics through a partnered-symposium and funding competition. In total, 85 stakeholders were engaged across diverse translational domains, with a significant increase in perceived importance of early inclusion of patients and communities in research. Participatory research approaches may be an effective model for engaging broad stakeholders in predictive analytics.
Early nutrition and growth have been found to be important early exposures for later development. Studies of crude growth in terms of weight and length/height, however, cannot elucidate how body composition (BC) might mediate associations between nutrition and later development. In this study, we aimed to examine the relation between fat mass (FM) or fat-free mass (FFM) tissues at birth and their accretion during early infancy, and later developmental progression. In a birth cohort from Ethiopia, 455 children who have BC measurement at birth and 416 who have standardised rate of BC growth during infancy were followed up for outcome variable, and were included in the statistical analysis. The study sample was restricted to mothers living in Jimma town who gave birth to a term baby with a birth weight ≥1500 g and no evident congenital anomalies. The relationship between the exposure and outcome variables was examined using linear-mixed regression model. The finding revealed that FFM at birth was positively associated with global developmental progression from 1 to 5 years (β=1·75; 95 % CI 0·11, 3·39) and from 4 to 5 years (β=1·34; 95 % CI 0·23, 2·44) in the adjusted model. Furthermore, the rate of postnatal FFM tissue accretion was positively associated with development at 1 year of age (β=0·50; 95 % CI 0·01, 0·99). Neither fetal nor postnatal FM showed a significant association. In conclusion, fetal, rather than postnatal, FFM tissue accretion was associated with developmental progression. Intervention studies are needed to assess whether nutrition interventions increasing FFM also increase cognitive development.
Introduction: The Ottawa SAH Rule was developed to identify patients at high-risk for subarachnoid hemorrhage (SAH) who require investigations and the 6-Hour CT Rule found that computed tomography (CT) was 100% sensitive for SAH 6 hours of headache onset. Together, they form the Ottawa SAH Strategy. Our objectives were to assess: 1) Safety of the Ottawa SAH Strategy and its 2) Impact on: a) CTs, b) LPs, c) ED length of stay, and d) CT angiography (CTA). Methods: We conducted a multicentre prospective before/after study at 6 tertiary-care EDs January 2010 to December 2016 (implementation July 2013). Consecutive alert, neurologically intact adults with a headache peaking within one hour were included. SAH was defined by subarachnoid blood on head CT (radiologists final report); xanthochromia in the cerebrospinal fluid (CSF); >1x106/L red blood cells in the final tube of CSF with an aneurysm on CTA. Results: We enrolled 3,669 patients, 1,743 before and 1,926 after implementation, including 185 with SAH. The investigation rate before implementation was 89.0% (range 82.9 to 95.6%) versus 88.4% (range 85.2 to 92.3%) after implementation. The proportion who had CT remained stable (88.0% versus 87.4%; p=0.60), while the proportion who had LP decreased from 38.9% to 25.9% (p<0.001), and the proportion investigated with CTA increased from 18.8% to 21.6% (p=0.036). The additional testing rate (i.e. LP or CTA) diminishedfrom 50.1% to 40.8% (p<0.001). The proportion admitted declined from 9.8% to 7.3% (p=0.008), while the mean length of ED stay was stable (6.2 +/− 4.0 to 6.4 +/− 4.1 hours; p=0.45). For the 1,201 patients with CT 6 hours, there was an absolute decrease in additional testing (i.e. LP or CTA) of 15.0% (46.6% versus 31.6%; p<0.001). The sensitivity of the Ottawa SAH Rule was 100% (95%CI: 98-100%), and the 6-Hour CT Rule was 95.3% (95%CI: 88.9-98.3) for SAH. Five patients with early CT had SAH with CT reported as normal: 2 unruptured aneuryms on CTA and presumed traumatic LP (determined by treating neurosurgeon); 1 missed by the radiologist on the initial interpretation; 1 dural vein fistula (i.e. non-aneuyrsmal); and 1 profoundly anemic (Hgb 63g/L). Conclusion: The Ottawa SAH Strategy is highly sensitive and can be used routinely when SAH is being considered in alert and neurologically intact headache patients. Its implementation was associated with a decrease in LPs and admissions to hospital.
The ‘Digital Index of North American Archaeology’ (DINAA) project demonstrates how the aggregation and publication of government-held archaeological data can help to document human activity over millennia and at a continental scale. These data can provide a valuable link between specific categories of information available from publications, museum collections and online databases. Integration improves the discovery and retrieval of records of archaeological research currently held by multiple institutions within different information systems. It also aids in the preservation of those data and makes efforts to archive these research results more resilient to political turmoil. While DINAA focuses on North America, its methods have global applicability.
Because of Comet Kohoutek's anticipated large gas production, which seemed to offer a unique chance to reveal parent molecules, two Fabry-Perot Tilting Filter Photometers were designed with the purpose to detect and study the behaviour of CH4 and its photolysis product H2 The importance of these two molecules is well known and their detection would have given valuable indications about the structure of the nucleus, its thermal history and conditions of formation.
Similar to CH4, H2 has no dipole moment and cannot be detected by radioastronomy. The most obvious way for measuring H2 in extended cometary comae is certainly on the basis of fluorescence from the Lyman bands around 1000Å, there are, however, vibrational quadrupole transitions within the overtone bands of the ground electronic state which give rise to emissions in the near infrared, accessible by means of ground based telescopes. Three of the stronger lines are: λ = 0.8748 μ; 0.8560 μ and 0.8497 μ. Methane is more readily detectable in the infrared, since it has strong fundamental (1-0) infrared vibration rotation bands at 3.3 μ (ν3).In order to measure both the CH4 concentration and its rotational temperature, a. very high resolution (~3.7A) high throughput instrument was designed which could isolate several individual vibration-rotation lines in the v3 band, namely the P2, P3 and P9 lines. The instrument consisting of a Fabry-Periot Tilting Filter Photometer with InSb detector interfaced with the 30 cm f/30 Dahl-Kirkham Telescope is described in detail elsewhere.( l). The observations were made in January from the NASA Convair 990 (Galileo II) at an altitude of 13 km, where atmospheric methane absorption can be minimized but not avoided. Doppler shift of cometary and atmospheric lines with respect to one another by at least a few A caused by the orbiting velocity of the comet would be sufficient to allow for high transmission measurements. Though long integration time measurements with Lock-In- Amplifier technique have been carried out, no signals from the CH4-rotational lines of the comet coma could be detected. Using the planet Venus as a calibration source for the photon flux and as a result of delicate laboratory measnrements an upper limit of
could be derived. This value is several orders of magnitude less than the original predictions for Kohoutek during close approach. Therefore, one could conclude that volatile components like CH4 boiled off the comet well before perihelion, at large (~4 AU) distances from the sun and were responsible for the high brightness of the comet at that time. Such a fractionation is only possible if the nucleus was composed of relatively loose, porous ice, rather than compact ice. This hypothesis was strongly supported by the second experiment for search of H2 in the near infrared at the 182 cm telescope of Asiago. Also in this case a Fabry-Perot tilting filter photometer was designed to match with the f/9 optics of the telescope. The instrument (2) consists in a high resolution (~0.7A) tilting filter system with photon counting technique which allows phase-sensitive background subtraction. On the basis of the best data achieved between January 10 and 15 the occurrence of H2-lines with an intensity larger than 2% of the continuum could be excluded, viz. the flux averaged over the field of view was less than 4.105 photons/cm2 sec sr A. Since the pre- and post-perihelion measurements were not affected by molecular fluorescence, they represent only the light scattering flux from dust particles. The data display that the comet's dust coma was definitely brighter during approach than during recession from the sun. However, the quantity of more fundamental interest is the difference in dust production rates, and a derivation of the mass-production rate of dust could be derived. The study shows that both the dust and gas production rate differ greatly in the pre-perihelion period as compared to the post-perihelion period, as conjectured previously for "virgin" comets. (Dust production rate/gas production rate: pre-perihelion 0.1, post-perihelion 1). The pronounced asymmetry in the production rates strongly suggests that fractionation and dust entrainment effects have to be considered in brightness predictions of young comets, the nucleus of which will generally consist of a multi-component mixture of parent molecules.
Bioelectrical impedance analysis (BIA) is an inexpensive, quick and non-invasive method to determine body composition. Equations used in BIA are typically derived in healthy individuals of European descent. BIA is specific to health status and ethnicity and may therefore provide inaccurate results in populations of different ethnic origin and health status. The aim of the present study was to test the validity of BIA in Ethiopian antiretroviral-naive HIV patients.
BIA was validated against the 2H dilution technique by comparing fat-free mass (FFM) measured by the two methods using paired t tests and Bland–Altman plots. BIA was based on single frequency (50 kHz) whole-body measurements. Data were obtained at three health facilities in Jimma Zone, Oromia Region, South-West Ethiopia. Data from 281 HIV-infected participants were available. Two-thirds were female and the mean age was 32·7 (sd 8·6) years. Also, 46 % were underweight with a BMI below 18·5 kg/m2. There were no differences in FFM between the methods. Overall, BIA slightly underestimated FFM by 0·1 kg (−0·1, 95 % CI −0·3, 0·2 kg). The Bland–Altman plot indicated acceptable agreement with an upper limit of agreement of 4·5 kg and a lower limit of agreement of −4·6 kg, but with a small correlation between the mean difference and the average FFM. BIA slightly overestimated FFM at low values compared with the 2H dilution technique, while it slightly underestimated FFM at high values. In conclusion, BIA proved to be valid in this population and may therefore be useful for measuring body composition in routine practice in HIV-infected African individuals.
Among African Americans, spirituality is meaning or purpose in life and a faith in God who is in control of health and there to provide support and guidance in illness situations. Using qualitative methods, we explored the use of spirituality to make sense of the end-of-life and bereavement experiences among family members of a deceased cancer patient.
Data in this report come from 19 African Americans who experienced the loss of a family member to cancer. A qualitative descriptive design was used with criterion sampling, open-ended semistructured interviews, and qualitative content analysis.
Participants made sense of the death of their loved one using the following five themes: Ready for life after death; I was there; I live to honor their memory; God's wisdom is infinite; and God prepares you and brings you through. These five themes are grounded in conceptualizations of spirituality as connectedness to God, self, and others.
Significance of results
Our findings support the results that even during bereavement, spirituality is important in the lives of African Americans. African American family members might struggle with issues related to life after death, their ability to be physically present during end-of-life care, and disentangling beliefs around God's control over the beginning and ending of life. The findings in this report can be used to inform healthcare providers to better support and address the needs for support of African American family members during end-of-life and bereavement experiences.
To assess the role of methodological differences on measured trace-element concentrations in ice cores, we developed an experiment to test the effects of acidification strength and time on dust dissolution using snow samples collected in West Antarctica and Alaska. We leached Antarctic samples for 3 months at room temperature using nitric acid at concentrations of 0.1, 1.0 and 10.0% (v/v). At selected intervals (20 min, 24 hours, 5 days, 14 days, 28 days, 56 days, 91 days) we analyzed 23 trace elements using inductively coupled plasma mass spectrometry. Concentrations of lithogenic elements scaled with acid strength and increased by 100–1380% in 3 months. Incongruent elemental dissolution caused significant variability in calculated crustal enrichment factors through time (factor of 1.3 (Pb) to 8.0 (Cs)). Using snow samples collected in Alaska and acidified at 1% (v/v) for 383 days, we found that the increase in lithogenic element concentration with time depends strongly on initial concentration, and varies by element (e.g. Fe linear regression slope = 1.66; r = 0.98). Our results demonstrate that relative trace-element concentrations measured in ice cores depend on the acidification method used.
Introduction: Concern for occult serious conditions leads to variations in ED syncope management [hospitalization, duration of ED/inpatient monitoring including Syncope Observation Units (SOU) for prolonged monitoring]. We sought to develop evidence-based recommendations for duration of ED/post-ED ECG monitoring using the Canadian Syncope Risk Score (CSRS) by assessing the time to serious adverse event (SAE) occurrence. Methods: We enrolled adults with syncope at 6 EDs and collected demographics, time of syncope and ED arrival, CSRS predictors and time of SAE. We stratified patients as per the CSRS (low, medium and high risk as ≤0, 1-3 and ≥4 respectively). 30-day adjudicated SAEs included death, myocardial infarction, arrhythmia, structural heart disease, pulmonary embolism or serious hemorrhage. We categorized arrhythmias, interventions for arrhythmias and death from unknown cause as arrhythmic SAE and the rest as non-arrhythmic SAE. We performed Kaplan-Meier analysis using time of ED registration for primary and time of syncope for secondary analyses. Results: 5,372 patients (mean age 54.3 years, 54% females, and 13.7% hospitalized) were enrolled with 538 (10%) patients suffering SAE (0.3% died due to an unknown cause and 0.5% suffered ventricular arrhythmia). 64.8% of SAEs occurred within 6 hours of ED arrival. The probability for any SAE or arrhythmia was highest within 2-hours of ED arrival for low-risk patients (0.65% and 0.31%; dropped to 0.54% and 0.06% after 2-hours) and within 6-hours for the medium and high-risk patients (any SAE 6.9% and 17.4%; arrhythmia 6.5% and 18.9% respectively) which also dropped after 6-hours (any SAE 0.99% and 2.92%; arrhythmia 0.78% and 3.07% respectively). For any CSRS threshold, the risk of arrhythmia was highest within the first 15-days (for CSRS ≥2 patients 15.6% vs. 0.006%). ED monitoring for 2-hours (low-risk) and 6-hours (medium and high-risk) and using a CSRS ≥2 cut-off for outpatient 15-day ECG monitoring will lead to 52% increase in arrhythmia detection. The majority (82.2%) arrived to the ED within 2-hours (median time 1.1 hours) and secondary analysis yielded similar results. Conclusion: Our study found 2 and 6 hours of ED monitoring for low-risk and medium/high-risk CSRS patients respectively, with 15-day outpatient ECG monitoring for CSRS ≥2 patients will improve arrhythmia detection without the need for hospitalization or observation units.
Introduction: The Canadian C-Spine Rule (CCR) was validated by emergency physicians and triage nurses to determine the need for radiography in alert and stable Emergency Department trauma patients. It was modified and validated for use by paramedics in 1,949 patients. The prehospital CCR calls for evaluation of active neck rotation if patients have none of 3 high-risk criteria and at least 1 of 4 low-risk criteria. This study evaluated the impact and safety of the implementation of the CCR by paramedics. Methods: This single-centre prospective cohort implementation study took place in Ottawa, Canada. Advanced and primary care paramedics received on-line and in-person training on the CCR, allowing them to use the CCR to evaluate eligible patients and selectively transport them without immobilization. We evaluated all consecutive eligible adult patients (GCS 15, stable vital signs) at risk for neck injury. Paramedics were required to complete a standardized study data form for each eligible patient evaluated. Study staff reviewed paramedic documentation and corresponding hospital records and diagnostic imaging reports. We followed all patients without initial radiologic evaluation for 30 days for referral to our spine service, or subsequent visit with radiologic evaluation. Analyses included sensitivity, specificity, kappa coefficient, t-test, and descriptive statistics with 95% CIs. Results: The 4,034 patients enrolled between Jan. 2011 and Aug. 2015 were: mean age 43 (range 16-99), female 53.3%, motor vehicle collision 51.9%, fall 23.8%, admitted to hospital 7.0%, acute c-spine injury 0.8%, and clinically important c-spine injury (0.3%). The CCR classified patients for 11 important injuries with sensitivity 91% (95% CI 58-100%), and specificity 67% (95% CI 65-68%). Kappa agreement for interpretation of the CCR between paramedics and study investigators was 0.94 (95% CI 0.92-0.95). Paramedics were comfortable or very comfortable using the CCR in 89.8% of cases. Mean scene time was 3 min (15.6%) shorter for those not immobilized (17 min vs. 20 min; p=0.0001). A total of 2,569 (63.7%) immobilizations were safely avoided using the CCR. Conclusion: Paramedics could safely and accurately apply the CCR to low-risk trauma patients. This had a significant impact on scene times and the number of prehospital immobilizations.