To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Introduction: For rhythm control of acute atrial flutter (AAFL) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAFL, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an intention-to-treat basis. Statistical significance was assessed using chi-squared tests and multivariable logistic regression. Results: We randomized 76 patients, and none was lost to follow-up. The Drug-Shock (N = 33) and Shock Only (N = 43) groups were similar for all characteristics including mean age (66.3 vs 63.4 yrs), duration of AAFL (30.1 vs 24.5 hrs), previous AAFL (72.7% vs 69.8%), median CHADS2 score (1 vs 1), and mean initial heart rate (128.9 vs 126.0 bpm). The Drug-Shock and Shock only groups were similar for the primary outcome of conversion (100% vs 93%; absolute difference 7.0%, 95% CI -0.6;14.6; P = 0.25). The multivariable analyses confirmed the similarity of the two strategies (P = 0.19). In the Drug-Shock group 21.2% of patients converted with the infusion. There were no statistically significant differences for time to conversion (84.2 vs 97.6 minutes), total ED length of stay (9.4 vs 7.5 hours), disposition home (100% vs 95.3%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion (usually for transient hypotension) was more common in the Drug-Shock group (9.1% vs 0.0%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAFL patients to go home in sinus rhythm. IV procainamide alone was effective in only one fifth of patients, much less than for acute AF.
Introduction: CAEP recently developed the acute atrial fibrillation (AF) and flutter (AFL) [AAFF] Best Practices Checklist to promote optimal care and guidance on cardioversion and rapid discharge of patients with AAFF. We sought to assess the impact of implementing the Checklist into large Canadian EDs. Methods: We conducted a pragmatic stepped-wedge cluster randomized trial in 11 large Canadian ED sites in five provinces, over 14 months. All hospitals started in the control period (usual care), and then crossed over to the intervention period in random sequence, one hospital per month. We enrolled consecutive, stable patients presenting with AAFF, where symptoms required ED management. Our intervention was informed by qualitative stakeholder interviews to identify perceived barriers and enablers for rapid discharge of AAFF patients. The many interventions included local champions, presentation of the Checklist to physicians in group sessions, an online training module, a smartphone app, and targeted audit and feedback. The primary outcome was length of stay in ED in minutes from time of arrival to time of disposition, and this was analyzed at the individual patient-level using linear mixed effects regression accounting for the stepped-wedge design. We estimated a sample size of 800 patients. Results: We enrolled 844 patients with none lost to follow-up. Those in the control (N = 316) and intervention periods (N = 528) were similar for all characteristics including mean age (61.2 vs 64.2 yrs), duration of AAFF (8.1 vs 7.7 hrs), AF (88.6% vs 82.9%), AFL (11.4% vs 17.1%), and mean initial heart rate (119.6 vs 119.9 bpm). Median lengths of stay for the control and intervention periods respectively were 413.0 vs. 354.0 minutes (P < 0.001). Comparing control to intervention, there was an increase in: use of antiarrhythmic drugs (37.4% vs 47.4%; P < 0.01), electrical cardioversion (45.1% vs 56.8%; P < 0.01), and discharge in sinus rhythm (75.3% vs. 86.7%; P < 0.001). There was a decrease in ED consultations to cardiology and medicine (49.7% vs 41.1%; P < 0.01), but a small but insignificant increase in anticoagulant prescriptions (39.6% vs 46.5%; P = 0.21). Conclusion: This multicenter implementation of the CAEP Best Practices Checklist led to a significant decrease in ED length of stay along with more ED cardioversions, fewer ED consultations, and more discharges in sinus rhythm. Widespread and rigorous adoption of the CAEP Checklist should lead to improved care of AAFF patients in all Canadian EDs.
Little is known about the experiences of people living alone with dementia in the community and their non-resident relatives and friends who support them. In this paper, we explore their respective attitudes and approaches to the future, particularly regarding the future care and living arrangements of those living with dementia. The study is based on a qualitative secondary analysis of interviews with 24 people living alone with early-stage dementia in North Wales, United Kingdom, and one of their relatives or friends who supported them. All but four of the dyads were interviewed twice over 12 months (a total of 88 interviews). In the analysis, it was observed that several people with dementia expressed the desire to continue living at home for ‘as long as possible’. A framework approach was used to investigate this theme in more depth, drawing on concepts from the existing studies of people living with dementia and across disciplines. Similarities and differences in the future outlook and temporal orientation of the participants were identified. The results support previous research suggesting that the future outlook of people living with early-stage dementia can be interpreted in part as a response to their situation and a way of coping with the threats that it is perceived to present, and not just an impaired view of time. Priorities for future research are highlighted in the discussion.
Trinucleotide repeats have been associated with schizophrenia, but the evidence, based on cross-sectional clinical information, is equivocal.
To examine the relationship between genomic CAG/CTG repeat size and premorbid development in schizophrenia.
Early development and premorbid functioning of 22 patients with DSM-IV diagnosis of schizophrenia were assessed by parental interviews. Repeat expansion detection (RED) technique was used to measure genomic CAG/CTG repeat size, and PCR for CAG repeat size at the ERDA-1 and CTG 18.1 loci.
There was an inverse association between CAG/CTG size and perinatal complications. Patients with speech and motor developmental delay had larger repeats. The results were not due to expansion in the ERDA-1 and CTG 18.1 genes.
CAG/CTG repeat expansion is associated with speech and motor developmental delay in schizophrenia. We propose that the developmental model may be useful for research into the genetics of schizophrenia.
The advent of genome wide association studies have resulted in the identification of a number of novel genetic loci for schizophrenia and related disorders. Understanding the functional impact of these variants on brain structure and function is crucial to understand their role in disease pathology. We presents data based on our genetic and neuropsychological assessment of almost 700 patients and healthy participants for a number of these variants and replication of our findings in independent samples of almost 1500 cases and controls. Specifically, we will use this data to suggest that the risk associated with some genetics variants (e.g. NOS1) is being mediated by an influence on variation in intelligence and other cognitive phenotypes, while other risk variants (e.g. ZNF804A) delineate illness subtypes in which cognitive deficits are a less prominent feature.
Brain-derived neurotrophic factor (BDNF) gene variants may potentially influence behaviour. In order to test this hypothesis, we investigated the relationship between BDNF Val66Met polymorphism and aggressive behaviour in a population of schizophrenic patients. Our results showed that increased number of BDNF Met alleles was associated with increased aggressive behaviour.
In the 1960s African hunter-gatherers in the Kalahari Desert were described as gentle people who used dispute resolution to prevent violence between band members. This ideal was a good fit for those anthropologists on one side of a debate on the nature of human behaviour. The Kalahari San played a role in the debate not only because anthropologists had categorised them as ‘gentle’, but also because they were seen as frozen remnants of our prehistoric ancestors. More recently, researchers have realised that the San of prehistory had very different lives from the ones anthropologists encountered in the ‘ethnographic present’. Evidence from archaeological skeletons from the middle and late Holocene suggests that interpersonal violence was a regular occurrence among the prehistoric foragers of the southern African Later Stone Age. Research has documented a number of antemortem and perimortem injuries on skeletons that can only be signs of interpersonal violence. The injuries have been found on women and children as well as adult males, and evidence suggests that inter-band violence was common in prehistoric times and that forager competition for resources may have been the cause of conflict.
Nearly half of care home residents with advanced dementia have clinically significant agitation. Little is known about costs associated with these symptoms toward the end of life. We calculated monetary costs associated with agitation from UK National Health Service, personal social services, and societal perspectives.
Prospective cohort study.
Thirteen nursing homes in London and the southeast of England.
Seventy-nine people with advanced dementia (Functional Assessment Staging Tool grade 6e and above) residing in nursing homes, and thirty-five of their informal carers.
Data collected at study entry and monthly for up to 9 months, extrapolated for expression per annum. Agitation was assessed using the Cohen-Mansfield Agitation Inventory (CMAI). Health and social care costs of residing in care homes, and costs of contacts with health and social care services were calculated from national unit costs; for a societal perspective, costs of providing informal care were estimated using the resource utilization in dementia (RUD)-Lite scale.
After adjustment, health and social care costs, and costs of providing informal care varied significantly by level of agitation as death approached, from £23,000 over a 1-year period with no agitation symptoms (CMAI agitation score 0–10) to £45,000 at the most severe level (CMAI agitation score >100). On average, agitation accounted for 30% of health and social care costs. Informal care costs were substantial, constituting 29% of total costs.
With the increasing prevalence of dementia, costs of care will impact on healthcare and social services systems, as well as informal carers. Agitation is a key driver of these costs in people with advanced dementia presenting complex challenges for symptom management, service planners, and providers.
To measure the association between statewide adoption of the Centers for Disease Control and Prevention’s (CDC’s) Core Elements for Hospital Antimicrobial Stewardship Programs (Core Elements) and hospital-associated methicillin-resistant Staphylococcus aureus bacteremia (MRSA) and Clostridioides difficile infection (CDI) rates in the United States. We hypothesized that states with a higher percentage of reported compliance with the Core Elements have significantly lower MRSA and CDI rates.
All US states.
Observational longitudinal study.
We used 2014–2016 data from Hospital Compare, Provider of Service files, Medicare cost reports, and the CDC’s Patient Safety Atlas website. Outcomes were MRSA standardized infection ratio (SIR) and CDI SIR. The key explanatory variable was the percentage of hospitals that meet the Core Elements in each state. We estimated state and time fixed-effects models with time-variant controls, and we weighted our analyses for the number of hospitals in the state.
The percentage of hospitals reporting compliance with the Core Elements between 2014 and 2016 increased in all states. A 1% increase in reported ASP compliance was associated with a 0.3% decrease (P < .01) in CDIs in 2016 relative to 2014. We did not find an association for MRSA infections.
Increasing documentation of the Core Elements may be associated with decreases in the CDI SIR. We did not find evidence of such an association for the MRSA SIR, probably due to the short length of the study and variety of stewardship strategies that ASPs may encompass.
Mössbauer instruments were included on the Mars Exploration Rover (MER) Mission to determine the mineralogic composition, diversity, and oxidation state of Fe-bearing igneous materials and alteration products. A total of 16 Fe-bearing phases (consistent with bulk-sample chemistry) were identified, including Fe associated with rock-forming minerals (olivine, pyroxene, magnetite, ilmenite, and chromite), Fe3+-bearing oxyhydroxides (nanophase ferric oxide, hematite, and goethite), sulfates (jarosite and an unassigned Fe3+ sulfate phase), and Fe2+ carbonate. Igneous rock types ranged from basalts to ultramafic rocks at Gusev crater. Jarosite-hematite bedrock was pervasive at Meridiani Planum, and concretions winnowed from the outcrop were mineralogically hematite. Because their structures contain hydroxyl, goethite, and jarosite provide mineralogic evidence for aqueous processes on Mars, and jarosite and Fe3+ sulfate are evidence for acid-sulfate processes at both Gusev crater and Meridiani Planum. A population of rocks on the Meridiani Planum outcrop was identified as iron and stony meteorites by the presence of Fe metal (kamacite) and the sulfide troilite. The MER mission demonstrates that Mössbauer spectrometers landed on any Fe-bearing planetary surface provide first-order information on igneous provinces, alteration state, and alteration style and provide well-constrained criteria for sample selection on planetary sample-return missions including planets, moons, and asteroids.
To investigate the nature of the relationship between cognitive function, mood state, and functionality in predicting awareness in a non-clinically depressed sample of participants with mild to moderate Alzheimer’s disease (AD) in Brazil.
People with AD (PwAD) aged 60 years or older were recruited from an outpatient unit at the Center of AD of the Federal University of Rio de Janeiro, Brazil. Measures of awareness of condition (Assessment Scale of the Psychosocial Impact of the Diagnosis of Dementia), cognitive function (Mini-Mental State Examination), mood state (Cornell Scale for Depression in Dementia), and functionality (Pfeffer Functional Activities Questionnaire) were applied to 264 people with mild to moderate AD and their caregivers. Hypotheses were tested statistically using SEM approach. Three competing models were compared.
The first model, in which the influence of mood state and cognitive function on awareness was mediated by functionality, showed a very good fit to the data and a medium effect size. The competing models, in which the mediating variables were mood state and cognitive function, respectively, only showed poor model fit.
Our model supports the notion that the relationship between different factors and awareness in AD is mediated by functionality and not by depressive mood state or cognitive level. The proposed direct and indirect effects on awareness are discussed, as well as the missing direct influence of mood state on awareness. The understanding of awareness in dementia is crucial and our model gives one possible explanation of its underlying structure in AD.
The impact of hurricanes on emergency services is well-known. Recent history demonstrates the need for prehospital and emergency department coordination to serve communities during evacuation, storm duration, and cleanup. The use of telehealth applications may enhance this coordination while lessening the impact on health-care systems. These applications can address triage, stabilization, and diversion and may be provided in collaboration with state and local emergency management operations through various shelters, as well as during other emergency medical responses.
Background: Observational studies have reported an association between childhood obesity and a higher risk of multiple sclerosis (MS). However, the difficulties to fully account for confounding and long recall periods make causal inference from these studies challenging. The objective of this study was to assess the contribution of childhood obesity to the development of MS through Mendelian randomization, which uses genetic associations to minimize the risk of confounding. Methods: We selected 23 independent genetic variants strongly associated with childhood body mass index (BMI) in a genome-wide association study (GWAS) which included 47,541 children. The corresponding effects of these variants on risk of MS were obtained from a GWAS of 14,802 MS cases and 26,703 controls. Standard two-sample Mendelian randomization methods were performed, with additional sensitivity analyses to assess the likelihood of bias from genetic pleiotropy. Results: The inverse-variance weighted MR analysis revealed that one standard deviation increase in childhood BMI increased odds of MS by 26% (odds ratio=1.26, 95% confidence interval 1.10-1.45, p=0.001). There was no significant heterogeneity across the individual estimates. Sensitivity analyses were consistent with the main findings and provided no evidence of pleiotropy. Conclusions: This study provides genetic support of a role for increased childhood BMI in the development of MS.
Introduction: Opioid side effects are common when treating chronic pain. However, the rate of opioid side effects for acute pain has rarely been examined, particularly in the post emergency department (ED) setting. The objective of this study was to evaluate the short-term incidence of opioid induced side effects (constipation, nausea/vomiting, dizziness, and drowsiness) in patients discharged from the ED with an opioid prescription. Methods: This was a prospective cohort study of patients aged ≥18 years that visited the ED for an acute pain condition (≤ 2 weeks) and were discharged with an opioid prescription. Patients completed a 14-day diary assessing daily pain medication use and side effects. Results: Mean age of the 386 patients included was 55 ± 16 years; 50% were women. During the 2-week follow-up, 80% of patients consumed at least one dose of opioids. Among the patients who used opioids, 38% (95%CI: 33-48) reported constipation, 27% (95%CI:22-32) nausea/vomiting, 30% (95%CI:25-35) dizziness, 51% (95%CI:45-57) drowsiness, and 77% (95%CI:72-82) reported any side effects. Adjusting for age, sex, and pain condition, patients who used opioids were more likely to report any side effect (OR 7.5, 95%CI:4.3-13.3) and constipation (OR 7.5, 95%CI:3.1-17.9). A significant dose response effect was observed for constipation but not for the other side effects. Nausea/vomiting (OR 2.0, 95%CI:1.1-3.6) and dizziness (OR 1.9, 95%CI:1.1-3.4) were associated with oxycodone compared to morphine. Conclusion: Similar to chronic pain, opioid side effects are highly prevalent during short-term treatment for acute pain. Physicians should be aware and inform patients about those side effects.
Introduction: For rhythm control of acute atrial fibrillation (AAF) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAF, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an apriori-specified modified intention-to-treat (MITT) basis excluding patients who never received the study infusion (e.g. spontaneous conversion). Data were analyzed using chi-squared tests and logistic regression. Our target sample size was 374 evaluable patients. Results: Of 395 randomized patients, 18 were excluded from the MITT analysis; none were lost to follow-up. The Drug-Shock (N = 198) and Shock Only (N = 180) groups (total = 378) were similar for all characteristics including mean age (60.0 vs 59.5 yrs), duration of AAF (10.1 vs 10.8 hrs), previous AF (67.2% vs 68.3%), median CHADS2 score (0 vs 0), and mean initial heart rate (119.9 vs 118.0 bpm). More patients converted to normal sinus rhythm in the Drug-Shock group (97.0% vs 92.2%; absolute difference 4.8%, 95% CI 0.2-9.9; P = 0.04). The multivariable analyses confirmed the Drug-Shock strategy superiority (P = 0.04). There were no statistically significant differences for time to conversion (91.4 vs 85.4 minutes), total ED length of stay (7.1 vs 7.7 hours), disposition home (97.0% vs 96.1%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion was more common in the Drug-Shock group (8.1% vs 0.6%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAF patients to go home in sinus rhythm. A strategy of initial cardioversion with procainamide was superior to a strategy of immediate ECV.
Introduction: Individualizing risk for stroke following a transient ischemic attack (TIA) is a topic of intense research, as existing scores are context-dependent or have not been well validated. The Canadian TIA Score stratifies risk of subsequent stroke into low, moderate and high risk. Our objective was to prospectively validate the Canadian TIA Score in a new cohort of emergency department (ED) patients. Methods: We conducted a prospective cohort study in 14 Canadian EDs over 4 years. We enrolled consecutive adult patients with an ED visit for TIA or nondisabling stroke. Treating physicians recorded standardized clinical variables onto data collection forms. Given the ability of prompt emergency carotid endarterectomy (CEA) to prevent stroke (NNT = 3) in high risk patients, our primary outcome was the composite of subsequent stroke or CEA ≤7 days. We conducted telephone follow-up using the validated Questionnaire for Verifying Stroke Free Status at 7 and 90 days. Outcomes were adjudicated by panels of 3 local stroke experts, blinded to the index ED data collection form. Based on prior work, we estimated a sample size of 5,004 patients including 93 subsequent strokes, would yield 95% confidence bands of +/− 10% for sensitivity and likelihood ratio (LR). Our analyses assessed interval LRs (iLR) with 95% CIs. Results: We prospectively enrolled 7,569 patients with mean 68.4 +/−14.7 years and 52.4% female, of whom 107 (1.4%) had a subsequent stroke and 74 (1.0%) CEA ≤7 days (total outcomes = 181). We enrolled 81.2% of eligible patients; missed patients were similar to enrolled. The Canadian TIA Score stratified the stroke/CEA ≤7days risk as: Low (probability <0.2%, iLR 0.20 [95%CI 0.091-0.44]; Moderate (probability 1.3%, iLR 0.79 [0.68-0.92]; High (probability 2.6%, iLR 2.2 [1.9-2.6]. Sensitivity analysis for just stroke ≤7 days yielded similar results: Low iLR 0.17 [95%CI 0.056-0.52], Medium iLR 0.89 [0.75-1.1], High iLR 2.0 [1.6-2.4]. Conclusion: The Canadian TIA Score accurately identifies TIA patients risk for stroke/CEA ≤7 days. Patients classified as low risk can be safely discharged following a careful ED assessment with elective follow-up. Patients at moderate risk can undergo additional testing in the ED, have antithrombotic therapy optimized, and be offered early stroke specialist follow-up. Patients at high risk should in most cases be fully investigated and managed ideally in consultation with a stroke specialist during their index ED visit.
Introduction: Studies suggest that acute pain evolution after an emergency department (ED) visit has been associated with the development of chronic pain. Using group-based trajectory modeling (GBTM), we aimed to evaluate if ED discharged patients with similar pain intensity profiles of change over 14 days are associated with chronic pain at 3 months. Methods: This is a prospective cohort study of patients aged 18 years or older who visited the ED for an acute pain condition (≤2 weeks) and were discharged with an opioid prescription. Patients completed a 14-day diary in which they listed their daily pain intensity level (0-10 numeric rating scale). Three months post-ED visit, participants were interviewed by phone to report their pain intensity related to the initial pain. Results: A total of 305 patients were retained at 3 months (mean age ± SD: 55 ± 15 years, 49% women). Using GBTM, six distinct pain intensity trajectories were identified during the first 14 days of the acute pain period; two linear one with moderate or severe pain during the follow-up (representing almost 40% of the patients) and four cubic polynomial order trajectories, with mild or no-pain at the end of the 14 days (low final pain). Twelve percent (11.9; ±95%CI: 8.2-15.4) of the patients had chronic pain at 3 months. Controlling for age, sex and types of pain condition, patients with trajectories of moderate or severe pain and those with only severe pain were 5.1 (95%CI: 2.2-11.8) and 8.2 (95%CI: 3.4-20.0) times more likely to develop chronic pain at 3 months, respectively, compared to the low final pain group. Conclusion: Trajectories could be useful to early identification of patients at risk of chronic pain.
Introduction: The objective of the study was to evaluate the acute pain intensity evolution in ED discharged patients using Group-based trajectory modeling (GBTM). This method identified patient groups with similar profiles of change over time without assuming the existence of a particular pattern or number of groups. Methods: This was a prospective cohort study of ED patients aged ≥18 years with an acute pain condition (≤ 2 weeks) and discharged with an opioid prescription. Patients completed a 14-day diary assessing daily pain intensity level (0-10 numeric rating scale) and pain medication use. Results: Among the 372 included patients, six distinct post-ED pain intensity trajectories were identified: two started with severe levels of pain, one remained with severe pain intensity (12.6% of the sample) and the other ended with moderate pain intensity level (26.3%). Two other trajectories had severe initial pain, one decreased to mild pain (21.7%) and the other to no-pain (13.8%). Another trajectory had moderate initial pain which decreased to a mild level (15.9%) and the last one started with mild pain intensity and had no pain at the end of the 14-day (9.7%). The pain trajectory patterns were significantly associated with age, type of painful conditions, pain intensity at ED discharge, and with opioid consumption. Conclusion: Acute pain resolution following an ED visit seems to progress through six different trajectory patterns that are more informative than simple linear models and could be useful to adapt acute pain management in future research.