To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study aimed to investigate general factors associated with prognosis regardless of the type of treatment received, for adults with depression in primary care.
We searched Medline, Embase, PsycINFO and Cochrane Central (inception to 12/01/2020) for RCTs that included the most commonly used comprehensive measure of depressive and anxiety disorder symptoms and diagnoses, in primary care depression RCTs (the Revised Clinical Interview Schedule: CIS-R). Two-stage random-effects meta-analyses were conducted.
Twelve (n = 6024) of thirteen eligible studies (n = 6175) provided individual patient data. There was a 31% (95%CI: 25 to 37) difference in depressive symptoms at 3–4 months per standard deviation increase in baseline depressive symptoms. Four additional factors: the duration of anxiety; duration of depression; comorbid panic disorder; and a history of antidepressant treatment were also independently associated with poorer prognosis. There was evidence that the difference in prognosis when these factors were combined could be of clinical importance. Adding these variables improved the amount of variance explained in 3–4 month depressive symptoms from 16% using depressive symptom severity alone to 27%. Risk of bias (assessed with QUIPS) was low in all studies and quality (assessed with GRADE) was high. Sensitivity analyses did not alter our conclusions.
When adults seek treatment for depression clinicians should routinely assess for the duration of anxiety, duration of depression, comorbid panic disorder, and a history of antidepressant treatment alongside depressive symptom severity. This could provide clinicians and patients with useful and desired information to elucidate prognosis and aid the clinical management of depression.
The National Neuropsychology Network (NNN) is a multicenter clinical research initiative funded by the National Institute of Mental Health (NIMH; R01 MH118514) to facilitate neuropsychology’s transition to contemporary psychometric assessment methods with resultant improvement in test validation and assessment efficiency.
The NNN includes four clinical research sites (Emory University; Medical College of Wisconsin; University of California, Los Angeles (UCLA); University of Florida) and Pearson Clinical Assessment. Pearson Q-interactive (Q-i) is used for data capture for Pearson published tests; web-based data capture tools programmed by UCLA, which serves as the Coordinating Center, are employed for remaining measures.
NNN is acquiring item-level data from 500–10,000 patients across 47 widely used Neuropsychology (NP) tests and sharing these data via the NIMH Data Archive. Modern psychometric methods (e.g., item response theory) will specify the constructs measured by different tests and determine their positive/negative predictive power regarding diagnostic outcomes and relationships to other clinical, historical, and demographic factors. The Structured History Protocol for NP (SHiP-NP) helps standardize acquisition of relevant history and self-report data.
NNN is a proof-of-principle collaboration: by addressing logistical challenges, NNN aims to engage other clinics to create a national and ultimately an international network. The mature NNN will provide mechanisms for data aggregation enabling shared analysis and collaborative research. NNN promises ultimately to enable robust diagnostic inferences about neuropsychological test patterns and to promote the validation of novel adaptive assessment strategies that will be more efficient, more precise, and more sensitive to clinical contexts and individual/cultural differences.
Mass asymptomatic SARS-CoV-2 nucleic acid amplified testing of healthcare personnel (HCP) was performed at a large tertiary health system. A low period-prevalence of positive HCP was observed. Of those who tested positive, half had mild symptoms in retrospect. HCP with even mild symptoms should be isolated and tested.
We summarize some of the past year's most important findings within climate change-related research. New research has improved our understanding of Earth's sensitivity to carbon dioxide, finds that permafrost thaw could release more carbon emissions than expected and that the uptake of carbon in tropical ecosystems is weakening. Adverse impacts on human society include increasing water shortages and impacts on mental health. Options for solutions emerge from rethinking economic models, rights-based litigation, strengthened governance systems and a new social contract. The disruption caused by COVID-19 could be seized as an opportunity for positive change, directing economic stimulus towards sustainable investments.
A synthesis is made of ten fields within climate science where there have been significant advances since mid-2019, through an expert elicitation process with broad disciplinary scope. Findings include: (1) a better understanding of equilibrium climate sensitivity; (2) abrupt thaw as an accelerator of carbon release from permafrost; (3) changes to global and regional land carbon sinks; (4) impacts of climate change on water crises, including equity perspectives; (5) adverse effects on mental health from climate change; (6) immediate effects on climate of the COVID-19 pandemic and requirements for recovery packages to deliver on the Paris Agreement; (7) suggested long-term changes to governance and a social contract to address climate change, learning from the current pandemic, (8) updated positive cost–benefit ratio and new perspectives on the potential for green growth in the short- and long-term perspective; (9) urban electrification as a strategy to move towards low-carbon energy systems and (10) rights-based litigation as an increasingly important method to address climate change, with recent clarifications on the legal standing and representation of future generations.
Social media summary
Stronger permafrost thaw, COVID-19 effects and growing mental health impacts among highlights of latest climate science.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
Background: Shared Healthcare Intervention to Eliminate Life-threatening Dissemination of MDROs in Orange County, California (SHIELD OC) was a CDC-funded regional decolonization intervention from April 2017 through July 2019 involving 38 hospitals, nursing homes (NHs), and long-term acute-care hospitals (LTACHs) to reduce MDROs. Decolonization in NH and LTACHs consisted of universal antiseptic bathing with chlorhexidine (CHG) for routine bathing and showering plus nasal iodophor decolonization (Monday through Friday, twice daily every other week). Hospitals used universal CHG in ICUs and provided daily CHG and nasal iodophor to patients in contact precautions. We sought to evaluate whether decolonization reduced hospitalization and associated healthcare costs due to infections among residents of NHs participating in SHIELD compared to nonparticipating NHs. Methods: Medicaid insurer data covering NH residents in Orange County were used to calculate hospitalization rates due to a primary diagnosis of infection (counts per member quarter), hospital bed days/member-quarter, and expenditures/member quarter from the fourth quarter of 2015 to the second quarter of 2019. We used a time-series design and a segmented regression analysis to evaluate changes attributable to the SHIELD OC intervention among participating and nonparticipating NHs. Results: Across the SHIELD OC intervention period, intervention NHs experienced a 44% decrease in hospitalization rates, a 43% decrease in hospital bed days, and a 53% decrease in Medicaid expenditures when comparing the last quarter of the intervention to the baseline period (Fig. 1). These data translated to a significant downward slope, with a reduction of 4% per quarter in hospital admissions due to infection (P < .001), a reduction of 7% per quarter in hospitalization days due to infection (P < .001), and a reduction of 9% per quarter in Medicaid expenditures (P = .019) per NH resident. Conclusions: The universal CHG bathing and nasal decolonization intervention adopted by NHs in the SHIELD OC collaborative resulted in large, meaningful reductions in hospitalization events, hospitalization days, and healthcare expenditures among Medicaid-insured NH residents. The findings led CalOptima, the Medicaid provider in Orange County, California, to launch an NH incentive program that provides dedicated training and covers the cost of CHG and nasal iodophor for OC NHs that enroll.
Disclosures: Gabrielle M. Gussin, University of California, Irvine, Stryker (Sage Products): Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Clorox: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Medline: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Xttrium: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes.
The emphasis on team science in clinical and translational research increases the importance of collaborative biostatisticians (CBs) in healthcare. Adequate training and development of CBs ensure appropriate conduct of robust and meaningful research and, therefore, should be considered as a high-priority focus for biostatistics groups. Comprehensive training enhances clinical and translational research by facilitating more productive and efficient collaborations. While many graduate programs in Biostatistics and Epidemiology include training in research collaboration, it is often limited in scope and duration. Therefore, additional training is often required once a CB is hired into a full-time position. This article presents a comprehensive CB training strategy that can be adapted to any collaborative biostatistics group. This strategy follows a roadmap of the biostatistics collaboration process, which is also presented. A TIE approach (Teach the necessary skills, monitor the Implementation of these skills, and Evaluate the proficiency of these skills) was developed to support the adoption of key principles. The training strategy also incorporates a “train the trainer” approach to enable CBs who have successfully completed training to train new staff or faculty.
The science of studying diamond inclusions for understanding Earth history has developed significantly over the past decades, with new instrumentation and techniques applied to diamond sample archives revealing the stories contained within diamond inclusions. This chapter reviews what diamonds can tell us about the deep carbon cycle over the course of Earth’s history. It reviews how the geochemistry of diamonds and their inclusions inform us about the deep carbon cycle, the origin of the diamonds in Earth’s mantle, and the evolution of diamonds through time.
Clinical Enterobacteriacae isolates with a colistin minimum inhibitory concentration (MIC) ≥4 mg/L from a United States hospital were screened for the mcr-1 gene using real-time polymerase chain reaction (RT-PCR) and confirmed by whole-genome sequencing. Four colistin-resistant Escherichia coli isolates contained mcr-1. Two isolates belonged to the same sequence type (ST-632). All subjects had prior international travel and antimicrobial exposure.
The Hamilton Depression Rating Scale (HAMD) and the Beck Depression Inventory (BDI) are the most frequently used observer-rated and self-report scales of depression, respectively. It is important to know what a given total score or a change score from baseline on one scale means in relation to the other scale.
We obtained individual participant data from the randomised controlled trials of psychological and pharmacological treatments for major depressive disorders. We then identified corresponding scores of the HAMD and the BDI (369 patients from seven trials) or the BDI-II (683 patients from another seven trials) using the equipercentile linking method.
The HAMD total scores of 10, 20 and 30 corresponded approximately with the BDI scores of 10, 27 and 42 or with the BDI-II scores of 13, 32 and 50. The HAMD change scores of −20 and −10 with the BDI of −29 and −15 and with the BDI-II of −35 and −16.
The results can help clinicians interpret the HAMD or BDI scores of their patients in a more versatile manner and also help clinicians and researchers evaluate such scores reported in the literature or the database, when scores on only one of these scales are provided. We present a conversion table for future research.
A converging literature has revealed the existence of a set of largely consistent, hierarchically organized personality traits, that is broader traits are able to be differentiated into more fine-grained traits, in both humans and chimpanzees. Despite recent work suggesting a neural basis to personality in chimpanzees, little is known with regard to the involvement of limbic structures (i.e., amygdala and hippocampus), which are thought to play important roles in emotion. Using saved maximum likelihood estimated exploratory factor scores (two to five factors) in the context of a series of path analyses, the current study examined associations among personality dimensions across various levels of the personality hierarchy and individual variability of amygdala and hippocampal grey matter (GM) volume in a sample of captive chimpanzees (N=191). Whereas results revealed no association between personality dimensions and amygdala volume, a more nuanced series of associations emerged between hippocampal GM volume and personality dimensions at various levels of the hierarchy. Hippocampal GM volume associated most notably with Alpha (a dimension reflecting a tendency to behave in an undercontrolled and agonistic way) at the most basic two-factor level of the hierarchy; associated positively with Disinhibition at the next level of the hierarchy (“Big Three”); and finally, associated positively with Impulsivity at the most fine-grained level (“five-factor model”) of the hierarchy. Findings underscore the importance of the hippocampus in the neurobiological foundation of personality, with support for its regulatory role of emotion. Further, results suggest the importance of the distinction between structure and function, particularly with regard to the amygdala.
OBJECTIVES/SPECIFIC AIMS: In patients with recurrent glioblastoma (GBM) who undergo a second surgery following standard chemoradiotherapy, histopathologic examination of the resected tissue often reveals a combination of viable tumor and treatment-related inflammatory changes. However, it remains unclear whether the degree of viable tumor Versus “treatment effect” in these specimens impacts prognosis. We sought to determine whether the percentage of viable tumor Versus “treatment effect” in recurrent GBM surgical samples, as assessed by a trained neuropathologist and quantified on a continuous scale, is associated with overall survival. METHODS/STUDY POPULATION: We reviewed the records of 47 patients with histopathologically confirmed GBM who underwent surgical resection as the first therapeutic modality for suspected radiographic progression following standard radiation therapy and temozolomide. The percentage of viable tumor Versus “treatment effect” in each specimen was estimated by one neuropathologist who was blinded to patient outcomes. RESULTS/ANTICIPATED RESULTS: After adjusting for other known prognostic factors in a multivariate Cox proportional hazards model, there was no association between the degree of viable tumor and overall survival (HR 0.83; 95% CI, 0.20–3.4; p=0.20). DISCUSSION/SIGNIFICANCE OF IMPACT: These results suggest that, in patients who undergo resection for recurrent GBM following standard first-line chemoradiotherapy, histopathologic quantification of the degree of viable tumor Versus “treatment effect” present in the surgical specimen has limited prognostic influence and clinical utility.
An internationally approved and globally used classification scheme for the diagnosis of CHD has long been sought. The International Paediatric and Congenital Cardiac Code (IPCCC), which was produced and has been maintained by the International Society for Nomenclature of Paediatric and Congenital Heart Disease (the International Nomenclature Society), is used widely, but has spawned many “short list” versions that differ in content depending on the user. Thus, efforts to have a uniform identification of patients with CHD using a single up-to-date and coordinated nomenclature system continue to be thwarted, even if a common nomenclature has been used as a basis for composing various “short lists”. In an attempt to solve this problem, the International Nomenclature Society has linked its efforts with those of the World Health Organization to obtain a globally accepted nomenclature tree for CHD within the 11th iteration of the International Classification of Diseases (ICD-11). The International Nomenclature Society has submitted a hierarchical nomenclature tree for CHD to the World Health Organization that is expected to serve increasingly as the “short list” for all communities interested in coding for congenital cardiology. This article reviews the history of the International Classification of Diseases and of the IPCCC, and outlines the process used in developing the ICD-11 congenital cardiac disease diagnostic list and the definitions for each term on the list. An overview of the content of the congenital heart anomaly section of the Foundation Component of ICD-11, published herein in its entirety, is also included. Future plans for the International Nomenclature Society include linking again with the World Health Organization to tackle procedural nomenclature as it relates to cardiac malformations. By doing so, the Society will continue its role in standardising nomenclature for CHD across the globe, thereby promoting research and better outcomes for fetuses, children, and adults with congenital heart anomalies.
To determine the scope, source, and mode of transmission of a multifacility outbreak of extensively drug-resistant (XDR) Acinetobacter baumannii.
SETTING AND PARTICIPANTS
Residents and patients in skilled nursing facilities, long-term acute-care hospital, and acute-care hospitals.
A case was defined as the incident isolate from clinical or surveillance cultures of XDR Acinetobacter baumannii resistant to imipenem or meropenem and nonsusceptible to all but 1 or 2 antibiotic classes in a patient in an Oregon healthcare facility during January 2012–December 2014. We queried clinical laboratories, reviewed medical records, oversaw patient and environmental surveillance surveys at 2 facilities, and recommended interventions. Pulsed-field gel electrophoresis (PFGE) and molecular analysis were performed.
We identified 21 cases, highly related by PFGE or healthcare facility exposure. Overall, 17 patients (81%) were admitted to either long-term acute-care hospital A (n=8), or skilled nursing facility A (n=8), or both (n=1) prior to XDR A. baumannii isolation. Interfacility communication of patient or resident XDR status was not performed during transfer between facilities. The rare plasmid-encoded carbapenemase gene blaOXA-237 was present in 16 outbreak isolates. Contact precautions, chlorhexidine baths, enhanced environmental cleaning, and interfacility communication were implemented for cases to halt transmission.
Interfacility transmission of XDR A. baumannii carrying the rare blaOXA-237 was facilitated by transfer of affected patients without communication to receiving facilities.
We describe the design and performance of the Engineering Development Array, which is a low-frequency radio telescope comprising 256 dual-polarisation dipole antennas working as a phased array. The Engineering Development Array was conceived of, developed, and deployed in just 18 months via re-use of Square Kilometre Array precursor technology and expertise, specifically from the Murchison Widefield Array radio telescope. Using drift scans and a model for the sky brightness temperature at low frequencies, we have derived the Engineering Development Array’s receiver temperature as a function of frequency. The Engineering Development Array is shown to be sky-noise limited over most of the frequency range measured between 60 and 240 MHz. By using the Engineering Development Array in interferometric mode with the Murchison Widefield Array, we used calibrated visibilities to measure the absolute sensitivity of the array. The measured array sensitivity matches very well with a model based on the array layout and measured receiver temperature. The results demonstrate the practicality and feasibility of using Murchison Widefield Array-style precursor technology for Square Kilometre Array-scale stations. The modular architecture of the Engineering Development Array allows upgrades to the array to be rolled out in a staged approach. Future improvements to the Engineering Development Array include replacing the second stage beamformer with a fully digital system, and to transition to using RF-over-fibre for the signal output from first stage beamformers.
Select units in the military have improved combat medic training by integrating their functions into routine clinical care activities with measurable improvements in battlefield care. This level of integration is currently limited to special operations units. It is unknown if regular Army units and combat medics can emulate these successes. The goal of this project was to determine whether US Army combat medics can be integrated into routine emergency department (ED) clinical care, specifically medication administration.
This was a quality assurance project that monitored training of combat medics to administer parenteral medications and to ensure patient safety. Combat medics were provided training that included direct supervision during medication administration. Once proficiency was demonstrated, combat medics would prepare the medications under direct supervision, followed by indirect supervision during administration. As part of the quality assurance and safety processes, combat medics were required to document all medication administrations, supervising provider, and unexpected adverse events. Additional quality assurance follow-up occurred via complete chart review by the project lead.
During the project period, the combat medics administered the following medications: ketamine (n=13), morphine (n=8), ketorolac (n=7), fentanyl (n=5), ondansetron (n=4), and other (n=6). No adverse events or patient safety events were reported by the combat medics or discovered during the quality assurance process.
In this limited case series, combat medics safely administered parenteral medications under indirect provider supervision. Future research is needed to further develop this training model for both the military and civilian setting.
SchauerSG, CunninghamCW, FisherAD, DeLorenzoRA. A Pilot Project Demonstrating that Combat Medics Can Safely Administer Parenteral Medications in the Emergency Department. Prehosp Disaster Med. 2017;32(6):679–681.
Mass-casualty (MASCAL) events are known to occur in the combat setting. There are very limited data at this time from the Joint Theater (Iraq and Afghanistan) wars specific to MASCAL events. The purpose of this report was to provide preliminary data for the development of prehospital planning and guidelines.
Cases were identified using the Department of Defense (DoD; Virginia USA) Trauma Registry (DoDTR) and the Prehospital Trauma Registry (PHTR). These cases were identified as part of a research study evaluating Tactical Combat Casualty Care (TCCC) guidelines. Cases that were designated as or associated with denoted MASCAL events were included.
Fifty subjects were identified during the course of this project. Explosives were the most common cause of injuries. There was a wide range of vital signs. Tourniquet placement and pressure dressings were the most common interventions, followed by analgesia administration. Oral transmucosal fentanyl citrate (OTFC) was the most common parenteral analgesic drug administered. Most were evacuated as “routine.” Follow-up data were available for 36 of the subjects and 97% were discharged alive.
The most common prehospital interventions were tourniquet and pressure dressing hemorrhage control, along with pain medication administration. Larger data sets are needed to guide development of MASCAL in-theater clinical practice guidelines.
SchauerSG, AprilMD, SimonE, MaddryJK, CarterR III, DelorenzoRA. Prehospital Interventions During Mass-Casualty Events in Afghanistan: A Case Analysis. Prehosp Disaster Med. 2017;32(4):465–468.