To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Fluoroquinolones (FQs) and extended-spectrum cephalosporins (ESCs) are associated with higher risk of Clostridioides difficile infection (CDI). Decreasing the unnecessary use of FQs and ESCs is a goal of antimicrobial stewardship. Understanding how prescribers perceive the risks and benefits of FQs and ESCs is needed.
We conducted interviews with clinicians from 4 hospitals. Interviews elicited respondent perceptions about the risk of ESCs, FQs, and CDI. Interviews were audio recorded, transcribed, and analyzed using a flexible coding approach.
Interviews were conducted with 64 respondents (38 physicians, 7 nurses, 6 advance practice providers, and 13 pharmacists). ESCs and FQs were perceived to have many benefits, including infrequent dosing, breadth of coverage, and greater patient adherence after hospital discharge. Prescribers stated that it was easy to make decisions about these drugs, so they were especially appealing to use in the context of time pressures. They described having difficulty discontinuing these drugs when prescribed by others due to inertia and fear. Prescribers were skeptical about targeting specific drugs as a stewardship approach and felt that the risk of a negative outcome from under treatment of a suspected bacterial infection was a higher priority than the prevention of CDI.
Prescribers in this study perceived many advantages to using ESCs and FQs, especially under conditions of time pressure and uncertainty. In making decisions about these drugs, prescribers balance risk and benefit, and they believed that the risk of CDI was acceptable in compared with the risk of undertreatment.
Background: Trauma care represents a complex patient journey, requiring multi-disciplinary coordinated care. Team members are human, and as such, how they feel about their colleagues and their work affects performance. The challenge for health service leaders is enabling culture that supports high levels of collaboration, cooperation and coordination across diverse groups. Aim Statement: We aimed to define and set the agenda for improvement of the relational aspects of trauma care at a large tertiary care hospital. Measures & Design: We conducted a mixed-methods collaborative ethnography using the Relational Coordination survey – an established tool to analyze the relational dimensions of multidisciplinary teamwork – participant observation, interviews, and narrative surveys. Findings were presented to clinicians in working groups for further interpretation and to facilitate co-creation of targeted interventions designed to improve team relationships and performance. Evaluation/Results: We engaged a complex multidisciplinary network of ~500 care providers dispersed across seven core interdependent clinical disciplines. Initial findings highlighted the importance of relationships in trauma care and opportunities to improve. Narrative survey and ethnographic findings further highlighted the centrality of a translational simulation program in contributing positively to team culture and relational ties. A range of 16 interventions – focusing on structural, process and relational dimensions – were co-created with participants and are now being implemented and evaluated by various trauma care providers. Discussion/Impact: Through engagement of clinicians spanning organizational boundaries, relational aspects of care can be measured and directly targeted in a collaborative quality improvement process. We encourage health care leaders to consider relationship-based quality improvement strategies, including translational simulation and relational coordination processes, in their efforts to improve care for patients with complex, interdependent journeys.
Background: Since January 1, 2016 2358 people have died from opioid poisoning in Alberta. Buprenorphine/naloxone (bup/nal) is the recommended first line treatment for opioid use disorder (OUD) and this treatment can be initiated in emergency departments and urgent care centres (EDs). Aim Statement: This project aims to spread a quality improvement intervention to all 107 adult EDs in Alberta by March 31, 2020. The intervention supports clinicians to initiate bup/nal for eligible individuals and provide rapid referrals to OUD treatment clinics. Measures & Design: Local ED teams were identified (administrators, clinical nurse educators, physicians and, where available, pharmacists and social workers). Local teams were supported by a provincial project team (project manager, consultant, and five physician leads) through a multi-faceted implementation process using provincial order sets, clinician education products, and patient-facing information. We used administrative ED and pharmacy data to track the number of visits where bup/nal was given in ED, and whether discharged patients continued to fill any opioid agonist treatment (OAT) prescription 30 days after their index ED visit. OUD clinics reported the number of referrals received from EDs and the number attending their first appointment. Patient safety event reports were tracked to identify any unintended negative impacts. Evaluation/Results: We report data from May 15, 2018 (program start) to September 31, 2019. Forty-nine EDs (46% of 107) implemented the program and 22 (45% of 49) reported evaluation data. There were 5385 opioid-related visits to reporting ED sites after program adoption. Bup/nal was given during 832 ED visits (663 unique patients): 7 visits in the 1st quarter the program operated, 55 in the 2nd, 74 in the 3rd, 143 in the 4th, 294 in the 5th, and 255 in the 6th. Among 505 unique discharged patients with 30 day follow up data available 319 (63%) continued to fill any OAT prescription after receiving bup/nal in ED. 16 (70%) of 23 community clinics provided data. EDs referred patients to these clinics 440 times, and 236 referrals (54%) attended their first follow-up appointment. Available data may under-report program impact. 5 patient safety events have been reported, with no harm or minimal harm to the patient. Discussion/Impact: Results demonstrate effective spread and uptake of a standardized provincial ED based early medical intervention program for patients who live with OUD.
The curves recommended for calibrating radiocarbon (14C) dates into absolute dates have been updated. For calibrating atmospheric samples from the Northern Hemisphere, the new curve is called IntCal20. This is accompanied by associated curves SHCal20 for the Southern Hemisphere, and Marine20 for marine samples. In this “companion article” we discuss advances and developments that have led to improvements in the updated curves and highlight some issues of relevance for the general readership. In particular the dendrochronological based part of the curve has seen a significant increase in data, with single-year resolution for certain time ranges, extending back to 13,910 calBP. Beyond the tree rings, the new curve is based upon an updated combination of marine corals, speleothems, macrofossils, and varved sediments and now reaches back to 55,000 calBP. Alongside these data advances, we have developed a new, bespoke statistical curve construction methodology to allow better incorporation of the diverse constituent records and produce a more robust curve with uncertainties. Combined, these data and methodological advances offer the potential for significant new insight into our past. We discuss some implications for the user, such as the dating of the Santorini eruption and also some consequences of the new curve for Paleolithic archaeology.
Clostridioides difficile infection (CDI) can be prevented through infection prevention practices and antibiotic stewardship. Diagnostic stewardship (ie, strategies to improve use of microbiological testing) can also improve antibiotic use. However, little is known about the use of such practices in US hospitals, especially after multidisciplinary stewardship programs became a requirement for US hospital accreditation in 2017. Thus, we surveyed US hospitals to assess antibiotic stewardship program composition, practices related to CDI, and diagnostic stewardship.
Surveys were mailed to infection preventionists at 900 randomly sampled US hospitals between May and October 2017. Hospitals were surveyed on antibiotic stewardship programs; CDI prevention, treatment, and testing practices; and diagnostic stewardship strategies. Responses were compared by hospital bed size using weighted logistic regression.
Overall, 528 surveys were completed (59% response rate). Almost all (95%) responding hospitals had an antibiotic stewardship program. Smaller hospitals were less likely to have stewardship team members with infectious diseases (ID) training, and only 41% of hospitals met The Joint Commission accreditation standards for multidisciplinary teams. Guideline-recommended CDI prevention practices were common. Smaller hospitals were less likely to use high-tech disinfection devices, fecal microbiota transplantation, or diagnostic stewardship strategies.
Following changes in accreditation standards, nearly all US hospitals now have an antibiotic stewardship program. However, many hospitals, especially smaller hospitals, appear to struggle with access to ID expertise and with deploying diagnostic stewardship strategies. CDI prevention could be enhanced through diagnostic stewardship and by emphasizing the role of non–ID-trained pharmacists and clinicians in antibiotic stewardship.
Disturbed sleep and activity are prominent features of bipolar disorder type I (BP-I). However, the relationship of sleep and activity characteristics to brain structure and behavior in euthymic BP-I patients and their non-BP-I relatives is unknown. Additionally, underlying genetic relationships between these traits have not been investigated.
Relationships between sleep and activity phenotypes, assessed using actigraphy, with structural neuroimaging (brain) and cognitive and temperament (behavior) phenotypes were investigated in 558 euthymic individuals from multi-generational pedigrees including at least one member with BP-I. Genetic correlations between actigraphy-brain and actigraphy-behavior associations were assessed, and bivariate linkage analysis was conducted for trait pairs with evidence of shared genetic influences.
More physical activity and longer awake time were significantly associated with increased brain volumes and cortical thickness, better performance on neurocognitive measures of long-term memory and executive function, and less extreme scores on measures of temperament (impulsivity, cyclothymia). These associations did not differ between BP-I patients and their non-BP-I relatives. For nine activity-brain or activity-behavior pairs there was evidence for shared genetic influence (genetic correlations); of these pairs, a suggestive bivariate quantitative trait locus on chromosome 7 for wake duration and verbal working memory was identified.
Our findings indicate that increased physical activity and more adequate sleep are associated with increased brain size, better cognitive function and more stable temperament in BP-I patients and their non-BP-I relatives. Additionally, we found evidence for pleiotropy of several actigraphy-behavior and actigraphy-brain phenotypes, suggesting a shared genetic basis for these traits.
Presenteeism, or working while ill, by healthcare personnel (HCP) experiencing influenza-like illness (ILI) puts patients and coworkers at risk. However, hospital policies and practices may not consistently facilitate HCP staying home when ill.
Objective and methods:
We conducted a mixed-methods survey in March 2018 of Emerging Infections Network infectious diseases physicians, describing institutional experiences with and policies for HCP working with ILI.
Of 715 physicians, 367 (51%) responded. Of 367, 135 (37%) were unaware of institutional policies. Of the remaining 232 respondents, 206 (89%) reported institutional policies regarding work restrictions for HCP with influenza or ILI, but only 145 (63%) said these were communicated at least annually. More than half of respondents (124, 53%) reported that adherence to work restrictions was not monitored or enforced. Work restrictions were most often not perceived to be enforced for physicians-in-training and attending physicians. Nearly all (223, 96%) reported that their facility tracked laboratory-confirmed influenza (LCI) in patients; 85 (37%) reported tracking ILI. For employees, 109 (47%) reported tracking of LCI and 53 (23%) reported tracking ILI. For independent physicians, not employed by the facility, 30 (13%) reported tracking LCI and 11 (5%) ILI.
More than one-third of respondents were unaware of whether their institutions had policies to prevent HCP with ILI from working; among those with knowledge of institutional policies, dissemination, monitoring, and enforcement of these policies was highly variable. Improving communication about work-restriction policies, as well as monitoring and enforcement, may help prevent the spread of infections from HCP to patients.
Viral pneumonia is an important cause of death and morbidity among infants worldwide. Transmission of non-influenza respiratory viruses in households can inform preventative interventions and has not been well-characterised in South Asia. From April 2011 to April 2012, household members of pregnant women enrolled in a randomised trial of influenza vaccine in rural Nepal were surveyed weekly for respiratory illness until 180 days after birth. Nasal swabs were tested by polymerase chain reaction for respiratory viruses in symptomatic individuals. A transmission event was defined as a secondary case of the same virus within 14 days of initial infection within a household. From 555 households, 825 initial viral illness episodes occurred, resulting in 79 transmission events. The overall incidence of transmission was 1.14 events per 100 person-weeks. Risk of transmission incidence was associated with an index case age 1–4 years (incidence rate ratio (IRR) 2.35; 95% confidence interval (CI) 1.40–3.96), coinfection as initial infection (IRR 1.94; 95% CI 1.05–3.61) and no electricity in household (IRR 2.70; 95% CI 1.41–5.00). Preventive interventions targeting preschool-age children in households in resource-limited settings may decrease the risk of transmission to vulnerable household members, such as young infants.
Southern crabgrass [Digitaria ciliaris (Retz.) Koeler] is an annual grass weed that commonly infests turfgrass, roadsides, wastelands, and cropping systems throughout the southeastern United States. Two biotypes of D. ciliaris (R1 and R2) with known resistance to cyclohexanediones (DIMs) and aryloxyphenoxypropionates (FOPs) previously collected from sod production fields in Georgia were compared with a separate susceptible biotype (S) collected from Alabama for the responses to pinoxaden and to explore the possible mechanisms of resistance. Increasing rates of pinoxaden (0.1 to 23.5 kg ha−1) were evaluated for control of R1, R2, and S. The resistant biotypes, R1 and R2, were resistant to pinoxaden relative to S. The S biotype was completely controlled at rates of 11.8 and 23.5 kg ha−1, resulting in no aboveground biomass at 14 d after treatment. Pinoxaden rates at which tiller length and aboveground biomass would be reduced 50% (I50) and 90% (I90) for R1, R2, and S ranged from 7.2 to 13.2 kg ha−1, 6.9 to 8.6 kg ha−1, and 0.7 to 2.1 kg ha−1, respectively, for tiller length, and 7.7 to 10.2 kg ha−1, 7.2 to 7.9 kg ha−1, and 1.6 to 2.3 kg ha−1, respectively, for aboveground biomass. Prior selection pressure from DIM and FOP herbicides could result in the evolution of D. ciliaris cross-resistance to pinoxaden herbicides. Amplification of the carboxyl-transferase domain of the plastidic ACCase by standard PCR identified a point mutation resulting in an Ile-1781-Leu amino acid substitution only for the resistant biotype, R1. Further cloning of PCR product surrounding the 1781 region yielded two distinct ACCase gene sequences, Ile-1781 and Leu-1781. The amino acid substitution, Ile-1781-Leu in both resistant biotypes (R1 and R2), however, was revealed by next-generation sequencing of RNA using Illumina platform. A point mutation in the Ile-1781 codon leading to herbicide insensitivity in the ACCase enzyme has been previously reported in other grass species. Our research confirms that the Ile-1781-Leu substitution is present in pinoxaden-resistant D. ciliaris.
Weeds can cause significant yield loss in watermelon production systems. Commercially acceptable weed control is difficult to achieve, even with heavy reliance on herbicides. A study was conducted to evaluate a spring-seeded cereal rye cover crop with different herbicide application timings for weed management between row middles in watermelon production systems. Common lambsquarters and pigweed species (namely, Palmer amaranth and smooth pigweed) densities and biomasses were often lower with cereal rye compared with no cereal rye, regardless of herbicide treatment. The presence of cereal rye did not negatively influence the number of marketable watermelon fruit, but average marketable fruit weight in cereal rye versus no cereal rye treatments varied by location. These results demonstrate that a spring-seeded cereal rye cover crop can help reduce weed density and weed biomass, and potentially enhance overall weed control. Cereal rye alone did not provide full-season weed control, so additional research is needed to determine the best methods to integrate spring cover cropping with other weed management tactics in watermelon for effective, full-season control.
Despite United States national learning objectives referencing research fundamentals and the critical appraisal of medical literature, many paramedic programs are not meeting these objectives with substantive content.
The objective was to develop and implement a journal club educational module for paramedic training programs, which is all-inclusive and could be distributed to Emergency Medical Services (EMS) educators and EMS medical directors to use as a framework to adapt to their program.
Four two-hour long journal club sessions were designed. First, the educator provided students with four types of articles on a student-chosen topic and discussed differences in methodology and structures. Next, after a lecture about peer-review, students used search engines to verify references of a trade magazine article. Third, the educator gave a statistics lecture and critiqued the results section of several articles found by students on a topic. Finally, students found an article on a topic of personal interest and presented it to their classmates, as if telling their paramedic partner about it at work. Before and after the series, students from two cohorts (2017, 2018) completed a survey with questions about demographics and perceptions of research. Students from one cohort (2017) received a follow-up survey one year later.
For the 2016 cohort, 13 students participated and provided qualitative feedback. For the 2017 and 2018 cohorts, 33 students participated. After the series, there was an increased self-reported ability to find, evaluate, and apply medical research articles, as well as overall positive trending opinions of participating in and the importance of prehospital research. This ability was demonstrated by every student during the final journal club session. McNemar’s and Related-Samples Cochran’s Q testing of questionnaire responses suggested a statistically significant improvement in student approval of exceptions from informed consent.
The framework for this paramedic journal club series could be adapted by EMS educators and medical directors to enable paramedics to search for, critically appraise, and discuss the findings of medical literature.
Item 9 of the Patient Health Questionnaire-9 (PHQ-9) queries about thoughts of death and self-harm, but not suicidality. Although it is sometimes used to assess suicide risk, most positive responses are not associated with suicidality. The PHQ-8, which omits Item 9, is thus increasingly used in research. We assessed equivalency of total score correlations and the diagnostic accuracy to detect major depression of the PHQ-8 and PHQ-9.
We conducted an individual patient data meta-analysis. We fit bivariate random-effects models to assess diagnostic accuracy.
16 742 participants (2097 major depression cases) from 54 studies were included. The correlation between PHQ-8 and PHQ-9 scores was 0.996 (95% confidence interval 0.996 to 0.996). The standard cutoff score of 10 for the PHQ-9 maximized sensitivity + specificity for the PHQ-8 among studies that used a semi-structured diagnostic interview reference standard (N = 27). At cutoff 10, the PHQ-8 was less sensitive by 0.02 (−0.06 to 0.00) and more specific by 0.01 (0.00 to 0.01) among those studies (N = 27), with similar results for studies that used other types of interviews (N = 27). For all 54 primary studies combined, across all cutoffs, the PHQ-8 was less sensitive than the PHQ-9 by 0.00 to 0.05 (0.03 at cutoff 10), and specificity was within 0.01 for all cutoffs (0.00 to 0.01).
PHQ-8 and PHQ-9 total scores were similar. Sensitivity may be minimally reduced with the PHQ-8, but specificity is similar.
In 2013, the national surveillance case definition for West Nile virus (WNV) disease was revised to remove fever as a criterion for neuroinvasive disease and require at most subjective fever for non-neuroinvasive disease. The aims of this project were to determine how often afebrile WNV disease occurs and assess differences among patients with and without fever. We included cases with laboratory evidence of WNV disease reported from four states in 2014. We compared demographics, clinical symptoms and laboratory evidence for patients with and without fever and stratified the analysis by neuroinvasive and non-neuroinvasive presentations. Among 956 included patients, 39 (4%) had no fever; this proportion was similar among patients with and without neuroinvasive disease symptoms. For neuroinvasive and non-neuroinvasive patients, there were no differences in age, sex, or laboratory evidence between febrile and afebrile patients, but hospitalisations were more common among patients with fever (P < 0.01). The only significant difference in symptoms was for ataxia, which was more common in neuroinvasive patients without fever (P = 0.04). Only 5% of non-neuroinvasive patients did not meet the WNV case definition due to lack of fever. The evidence presented here supports the changes made to the national case definition in 2013.
Background: Buprenorphine/naloxone (bup/nal) is a partial opioid agonist/antagonist and recommended first line treatment for opioid use disorder (OUD). Emergency departments (EDs) are a key point of contact with the healthcare system for patients living with OUD. Aim Statement: We implemented a multi-disciplinary quality improvement project to screen patients for OUD, initiate bup/nal for eligible individuals, and provide rapid next business day walk-in referrals to addiction clinics in the community. Measures & Design: From May to September 2018, our team worked with three ED sites and three addiction clinics to pilot the program. Implementation involved alignment with regulatory requirements, physician education, coordination with pharmacy to ensure in-ED medication access, and nurse education. The project is supported by a full-time project manager, data analyst, operations leaders, physician champions, provincial pharmacy, and the Emergency Strategic Clinical Network leadership team. For our pilot, our evaluation objective was to determine the degree to which our initiation and referral pathway was being utilized. We used administrative data to track the number of patients given bup/nal in ED, their demographics and whether they continued to fill bup/nal prescriptions 30 days after their ED visit. Addiction clinics reported both the number of patients referred to them and the number of patients attending their referral. Evaluation/Results: Administrative data shows 568 opioid-related visits to ED pilot sites during the pilot phase. Bup/nal was given to 60 unique patients in the ED during 66 unique visits. There were 32 (53%) male patients and 28 (47%) female patients. Median patient age was 34 (range: 21 to 79). ED visits where bup/nal was given had a median length of stay of 6 hours 57 minutes (IQR: 6 hours 20 minutes) and Canadian Triage Acuity Scores as follows: Level 1 – 1 (2%), Level 2 – 21 (32%), Level 3 – 32 (48%), Level 4 – 11 (17%), Level 5 – 1 (2%). 51 (77%) of these visits led to discharge. 24 (47%) discharged patients given bup/nal in ED continued to fill bup/nal prescriptions 30 days after their index ED visit. EDs also referred 37 patients with OUD to the 3 community clinics, and 16 of those individuals (43%) attended their first follow-up appointment. Discussion/Impact: Our pilot project demonstrates that with dedicated resources and broad institutional support, ED patients with OUD can be appropriately initiated on bup/nal and referred to community care.
Each of the laboratory intercomparisons (from ICS onwards) has included wood samples, many of them dendrochronologically dated. In the early years, as a result of the majority of laboratories being radiometric, these samples were typically blocks of 20–40 rings, but more recently (SIRI), they have been single ring samples. The sample ages have spanned background through to modern. In some intercomparisons, we have examined different wood pretreatment effects, in others the focus has been on background samples. In this paper, we illustrate what we have learned from these extensive intercomparisons involving wood samples and how the results contribute to the global IntCal effort.
Targeted screening for carbapenem-resistant organisms (CROs), including carbapenem-resistant Enterobacteriaceae (CRE) and carbapenemase-producing organisms (CPOs), remains limited; recent data suggest that existing policies miss many carriers.
Our objective was to measure the prevalence of CRO and CPO perirectal colonization at hospital unit admission and to use machine learning methods to predict probability of CRO and/or CPO carriage.
We performed an observational cohort study of all patients admitted to the medical intensive care unit (MICU) or solid organ transplant (SOT) unit at The Johns Hopkins Hospital between July 1, 2016 and July 1, 2017. Admission perirectal swabs were screened for CROs and CPOs. More than 125 variables capturing preadmission clinical and demographic characteristics were collected from the electronic medical record (EMR) system. We developed models to predict colonization probabilities using decision tree learning.
Evaluating 2,878 admission swabs from 2,165 patients, we found that 7.5% and 1.3% of swabs were CRO and CPO positive, respectively. Organism and carbapenemase diversity among CPO isolates was high. Despite including many characteristics commonly associated with CRO/CPO carriage or infection, overall, decision tree models poorly predicted CRO and CPO colonization (C statistics, 0.57 and 0.58, respectively). In subgroup analyses, however, models did accurately identify patients with recent CRO-positive cultures who use proton-pump inhibitors as having a high likelihood of CRO colonization.
In this inpatient population, CRO carriage was infrequent but was higher than previously published estimates. Despite including many variables associated with CRO/CPO carriage, models poorly predicted colonization status, likely due to significant host and organism heterogeneity.
Childhood adversity is associated with poor mental and physical health outcomes across the life span. Alterations in the hypothalamic–pituitary–adrenal axis are considered a key mechanism underlying these associations, although findings have been mixed. These inconsistencies suggest that other aspects of stress processing may underlie variations in this these associations, and that differences in adversity type, sex, and age may be relevant. The current study investigated the relationship between childhood adversity, stress perception, and morning cortisol, and examined whether differences in adversity type (generalized vs. threat and deprivation), sex, and age had distinct effects on these associations. Salivary cortisol samples, daily hassle stress ratings, and retrospective measures of childhood adversity were collected from a large sample of youth at risk for serious mental illness including psychoses (n = 605, mean age = 19.3). Results indicated that childhood adversity was associated with increased stress perception, which subsequently predicted higher morning cortisol levels; however, these associations were specific to threat exposures in females. These findings highlight the role of stress perception in stress vulnerability following childhood adversity and highlight potential sex differences in the impact of threat exposures.