To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Antibiotics are frequently prescribed inappropriately for acute respiratory infections in the outpatient setting. We report the implementation of a multifaceted outpatient antimicrobial stewardship initiative resulting in a 12.3% absolute reduction of antibiotic prescribing for acute bronchitis in primary care clinics receiving active interventions.
Analyses of macroscopic charcoal, sediment geochemistry (%C, %N, C/N, δ13C, δ15N), and fossil pollen were conducted on a sediment core recovered from Stella Lake, Nevada, establishing a 2000 year record of fire history and vegetation change for the Great Basin. Charcoal accumulation rates (CHAR) indicate that fire activity, which was minimal from the beginning of the first millennium to AD 750, increased slightly at the onset of the Medieval Climate Anomaly (MCA). Observed changes in catchment vegetation were driven by hydroclimate variability during the early MCA. Two notable increases in CHAR, which occurred during the Little Ice Age (LIA), were identified as major fire events within the catchment. Increased C/N, enriched δ15N, and depleted δ13C values correspond with these events, providing additional evidence for the occurrence of catchment-scale fire events during the late fifteenth and late sixteenth centuries. Shifts in the vegetation community composition and structure accompanied these fires, with Pinus and Picea decreasing in relative abundance and Poaceae increasing in relative abundance following the fire events. During the LIA, the vegetation change and lacustrine geochemical response was most directly influenced by the occurrence of catchment-scale fires, not regional hydroclimate.
ABSTRACT IMPACT: This work will help to understand a novel therapeutic approach to a common type of acute myeloid leukemia. OBJECTIVES/GOALS: FMS-like tyrosine kinase 3 (or FLT3) mutations occur in ˜30% of acute myeloid leukemia (AML) cases. FLT3 tyrosine kinase domain (TKD) mutations are particularly important in relapsed/refractory FLT3 mutant AML, which portends poor prognosis. This study describes a therapeutic approach to overcoming resistance conferred by FLT3-TKD mutations. METHODS/STUDY POPULATION: To understand the efficacy of a novel type 1 FLT3 inhibitor (NCGC1481), as a monotherapy and combination therapy, several assays were utilized to interrogate functionality of these therapies. Cell lines and patient samples containing aspartate 835 to tyrosine mutations (D835Y, the most common TKD alteration) and phenylalanine 691 to leucine (F691L) were utilized to examine the effects of NCGC1481 with and without other targeted therapies like MEK inhibitors. Specifically, assays measuring viability, cell death using flow cytometry, in vitro clonogenicity, cellular signaling, and xenograft survival were examined in these FLT3-TKD AML models. Synergy was also measured using well described methods, which also allowed for appropriate dose finding for drug combination experiments. RESULTS/ANTICIPATED RESULTS: Our novel type 1 FLT3 inhibitor (NCGC1481) was particularly effective in the most common FLT3 TKD mutant, D835Y. NCGC1481 reduced viability and cell signaling, while also inducing cell death and prolonging xenograft survival in the FLT3-D835Y model system. In contrast, clinically approved FLT3 inhibitors were less effective at suppressing AML cells expressing FLT3-D835Y. In the case of FLT3-F691L, most of the FLT3 inhibitors tested, including NCGC1481, suppressed canonical FLT3 signaling, but did not significantly reduce viability or leukemic clonogenicity. However, when NCGC1481 was combined with other targeted agents like MEK inhibitors, at synergistic doses, eradication of the FLT3-F691L AML clone was substantially increased. DISCUSSION/SIGNIFICANCE OF FINDINGS: In AML, response to FLT3 inhibitor therapy is often short-lived, with resistance sometimes occurring via FLT3-TKD mutations. Given the dismal prognosis of relapsed FLT3 mutant AML, novel therapies are necessary. This study describes efficacy of a novel FLT3 inhibitor, along with its synergistic activity when combined with other targeted agents.
Mass asymptomatic SARS-CoV-2 nucleic acid amplified testing of healthcare personnel (HCP) was performed at a large tertiary health system. A low period-prevalence of positive HCP was observed. Of those who tested positive, half had mild symptoms in retrospect. HCP with even mild symptoms should be isolated and tested.
In recent years, a variety of efforts have been made in political science to enable, encourage, or require scholars to be more open and explicit about the bases of their empirical claims and, in turn, make those claims more readily evaluable by others. While qualitative scholars have long taken an interest in making their research open, reflexive, and systematic, the recent push for overarching transparency norms and requirements has provoked serious concern within qualitative research communities and raised fundamental questions about the meaning, value, costs, and intellectual relevance of transparency for qualitative inquiry. In this Perspectives Reflection, we crystallize the central findings of a three-year deliberative process—the Qualitative Transparency Deliberations (QTD)—involving hundreds of political scientists in a broad discussion of these issues. Following an overview of the process and the key insights that emerged, we present summaries of the QTD Working Groups’ final reports. Drawing on a series of public, online conversations that unfolded at www.qualtd.net, the reports unpack transparency’s promise, practicalities, risks, and limitations in relation to different qualitative methodologies, forms of evidence, and research contexts. Taken as a whole, these reports—the full versions of which can be found in the Supplementary Materials—offer practical guidance to scholars designing and implementing qualitative research, and to editors, reviewers, and funders seeking to develop criteria of evaluation that are appropriate—as understood by relevant research communities—to the forms of inquiry being assessed. We dedicate this Reflection to the memory of our coauthor and QTD working group leader Kendra Koivu.1
The National Institute for Health and Care Excellence (NICE) worked with patients and staff from six patient organizations to review existing health technology assessment (HTA) methods and coproduce proposals to improve the following: patient involvement, how patient evidence is identified and considered by committees, and the support offered to patient stakeholders. This engagement identified important factors that HTA bodies need to understand to enable meaningful patient and public involvement (PPI), such as having clearly documented processes, appropriate evidence submission processes, transparent decisions, and suitable support. This work demonstrated the benefits of HTA bodies working collaboratively with patient stakeholders to improve PPI. By doing so, HTA bodies can increase their knowledge and understanding of the barriers faced by patient stakeholders to develop appropriate solutions to remove them. The coproduction approach improved stakeholder engagement methods, provided a better analysis of data, supported the development of meaningful conclusions, and improved stakeholder relationships.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
The CDC recommends that consultant pharmacists support antimicrobial stewardship programs (ASPs) in long-term care facilities (LTCFs). We studied CDC-recommended ASP core elements implementation and antibiotic use in LTCFs before and after training consultant pharmacists. Methods: Between August 2017 and October 2017, consultant pharmacists from a regional long-term care pharmacy attended 5 didactic sessions preparing them to assist LTCFs in implementation of CDC-recommended ASP core elements. Training also included creating a process for evaluating appropriateness of all systemic antibiotics and providing prescriber feedback during their monthly mandatory drug-regimen reviews. Once monthly “meet-the-expert” sessions were held with consultant pharmacists throughout the project (November 2017 to December 2018). LTCF enrollment began in November 2017 and >90% of facilities joined by January 2018. After enrollment, consultant pharmacists initiated ASP interventions including antibiotic reviews and feedback using standard templates. They also held regular meetings with infection preventionists to discuss Core Elements implementation and provided various ASP resources to LTCFs (eg, antibiotic policy template, guidance documents and standard assessment and communication tools). Data collection included ASP Core Elements, antibiotic starts, days of therapy (DOT), and resident days (RD). The McNemar test, the Wilcoxon signed-rank test, generalized estimating equation model, and the classic repeated measures approach were used to compare the presence of all 7 core elements and antibiotic use during the baseline (2017) and intervention (2018) year.Results: In total, 9 trained consultant pharmacists assisted 32 LTCFs with ASP implementation. When evaluating 27 LTCFs that provided complete data, a significant increase in presence of all 7 Core Elements after the intervention was noted compared to baseline (67% vs 0; median Core Elements, 7 vs 2; range, 6–7 vs 1–6; P < .001). Median monthly antibiotic starts per 1,000 RD and DOT per 1,000 RD decreased in 2018 compared to 2017: 8.93 versus 9.91 (P < .01) and 106.47 versus 141.59 (P < .001), respectively. However, variations in antibiotic use were detected among facilities (Table 1). When comparing trends, antibiotic starts and DOT were already trending downward during 2017 (Fig. 1A and 1B). On average, antibiotic starts decreased by 0.27 per 1,000 RD (P < .001) and DOT by 1.92 per 1,000 RD (P < .001) each month during 2017. Although antibiotic starts remained mostly stable in 2018, DOT continued to decline further (average monthly decline, 2.60 per 1,000 RD; P < .001). When analyzing aggregated mean, antibiotic use across all sites per month by year, DOT were consistently lower throughout 2018 and antibiotic starts were lower for the first 9 months (Fig. 1C and 1D). Conclusions: Consultant pharmacists can play an important role in strengthening ASPs and in decreasing antibiotic use in LTCFs. Educational programs should be developed nationally to train long-term care consultant pharmacists in ASP implementation.
Funding: Merck & Co., Inc, provided funding for this study.
Disclosures: Muhammad Salman Ashraf and Scott Bergman report receipt of a research grant from Merck.
Background: Effective medical device reprocessing (MDR) is essential in preventing the spread of microorganisms and maintaining patient safety. Alberta Health Services (AHS) is an Alberta-wide, integrated health system, responsible for delivering health services to >4.3 million people living in the province. In 2010, periodic province-wide MDR reviews were initiated by the provincial health system to verify that the cleaning, disinfection, and sterilization of reusable critical and semicritical medical devices met established standards. To date, there have been 3 review cycles; in cycle 3, a follow-up process for tracking and reporting corrective actions was initiated. Methods: As in previous MDR review cycles, cycle 3 included the use of a standardized suite of tools to measure compliance with standards set by Accreditation Canada, the Canadian Standards Association, and the Government of Alberta. Each cycle involved a review of MDR areas completed by trained reviewers. Interrater reliability among reviewers was maintained through training and debriefings following reviews to ensure agreement. Following reviews, reports were generated for areas, zones, and AHS. As part of the corrective actions and follow-up process, identified deficiencies were categorized into 5 themes. Corrective actions were tracked and periodic reports were generated showing the progress of deficiency resolution. Resolution rates (number of resolved deficiencies divided by total number of of deficiencies) were calculated for each of the identified themes as well as overall for cycle 3. Results: Overall compliance for cycle 3 was 93%. Cycle 3 reviews revealed that more than half of the deficiencies (58%) were identified previously in cycle 2. The resolution rates ranged from 78% to 95% for identified deficiencies for 4 of the 5 themes: documentation, technique, PPE/attire/hand hygiene, and other. The theme related to physical infrastructure showed a considerably lower resolution rate of 49%. The corrective action follow-up process showed increased overall resolution rate from 59% at the start of the follow-up process to 82% at its completion. When this resolution rate was applied to the initial survey compliance rate for cycle 3, overall compliance increased to 99%. Conclusions: Monitoring quality of MDR practices is essential in maintaining and improving patient safety. The standardized provincial review process identified common themes and a coordinated approach to support the resolution of many identified deficiencies. Most of those deficiencies were resolved; however, those deficiencies related to physical infrastructure of the MDR department continue to be seen across review cycles. This review process with follow up of these deficiencies can help bring attention to organization leadership and Funding: authorities during budget cycles.
Background: A penicillin allergy guidance document containing an algorithm for challenging penicillin allergic patients with β-lactams was developed by the antimicrobial stewardship program (ASP). As part of this algorithm, a “graded challenge” order set was created containing antimicrobial orders and safety medications along with monitoring instructions. The process is designed to challenge patients at low risk of reaction with infusions of 1% of the target dose, then 10%, and finally the full dose, each 30 minutes apart. We evaluated outcomes from the order set. Methods: Orders of the graded challenge over 17 months (March 2018 through July 2019) were reviewed retrospectively. Data were collected on ordering and outcomes of the challenges and allergy documentation. Use was evaluated based on ASP-recommended indications: history of IgE-mediated or unknown reaction plus (1) no previous β-lactam tolerance and the reaction occurred >10 years ago, or (2) previous β-lactam tolerance, now requiring a different β-lactam for treatment. Only administered challenges were included and descriptive statistics were utilized. Results: Of 67 orders, 57 graded challenges were administered to 56 patients. The most common allergies were penicillins (87.7%) and cephalosporins (38.6%), with the most common reactions being unknown (41.7%) or hives (22%). The most common antibiotics challenged were ceftriaxone (43.9%), cefepime (21.1%), and cefazolin (5.3%). Antibiotics given prior to challenge included vancomycin (48.2%), fluoroquinolones (35.7%), carbapenems (21.4%), aztreonam (19.6%), and clindamycin (12.5%). The median duration of challenged antibiotic was 6 days. The infectious diseases service was consulted on 59.6% of challenges and 75.4% of challenges were administered in non-ICU settings. There was 1 reaction (1.8%) involving a rash with the second infusion, which was treated with oral diphenhydramine and had no lasting effects. Based on indications, 80.7% of challenges were aligned with ASP guidance criteria. The most common use outside of these criteria was in patients without IgE-mediated reactions (10.5%). Most of these had minor rashes and could have received a full dose of a cephalosporin. Allergy information was updated in the electronic health record after 91.2% of challenges. Conclusions: We demonstrated the utility of a graded challenge process at our academic medical center. It was well tolerated, ordered frequently by noninfectious diseases clinicians, administered primarily in non-ICU settings, and regularly resulted in updated allergy information in the medical record. With many patients initially receiving broad-spectrum antibiotics with high costs or increased rates of adverse effects, graded challenges can potentially prevent the use of suboptimal therapies with minimal time and resource investment.
Disclosures: Scott Bergman reports a research grant from Merck.
We examined whether intraindividual variability (IIV) across tests of executive functions (EF-IIV) is elevated in Veterans with a history of mild traumatic brain injury (mTBI) relative to military controls (MCs) without a history of mTBI. We also explored relationships among EF-IIV, white matter microstructure, and posttraumatic stress disorder (PTSD) symptoms.
A total of 77 Veterans (mTBI = 43, MCs = 34) completed neuropsychological testing, diffusion tensor imaging (DTI), and PTSD symptom ratings. EF-IIV was calculated as the standard deviation across six tests of EF, along with an EF-Mean composite. DSI Studio connectometry analysis identified white matter tracts significantly associated with EF-IIV according to generalized fractional anisotropy (GFA).
After adjusting for EF-Mean and PTSD symptoms, the mTBI group showed significantly higher EF-IIV than MCs. Groups did not differ on EF-Mean after adjusting for PTSD symptoms. Across groups, PTSD symptoms significantly negatively correlated with EF-Mean, but not with EF-IIV. EF-IIV significantly negatively correlated with GFA in multiple white matter pathways connecting frontal and more posterior regions.
Veterans with mTBI demonstrated significantly greater IIV across EF tests compared to MCs, even after adjusting for mean group differences on those measures as well as PTSD severity. Findings suggest that, in contrast to analyses that explore effects of mean performance across tests, discrepancy analyses may capture unique variance in neuropsychological performance and more sensitively capture cognitive disruption in Veterans with mTBI histories. Importantly, findings show that EF-IIV is negatively associated with the microstructure of white matter pathways interconnecting cortical regions that mediate executive function and attentional processes.
Intensified cover-cropping practices are increasingly viewed as a herbicide-resistance management tool but clear distinction between reactive and proactive resistance management performance targets is needed. We evaluated two proactive performance targets for integrating cover-cropping tactics, including (1) facilitation of reduced herbicide inputs and (2) reduced herbicide selection pressure. We conducted corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] field experiments in Pennsylvania and Delaware using synthetic weed seedbanks of horseweed [Conyza canadensis (L.) Cronquist] and smooth pigweed (Amaranthus hybridus L.) to assess winter and summer annual population dynamics, respectively. The effect of alternative cover crops was evaluated across a range of herbicide inputs. Cover crop biomass production ranged from 2,000 to 8,500 kg ha−1 in corn and 3,000 to 5,500 kg ha−1 in soybean. Experimental results demonstrated that herbicide-based tactics were the primary drivers of total weed biomass production, with cover-cropping tactics providing an additive weed-suppression benefit. Substitution of cover crops for PRE or POST herbicide programs did not reduce total weed control levels or cash crop yields but did result in lower net returns due to higher input costs. Cover-cropping tactics significantly reduced C. canadensis populations in three of four cover crop treatments and decreased the number of large rosettes (>7.6-cm diameter) at the time of preplant herbicide exposure. Substitution of cover crops for PRE herbicides resulted in increased selection pressure on POST herbicides, but reduced the number of large individuals (>10 cm) at POST applications. Collectively, our findings suggest that cover crops can reduce the intensity of selection pressure on POST herbicides, but the magnitude of the effect varies based on weed life-history traits. Additional work is needed to describe proactive resistance management concepts and performance targets for integrating cover crops so producers can apply these concepts in site-specific, within-field management practices.
While policy attention is understandably diverted to COVID-19, the end of the UK's post-Brexit ‘transition period’ remains 31 December 2020. All forms of future EU−UK relationship are worse for health than EU membership, but analysis of the negotiating texts shows some forms are better than others. The likely outcomes involve major negative effects for NHS staffing, funding for health and social care, and capital financing for the NHS; and for UK global leadership and influence. We expect minor negative effects for cross border healthcare (except in Northern Ireland); research collaboration; and data sharing, such as the Early Warning and Response System for health threats. Despite political narratives, the legal texts show that the UK seeks de facto continuity in selected key areas for pharmaceuticals, medical devices, and equipment [including personal protective equipment (PPE)], especially clinical trials, pharmacovigilance, and batch-testing. The UK will be excluded from economies of scale of EU membership, e.g. joint procurement programmes as used recently for PPE. Above all, there is a major risk of reaching an agreement with significant adverse effects for health, without meaningful oversight by or input from the UK Parliament, or other health policy stakeholders.
Members of an emergency department (ED) staff need to be prepared for mass casualty incidents (MCIs) at all times. Didactic sessions, drills, and functional exercises have shown to be effective, but it is challenging to find time and resources for appropriate training. We conducted brief, task-specific drills (deemed “disaster huddles”) in a pediatric ED (PED) to examine if such an approach could be an alternative or supplement to traditional MCI training paradigms. Over the course of the study, we observed an improving trend in the overall score for administrative disaster preparedness. Disaster huddles may be an effective way to improve administrative disaster preparedness in the PED. Low-effort, low-time commitment education could be an attractive way for further disaster preparedness efforts. Further studies are indicated to show a potential impact on lasting behavior and patient outcomes.
Healthcare personnel (HCP) were recruited to provide serum samples, which were tested for antibodies against Ebola or Lassa virus to evaluate for asymptomatic seroconversion.
From 2014 to 2016, 4 patients with Ebola virus disease (EVD) and 1 patient with Lassa fever (LF) were treated in the Serious Communicable Diseases Unit (SCDU) at Emory University Hospital. Strict infection control and clinical biosafety practices were implemented to prevent nosocomial transmission of EVD or LF to HCP.
All personnel who entered the SCDU who were required to measure their temperatures and complete a symptom questionnaire twice daily were eligible.
No employee developed symptomatic EVD or LF. EVD and LF antibody studies were performed on sera samples from 42 HCP. The 6 participants who had received investigational vaccination with a chimpanzee adenovirus type 3 vectored Ebola glycoprotein vaccine had high antibody titers to Ebola glycoprotein, but none had a response to Ebola nucleoprotein or VP40, or a response to LF antigens.
Patients infected with filoviruses and arenaviruses can be managed successfully without causing occupation-related symptomatic or asymptomatic infections. Meticulous attention to infection control and clinical biosafety practices by highly motivated, trained staff is critical to the safe care of patients with an infection from a special pathogen.
Research participants want to receive results from studies in which they participate. However, health researchers rarely share the results of their studies beyond scientific publication. Little is known about the barriers researchers face in returning study results to participants.
Using a mixed-methods design, health researchers (N = 414) from more than 40 US universities were asked about barriers to providing results to participants. Respondents were recruited from universities with Clinical and Translational Science Award programs and Prevention Research Centers.
Respondents reported the percent of their research where they experienced each of the four barriers to disseminating results to participants: logistical/methodological, financial, systems, and regulatory. A fifth barrier, investigator capacity, emerged from data analysis. Training for research faculty and staff, promotion and tenure incentives, and funding agencies supporting dissemination of results to participants were solutions offered to overcoming barriers.
Study findings add to literature on research dissemination by documenting health researchers’ perceived barriers to sharing study results with participants. Implications for policy and practice suggest that additional resources and training could help reduce dissemination barriers and increase the return of results to participants.
Weeds can cause significant yield loss in watermelon production systems. Commercially acceptable weed control is difficult to achieve, even with heavy reliance on herbicides. A study was conducted to evaluate a spring-seeded cereal rye cover crop with different herbicide application timings for weed management between row middles in watermelon production systems. Common lambsquarters and pigweed species (namely, Palmer amaranth and smooth pigweed) densities and biomasses were often lower with cereal rye compared with no cereal rye, regardless of herbicide treatment. The presence of cereal rye did not negatively influence the number of marketable watermelon fruit, but average marketable fruit weight in cereal rye versus no cereal rye treatments varied by location. These results demonstrate that a spring-seeded cereal rye cover crop can help reduce weed density and weed biomass, and potentially enhance overall weed control. Cereal rye alone did not provide full-season weed control, so additional research is needed to determine the best methods to integrate spring cover cropping with other weed management tactics in watermelon for effective, full-season control.