To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Persons at clinical high-risk for psychosis (CHR) are characterised by specific neurocognitive deficits. However, the course of neurocognitive performance during the prodromal period and over the onset of psychosis remains unclear. The aim of this meta-analysis was to synthesise results from follow-up studies of CHR individuals to examine longitudinal changes in neurocognitive performance. Three electronic databases were systematically searched to identify articles published up to 31 December 2021. Thirteen studies met inclusion criteria. Study effect sizes (Hedges' g) were calculated and pooled for each neurocognitive task using random-effects meta-analyses. We examined whether changes in performance between baseline and follow-up assessments differed between: (1) CHR and healthy control (HC) individuals, and (2) CHR who did (CHR-T) and did not transition to psychosis (CHR-NT). Meta-analyses found that HC individuals had greater improvements in performance over time compared to CHR for letter fluency (g = −0.32, p = 0.029) and digit span (g = −0.30, p = 0.011) tasks. Second, there were differences in longitudinal performance of CHR-T and CHR-NT in trail making test A (TMT-A) (g = 0.24, p = 0.014) and symbol coding (g = −0.51, p = 0.011). Whilst CHR-NT improved in performance on both tasks, CHR-T improved to a lesser extent in TMT-A and had worsened performance in symbol coding over time. Together, neurocognitive performance generally improved in all groups at follow-up. Yet, evidence suggested that improvements were less pronounced for an overall CHR group, and specifically for CHR-T, in processing speed tasks which may be a relevant domain for interventions aimed to enhance neurocognition in CHR populations.
To investigate factors that influence antibiotic prescribing decisions, we interviewed 49 antibiotic stewardship champions and stakeholders across 15 hospitals. We conducted thematic analysis and subcoding of decisional factors. We identified 31 factors that influence antibiotic prescribing decisions. These factors may help stewardship programs identify educational targets and design more effective interventions.
The COVID-19 pandemic and global climate change crisis remind us that widespread trust in the products of the scientific enterprise is vital to the health and safety of the global community. Insofar as appropriate responses to these (and other) crises require us to trust that enterprise, cultivating a healthier trust relationship between science and the public may be considered as a collective public good. While it might appear that scientists can contribute to this good by taking more initiative to communicate their work to public audiences, we raise a concern about unintended consequences of an individualistic approach to such communication.
Among outpatients with coronavirus disease 2019 (COVID-19) due to the severe acute respiratory coronavirus virus 2 (SARS-CoV-2) δ (delta) variant who did and did not receive 2 vaccine doses at 7 days after symptom onset, there was no difference in viral shedding (cycle threshold difference 0.59, 95% CI, −4.68 to 3.50; P = .77) with SARS-CoV-2 cultured from 2 (7%) of 28 and 1 (4%) of 26 outpatients, respectively.
As clinical trials were rapidly initiated in response to the COVID-19 pandemic, Data and Safety Monitoring Boards (DSMBs) faced unique challenges overseeing trials of therapies never tested in a disease not yet characterized. Traditionally, individual DSMBs do not interact or have the benefit of seeing data from other accruing trials for an aggregated analysis to meaningfully interpret safety signals of similar therapeutics. In response, we developed a compliant DSMB Coordination (DSMBc) framework to allow the DSMB from one study investigating the use of SARS-CoV-2 convalescent plasma to treat COVID-19 to review data from similar ongoing studies for the purpose of safety monitoring.
The DSMBc process included engagement of DSMB chairs and board members, execution of contractual agreements, secure data acquisition, generation of harmonized reports utilizing statistical graphics, and secure report sharing with DSMB members. Detailed process maps, a secure portal for managing DSMB reports, and templates for data sharing and confidentiality agreements were developed.
Four trials participated. Data from one trial were successfully harmonized with that of an ongoing trial. Harmonized reports allowing for visualization and drill down into the data were presented to the ongoing trial’s DSMB. While DSMB deliberations are confidential, the Chair confirmed successful review of the harmonized report.
It is feasible to coordinate DSMB reviews of multiple independent studies of a similar therapeutic in similar patient cohorts. The materials presented mitigate challenges to DSMBc and will help expand these initiatives so DSMBs may make more informed decisions with all available information.
The timing of pulmonary valve replacement in patients with pulmonary regurgitation following treatment of pulmonary stenosis is undefined. Although cardiac magnetic resonance-based right ventricular volumes in tetralogy of Fallot patients have been used as a guide in pulmonary stenosis patients, anatomic differences between tetralogy of Fallot and pulmonary stenosis patients complicate their application to pulmonary stenosis patients and could result in late referral for pulmonary valve replacement. We sought to determine if pulmonary stenosis patients referred for pulmonary valve replacement were at greater risk for morbidity or need for tricuspid valve intervention at the time of pulmonary valve replacement. A retrospective cohort study was performed on all adult patients with a diagnosis of pulmonary stenosis or tetralogy of Fallot followed at our centre. Clinical and imaging-based exposures were collected. Pre-specified endpoints included need for concomitant tricuspid valve repair or replacement and pre- and post-pulmonary valve replacement cardiac magnetic resonance-based volumetric measurements. Between 1/1999 and 1/2020, 235 patients underwent pulmonary valve replacement for pulmonary regurgitation (52 with pulmonary stenosis, 183 with tetralogy of Fallot). Pulmonary stenosis patients were more likely to have at least moderate tricuspid regurgitation (p = 0.010), undergo concomitant tricuspid valve intervention (p = 0.003), and require tricuspid valve repair or replacement secondary to annular dilation (p = 0.027) compared to tetralogy of Fallot patients. There was no difference in pre-pulmonary valve replacement right ventricular size between pulmonary stenosis and tetralogy of Fallot patients. These findings suggest that referral for pulmonary valve replacement may be occurring later in the disease course for pulmonary stenosis patients.
The spatial and temporal extent of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) environmental contamination has not been precisely defined. We sought to elucidate contamination of different surface types and how contamination changes over time.
We sampled surfaces longitudinally within COVID-19 patient rooms, performed quantitative RT-PCR for the detection of SARS-CoV-2 RNA, and modeled distance, time, and severity of illness on the probability of detecting SARS-CoV-2 using a mixed-effects binomial model.
The probability of detecting SARS-CoV-2 RNA in a patient room did not vary with distance. However, we found that surface type predicted probability of detection, with floors and high-touch surfaces having the highest probability of detection: floors (odds ratio [OR], 67.8; 95% credible interval [CrI], 36.3–131) and high-touch elevated surfaces (OR, 7.39; 95% CrI, 4.31–13.1). Increased surface contamination was observed in room where patients required high-flow oxygen, positive airway pressure, or mechanical ventilation (OR, 1.6; 95% CrI, 1.03–2.53). The probability of elevated surface contamination decayed with prolonged hospitalization, but the probability of floor detection increased with the duration of the local pandemic wave.
Distance from a patient’s bed did not predict SARS-CoV-2 RNA deposition in patient rooms, but surface type, severity of illness, and time from local pandemic wave predicted surface deposition.
We prospectively surveyed SARS-CoV-2 RNA contamination in staff common areas within an acute-care hospital. An increasing prevalence of surface contamination was detected over time. Adjusting for patient census or community incidence of coronavirus disease 2019 (COVID-19), the proportion of contaminated surfaces did not predict healthcare worker COVID-19 infection on study units.
Healthcare personnel (HCP) with unprotected exposures to aerosol-generating procedures (AGPs) on patients with coronavirus disease 2019 (COVID-19) are at risk of infection with severe acute respiratory coronavirus virus 2 (SARS-CoV-2). A retrospective review at an academic medical center demonstrated an infection rate of <1% among HCP involved in AGPs without a respirator and/or eye protection.
Parkinson's disease (PD) is the second most common neurodegenerative disease after Alzheimer's disease and affects about 1% of the population over the age of 60 years in industrialised countries. The aim of this review is to examine nutrition in PD across three domains: dietary intake and the development of PD; whole body metabolism in PD and the effects of PD symptoms and treatment on nutritional status. In most cases, PD is believed to be caused by a combination of genetic and environmental factors and although there has been much research in the area, evidence suggests that poor dietary intake is not a risk factor for the development of PD. The evidence about body weight changes in both the prodromal and symptomatic phases of PD is inconclusive and is confounded by many factors. Malnutrition in PD has been documented as has sarcopaenia, although the prevalence of the latter remains uncertain due to a lack of consensus in the definition of sarcopaenia. PD symptoms, including those which are gastrointestinal and non-gastrointestinal, are known to adversely affect nutritional status. Similarly, PD treatments can cause nausea, vomiting and constipation, all of which can adversely affect nutritional status. Given that the prevalence of PD will increase as the population ages, it is important to understand the interplay between PD, comorbidities and nutritional status. Further research may contribute to the development of interventional strategies to improve symptoms, augment care and importantly, enhance the quality of life for patients living with this complex neurodegenerative disease.
The COVID-19 pandemic has placed significant strain on emergency departments (EDs) that were not designed to care for many patients who may be highly contagious. This report outlines how a busy urban ED was adapted to prepare for COVID-19 via 3 primary interventions: (1) creating an open-air care space in the ambulance bay to cohort, triage, and rapidly test patients with suspected COVID-19, (2) quickly constructing temporary doors on all open treatment rooms, and (3) adapting and expanding the waiting room. This description serves as a model by which other EDs can repurpose their own care spaces to help ensure safety of their patients and health care workers.
Background: NDM/OXA-23 carbapenemase-producing Acinetobacter baumannii isolates have been reported worldwide, but rarely in the United States. A California acute-care hospital (ACH) A identified 3 patients with pan-nonsusceptible A. baumannii during May–June 2020, prompting a public health investigation to prevent further transmission among the regional healthcare network. Methods: A clinical isolate was defined as NDM/OXA-23–producing A. baumannii from a patient at ACH A or B, or an epidemiologically linked patient identified through colonization screening during May 2020–January 2021. ACHs A and B are sentinel sites for carbapenem-resistant A. baumannii surveillance through the Antibiotic Resistance Laboratory Network (AR Lab Network), where isolates are tested for carbapenemase genes. The California Department of Public Health with 3 local health departments conducted an epidemiological investigation, contact tracing, colonization screening, and whole-genome sequencing (WGS). Results: In total, 11 cases were identified during May 2020–January 2021, including 3 cases at ACH A during May–June 2020, and 8 additional cases during November 2020–January 2021: 5 at ACH A, 1 at ACH B, and 2 at skilled nursing facility (SNF) A. Isolates from ACHs A and B were identified through testing at the AR Lab Network. Of the 11 patients (including the index patient), 4 had exposure at SNF A, where 2 cases were identified through colonization screening. Screening conducted at ACH A and 5 other long-term care facilities (LTCFs) identified no additional cases. WGS results for the first 8 cases identified showed 2–13 single-nucleotide polymorphism differences. Antibiotic resistance genes for all isolates sequenced included NDM-1 and OXA-23. On-site assessments related to a COVID-19 outbreak conducted at ACH A identified infection control gaps. Conclusions: Hospital participation in public health laboratory surveillance allows early detection of novel multidrug-resistant organisms (MDROs), which enabled outbreak identification and public health response. A high COVID-19 burden and related changes in infection control practices have been associated with MDRO transmission elsewhere in California. This factor might have contributed to spread at ACH A and hampered earlier screening efforts at SNF A, likely leading to undetected transmission. Extensive movement of positive patients among a regional healthcare network including at least 6 ACHs and 7 LTCFs likely contributed to the prolonged duration of this outbreak. This investigation highlights the importance of enhanced novel MDRO surveillance strategies coupled with strong infection prevention and control practices as important factors in identifying outbreaks and preventing further transmission in regional networks.
It is trust policy that the VTE risk assessment should be completed for every patient admitted to wards. The standard for this audit is therefore 100% completion. We completed the audit in October 2018 and closed the loop in September 2019.
This was a cross-sectional study of all patients on all the wards according to patients’ list on the electronic system (Paris) on certain date. In the first audit we used an audit tool from a similar audit performed in another area in the trust. For the purpose of re- audit we designed an audit tool to reflect the changes made in the electronic form.
In the re-audit, there was noticeable improvement in the completion rate compared to initial audit (95% vs. 82%); however, there was still under-performance. An interesting observation of the re-audit is that 74% percent of admissions had VTE risk assessments forms completed on same day of admission or next day compared to only 45% in previous audit.
When looking at the completion of individual components on the VTE forms there are still some room for improvement as well. For example, in 26% of the patients there was no documentation about the use of prophylactic anticoagulants before admission compared to 34% in our previous audit. Also in 7% of the patients there was no documentation about the outcome of the assessment compared to only 3% in previous audit.
This is an audit to assess the completion of electronic VTE forms as per trust policy. Following the initial audit we made recommendations to improve completion rate. In the re-audit there was an improvement in total completion rate but we have not met the goal of 100% yet.
We report detailed chemical and isotopic data from a subglacial siliceous deposit on andesitic bedrock recently exposed by glacier retreat. Whereas a single, <1 μm, Si-rich layer covers the highly polished bedrock on the up-glacier (stoss) surfaces, distinct, lithified deposits commonly occur at the lee of small bedrock protuberances, on a scale <0.1 meter. The deposit is millimeters in thickness and consists of laminae tens to hundreds microns thick that differ from one another in color, rock-fragment abundance and chemical composition. Ca-rich laminae that are sufficiently enriched in uranium (~2–50 ppm) to permit U-series isotopic analysis suggest that the subglacial deposit formed 10–20 ka, much earlier than previously assumed. We conclude that (1) the siliceous deposit persisted for at least 10 000 years despite the intervening erosion and weathering, (2) distinct episodes of formation due to significant changes in hydrology and water chemistry are recorded in the deposit, and (3) a siliceous slurry may have existed at the ice-rock interface and influenced the local friction. This work reinforces earlier findings that subglacial chemical deposits can form and persist on geologic time scales and may have implications for the role of the cryosphere in the Earth's geochemical cycles and climate system.
To determine whether cascade reporting is associated with a change in meropenem and fluoroquinolone consumption.
A quasi-experimental study was conducted using an interrupted time series to compare antimicrobial consumption before and after the implementation of cascade reporting.
A 399-bed, tertiary-care, Veterans’ Affairs medical center.
Antimicrobial consumption data across 8 inpatient units were extracted from the Center for Disease Control and Prevention (CDC) National Health Safety Network (NHSN) antimicrobial use (AU) module from April 2017 through March 2019, reported as antimicrobial days of therapy (DOT) per 1,000 days present (DP).
Cascade reporting is a strategy of reporting antimicrobial susceptibility test results in which secondary agents are only reported if an organism is resistant to primary, narrow-spectrum agents. A multidisciplinary team developed cascade reporting algorithms for gram-negative bacteria based on local antibiogram and infectious diseases practice guidelines, aimed at restricting the use of fluoroquinolones and carbapenems. The algorithms were implemented in March 2018.
Following the implementation of cascade reporting, mean monthly meropenem (P =.005) and piperacillin/tazobactam (P = .002) consumption decreased and cefepime consumption increased (P < .001). Ciprofloxacin consumption decreased by 2.16 DOT per 1,000 DP per month (SE, 0.25; P < .001). Clostridioides difficile rates did not significantly change.
Ciprofloxacin consumption significantly decreased after the implementation of cascade reporting. Mean meropenem consumption decreased after cascade reporting was implemented, but we observed no significant change in the slope of consumption. cascade reporting may be a useful strategy to optimize antimicrobial prescribing.
ABSTRACT IMPACT: o The Indiana Clinical and Translational Sciences Institute K-12 STEM Outreach Program’s pivoted to a virtual program in summer 2020 which yielded novel approaches that could be retained in future years to extend the reach/impact of our pipeline program. OBJECTIVES/GOALS: o Provide students with a meaningful and safe research experience during the COVID Pandemic. o Develop new modules and approaches that could be delivered virtually. o Engage students from communities that were not possible in previous years when in person meetings were required. METHODS/STUDY POPULATION: o The program has historically supported over 100 high school students per year in a summer research internship for the last 5 years. Students are placed with academic research mentors in various Schools and Departments across the IUPUI campus, and also with industry laboratories. o COVID-related restrictions required development of 100% virtual program. Key aspects of the virtual program included: cohort-based research mentor assignments with 1-4 mentees matched per research mentor, research projects that could be conducted virtually, heavy engagement of high-school teachers to facilitate the research experience with cohorts of mentees, a more rigorous virtual seminar series that included new modules such as COVID-specific programming and thus enhancing public education about COVID. RESULTS/ANTICIPATED RESULTS: o The program served 130 students in summer 2020. o We were able to recruit new faculty and industry mentors involved in data science research. As a result, we have now increased our mentor pool to serve more students in the future. o Because student participation was virtual, we were able to accept students from further distances (up to 120 miles away) across the state. We were also able to accept local economically disadvantaged students that may have not been able to participate because of lack of reliable transportation. o A positive unanticipated outcome was that mentees relationships with the mentors was established virtually thus increasing the potential for students to remain engaged in their research. DISCUSSION/SIGNIFICANCE OF FINDINGS: o Adapting to a virtual platform provided research experience to high school students during a time when traditional approaches were not possible. Given some research experiences do not require in-person activities, this newly established model could be used moving forward to allow more statewide engagement in research experiences.
This SHEA white paper identifies knowledge gaps and challenges in healthcare epidemiology research related to coronavirus disease 2019 (COVID-19) with a focus on core principles of healthcare epidemiology. These gaps, revealed during the worst phases of the COVID-19 pandemic, are described in 10 sections: epidemiology, outbreak investigation, surveillance, isolation precaution practices, personal protective equipment (PPE), environmental contamination and disinfection, drug and supply shortages, antimicrobial stewardship, healthcare personnel (HCP) occupational safety, and return to work policies. Each section highlights three critical healthcare epidemiology research questions with detailed description provided in supplementary materials. This research agenda calls for translational studies from laboratory-based basic science research to well-designed, large-scale studies and health outcomes research. Research gaps and challenges related to nursing homes and social disparities are included. Collaborations across various disciplines, expertise and across diverse geographic locations will be critical.
Background:Candida dubliniensis is a worldwide fungal opportunistic pathogen, closely related to C. albicans. Originally identified in patients infected with HIV in Dublin, Ireland, C. dubliniensis has emerged as a pathogen in other immunocompromised individuals, including patients receiving chemotherapy and transplant recipients. Pediatric epidemiological data for this organism are limited. Methods: We report a descriptive review of C. dubliniensis isolates recovered between January 2018 and June 2019 at a large tertiary-care pediatric institution in Columbus, Ohio. Results:C. dubliniensis was identified in 48 patients in the 18-month review period. In total, 67 positive cultures were collected in these patients with the following distribution of sources: 44 sputum (66%), 11 bronchoalveloar lavage fluid (16%), 4 blood (6%), 3 wounds (4%), 2 esophageal (3%), 2 peritoneal fluid (3%), and 1 vaginal (1%). Of the 48 patients in whom C. dubliniensis was identified, 35 (73%) were patients with cystic fibrosis. Also, 8 patients (17%) were considered to have clinical infections and received antifungal therapy: 3 patients with pneumonia, 2 patients with esophagitis, 1 patient with peritonitis, 1 patient with catheter-related bloodstream infection, and 1 patient with disseminated candidiasis. The remaining 40 patients (83%) were considered colonized. Conclusions: We report a descriptive series over 18 months of clinical isolates with C. dubliniensis recovery at a pediatric institution. Most isolates were identified as colonizing strains in patients with cystic fibrosis. C. dubliniensis was a rare cause of invasive disease in our institution, with only 8 cases identified.
Background: Audit-and-feedback interventions track clinician practice patterns for a targeted practice behavior. Audit and feedback of antibiotic prescribing data for acute respiratory infections (ARI) is an effective stewardship strategy that relies on administrative coding to identify eligible visits for audit. Diagnostic shifting is the misclassification of a patient’s diagnosis in response to audit and feedback and is a potential unintended consequence of audit and feedback. Objective: To develop a method to identify patterns consistent with diagnostic shifting including both positive shifting (improved diagnosis and documentation) and negative shifting (intentionally altering documentation of diagnosis to justify antibiotic prescribing), after implementation of an audit-and-feedback intervention to improve ARI management. Methods: We evaluated the intervention effect on diagnostic shifting within 12 University of Utah pediatric clinics (293 providers). Data included 66,827 ARI diagnoses: pneumonia, sinusitis, bronchitis, pharyngitis, upper respiratory infection (URI), acute otitis media (AOM), or serous otitis with effusion (OME). To determine whether rates of ARI diagnoses changed after the intervention, we developed logistic generalized estimating equation (GEE) models with robust sandwich standard error estimates to account for clinic-wise clustering. Outcomes included the change in each ARI diagnosis relative to the competing 6 diagnoses included in audit-and-feedback reports before and after intervention implementation. Models tested for a change in outcomes after the intervention (ie, diagnostic shift) after adjustment for month of diagnosis. For each diagnosis, we estimated the population attributable fraction (PAF) for antibiotic prescriptions due to combined shifts in diagnostic frequencies and prescription rates for each diagnosis. The PAF is the estimated fraction of antibiotic prescriptions that would have changed under a population-level intervention. Results: In month-adjusted analyses, diagnoses of pneumonia and OME decreased after the intervention: odds ratio (OR), 0.46 (95% CI, 0.31–0.68) and OR, 0.81 (95% CI, 0.67–0.99), respectively. In addition, URI diagnoses increased: OR, 1.05 (95% CI 1.00, 1.11). We did not detect changes in the diagnosis rates of sinusitis, AOM, bronchitis, and pharyngitis post intervention. The intervention effect on the PAF for antibiotics prescriptions was consistently positive but relatively small in magnitude. PAF was highest for URIs (PAF, 8.87%), followed by AOM (PAF, 3.56%) and sinusitis (PAF, 2.76%), and was lowest for pneumonia and bronchitis (PAF, 0.41% for both). Conclusions: Our analysis found minimal evidence overall of diagnostic shifting after a stewardship intervention using audit and feedback in these pediatric clinics. Small changes in diagnostic coding may reflect more appropriate diagnosis and coding, a positive effect of audit and feedback, rather than intentional negative diagnostic shift.