We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Over a 2-year period, we identified Transmission from Room Environment Events (TREE) across the Johns Hopkins Health System, where the subsequent room occupant developed the same organism with the same antimicrobial susceptibilities as the patient who had previously occupied that room. Overall, the TREE rate was 50/100,000 inpatient days.
Clostridioides difficile infection (CDI) may be misdiagnosed if testing is performed in the absence of signs or symptoms of disease. This study sought to support appropriate testing by estimating the impact of signs, symptoms, and healthcare exposures on pre-test likelihood of CDI.
Methods:
A panel of fifteen experts in infectious diseases participated in a modified UCLA/RAND Delphi study to estimate likelihood of CDI. Consensus, defined as agreement by >70% of panelists, was assessed via a REDCap survey. Items without consensus were discussed in a virtual meeting followed by a second survey.
Results:
All fifteen panelists completed both surveys (100% response rate). In the initial survey, consensus was present on 6 of 15 (40%) items related to risk of CDI. After panel discussion and clarification of questions, consensus (>70% agreement) was reached on all remaining items in the second survey. Antibiotics were identified as the primary risk factor for CDI and grouped into three categories: high-risk (likelihood ratio [LR] 7, 93% agreement among panelists in first survey), low-risk (LR 3, 87% agreement in first survey), and minimal-risk (LR 1, 71% agreement in first survey). Other major factors included new or unexplained severe diarrhea (e.g., ≥ 10 liquid bowel movements per day; LR 5, 100% agreement in second survey) and severe immunosuppression (LR 5, 87% agreement in second survey).
Conclusion:
Infectious disease experts concurred on the importance of signs, symptoms, and healthcare exposures for diagnosing CDI. The resulting risk estimates can be used by clinicians to optimize CDI testing and treatment.
Hypothetical model structures for magadiite and sodium octosilicate, based on the structure of the zeolite dachiardite, are proposed that consist of layers of 6-member rings of tetrahedra and blocks containing 5-member rings attached to both sides of the layers. The infrared (IR) and nuclear magnetic resonance spectra of magadiite and sodium octosilicate have features in common with spectra of zeolites in the ZSM-5 and mordenite groups. A peak at 1225 cm-1 in the IR spectra of magadiite and sodium octosilicate is characteristic of zeolites containing 5-member rings, such as ZSM-5- and mordenite-type zeolites. The defect structures of pentasil zeolites may therefore be akin to layered alkali metal silicates containing zeolite-like domains, in which part of the silanol groups from adjacent silicate layers are condensed (cross-linked) forming siloxane linkages.
Exposure investigations are labor intensive and vulnerable to recall bias. We developed an algorithm to identify healthcare personnel (HCP) interactions from the electronic health record (EHR), and we evaluated its accuracy against conventional exposure investigations. The EHR algorithm identified every known transmission and used ranking to produce a manageable contact list.
Central-line–associated bloodstream infection (CLABSI) surveillance in home infusion therapy is necessary to track efforts to reduce infections, but a standardized, validated, and feasible definition is lacking. We tested the validity of a home-infusion CLABSI surveillance definition and the feasibility and acceptability of its implementation.
Design:
Mixed-methods study including validation of CLABSI cases and semistructured interviews with staff applying these approaches.
Setting:
This study was conducted in 5 large home-infusion agencies in a CLABSI prevention collaborative across 14 states and the District of Columbia.
From May 2021 to May 2022, agencies implemented a home-infusion CLABSI surveillance definition, using 3 approaches to secondary bloodstream infections (BSIs): National Healthcare Safety Program (NHSN) criteria, modified NHSN criteria (only applying the 4 most common NHSN-defined secondary BSIs), and all home-infusion–onset bacteremia (HiOB). Data on all positive blood cultures were sent to an infection preventionist for validation. Surveillance staff underwent semistructured interviews focused on their perceptions of the definition 1 and 3–4 months after implementation.
Results:
Interrater reliability scores overall ranged from κ = 0.65 for the modified NHSN criteria to κ = 0.68 for the NHSN criteria to κ = 0.72 for the HiOB criteria. For the NHSN criteria, the agency-determined rate was 0.21 per 1,000 central-line (CL) days, and the validator-determined rate was 0.20 per 1,000 CL days. Overall, implementing a standardized definition was thought to be a positive change that would be generalizable and feasible though time-consuming and labor intensive.
Conclusions:
The home-infusion CLABSI surveillance definition was valid and feasible to implement.
Severe acute respiratory coronavirus virus 2 (SARS-CoV-2) transmissions among healthcare workers and hospitalized patients are challenging to confirm. Investigation of infected persons often reveals multiple potential risk factors for viral acquisition. We combined exposure investigation with genomic analysis confirming 2 hospital-based clusters. Prolonged close contact with unmasked, unrecognized infectious, individuals was a common risk.
The aim of this study was to develop a novel live Delphi method to obtain a consensus on the skills and competencies that a new ENT registrar (specialty trainee level 3) should possess. Developing a clear outcome set for core surgical trainees is important so that this phase of training can be directed at specific aims.
Method
Attendees at the North of England meeting participated in this Delphi exercise. Participants comprised a range of ENT professionals from medical student to consultant surgeons. The main outcome measure of consensus was defined prior to the study as the median response value: strongly agree or more for positive consensus and strongly disagree or less for negative consensus.
Results
This study identified multiple areas that reached consensus relating to elective and operative skills and demonstrated agreement in areas relating to ENT specific and allied specialty experience.
Conclusion
This study has highlighted a novel method for shaping surgical curricula.
We analyzed the impact of a 7-day recurring asymptomatic SARS-CoV-2 testing protocol for all patients hospitalized at a large academic center. Overall, 40 new cases were identified, and 1 of 3 occurred after 14 days of hospitalization. Recurring testing can identify unrecognized infections, especially during periods of elevated community transmission.
Physical distancing among healthcare workers (HCWs) is an essential strategy in preventing HCW-to-HCWs transmission of severe acute respiratory coronavirus virus 2 (SARS-CoV-2).
Objective:
To understand barriers to physical distancing among HCWs on an inpatient unit and identify strategies for improvement.
Design:
Qualitative study including observations and semistructured interviews conducted over 3 months.
Setting:
A non–COVID-19 adult general medical unit in an academic tertiary-care hospital.
Participants:
HCWs based on the unit.
Methods:
We performed a qualitative study in which we (1) observed HCW activities and proximity to each other on the unit during weekday shifts July–October 2020 and (2) conducted semi-structured interviews of HCWs to understand their experiences with and perspectives of physical distancing in the hospital. Qualitative data were coded based on a human-factors engineering model.
Results:
We completed 25 hours of observations and 20 HCW interviews. High-risk interactions often occurred during handoffs of care at shift changes and patient rounds, when HCWs gathered regularly in close proximity for at least 15 minutes. Identified barriers included spacing and availability of computers, the need to communicate confidential patient information, and the desire to maintain relationships at work.
Conclusions:
Physical distancing can be improved in hospitals by restructuring computer workstations, work rooms, and break rooms; applying visible cognitive aids; adapting shift times; and supporting rounds and meetings with virtual conferencing. Additional strategies to promote staff adherence to physical distancing include rewarding positive behaviors, having peer leaders model physical distancing, and encouraging additional safe avenues for social connection at a safe distance.
This SHEA white paper identifies knowledge gaps and challenges in healthcare epidemiology research related to coronavirus disease 2019 (COVID-19) with a focus on core principles of healthcare epidemiology. These gaps, revealed during the worst phases of the COVID-19 pandemic, are described in 10 sections: epidemiology, outbreak investigation, surveillance, isolation precaution practices, personal protective equipment (PPE), environmental contamination and disinfection, drug and supply shortages, antimicrobial stewardship, healthcare personnel (HCP) occupational safety, and return to work policies. Each section highlights three critical healthcare epidemiology research questions with detailed description provided in supplementary materials. This research agenda calls for translational studies from laboratory-based basic science research to well-designed, large-scale studies and health outcomes research. Research gaps and challenges related to nursing homes and social disparities are included. Collaborations across various disciplines, expertise and across diverse geographic locations will be critical.
To define a generic diet to protect human health and food system sustainability based on three dimensions: animal:plant ratio, degree of food processing and food diversity.
Design/setting:
The percentages of maximum animal and ultra-processed energy content were evaluated from scientific papers (Web of Science database) and reports from international scientific institutions. Then, a weekly French standard diet, including these percentages and food diversity (≥42 different foods), was designed to calculate adequacy to nutritional needs.
Results:
Based on traditional and scientifically based healthy diets, and on foresight scenarios for sustainable diets at horizon 2050, a median daily animal energy content intake of 15 % was found to be protective towards both human health and environment. Based on epidemiological studies associating ultra-processed energy consumption with increased overweight/obesity risk, a precautionary threshold of approximately 15 % ultra-processed energy content was observed. The French diet allows addressing all nutritional needs and other nutritional indicators such as maximum salt and simple sugar consumption, α-linolenic acid:linoleic acid ratio and essential amino acids. This diet was named the ‘3V rule’ for Végétal (plant), Vrai (real) and Varié (varied, if possible organic, local and seasonal). This generic diet can be adapted according to regional traditions and environmental characteristics. Excluding only one dimension of it would threaten both health and food system sustainability.
Conclusions:
Tending towards a 3V-based diet, while respecting local constraints, should allow preserving human health, environment (greenhouse gas emissions, pollution, deforestation, etc.), small farmers, animal welfare and biodiversity, culinary traditions and socioeconomics (including an alleviation of public health cost).
No standardized surveillance criteria exist for surgical site infection after breast tissue expander (BTE) access. This report provides a framework for defining postaccess BTE infections and identifies contributing factors to infection during the expansion period. Implementing infection prevention guidelines for BTE access may reduce postaccess BTE infections.
We compared the fluorescent gel removal rate using fewer high-touch surfaces (HTSs) and rooms and determined the optimum number of HTSs and rooms needed to ensure accuracy using 2,942 HTSs in 228 rooms on 13 units. Randomly selecting 3 HTS in 2 rooms predicted the optimal removal rate.
Targeted screening for carbapenem-resistant organisms (CROs), including carbapenem-resistant Enterobacteriaceae (CRE) and carbapenemase-producing organisms (CPOs), remains limited; recent data suggest that existing policies miss many carriers.
Objective:
Our objective was to measure the prevalence of CRO and CPO perirectal colonization at hospital unit admission and to use machine learning methods to predict probability of CRO and/or CPO carriage.
Methods:
We performed an observational cohort study of all patients admitted to the medical intensive care unit (MICU) or solid organ transplant (SOT) unit at The Johns Hopkins Hospital between July 1, 2016 and July 1, 2017. Admission perirectal swabs were screened for CROs and CPOs. More than 125 variables capturing preadmission clinical and demographic characteristics were collected from the electronic medical record (EMR) system. We developed models to predict colonization probabilities using decision tree learning.
Results:
Evaluating 2,878 admission swabs from 2,165 patients, we found that 7.5% and 1.3% of swabs were CRO and CPO positive, respectively. Organism and carbapenemase diversity among CPO isolates was high. Despite including many characteristics commonly associated with CRO/CPO carriage or infection, overall, decision tree models poorly predicted CRO and CPO colonization (C statistics, 0.57 and 0.58, respectively). In subgroup analyses, however, models did accurately identify patients with recent CRO-positive cultures who use proton-pump inhibitors as having a high likelihood of CRO colonization.
Conclusions:
In this inpatient population, CRO carriage was infrequent but was higher than previously published estimates. Despite including many variables associated with CRO/CPO carriage, models poorly predicted colonization status, likely due to significant host and organism heterogeneity.
In this systematic evaluation of fluorescent gel markers (FGM) applied to high-touch surfaces with a metered applicator (MA) made for the purpose versus a generic cotton swab (CS), removal rates were 60.5% (476 of 787) for the MA and 64.3% (506 of 787) for the CS. MA-FGM removal interpretation was more consistent, 83% versus 50% not removed, possibly due to less varied application and more adhesive gel.
Using samples collected for VRE surveillance, we evaluated unit admission prevalence of carbapenem-resistant Enterobacteriaceae (CRE) perirectal colonization and whether CRE carriers (unknown to staff) were on contact precautions for other indications. CRE colonization at unit admission was infrequent (3.9%). Most CRE carriers were not on contact precautions, representing a reservoir for healthcare-associated CRE transmission.