We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Testing of asymptomatic patients for severe acute respiratory coronavirus virus 2 (SARS-CoV-2) (ie, “asymptomatic screening) to attempt to reduce the risk of nosocomial transmission has been extensive and resource intensive, and such testing is of unclear benefit when added to other layers of infection prevention mitigation controls. In addition, the logistic challenges and costs related to screening program implementation, data noting the lack of substantial aerosol generation with elective controlled intubation, extubation, and other procedures, and the adverse patient and facility consequences of asymptomatic screening call into question the utility of this infection prevention intervention. Consequently, the Society for Healthcare Epidemiology of America (SHEA) recommends against routine universal use of asymptomatic screening for SARS-CoV-2 in healthcare facilities. Specifically, preprocedure asymptomatic screening is unlikely to provide incremental benefit in preventing SARS-CoV-2 transmission in the procedural and perioperative environment when other infection prevention strategies are in place, and it should not be considered a requirement for all patients. Admission screening may be beneficial during times of increased virus transmission in some settings where other layers of controls are limited (eg, behavioral health, congregate care, or shared patient rooms), but widespread routine use of admission asymptomatic screening is not recommended over strengthening other infection prevention controls. In this commentary, we outline the challenges surrounding the use of asymptomatic screening, including logistics and costs of implementing a screening program, and adverse patient and facility consequences. We review data pertaining to the lack of substantial aerosol generation during elective controlled intubation, extubation, and other procedures, and we provide guidance for when asymptomatic screening for SARS-CoV-2 may be considered in a limited scope.
To determine the impact of a documented penicillin or cephalosporin allergy on the development of surgical site infections (SSIs).
Background:
Appropriate preoperative antibiotic prophylaxis reduces SSI risk, but documented antibiotic allergies influence the choice of prophylactic agents. Few studies have examined the relationship between a reported antibiotic allergy and risk of SSI and to what extent this relationship is modified by the antibiotic class given for prophylaxis.
Methods:
We conducted a retrospective cohort study of adult patients undergoing coronary artery bypass, craniotomy, spinal fusion, laminectomy, hip arthroplasty and knee arthroplasty at 3 hospitals from July 1, 2013, to December 31, 2017. We built a multivariable logistic regression model to calculate the adjusted odds ratio (aOR) of developing an SSI among patients with and without patient-reported penicillin or cephalosporin allergies. We also examined effect measure modification (EMM) to determine whether surgical prophylaxis affected the association between reported allergy and SSI.
Results:
We analyzed 39,972 procedures; 1,689 (4.2%) with a documented patient penicillin or cephalosporin allergy, and 374 (0.9%) resulted in an SSI. Patients with a reported penicillin or cephalosporin allergy were more likely to develop an SSI compared to patients who did not report an allergy to penicillin or cephalosporins (adjusted odds ratio, 3.26; 95% confidence interval, 2.71–3.93). Surgical prophylaxis did not have significant EMM on this association.
Conclusions:
Patients who reported a penicillin or cephalosporin allergy had higher odds of developing an SSI than nonallergic patients. However, the increase in odds is not completely mediated by the type of surgical prophylaxis. Instead, a reported allergy may be a surrogate marker for a more complicated patient population.
To estimate the impact of California’s antimicrobial stewardship program (ASP) mandate on methicillin-resistant Staphylococcus aureus (MRSA) and Clostridioides difficile infection (CDI) rates in acute-care hospitals.
Population:
Centers for Medicare and Medicaid Services (CMS)–certified acute-care hospitals in the United States.
Data Sources:
2013–2017 data from the CMS Hospital Compare, Provider of Service File and Medicare Cost Reports.
Methods:
Difference-in-difference model with hospital fixed effects to compare California with all other states before and after the ASP mandate. We considered were standardized infection ratios (SIRs) for MRSA and CDI as the outcomes. We analyzed the following time-variant covariates: medical school affiliation, bed count, quality accreditation, number of changes in ownership, compliance with CMS requirements, % intensive care unit beds, average length of stay, patient safety index, and 30-day readmission rate.
Results:
In 2013, California hospitals had an average MRSA SIR of 0.79 versus 0.94 in other states, and an average CDI SIR of 1.01 versus 0.77 in other states. California hospitals had increases (P < .05) of 23%, 30%, and 20% in their MRSA SIRs in 2015, 2016, and 2017, respectively. California hospitals were associated with a 20% (P < .001) decrease in the CDI SIR only in 2017.
Conclusions:
The mandate was associated with a decrease in CDI SIR and an increase in MRSA SIR.
In alcoholism, one relevant mechanism contributing to relapse is the exposure to stimuli that are associated with alcohol intake. Such conditioned cues can elicit conditioned responses like alcohol craving and consumption. In the last decade, considerable progress has been made in identifying basic neuronal mechanisms that underlie cue-induced alcohol craving.
Objectives/ aims
We explored whether functional brain activation during exposure to alcohol-associated stimuli is related to the prospective relapse risk in detoxified alcohol-dependent patients.
Methods
46 alcohol-dependent and 46 healthy volunteers participated in a fMRI study using a cue reactivity paradigm, in which visual alcohol-related and control stimuli were presented. Patients were followed for 3 months. Afterwards data was analysed regarding the subsequent relapse, resulting in 16 abstainers and 30 relapsers.
Results
Alcohol-related versus neutral stimuli activated a frontocortical-limbic network including inferior, medial and middle frontal gyrus as well as putamen in the group of patients relative to healthy controls. Moreover, abstainers showed a stronger activation in orbitofrontal cortex as well as midbrain during the presentation of alcohol-related cues whereas relapsers revealed a stronger activation of cingulate gyrus.
Conclusions
This study suggests that cue-induced activation of orbitofrontal cortex and dopaminergic innervated midbrain is negatively associated with the prospective relapse risk in alcohol-dependent patients. This could indicate a more pronounced and conscious processing of alcohol cues which might serve as a warning signal and a behavioural controlling function. In contrast, prospective relapsers showed a stronger activation of cingulate gyrus, a region involved in the attribution of motivational value.
Healthcare personnel who perform invasive procedures and are living with HIV or hepatitis B have been required to self-notify the NC state health department since 1992. State coordinated review of HCP utilizes a panel of experts to evaluate transmission risk and recommend infection prevention measures. We describe how this practice balances HCP privacy and patient safety and health.
To measure the association between statewide adoption of the Centers for Disease Control and Prevention’s (CDC’s) Core Elements for Hospital Antimicrobial Stewardship Programs (Core Elements) and hospital-associated methicillin-resistant Staphylococcus aureus bacteremia (MRSA) and Clostridioides difficile infection (CDI) rates in the United States. We hypothesized that states with a higher percentage of reported compliance with the Core Elements have significantly lower MRSA and CDI rates.
Participants:
All US states.
Design:
Observational longitudinal study.
Methods:
We used 2014–2016 data from Hospital Compare, Provider of Service files, Medicare cost reports, and the CDC’s Patient Safety Atlas website. Outcomes were MRSA standardized infection ratio (SIR) and CDI SIR. The key explanatory variable was the percentage of hospitals that meet the Core Elements in each state. We estimated state and time fixed-effects models with time-variant controls, and we weighted our analyses for the number of hospitals in the state.
Results:
The percentage of hospitals reporting compliance with the Core Elements between 2014 and 2016 increased in all states. A 1% increase in reported ASP compliance was associated with a 0.3% decrease (P < .01) in CDIs in 2016 relative to 2014. We did not find an association for MRSA infections.
Conclusions:
Increasing documentation of the Core Elements may be associated with decreases in the CDI SIR. We did not find evidence of such an association for the MRSA SIR, probably due to the short length of the study and variety of stewardship strategies that ASPs may encompass.
To update current estimates of non–device-associated pneumonia (ND pneumonia) rates and their frequency relative to ventilator associated pneumonia (VAP), and identify risk factors for ND pneumonia.
Design:
Cohort study.
Setting:
Academic teaching hospital.
Patients:
All adult hospitalizations between 2013 and 2017 were included. Pneumonia (device associated and non–device associated) were captured through comprehensive, hospital-wide active surveillance using CDC definitions and methodology.
Results:
From 2013 to 2017, there were 163,386 hospitalizations (97,485 unique patients) and 771 pneumonia cases (520 ND pneumonia and 191 VAP). The rate of ND pneumonia remained stable, with 4.15 and 4.54 ND pneumonia cases per 10,000 hospitalization days in 2013 and 2017 respectively (P = .65). In 2017, 74% of pneumonia cases were ND pneumonia. Male sex and increasing age we both associated with increased risk of ND pneumonia. Additionally, patients with chronic bronchitis or emphysema (hazard ratio [HR], 2.07; 95% confidence interval [CI], 1.40–3.06), congestive heart failure (HR, 1.48; 95% CI, 1.07–2.05), or paralysis (HR, 1.72; 95% CI, 1.09–2.73) were also at increased risk, as were those who were immunosuppressed (HR, 1.54; 95% CI, 1.18–2.00) or in the ICU (HR, 1.49; 95% CI, 1.06–2.09). We did not detect a change in ND pneumonia risk with use of chlorhexidine mouthwash, total parenteral nutrition, all medications of interest, and prior ventilation.
Conclusion:
The incidence rate of ND pneumonia did not change from 2013 to 2017, and 3 of 4 nosocomial pneumonia cases were non–device associated. Hospital infection prevention programs should consider expanding the scope of surveillance to include non-ventilated patients. Future research should continue to look for modifiable risk factors and should assess potential prevention strategies.
We propose a new theoretical model for metal pad roll instability in idealized cylindrical reduction cells. In addition to the usual destabilizing effects, we model viscous and Joule dissipation and some capillary effects. The resulting explicit formulas are used as theoretical benchmarks for two multiphase magnetohydrodynamic solvers, OpenFOAM and SFEMaNS. Our explicit formula for the viscous damping rate of gravity waves in cylinders with two fluid layers compares excellently to experimental measurements. We use our model to locate the viscously controlled instability threshold in cylindrical shallow reduction cells but also in Mg–Sb liquid metal batteries with decoupled interfaces.
To update current estimates of non–device-associated urinary tract infection (ND-UTI) rates and their frequency relative to catheter-associated UTIs (CA-UTIs) and to identify risk factors for ND-UTIs.
Design:
Cohort study.
Setting:
Academic teaching hospital.
Patients:
All adult hospitalizations between 2013 and 2017 were included. UTIs (device and non-device associated) were captured through comprehensive, hospital-wide active surveillance using Centers for Disease Control and Prevention case definitions and methodology.
Results:
From 2013 to 2017 there were 163,386 hospitalizations (97,485 unique patients) and 1,273 UTIs (715 ND-UTIs and 558 CA-UTIs). The rate of ND-UTIs remained stable, decreasing slightly from 6.14 to 5.57 ND-UTIs per 10,000 hospitalization days during the study period (P = .15). However, the proportion of UTIs that were non–device related increased from 52% to 72% (P < .0001). Female sex (hazard ratio [HR], 1.94; 95% confidence interval [CI], 1.50–2.50) and increasing age were associated with increased ND-UTI risk. Additionally, the following conditions were associated with increased risk: peptic ulcer disease (HR, 2.25; 95% CI, 1.04–4.86), immunosuppression (HR, 1.48; 95% CI, 1.15–1.91), trauma admissions (HR, 1.36; 95% CI, 1.02–1.81), total parenteral nutrition (HR, 1.99; 95% CI, 1.35–2.94) and opioid use (HR, 1.62; 95% CI, 1.10–2.32). Urinary retention (HR, 1.41; 95% CI, 0.96–2.07), suprapubic catheterization (HR, 2.28; 95% CI, 0.88–5.91), and nephrostomy tubes (HR, 2.02; 95% CI, 0.83–4.93) may also increase risk, but estimates were imprecise.
Conclusion:
Greater than 70% of UTIs are now non–device associated. Current targeted surveillance practices should be reconsidered in light of this changing landscape. We identified several modifiable risk factors for ND-UTIs, and future research should explore the impact of prevention strategies that target these factors.
A lasting legacy of the International Polar Year (IPY) 2007–2008 was the promotion of the Permafrost Young Researchers Network (PYRN), initially an IPY outreach and education activity by the International Permafrost Association (IPA). With the momentum of IPY, PYRN developed into a thriving network that still connects young permafrost scientists, engineers, and researchers from other disciplines. This research note summarises (1) PYRN’s development since 2005 and the IPY’s role, (2) the first 2015 PYRN census and survey results, and (3) PYRN’s future plans to improve international and interdisciplinary exchange between young researchers. The review concludes that PYRN is an established network within the polar research community that has continually developed since 2005. PYRN’s successful activities were largely fostered by IPY. With >200 of the 1200 registered members active and engaged, PYRN is capitalising on the availability of social media tools and rising to meet environmental challenges while maintaining its role as a successful network honouring the legacy of IPY.
Hospital environmental surfaces are frequently contaminated by microorganisms. However, the causal mechanism of bacterial contamination of the environment as a source of transmission is still debated. This prospective study was performed to characterize the nature of multidrug-resistant organism (MDRO) transmission between the environment and patients using standard microbiological and molecular techniques.
Setting
Prospective cohort study at 2 academic medical centers.
Design
A prospective multicenter study to characterize the nature of bacterial transfer events between patients and environmental surfaces in rooms that previously housed patients with 1 of 4 ‘marker’ MDROs: methicillin-resistant Staphylococcus aureus, vancomycin-resistant enterococci, Clostridium difficile, and MDR Acinetobacter baumannii. Environmental and patient microbiological samples were obtained on admission into a freshly disinfected inpatient room. Repeat samples from room surfaces and patients were taken on days 3 and 7 and each week the patient stayed in the same room. The bacterial identity, antibiotic susceptibility, and molecular sequences were compared between organisms found in the environment samples and patient sources.
Results
We enrolled 80 patient–room admissions; 9 of these patients (11.3%) were asymptomatically colonized with MDROs at study entry. Hospital room surfaces were contaminated with MDROs despite terminal disinfection in 44 cases (55%). Microbiological Bacterial Transfer events either to the patient, the environment, or both occurred in 12 patient encounters (18.5%) from the microbiologically evaluable cohort.
Conclusions
Microbiological Bacterial Transfer events between patients and the environment were observed in 18.5% of patient encounters and occurred early in the admission. This study suggests that research on prevention methods beyond the standard practice of room disinfection at the end of a patient’s stay is needed to better prevent acquisition of MDROs through the environment.
We have developed a system using ‘forescatter detectors’ for backscattered imaging of specimen surfaces inclined at 50–80° to the incident beam (inclined-scanning) in the SEM. These detectors comprise semiconductor chips placed below the tilted specimen. Forescatter detectors provide an orientation contrast (OC) image to complement quantitative crystallographic data from electron backscatter patterns (EBSP). Specimens were imaged using two detector geometries and these images were compared to those collected with the specimen surface normal to the incident beam (normal-scanning) using conventional backscattered electron detector geometries and also to an automated technique, orientation imaging microscopy (OIM). When normal-scanning, the component of the BSE signal relating to the mean atomic number (z) of the material is an order of magnitude greater than any OC component, making OC imaging in polyphase specimens almost impossible. Images formed in inclined-scanning, using forescatter detectors, have OC and z-contrast signals of similar magnitude, allowing OC imaging in polyphase specimens.
OC imaging is purely qualitative, and by repeatedly imaging the same area using different specimen-beam geometries, we found that a single image picks out less than 60% of the total microstructural information and as many as 6 combined images are required to give the full data set. The OIM technique is limited by the EBSP resolution (1–2°) and subsequently misses a lot of microstructural information. The use of forescatter detectors is the most practical means of imaging OC in tilted specimens, but it is also a powerful tool in its own right for imaging microstructures in polyphase specimens, an essential asset for geological work.
To summarize and discuss logistic and administrative challenges we encountered during the Benefits of Enhanced Terminal Room (BETR) Disinfection Study and lessons learned that are pertinent to future utilization of ultraviolet (UV) disinfection devices in other hospitals
DESIGN
Multicenter cluster randomized trial
SETTING AND PARTICIPANTS
Nine hospitals in the southeastern United States
METHODS
All participating hospitals developed systems to implement 4 different strategies for terminal room disinfection. We measured compliance with disinfection strategy, barriers to implementation, and perceptions from nurse managers and environmental services (EVS) supervisors throughout the 28-month trial.
RESULTS
Implementation of enhanced terminal disinfection with UV disinfection devices provides unique challenges, including time pressures from bed control personnel, efficient room identification, negative perceptions from nurse managers, and discharge volume. In the course of the BETR Disinfection Study, we utilized several strategies to overcome these barriers: (1) establishing safety as the priority; (2) improving communication between EVS, bed control, and hospital administration; (3) ensuring availability of necessary resources; and (4) tracking and providing feedback on compliance. Using these strategies, we deployed ultraviolet (UV) disinfection devices in 16,220 (88%) of 18,411 eligible rooms during our trial (median per hospital, 89%; IQR, 86%–92%).
CONCLUSIONS
Implementation of enhanced terminal room disinfection strategies using UV devices requires recognition and mitigation of 2 key barriers: (1) timely and accurate identification of rooms that would benefit from enhanced terminal disinfection and (2) overcoming time constraints to allow EVS cleaning staff sufficient time to properly employ enhanced terminal disinfection methods.
Burn patients are particularly vulnerable to infection, and an estimated half of all burn deaths are due to infections. This study explored risk factors for healthcare-associated infections (HAIs) in adult burn patients.
DESIGN
Retrospective cohort study.
SETTING
Tertiary-care burn center.
PATIENTS
Adults (≥18 years old) admitted with burn injury for at least 2 days between 2004 and 2013.
METHODS
HAIs were determined in real-time by infection preventionists using Centers for Disease Control and Prevention criteria. Multivariable Cox proportional hazards regression was used to estimate the direct effect of each risk factor on time to HAI, with inverse probability of censor weights to address potentially informative censoring. Effect measure modification by burn size was also assessed.
RESULTS
Overall, 4,426 patients met inclusion criteria, and 349 (7.9%) patients had at least 1 HAI within 60 days of admission. Compared to <5% total body surface area (TBSA), patients with 5%–10% TBSA were almost 3 times as likely to acquire an HAI (hazard ratio [HR], 2.92; 95% CI, 1.63–5.23); patients with 10%–20% TBSA were >6 times as likely to acquire an HAI (HR, 6.38; 95% CI, 3.64–11.17); and patients with >20% TBSA were >10 times as likely to acquire an HAI (HR, 10.33; 95% CI, 5.74–18.60). Patients with inhalational injury were 1.5 times as likely to acquire an HAI (HR, 1.61; 95% CI, 1.17–2.22). The effect of inhalational injury (P=.09) appeared to be larger among patients with ≤20% TBSA.
CONCLUSIONS
Larger burns and inhalational injury were associated with increased incidence of HAIs. Future research should use these risk factors to identify potential interventions.
To determine whether antimicrobial-impregnated textiles decrease the acquisition of pathogens by healthcare provider (HCP) clothing.
DESIGN
We completed a 3-arm randomized controlled trial to test the efficacy of 2 types of antimicrobial-impregnated clothing compared to standard HCP clothing. Cultures were obtained from each nurse participant, the healthcare environment, and patients during each shift. The primary outcome was the change in total contamination on nurse scrubs, measured as the sum of colony-forming units (CFU) of bacteria.
PARTICIPANTS AND SETTING
Nurses working in medical and surgical ICUs in a 936-bed tertiary-care hospital.
INTERVENTION
Nurse subjects wore standard cotton-polyester surgical scrubs (control), scrubs that contained a complex element compound with a silver-alloy embedded in its fibers (Scrub 1), or scrubs impregnated with an organosilane-based quaternary ammonium and a hydrophobic fluoroacrylate copolymer emulsion (Scrub 2). Nurse participants were blinded to scrub type and randomly participated in all 3 arms during 3 consecutive 12-hour shifts in the intensive care unit.
RESULTS
In total, 40 nurses were enrolled and completed 3 shifts. Analyses of 2,919 cultures from the environment and 2,185 from HCP clothing showed that scrub type was not associated with a change in HCP clothing contamination (P=.70). Mean difference estimates were 0.118 for the Scrub 1 arm (95% confidence interval [CI], −0.206 to 0.441; P=.48) and 0.009 for the Scrub 2 rm (95% CI, −0.323 to 0.342; P=.96) compared to the control. HCP became newly contaminated with important pathogens during 19 of the 120 shifts (16%).
CONCLUSIONS
Antimicrobial-impregnated scrubs were not effective at reducing HCP contamination. However, the environment is an important source of HCP clothing contamination.
A Shubata soil was modified by the addition of organic matter or montmorillonite clay. Three annual applications of fluometuron [1,1-dimethyl-3-(α,α,α-trifluro-m-tolyl)urea], prometryne [2,4-bis(isopropylamino)-6-(methylthio)-s-triazine], and trifluralin (α,α,α-trifluro-2,6-dinitro-N,N-dipropyl-p-toluidine) were made and soil samples analyzed at the end of the third year. Less than 0.05 ppm prometryne was found in the subsoils. Prometryne present (by chemical analysis) in the surface soils ranged from 2% (of that originally applied) in the soil without additives to 20% in the soils modified with organic matter and clay. Approximately 12% of the prometryne present in the soil modified with organic matter and 32% of that in the soil modified with clay was detectable by bioassay. Trifluralin concentration (by chemical analysis) ranged from 2% in the soil without additives and soil modified with clay to 11% in the soil modified with organic matter. Approximately 100% of the trifluralin in soil without additives and soil modified with clay was detectable by bioassay while only 15% of that present in soil modified with organic matter was biologically detectable. Fluometuron was presented in the soils in amounts ranging from 0.6 to 1.9% of that applied. Prometryne and fluometuron greatly decreased the large crabgrass [Digitaria sanguinalis (L.) Scop.] population and increased the population of yellow nutsedge (Cyperus esculentus L.) and crowfootgrass [Dactyloctenium aegyptium (L.) Richter]. Trifluralin greatly decreased large crabgrass and crowfootgrass populations, but the plots became completely infested with yellow nutsedge. Horseweed [Conyza canadensis (L.) Cronq.] became the overall dominant species the first year after the field was taken out of production. Yellow nutsedge, bermudagrass [Cynodon dactylon (L.) Pers.], and other grasses were also prevalent.