To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We identified a pseudo-outbreak of Mycobacterium avium in an outpatient bronchoscopy clinic following an increase in clinic procedure volume. We terminated the pseudo-outbreak by increasing the frequency of automated endoscope reprocessors (AER) filter changes from quarterly to monthly. Filter changing schedules should depend on use rather than fixed time intervals.
Hospital environmental surfaces are frequently contaminated by microorganisms. However, the causal mechanism of bacterial contamination of the environment as a source of transmission is still debated. This prospective study was performed to characterize the nature of multidrug-resistant organism (MDRO) transmission between the environment and patients using standard microbiological and molecular techniques.
Prospective cohort study at 2 academic medical centers.
A prospective multicenter study to characterize the nature of bacterial transfer events between patients and environmental surfaces in rooms that previously housed patients with 1 of 4 ‘marker’ MDROs: methicillin-resistant Staphylococcus aureus, vancomycin-resistant enterococci, Clostridium difficile, and MDR Acinetobacter baumannii. Environmental and patient microbiological samples were obtained on admission into a freshly disinfected inpatient room. Repeat samples from room surfaces and patients were taken on days 3 and 7 and each week the patient stayed in the same room. The bacterial identity, antibiotic susceptibility, and molecular sequences were compared between organisms found in the environment samples and patient sources.
We enrolled 80 patient–room admissions; 9 of these patients (11.3%) were asymptomatically colonized with MDROs at study entry. Hospital room surfaces were contaminated with MDROs despite terminal disinfection in 44 cases (55%). Microbiological Bacterial Transfer events either to the patient, the environment, or both occurred in 12 patient encounters (18.5%) from the microbiologically evaluable cohort.
Microbiological Bacterial Transfer events between patients and the environment were observed in 18.5% of patient encounters and occurred early in the admission. This study suggests that research on prevention methods beyond the standard practice of room disinfection at the end of a patient’s stay is needed to better prevent acquisition of MDROs through the environment.
In this prospective study, we monitored 4 epidemiologically important pathogens (EIPs): methicillin-resistane Staphylococcus aureus (MRSA), vancomycin-resistant enterococci (VRE), Clostridium difficile, and multidrug-resistant (MDR) Acinetobacter to assess the effectiveness of 3 enhanced disinfection strategies for terminal room disinfection against standard practice. Our data demonstrated that a decrease in room contamination with EIPs of 94% was associated with a 35% decrease in subsequent patient colonization and/or infection.
To summarize and discuss logistic and administrative challenges we encountered during the Benefits of Enhanced Terminal Room (BETR) Disinfection Study and lessons learned that are pertinent to future utilization of ultraviolet (UV) disinfection devices in other hospitals
Multicenter cluster randomized trial
SETTING AND PARTICIPANTS
Nine hospitals in the southeastern United States
All participating hospitals developed systems to implement 4 different strategies for terminal room disinfection. We measured compliance with disinfection strategy, barriers to implementation, and perceptions from nurse managers and environmental services (EVS) supervisors throughout the 28-month trial.
Implementation of enhanced terminal disinfection with UV disinfection devices provides unique challenges, including time pressures from bed control personnel, efficient room identification, negative perceptions from nurse managers, and discharge volume. In the course of the BETR Disinfection Study, we utilized several strategies to overcome these barriers: (1) establishing safety as the priority; (2) improving communication between EVS, bed control, and hospital administration; (3) ensuring availability of necessary resources; and (4) tracking and providing feedback on compliance. Using these strategies, we deployed ultraviolet (UV) disinfection devices in 16,220 (88%) of 18,411 eligible rooms during our trial (median per hospital, 89%; IQR, 86%–92%).
Implementation of enhanced terminal room disinfection strategies using UV devices requires recognition and mitigation of 2 key barriers: (1) timely and accurate identification of rooms that would benefit from enhanced terminal disinfection and (2) overcoming time constraints to allow EVS cleaning staff sufficient time to properly employ enhanced terminal disinfection methods.
To describe the epidemiology of complex surgical site infection (SSI) following commonly performed surgical procedures in community hospitals and to characterize trends of SSI prevalence rates over time for MRSA and other common pathogens
We prospectively collected SSI data at 29 community hospitals in the southeastern United States from 2008 through 2012. We determined the overall prevalence rates of SSI for commonly performed procedures during this 5-year study period. For each year of the study, we then calculated prevalence rates of SSI stratified by causative organism. We created log-binomial regression models to analyze trends of SSI prevalence over time for all pathogens combined and specifically for MRSA.
A total of 3,988 complex SSIs occurred following 532,694 procedures (prevalence rate, 0.7 infections per 100 procedures). SSIs occurred most frequently after small bowel surgery, peripheral vascular bypass surgery, and colon surgery. Staphylococcus aureus was the most common pathogen. The prevalence rate of SSI decreased from 0.76 infections per 100 procedures in 2008 to 0.69 infections per 100 procedures in 2012 (prevalence rate ratio [PRR], 0.90; 95% confidence interval [CI], 0.82–1.00). A more substantial decrease in MRSA SSI (PRR, 0.69; 95% CI, 0.54–0.89) was largely responsible for this overall trend.
The prevalence of MRSA SSI decreased from 2008 to 2012 in our network of community hospitals. This decrease in MRSA SSI prevalence led to an overall decrease in SSI prevalence over the study period.
To determine the association (1) between shorter operative duration and surgical site infection (SSI) and (2) between surgeon median operative duration and SSI risk among first-time hip and knee arthroplasties.
Retrospective cohort study
A total of 43 community hospitals located in the southeastern United States.
Adults who developed SSIs according to National Healthcare Safety Network criteria within 365 days of first-time knee or hip arthroplasties performed between January 1, 2008 and December 31, 2012.
Log-binomial regression models estimated the association (1) between operative duration and SSI outcome and (2) between surgeon median operative duration and SSI outcome. Hip and knee arthroplasties were evaluated in separate models. Each model was adjusted for American Society of Anesthesiology score and patient age.
A total of 25,531 hip arthroplasties and 42,187 knee arthroplasties were included in the study. The risk of SSI in knee arthroplasties with an operative duration shorter than the 25th percentile was 0.40 times the risk of SSI in knee arthroplasties with an operative duration between the 25th and 75th percentile (risk ratio [RR], 0.40; 95% confidence interval [CI], 0.38–0.56; P<.01). Short operative duration did not demonstrate significant association with SSI for hip arthroplasties (RR, 1.04; 95% CI, 0.79–1.37; P=.36). Knee arthroplasty surgeons with shorter median operative durations had a lower risk of SSI than surgeons with typical median operative durations (RR, 0.52; 95% CI, 0.43–0.64; P<.01).
Short operative durations were not associated with a higher SSI risk for knee or hip arthroplasty procedures in our analysis.
Infect. Control Hosp. Epidemiol. 2015;36(12):1431–1436
The Hawthorne Effect is a prevalent observer effect that causes behavioral changes among participants of epidemiological studies or infection control interventions. The purpose of the review is to describe the origins of the Hawthorne Effect, to understand the term in relation to current scientific literature, to describe characteristics of the Hawthorne effect, and to discuss methods to quantify and overcome limitations associated with the Hawthorne Effect.
Infect. Control Hosp. Epidemiol. 2015;36(12):1444–1450
To evaluate seasonal variation in the rate of surgical site infections (SSI) following commonly performed surgical procedures.
Retrospective cohort study.
We analyzed 6 years (January 1, 2007, through December 31, 2012) of data from the 15 most commonly performed procedures in 20 hospitals in the Duke Infection Control Outreach Network. We defined summer as July through September. First, we performed 3 separate Poisson regression analyses (unadjusted, multivariable, and polynomial) to estimate prevalence rates and prevalence rate ratios of SSI following procedures performed in summer versus nonsummer months. Then, we stratified our results to obtain estimates based on procedure type and organism type. Finally, we performed a sensitivity analysis to test the robustness of our findings.
We identified 4,543 SSI following 441,428 surgical procedures (overall prevalence rate, 1.03/100 procedures). The rate of SSI was significantly higher during the summer compared with the remainder of the year (1.11/100 procedures vs 1.00/100 procedures; prevalence rate ratio, 1.11 [95% CI, 1.04–1.19]; P=.002). Stratum-specific SSI calculations revealed higher SSI rates during the summer for both spinal (P=.03) and nonspinal (P=.004) procedures and revealed higher rates during the summer for SSI due to either gram-positive cocci (P=.006) or gram-negative bacilli (P=.004). Multivariable regression analysis and sensitivity analyses confirmed our findings.
The rate of SSI following commonly performed surgical procedures was higher during the summer compared with the remainder of the year. Summer SSI rates remained elevated after stratification by organism and spinal versus nonspinal surgery, and rates did not change after controlling for other known SSI risk factors.
Infect. Control Hosp. Epidemiol. 2015;36(9):1011–1016
Funguria rarely represents true infection in the urinary tract. Excluding yeast from the catheter-associated urinary tract infection (CAUTI) surveillance definition reduced CAUTI rates by nearly 25% in community hospitals and at an academic, tertiary-care medical center.
Hospitals in the National Healthcare Safety Network began reporting laboratory-identified (LabID) Clostridium difficile infection (CDI) events in January 2013. Our study quantified the differences between the LabID and traditional surveillance methods.
A cohort of 29 community hospitals in the southeastern United States.
A period of 6 months (January 1, 2013, to June 30, 2013) of prospectively collected data using both LabID and traditional surveillance definitions were analyzed. CDI events with mismatched surveillance categories between LabID and traditional definitions were identified and characterized further. Hospital-onset CDI (HO-CDI) rates for the entire cohort of hospitals were calculated using each method, then hospital-specific HO-CDI rates and standardized infection ratios (SIRs) were calculated. Hospital rankings based on each CDI surveillance measure were compared.
A total of 1,252 incident LabID CDI events were identified during 708,551 patient-days; 286 (23%) mismatched CDI events were detected. The overall HO-CDI rate was 6.0 vs 4.4 per 10,000 patient-days for LabID and traditional surveillance, respectively (P<.001); of 29 hospitals, 25 (86%) detected a higher CDI rate using LabID compared with the traditional method. Hospital rank in the cohort differed greatly between surveillance measures. A rank change of at least 5 places occurred in 9 of 28 hospitals (32%) between LabID and traditional CDI surveillance methods, and for SIR.
LabID surveillance resulted in a higher hospital-onset CDI incidence rate than did traditional surveillance. Hospital-specific rankings varied based on the HO-CDI surveillance measure used. A clear understanding of differences in CDI surveillance measures is important when interpreting national and local CDI data.
Hospitals must report cases of methicillin-resistant Staphylococcus aureus bloodstream infection (BSI) using a new laboratory-identified (LabID) event reporting module. BSI rates obtained using LabID differ from rates of BSI obtained from traditional surveillance (concordance of healthcare facility–onset cases, 61%–76%) because definitions used to report LabID events are inconsistent with traditional BSI definitions
Infect Control Hosp Epidemiol 2014;35(10):1286–1289
(See the commentary by Pfeiffer and Beldavs, on pages 984–986.)
Describe the epidemiology of carbapenem-resistant Enterobacteriaceae (CRE) and examine the effect of lower carbapenem breakpoints on CRE detection.
Inpatient care at community hospitals.
All patients with CRE-positive cultures were included.
CRE isolated from 25 community hospitals were prospectively entered into a centralized database from January 2008 through December 2012. Microbiology laboratory practices were assessed using questionnaires.
A total of 305 CRE isolates were detected at 16 hospitals (64%). Patients with CRE had symptomatic infection in 180 cases (59%) and asymptomatic colonization in the remainder (125 cases; 41%). Klebsiella pneumoniae (277 isolates; 91%) was the most prevalent species. The majority of cases were healthcare associated (288 cases; 94%). The rate of CRE detection increased more than fivefold from 2008 (0.26 cases per 100,000 patient-days) to 2012 (1.4 cases per 100,000 patient-days; incidence rate ratio (IRR), 5.3 [95% confidence interval (CI), 1.22–22.7]; P = .01). Only 5 hospitals (20%) had adopted the 2010 Clinical and Laboratory Standards Institute (CLSI) carbapenem breakpoints. The 5 hospitals that adopted the lower carbapenem breakpoints were more likely to detect CRE after implementation of breakpoints than before (4.1 vs 0.5 cases per 100,000 patient-days; P < .001; IRR, 8.1 [95% CI, 2.7–24.6]). Hospitals that implemented the lower carbapenem breakpoints were more likely to detect CRE than were hospitals that did not (3.3 vs 1.1 cases per 100,000 patient-days; P = .01).
The rate of CRE detection increased fivefold in community hospitals in the southeastern United States from 2008 to 2012. Despite this, our estimates are likely underestimates of the true rate of CRE detection, given the low adoption of the carbapenem breakpoints recommended in the 2010 CLSI guidelines.
The updated 2013 Centers for Disease Control and Prevention/National Healthcare Safety Network definitions for surgical site infections (SSIs) reduced the duration of prolonged surveillance from 1 year to 90 days and defined which procedure types require prolonged surveillance. Applying the updated 2013 SSI definitions to cases analyzed using the pre-2013 surveillance definitions excluded 10% of previously identified SSIs.
Hospital-acquired infections (HAIs) occur commonly, cause significant harm to patients, and result in excess healthcare expenditures. The urinary tract is frequently cited as the most common site of HAI, but these estimates were extrapolated from National Nosocomial Infection Surveillance (NNIS) data from the 1990s. Updated information regarding the relative burden of specific types of HAIs would help governmental agencies and other stakeholders within the field of infection prevention to prioritize areas for research and innovation. The objective of our study was to assess the relative proportion of HAIs attributed to each of the following 5 types of infection in a network of community hospitals: catheter-associated urinary tract infection (CAUTI), surgical site infection (SSI), ventilator-associated pneumonia (VAP), central line–associated bloodstream infection (CLABSI), and Clostridium difficile infection (CDI).
We performed a retrospective cohort study using prospectively collected HAI surveillance data from hospitals participating in the Duke Infection Control Outreach Network (DICON). DICON hospital epidemiologists and liaison infection preventionists work directly with local hospital infection preventionists to provide surveillance data validation, benchmarking, and infection prevention consultation services to participating hospitals.
We describe and compare the epidemiology of catheter-associated urinary tract infection (CAUTI) occurring in non-intensive care unit (ICU) versus ICU wards in a network of community hospitals over a 2-year period. Overall, 72% of cases of CAUTI occurred in non-ICU patients, which indicates that this population is an important target for dedicated surveillance and prevention efforts.
To describe the epidemiology of ventilator-associated pneumonia (VAP) in community hospitals.
Design and Setting.
Prospective study in 31 community hospitals from 2007 to 2011.
VAP surveillance was performed by infection preventionists using the National Healthcare Safety Network protocol. VAP incidence was reported as number of events per 1,000 ventilator-days. We categorized hospitals into small (<30,000 patient-days/year), medium (30,000–60,000 patient-days/year), and large (>60,000 patient-days/year) groups and compared VAP incidence by hospital size.
The median VAP incidence was 1.4 (interquartile range, 0.4–2.4), and ventilator utilization ratio (VUR) was 0.33 (0.25–0.47). VAP incidence was higher in small hospitals (2.1) than medium (0.85) or large (0.69) hospitals (P = .03) despite a lower VUR in small hospitals (0.29 vs 0.31 vs 0.44, respectively; P = .01). The median age of 247 VAP cases was 64 (53-73); 136 (55.1%) were female; 142 (57.5%) were Caucasian; 170 (68.8%) were admitted from home. The length of stay and duration of ventilation were 26 (14–42) and 12 (4–21) days, respectively. The pre- and postinfection hospital stays were 8 (3–13) days and 14 (8–30) days, respectively. Data on outcomes were available in 214 cases (86.6%), and 75 (35.0%) cases died during hospitalization. The top 3 pathogens were methicillin-resistant Staphylococcus aureus (MRSA; n = 70, 27.9%), Pseudomonas species (n = 40, 16.3%), and Klebsiella species (n = 34, 13.3%).
VAP incidence was inversely associated with size of hospital. VAP in community hospitals was frequently caused by MRSA. Importantly, predictors of VAP incidence in tertiary care hospitals such as VUR may not be predictive in community hospitals with few ventilated patients.
To determine the effectiveness of an automated ultraviolet-C (UV-C) emitter against vancomycin-resistant enterococci (VRE), Clostridium difficile, and Acinetobacter spp. in patient rooms.
Prospective cohort study.
Two tertiary care hospitals.
Convenience sample of 39 patient rooms from which a patient infected or colonized with 1 of the 3 targeted pathogens had been discharged.
Environmental sites were cultured before and after use of an automated UV-C-emitting device in targeted rooms but before standard terminal room disinfection by environmental services.
In total, 142 samples were obtained from 27 rooms of patients who were colonized or infected with VRE, 77 samples were obtained from 10 rooms of patients with C. difficile infection, and 10 samples were obtained from 2 rooms of patients with infections due to Acinetobacter. Use of an automated UV-C-emitting device led to a significant decrease in the total number of colony-forming units (CFUs) of any type of organism (1.07 log10 reduction; P < .0001), CFUs of target pathogens (1.35 log10 reduction; P < .0001), VRE CFUs (1.68 log10 reduction; P < .0001), and C. difficile CFUs (1.16 log10 reduction; P < .0001). CFUs of Acinetobacter also decreased (1.71 log10 reduction), but the trend was not statistically significant P = .25). CFUs were reduced at all 9 of the environmental sites tested. Reductions similarly occurred in direct and indirect line of sight.
Our data confirm that automated UV-C-emitting devices can decrease the bioburden of important pathogens in real-world settings such as hospital rooms.
We implemented a direct-observer hand hygiene audit program that used trained observers, wireless data entry devices, and an intranet portal. We improved the reliability and utility of the data by standardizing audit processes, regularly retraining auditors, developing an audit guidance tool, and reporting weighted composite hand hygiene compliance scores.