Book chapters will be unavailable on Saturday 24th August between 8am-12pm BST. This is for essential maintenance which will provide improved performance going forwards. Please accept our apologies for any inconvenience caused.
To send this article to your account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send this article to your Kindle, first ensure firstname.lastname@example.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To evaluate the impact and burden of the new National Healthcare Safety Network surveillance definition, mucosal barrier injury laboratory-confirmed bloodstream infection (MBI-LCBI), in hematology, oncology, and stem cell transplant populations.
Retrospective cohort study.
Two hematology, oncology, and stem cell transplant units at a large academic medical center.
Central line–associated bloodstream infections (CLABSIs) identified during a 14-month period were reviewed and classified as MBI-LCBI or non-MBI-LCBI (MBI-LCBI criteria not met). During this period, interventions to improve central line maintenance were implemented. Characteristics of patients with MBI-LCBI and non-MBI-LCBI were compared. Total CLABSI, MBI-LCBI, and non-MBI-LCBI rates were compared between baseline and postintervention phases of the study period.
Among 66 total CLABSI cases, 47 (71%) met MBI-LCBI criteria. Patients with MBI-LCBI and non-MBI-LCBI were similar in regard to most clinical and demographic characteristics. Between the baseline and postintervention study periods, the overall CLABSI rate decreased from 3.37 to 3.21 infections per 1,000 line-days (incidence rate ratio, 0.95; 4.7% reduction, P=.84), the MBI-LCBI rate increased from 2.08 to 2.61 infections per 1,000 line-days (incidence rate ratio, 1.25; 25.3% increase, P=.44), and the non-MBI-LCBI rate decreased from 1.29 to 0.60 infections per 1,000 line-days (incidence rate ratio, 0.47; 53.3% reduction, P=.12).
Most CLABSIs identified among hematology, oncology, and stem cell transplant patients met MBI-LCBI criteria, and CLABSI prevention efforts did not reduce these infections. Further review of the MBI-LCBI definition and impact is necessary to direct future definition changes and reporting mandates.
Hospitals in the National Healthcare Safety Network began reporting laboratory-identified (LabID) Clostridium difficile infection (CDI) events in January 2013. Our study quantified the differences between the LabID and traditional surveillance methods.
A cohort of 29 community hospitals in the southeastern United States.
A period of 6 months (January 1, 2013, to June 30, 2013) of prospectively collected data using both LabID and traditional surveillance definitions were analyzed. CDI events with mismatched surveillance categories between LabID and traditional definitions were identified and characterized further. Hospital-onset CDI (HO-CDI) rates for the entire cohort of hospitals were calculated using each method, then hospital-specific HO-CDI rates and standardized infection ratios (SIRs) were calculated. Hospital rankings based on each CDI surveillance measure were compared.
A total of 1,252 incident LabID CDI events were identified during 708,551 patient-days; 286 (23%) mismatched CDI events were detected. The overall HO-CDI rate was 6.0 vs 4.4 per 10,000 patient-days for LabID and traditional surveillance, respectively (P<.001); of 29 hospitals, 25 (86%) detected a higher CDI rate using LabID compared with the traditional method. Hospital rank in the cohort differed greatly between surveillance measures. A rank change of at least 5 places occurred in 9 of 28 hospitals (32%) between LabID and traditional CDI surveillance methods, and for SIR.
LabID surveillance resulted in a higher hospital-onset CDI incidence rate than did traditional surveillance. Hospital-specific rankings varied based on the HO-CDI surveillance measure used. A clear understanding of differences in CDI surveillance measures is important when interpreting national and local CDI data.
Clostridium difficile infection (CDI) has been extensively described in healthcare settings; however, risk factors associated with community-acquired (CA) CDI remain uncertain. This study aimed to synthesize the current evidence for an association between commonly prescribed medications and comorbidities with CA-CDI.
A systematic search was conducted in 5 electronic databases for epidemiologic studies that examined the association between the presence of comorbidities and exposure to medications with the risk of CA-CDI. Pooled odds ratios were estimated using 3 meta-analytic methods. Subgroup analyses by location of studies and by life stages were conducted.
Twelve publications (n=56,776 patients) met inclusion criteria. Antimicrobial (odds ratio, 6.18; 95% CI, 3.80–10.04) and corticosteroid (1.81; 1.15–2.84) exposure were associated with increased risk of CA-CDI. Among the comorbidities, inflammatory bowel disease (odds ratio, 3.72; 95% CI, 1.52–9.12), renal failure (2.64; 1.23–5.68), hematologic cancer (1.75; 1.02–5.68), and diabetes mellitus (1.15; 1.05–1.27) were associated with CA-CDI. By location, antimicrobial exposure was associated with a higher risk of CA-CDI in the United States, whereas proton-pump inhibitor exposure was associated with a higher risk in Europe. By life stages, the risk of CA-CDI associated with antimicrobial exposure greatly increased in adults older than 65 years.
Antimicrobial exposure was the strongest risk factor associated with CA-CDI. Further studies are required to investigate the risk of CA-CDI associated with medications commonly prescribed in the community. Patients with diarrhea who have inflammatory bowel disease, renal failure, hematologic cancer, or diabetes are appropriate populations for interventional studies of screening.
Evaluate the effect of outpatient antimicrobial stewardship programs on prescribing, patient, microbial outcomes, and costs.
Search of MEDLINE (2000 through November 2013), Cochrane Library, and reference lists of relevant studies. We included English language studies with patient populations relevant to the United States (eg, infectious conditions, prescription services) evaluating stewardship programs in outpatient settings and reporting outcomes of interest. Data regarding study characteristics and outcomes were extracted and organized by intervention type.
We identified 50 studies eligible for inclusion, with most (29 of 50; 58%) reporting on respiratory tract infections, followed by multiple/unspecified infections (17 of 50; 34%). We found medium-strength evidence that stewardship programs incorporating communication skills training and laboratory testing are associated with reductions in antimicrobial use, and low-strength evidence that other stewardship interventions are associated with improved prescribing. Patient-centered outcomes, which were infrequently reported, were not adversely affected. Medication costs were generally lower with stewardship interventions, but overall program costs were rarely reported. No studies reported microbial outcomes, and data regarding outpatient settings other than primary care clinics are limited.
Low- to moderate-strength evidence suggests that antimicrobial stewardship programs in outpatient settings improve antimicrobial prescribing without adversely effecting patient outcomes. Effectiveness depends on program type. Most studies were not designed to measure patient or resistance outcomes. Data regarding sustainability and scalability of interventions are limited.
To examine inappropriate antibiotic prescribing for acute respiratory tract infections (RTIs) in ambulatory care to help target antimicrobial stewardship interventions.
Design and Setting
Retrospective analysis of RTI visits within general internal medicine (GIM) and family medicine (FM) ambulatory practices at an inner-city academic medical center from 2008 to 2010.
Patient, physician, and practice characteristics were analyzed using multivariable logistic regression to determine factors predictive of inappropriate prescribing; physicians in the highest and lowest antibiotic-prescribing quartiles were compared using χ2 analysis.
Visits with FM providers, female gender, and self-reported race/ethnicity as white or Hispanic were significantly associated with inappropriate antibiotic prescribing. Physicians in the lowest quartile prescribed antibiotics for 5%–28% (mean, 21%) of RTI visits; physicians in the highest quartile prescribed antibiotics for 54%–85% (mean, 65%) of RTI visits. High prescribers had fewer African-American patients and more patients who were younger and privately insured. High prescribers had more patients with chronic lung disease. A GIM practice pod with a low prescriber was 3.0 times more likely to have a second low prescriber than other practice pods, whereas pods with a high prescriber were 1.3 times more likely to have a second high prescriber.
Medical specialty was the only physician factor predictive of inappropriate prescribing when patient gender, race, and comorbidities were taken into account. Possible disparities in care need further study. Stewardship education in medical school, enlisting low prescribers as physician leaders, and targeting interventions to the highest prescribers might be more effective approaches to antimicrobial stewardship.
No previous studies of methicillin-resistant Staphylococcus aureus (MRSA) epidemiology in adult intensive care units (ICUs) have assessed the utility of rapid, highly discriminatory strain typing in the investigation of transmission events.
A 22-bed medical-surgical adult ICU.
Those admissions MRSA-positive on initial screening and all admissions <48 hours in duration were excluded, leaving a cohort of 653 patients (median age, 61 years; APACHE-II, 19).
We conducted this study of MRSA transmission over 1 year (August 1, 2011 to July 31, 2012) using a multiplex PCR-based reverse line blot (mPCR/RLB) assay to genotype isolates from surveillance swabs obtained at admission and twice weekly during ICU stays. MRSA prevalence and incidence rates were calculated and transmission events were identified using strain matching. Colonization pressure was calculated daily by summation of all MRSA cases.
Of 1,030 admissions to ICU during the study period, 349 patients were excluded. MRSA acquisition occurred during 31 of 681 (4.6%) remaining admissions; 19 of 31(61%) acquisitions were genotype-confirmed, including 7 (37%) due to the most commonly transmitted strain. Moving averages of MRSA patient numbers on the days prior to a documented event were used in a Poisson regression model. A significant association was found between transmission and colonization pressure when the average absolute colonization pressure on the previous day was ≥3 (χ2=7.41, P=0.01).
mPCR/RLB characterizes MRSA isolates within a clinically useful time frame for identification of single-source clusters within the ICU. High MRSA colonization pressure (≥3 MRSA-positive patients) on a given day is associated with an increased likelihood of a transmission event.
To analyze available evidence on the effectiveness of triclosan-coated sutures (TCSs) in reducing the risk of surgical site infection (SSI).
Systematic review and meta-analysis.
A systematic search of both randomized (RCTs) and nonrandomized (non-RCT) studies was performed on PubMed Medline, OVID, EMBASE, and SCOPUS, without restrictions in language and publication type. Random-effects models were utilized and pooled estimates were reported as the relative risk (RR) ratio with 95% confidence interval (CI). Tests for heterogeneity as well as meta-regression, subgroup, and sensitivity analyses were performed.
A total of 29 studies (22 RCTs, 7 non-RCTs) were included in the meta-analysis. The overall RR of acquiring an SSI was 0.65 (95% CI: 0.55–0.77; I2=42.4%, P=.01) in favor of TCS use. The pooled RR was particularly lower for the abdominal surgery group (RR: 0.56; 95% CI: 0.41–0.77) and was robust to sensitivity analysis. Meta-regression analysis revealed that study design, in part, may explain heterogeneity (P=.03). The pooled RR subgroup meta-analyses for randomized controlled trials (RCTs) and non-RCTs were 0.74 (95% CI: 0.61–0.89) and 0.53 (95% CI: 0.42–0.66), respectively, both of which favored the use of TCSs.
The random-effects meta-analysis based on RCTs suggests that TCSs reduced the risk of SSI by 26% among patients undergoing surgery. This effect was particularly evident among those who underwent abdominal surgery.
High-level disinfectants (HLDs) are used throughout the healthcare industry to chemically disinfect reusable, semicritical medical and dental devices to control and prevent healthcare-associated infections among patient populations. Workers who use HLDs are at risk of exposure to these chemicals, some of which are respiratory and skin irritants and sensitizers.
To evaluate exposure controls used and to better understand impediments to healthcare workers using personal protective equipment while handling HLDs.
A targeted sample of members of professional practice organizations representing nurses, technologists/technicians, dental professionals, respiratory therapists, and others who reported handling HLDs in the previous 7 calendar days. Participating organizations invited either all or a random sample of members via email, which included a hyperlink to the survey.
Descriptive analyses were conducted including simple frequencies and prevalences.
A total of 4,657 respondents completed the survey. The HLDs used most often were glutaraldehyde (59%), peracetic acid (16%), and ortho-phthalaldehyde (15%). Examples of work practices or events that could increase exposure risk included failure to wear water-resistant gowns (44%); absence of standard procedures for minimizing exposure (19%); lack of safe handling training (17%); failure to wear protective gloves (9%); and a spill/leak of HLD during handling (5%). Among all respondents, 12% reported skin contact with HLDs, and 33% of these respondents reported that they did not always wear gloves.
Findings indicated that precautionary practices were not always used, underscoring the importance of improved employer and worker training and education regarding HLD hazards.
To characterize health professional schools by their vaccination policies for acceptable forms of evidence of immunity and exemptions permitted.
Data were collected between September 2011 and April 2012 using an Internet-based survey e-mailed to selected types of accredited health professional programs. Schools were identified through accrediting associations for each type of health professional program. Analysis was limited to schools requiring ≥1 vaccine recommended by the Advisory Committee on Immunization Practices (ACIP): measles, mumps, rubella, hepatitis B, varicella, pertussis, and influenza. Weighted bivariate frequencies were generated using SAS 9.3.
Of 2,775 schools surveyed, 75% (n=2,077) responded; of responding schools, 93% (1947) required ≥1 ACIP-recommended vaccination. The proportion of schools accepting ≥1 non–ACIP-recommended form of evidence of immunity varied by vaccine: 42% for pertussis, 37% for influenza, 30% for rubella, 22% for hepatitis B, 18% for varicella, and 9% for measles and mumps. Among schools with ≥1 vaccination requirement, medical exemptions were permitted for ≥1 vaccine by 75% of schools; 54% permitted religious exemptions; 35% permitted personal belief exemptions; 58% permitted any nonmedical exemption.
Many schools accept non–ACIP-recommended forms of evidence of immunity which could lead some students to believe they are protected from vaccine preventable diseases when they may be susceptible. Additional efforts are needed to better educate school officials about current ACIP recommendations for acceptable forms of evidence of immunity so school policies can be revised as needed.
To determine the effectiveness of a pulsed xenon ultraviolet (PX-UV) disinfection device for reduction in recovery of healthcare-associated pathogens.
Two acute-care hospitals.
We examined the effectiveness of PX-UV for killing of Clostridium difficile spores, methicillin-resistant Staphylococcus aureus (MRSA), and vancomycin-resistant Enterococcus (VRE) on glass carriers and evaluated the impact of pathogen concentration, distance from the device, organic load, and shading from the direct field of radiation on killing efficacy. We compared the effectiveness of PX-UV and ultraviolet-C (UV-C) irradiation, each delivered for 10 minutes at 4 feet. In hospital rooms, the frequency of native pathogen contamination on high-touch surfaces was assessed before and after 10 minutes of PX-UV irradiation.
On carriers, irradiation delivered for 10 minutes at 4 feet from the PX-UV device reduced recovery of C. difficile spores, MRSA, and VRE by 0.55±0.34, 1.85±0.49, and 0.6±0.25 log10 colony-forming units (CFU)/cm2, respectively. Increasing distance from the PX-UV device dramatically reduced killing efficacy, whereas pathogen concentration, organic load, and shading did not. Continuous UV-C achieved significantly greater log10CFU reductions than PX-UV irradiation on glass carriers. On frequently touched surfaces, PX-UV significantly reduced the frequency of positive C. difficile, VRE, and MRSA culture results.
The PX-UV device reduced recovery of MRSA, C. difficile, and VRE on glass carriers and on frequently touched surfaces in hospital rooms with a 10-minute UV exposure time. PX-UV was not more effective than continuous UV-C in reducing pathogen recovery on glass slides, suggesting that both forms of UV have some effectiveness at relatively short exposure times.
To identify the source of a pseudo-outbreak of Mycobacterium gordonae
University Hospital in Chicago, Ilinois.
Hospital patients with M. gordonae-positive clinical cultures.
An increase in isolation of M. gordonae from clinical cultures was noted immediately following the opening of a newly constructed hospital in January 2012. We reviewed medical records of patients with M. gordonae-positive cultures collected between January and December 2012 and cultured potable water specimens in new and old hospitals quantitatively for mycobacteria.
Of 30 patients with M. gordonae-positive clinical cultures, 25 (83.3%) were housed in the new hospital; of 35 positive specimens (sputum, bronchoalveolar lavage, gastric aspirate), 32 (91.4%) had potential for water contamination. M. gordonae was more common in water collected from the new vs. the old hospital [147 of 157 (93.6%) vs. 91 of 113 (80.5%), P=.001]. Median concentration of M. gordonae was higher in the samples from the new vs. the old hospital (208 vs. 48 colony-forming units (CFU)/mL; P<.001). Prevalence and concentration of M. gordonae were lower in water samples from ice and water dispensers [13 of 28 (46.4%) and 0 CFU/mL] compared with water samples from patient rooms and common areas [225 of 242 (93%) and 146 CFU/mL, P<.001].
M. gordonae was common in potable water. The pseudo-outbreak of M. gordonae was likely due to increased concentrations of M. gordonae in the potable water supply of the new hospital. A silver ion-impregnated 0.5-μm filter may have been responsible for lower concentrations of M. gordonae identified in ice/water dispenser samples. Hospitals should anticipate that construction activities may amplify the presence of waterborne nontuberculous mycobacterial contaminants.
To systematically review studies evaluating clinical prediction rules (CPRs) for adult inpatients suspected to have pulmonary tuberculosis.
Systematic review with meta-analyses.
Inpatients at least 15 years of age admitted to acute care.
A search was conducted in 5 indexed electronic databases with no language or year of publication restrictions. We performed a meta-analysis for those CPRs with at least 2 validation studies. Results were reported according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses.
Of the 461 abstracts selected, 36 articles were fully analyzed and 11 articles were included, yielding 8 CPRs derived in 4 countries. Broad validation studies were identified for 2 CPRs. The most frequent clinical predictors were fever and weight loss. All CPRs included chest imaging signs. Most CPRs were derived in countries with a low prevalence of pulmonary tuberculosis and included homeless, immigrants, and those who reacted to the purified protein derivative test. Both of the CPRs derived in countries with a high prevalence of pulmonary tuberculosis strongly relied on chest radiograph predictors. Accuracy of the different CPRs was high (area under receiver operating characteristic curve, 0.79–0.91). Meta-analysis of 4 validation studies for Wisnivesky´s CPR indicates optimistic pooled results: sensitivity, 94.1% (95% CI, 89.7%–96.7%); negative likelihood ratio, 0.22 (95% CI, 0.12–0.40).
On the basis of a critical appraisal of the 2 best validated CPRs, the presence of weight loss and/or fever in inpatients warrants obtaining a chest radiograph, regardless of the presence of productive cough. If the chest radiograph is abnormal, the patient should be placed in isolation until more specific test results are available. Validation in different settings is required to maximize external generalization of existing CPRs.
Little is known about central line–associated bloodstream infection risk factors in the bundle era. In our case-control investigation, we found that independent risk factors for central line–associated bloodstream infection at our center included the number of recent lab tests, catheter duration, and lack of hemodynamic monitoring as the insertion indication.
We evaluated the impact of nursing education and stewardship interventions on Clostridium difficile testing and treatment appropriateness. Diarrhea documentation increased for those with positive tests (45% to 70%); pretreatment laxative use decreased (50% to 19%). Appropriate treatment increased for severe infection (57% to 93%), but all asymptomatically colonized patients were treated.
We reviewed patient discharges with outpatient parenteral antimicrobial therapy (OPAT) to determine whether outpatient parenteral antimicrobial therapy was modifiable or unnecessary at a large tertiary care children’s hospital. At least one modification definitely or possibly would have been recommended for 78% of episodes. For more than 40% of episodes, outpatient parenteral antimicrobial therapy was potentially not indicated.
We assessed 4045 ambulatory surgery patients for surgical site infection (SSI) using claims-based triggers for medical chart review. Of 98 patients flagged by codes suggestive of SSI, 35 had confirmed SSIs. SSI rates ranged from 0 to 3.2% for common procedures. Claims may be useful for SSI surveillance following ambulatory surgery.
Of 82 patients with methicillin-resistant Staphylococcus aureus (MRSA) colonization, 67 (82%) had positive hand cultures for MRSA. A single application of alcohol gel (2 mL) consistently reduced the burden of MRSA on hands. However, incomplete removal of MRSA was common, particularly in those with a high baseline level of recovery.