To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: The Centers for Disease Control and Prevention’s Emerging Infections Program conducts active laboratory- and population-based surveillance for carbapenem-resistant Enterobacterales (CRE) and extended spectrum beta-lactamase-producing Enterobacterales (ESBL-E). To better understand the U.S. epidemiology of these organisms among children, we determined the incidence of pediatric CRE and ESBL-E cases and described their clinical characteristics. Methods: Surveillance was conducted among children <18 years of age for CRE from 2016–2020 in 10 sites, and for ESBL-E from 2019–2020 in 6 sites. Among catchment-area residents, an incident CRE case was defined as the first isolation of Escherichia coli, Enterobacter cloacae complex, Klebsiella aerogenes, K. oxytoca, or K. pneumoniae in a 30-day period resistant to ≥1 carbapenem from a normally sterile site or urine. An incident ESBL-E case was defined as the first isolation of E. coli, K. pneumoniae, or K. oxytoca in a 30-day period resistant to any third-generation cephalosporin and non-resistant to all carbapenems from a normally sterile site or urine. Case records were reviewed. Results: Among 159 CRE cases, 131 (82.9%) were isolated from urine and 19 (12.0%) from blood; median age was 5 years (IQR 1–10) and 94 (59.1%) were female. Combined CRE incidence rate per 100,000 population by year ranged from 0.47 to 0.87. Among 207 ESBL-E cases, 160 (94.7%) were isolated from urine and 6 (3.6%) from blood; median age was 6 years (IQR 2–15) and 165 (79.7%) were female. Annual ESBL incidence rate per 100,000 population was 26.5 in 2019 and 19.63 in 2020. Incidence rates of CRE and ESBL-E were >2-fold higher in infants (children <1 year) than other age groups. Among those with data available, CRE cases were more likely than ESBL-E cases to have underlying conditions (99/158 [62.7%] versus 59/169 [34.9%], P<0.0001), prior healthcare exposures (74/158 [46.8%] versus 38/169 [22.5%], P<0.0001), and be hospitalized for any reason around time of their culture collection (75/158 [47.5%] versus 38/169 [22.5%], P<0.0001); median duration of admission was 18 days [IQR 3–103] for CRE versus 10 days [IQR 4–43] for ESBL-E. Urinary tract infection was the most frequent infection for CRE (89/158 [56.3%]) and ESBL-E (125/169 [74.0%]) cases. Conclusion: CRE infections occurred less frequently than ESBL-infections in U.S. children but were more often associated with healthcare risk factors and hospitalization. Infants had highest incidence of CRE and ESBL-E. Continued surveillance, infection prevention and control efforts, and antibiotic stewardship outside and within pediatric care are needed
The coronavirus disease 2019 pandemic caused substantial changes to healthcare delivery and antibiotic prescribing beginning in March 2020. To assess pandemic impact on Clostridioides difficile infection (CDI) rates, we described patients and trends in facility-level incidence, testing rates, and percent positivity during 2019–2020 in a large cohort of US hospitals.
We estimated and compared rates of community-onset CDI (CO-CDI) per 10,000 discharges, hospital-onset CDI (HO-CDI) per 10,000 patient days, and C. difficile testing rates per 10,000 discharges in 2019 and 2020. We calculated percent positivity as the number of inpatients diagnosed with CDI over the total number of discharges with a test for C. difficile. We used an interrupted time series (ITS) design with negative binomial and logistic regression models to describe level and trend changes in rates and percent positivity before and after March 2020.
In pairwise comparisons, overall CO-CDI rates decreased from 20.0 to 15.8 between 2019 and 2020 (P < .0001). HO-CDI rates did not change. Using ITS, we detected decreasing monthly trends in CO-CDI (−1% per month, P = .0036) and HO-CDI incidence (−1% per month, P < .0001) during the baseline period, prior to the COVID-19 pandemic declaration. We detected no change in monthly trends for CO-CDI or HO-CDI incidence or percent positivity after March 2020 compared with the baseline period.
While there was a slight downward trajectory in CDI trends prior to March 2020, no significant change in CDI trends occurred during the COVID-19 pandemic despite changes in infection control practices, antibiotic use, and healthcare delivery.
Previously reported associations between hospital-level antibiotic use and hospital-onset Clostridioides difficile infection (HO-CDI) were reexamined using 2012–2018 data from a new cohort of US acute-care hospitals. This analysis revealed significant positive associations between total, third-generation, and fourth-generation cephalosporin, fluoroquinolone, carbapenem, and piperacillin-tazobactam use and HO-CDI rates, confirming previous findings.
Background: The epidemic NAP1/027 Clostridioides difficile strain (MLST1, ST1) that emerged in the mid-2000 is on the decline. The current distribution of C. difficile strain types and their transmission dynamics are poorly defined. We performed whole-genome sequencing (WGS) of C. difficile isolates in 2 regions to identify the predominant multilocus sequence types (MLSTs) in community- and healthcare-associated cases and potential transmission between cases using whole-genome single-nucleotide polymorphism (SNP) analysis. Methods: Isolates were collected through the CDC Emerging Infections Program population-based surveillance for C. difficile infections (CDI) for 3 months between 2016 and 2017 in 5 Minnesota counties and 1 New York county. Isolates were limited to incident cases (CDI in a county resident with no positive C. difficile test in the preceding 8 weeks). Cases were classified as healthcare associated (HA-CDI) or community associated (CA-CDI) based on healthcare exposures as previously described. WGS was performed on an Illumina Miseq. The CFSAN (FDA) pipeline was used to compute whole-genome SNPs, SPAdes was used for assembly, and MLST was assigned according to www.pubmlst.org. Results: Of 431 isolates, 269 originated from New York and 162 from Minnesota; 203 cases were classified as CA-CDI and 221 as HA-CDI. The proportion of CA-CDI cases was higher in Minnesota than in New York: 62% vs 38%. The predominant MLSTs across both sites were ST42 (9%), ST8 (8%), and ST2 (8%). MLSTs more frequently encountered in HA-CDI than CA-CDI included ST1 (note that this ST includes PCR Ribotype 027; 76% HA-CDI), ST53 (84% HA-CDI), and ST43 (80% HA-CDI). In contrast, ST110 (63% CA-CDI) and ST3 (67% CA-CDI) were more commonly isolated from CA-CDI cases. ST1 accounted for 7.6% of circulating strains and was more common in New York than Minnesota (10% vs 3%) and was concentrated among New York HA-CDI cases. Also, 412 isolates (1 per patient) were included in the final whole-genome SNP analysis. Of these, only 12 pairs were separated by 0–3 SNPs, indicating potential transmission, and most involved HA-CDI cases. ST1, ST17, and ST46 accounted for 8 of 12 pairs, with ST17 and ST46 potentially forming small clusters. Conclusions: This analysis provides a snapshot of the current genomic epidemiology of C. difficile across 2 geographically and epidemiologically distinct regions of the United States and supports other studies suggesting that the role of direct transmission in the spread of CDI may be limited.
Background: The NHSN has used positive laboratory tests for surveillance of Clostridioides difficile infection (CDI) LabID events since 2009. Typically, CDIs are detected using enzyme immunoassays (EIAs), nucleic acid amplification tests (NAATs), or various test combinations. The NHSN uses a risk-adjusted, standardized infection ratio (SIR) to assess healthcare facility-onset (HO) CDI. Despite including test type in the risk adjustment, some hospital personnel and other stakeholders are concerned that NAAT use is associated with higher SIRs than are EIAs. To investigate this issue, we analyzed NHSN data from acute-care hospitals for July 1, 2017 through June 30, 2018. Methods: Calendar quarters for which CDI test type was reported as NAAT (includes NAAT, glutamate dehydrogenase (GDH)+NAAT and GDH+EIA followed by NAAT if discrepant) or EIA (includes EIA and GDH+EIA) were selected. HO CDI SIRs were calculated for facility-wide inpatient locations. We conducted the following analyses: (1) Among hospitals that did not switch their test type, we compared the distribution of HO incident rates and SIRs by those reporting NAAT vs EIA. (2) Among hospitals that switched their test type, we selected quarters with a stable switch pattern of 2 consecutive quarters of each of EIA and NAAT (categorized as pattern EIA-to-NAAT or NAAT-to-EIA). Pooled semiannual SIRs for EIA and NAAT were calculated, and a paired t test was used to evaluate the difference of SIRs by switch pattern. Results: Most hospitals did not switch test types (3,242, 89%), and 2,872 (89%) reported sufficient data to calculate SIRs, with 2,444 (85%) using NAAT. The crude pooled HO CDI incidence rates for hospitals using EIA clustered at the lower end of the histogram versus rates for NAAT (Fig. 1). The SIR distributions of both NAAT and EIA overlapped substantially and covered a similar range of SIR values (Fig. 1). Among hospitals with a switch pattern, hospitals were equally likely to have an increase or decrease in their SIR (Fig. 2). The mean SIR difference for the 42 hospitals switching from EIA to NAAT was 0.048 (95% CI, −0.189 to 0.284; P = .688). The mean SIR difference for the 26 hospitals switching from NAAT to EIA was 0.162 (95% CI, −0.048 to 0.371; P = .124). Conclusions: The pattern of SIR distributions of both NAAT and EIA substantiate the soundness of NHSN risk adjustment for CDI test types. Switching test type did not produce a consistent directional pattern in SIR that was statistically significant.
Background: Chlorhexidine bathing reduces bacterial skin colonization and prevents infections in specific patient populations. As chlorhexidine use becomes more widespread, concerns about bacterial tolerance to chlorhexidine have increased; however, testing for chlorhexidine minimum inhibitory concentrations (MICs) is challenging. We adapted a broth microdilution (BMD) method to determine whether chlorhexidine MICs changed over time among 4 important healthcare-associated pathogens. Methods: Antibiotic-resistant bacterial isolates (Staphylococcus aureus from 2005 to 2019 and Escherichia coli, Klebsiella pneumoniae, and Enterobacter cloacae complex from 2011 to 2019) were collected through Emerging Infections Program surveillance in 2 sites (Georgia and Tennessee) or through public health reporting in 1 site (Orange County, California). A convenience sample of isolates were collected from facilities with varying amounts of chlorhexidine use. We performed BMD testing using laboratory-developed panels with chlorhexidine digluconate concentrations ranging from 0.125 to 64 μg/mL. After successfully establishing reproducibility with quality control organisms, 3 laboratories performed MIC testing. For each organism, epidemiological cutoff values (ECVs) were established using ECOFFinder. Results: Among 538 isolates tested (129 S. aureus, 158 E. coli, 142 K. pneumoniae, and 109 E. cloacae complex), S. aureus, E. coli, K. pneumoniae, and E. cloacae complex ECVs were 8, 4, 64, and 64 µg/mL, respectively (Table 1). Moreover, 14 isolates had an MIC above the ECV (12 E. coli and 2 E. cloacae complex). The MIC50 of each species is reported over time (Table 2). Conclusions: Using an adapted BMD method, we found that chlorhexidine MICs did not increase over time among a limited sample of S. aureus, E. coli, K. pneumoniae, and E. cloacae complex isolates. Although these results are reassuring, continued surveillance for elevated chlorhexidine MICs in isolates from patients with well-characterized chlorhexidine exposure is needed as chlorhexidine use increases.
Background: The National Healthcare Safety Network (NHSN) has used positive laboratory tests for surveillance of Clostridioides difficile infection (CDI) LabID events since 2009. Typically, CDIs are detected using enzyme immunoassays (EIAs), nucleic acid amplification tests (NAATs), or various test combinations. The NHSN uses a risk-adjusted, standardized infection ratio (SIR) to assess healthcare facility-onset (HO) CDI. Despite including test type in the risk adjustment, some hospital personnel and other stakeholders are concerned that NAAT use is associated with higher SIRs than EIA use. To investigate this issue, we analyzed NHSN data from acute-care hospitals for July 1, 2017, through June 30, 2018. Methods: Calendar quarters where CDI test type was reported as NAAT (includes NAAT, glutamate dehydrogenase (GDH)+NAAT and GDH+EIA followed by NAAT if discrepant) or EIA (includes EIA and GDH+EIA) were selected. HO-CDI SIRs were calculated for facility-wide inpatient locations. We conducted the following 2 analyses: (1) Among hospitals that did not switch their test type, we compared the distribution of HO incident rates and SIRs by those reporting NAAT versus EIA. (2) Among hospitals that switched their test type, we selected quarters with a stable switch pattern of 2 consecutive quarters of each of EIA and NAAT (categorized as EIA-to-NAAT or NAAT-to-EIA). Pooled semiannual SIRs for EIA and NAAT were calculated, and a paired t test was used to evaluate the difference in SIRs by switch pattern. Results: Most hospitals did not switch test types (3,242, 89%), and 2,872 (89%) reported sufficient data to calculate an SIR, with 2,444 (85%) using NAAT. The crude pooled HO CDI incidence rates for hospitals using EIAs clustered at the lower end of the histogram versus rates for NAATs (Fig. 1). The SIR distributions, both NAATs and EIAs, overlapped substantially and covered a similar range of SIR values (Fig. 1). Among hospitals with a switch pattern, hospitals were equally likely to have an increase or decrease in their SIRs (Fig. 2). The mean SIR difference for the 42 hospitals switching from EIA to NAAT was 0.048 (95% CI, −0.189 to 0.284; P = .688). The mean SIR difference for the 26 hospitals switching from NAAT to EIA was 0.162 (95% CI, −0.048 to 0.371; P = .124). Conclusions: The pattern of SIR distribution for both NAAT and EIA substantiate the soundness of the NHSN’s risk adjustment for CDI test types. Switching test type did not produce a consistent directional pattern in SIR that was statistically significant.
Background: There is great enthusiasm for the potential of decision support tools embedded in the electronic medical record to improve antimicrobial use in hospitals. Yet they are often limited in their ability to change prescriber behavior. Analyzing these tools using an interactive sociotechnical approach (ISTA) can identify barriers and facilitators to the implementation of electronic decision support (EDS) in antimicrobial stewardship. Objective: To examine prescriber and antimicrobial steward perceptions of EDS using an ISTA approach in the preimplementation phase of an antimicrobial stewardship intervention. Methods: We conducted semistructured interviews with prescribers and stewards from 4 hospitals in 2 health systems in the context of a multicomponent intervention to improve the use of fluoroquinolones and extended-spectrum cephalosporins. Sites planned to implement various EDS elements including order sets, antimicrobial time outs, and audit with feedback stewardship notes in the medical record. Interviews elicited respondent perceptions about the planned intervention. Two analysts systematically coded transcripts using an ISTA framework in NVivo12 software. Results: Interviews with 64 respondents were conducted: 38 physicians, 7 nurses, 6 advanced practice providers, and 13 pharmacists. We identified 4 key sociotechnical interaction types likely to influence stewardship EDS implementation. First, EDS changes the communication patterns and practices of antimicrobial stewards in a way that improves efficiency but decreases vital social interaction with prescribers to facilitate behavior change. Second, there is a gap between what stewards envision for EDS and that which is possible to build in a timely manner by hospital information technology specialists. As a result, there is often a months- to years-long delay from proposal to implementation, which negatively affects intervention acceptance. Third, prescribers expressed great enthusiasm for stewardship EDS that would simplify their workload, allow them to complete important work tasks, and save time. They strongly objected to stewardship EDS that was disruptive without a compelling purpose or did not integrate smoothly with pre-existing technology infrastructure. Fourth, physician prescribers attributed social and emotional meaning to stewardship EDS, suggesting that these tools can undermine professional authority, autonomy, and confidence. Conclusions: Implementing stewardship EDS in a way that improves the use of antimicrobials while minimizing unintended negative consequences requires attention to the interplay between new EDS and an organization’s existing workflow, culture, social interactions and technologies. Implementing EDS in stewardship will require attention to these domains to realize the full potential of these tools and to avoid negative unintended consequences.
Fluoroquinolones (FQs) and extended-spectrum cephalosporins (ESCs) are associated with higher risk of Clostridioides difficile infection (CDI). Decreasing the unnecessary use of FQs and ESCs is a goal of antimicrobial stewardship. Understanding how prescribers perceive the risks and benefits of FQs and ESCs is needed.
We conducted interviews with clinicians from 4 hospitals. Interviews elicited respondent perceptions about the risk of ESCs, FQs, and CDI. Interviews were audio recorded, transcribed, and analyzed using a flexible coding approach.
Interviews were conducted with 64 respondents (38 physicians, 7 nurses, 6 advance practice providers, and 13 pharmacists). ESCs and FQs were perceived to have many benefits, including infrequent dosing, breadth of coverage, and greater patient adherence after hospital discharge. Prescribers stated that it was easy to make decisions about these drugs, so they were especially appealing to use in the context of time pressures. They described having difficulty discontinuing these drugs when prescribed by others due to inertia and fear. Prescribers were skeptical about targeting specific drugs as a stewardship approach and felt that the risk of a negative outcome from under treatment of a suspected bacterial infection was a higher priority than the prevention of CDI.
Prescribers in this study perceived many advantages to using ESCs and FQs, especially under conditions of time pressure and uncertainty. In making decisions about these drugs, prescribers balance risk and benefit, and they believed that the risk of CDI was acceptable in compared with the risk of undertreatment.
To determine the source of a healthcare-associated outbreak of Pantoea agglomerans bloodstream infections.
Epidemiologic investigation of the outbreak.
Oncology clinic (clinic A).
Cases were defined as Pantoea isolation from blood or catheter tip cultures of clinic A patients during July 2012–May 2013. Clinic A medical charts and laboratory records were reviewed; infection prevention practices and the facility’s water system were evaluated. Environmental samples were collected for culture. Clinical and environmental P. agglomerans isolates were compared using pulsed-field gel electrophoresis.
Twelve cases were identified; median (range) age was 65 (41–78) years. All patients had malignant tumors and had received infusions at clinic A. Deficiencies in parenteral medication preparation and handling were identified (eg, placing infusates near sinks with potential for splash-back contamination). Facility inspection revealed substantial dead-end water piping and inadequate chlorine residual in tap water from multiple sinks, including the pharmacy clean room sink. P. agglomerans was isolated from composite surface swabs of 7 sinks and an ice machine; the pharmacy clean room sink isolate was indistinguishable by pulsed-field gel electrophoresis from 7 of 9 available patient isolates.
Exposure of locally prepared infusates to a contaminated pharmacy sink caused the outbreak. Improvements in parenteral medication preparation, including moving chemotherapy preparation offsite, along with terminal sink cleaning and water system remediation ended the outbreak. Greater awareness of recommended medication preparation and handling practices as well as further efforts to better define the contribution of contaminated sinks and plumbing deficiencies to healthcare-associated infections are needed.
We assessed for vancomycin-resistant Staphylococcus aureus (VRSA) precursor organisms in southeastern Michigan, an area known to have VRSA. The prevalence was 2.5% (pSK41-positive methicillin-resistant S. aureus, 2009–2011) and 1.5% (Inc18-positive vancomycin-resistant Enterococcus, 2006–2013); Inc18 prevalence significantly decreased after 2009 (3.7% to 0.82%). Risk factors for pSK41 included intravenous vancomycin exposure.
Infect Control Hosp Epidemiol 2014;35(12):1531–1534
To determine the source and identify control measures of an outbreak of Tsukamurella species bloodstream infections at an outpatient oncology facility.
Epidemiologic investigation of the outbreak with a case-control study.
A case was an infection in which Tsukamurella species was isolated from a blood or catheter tip culture during the period January 2011 through June 2012 from a patient of the oncology clinic. Laboratory records of area hospitals and patient charts were reviewed. A case-control study was conducted among clinic patients to identify risk factors for Tsukamurella species bloodstream infection. Clinic staff were interviewed, and infection control practices were assessed.
Fifteen cases of Tsukamurella (Tsukamurella pulmonis or Tsukamurella tyrosinosolvens) bloodstream infection were identified, all in patients with underlying malignancy and indwelling central lines. The median age of case patients was 68 years; 47% were male. The only significant risk factor for infection was receipt of saline flush from the clinic during the period September–October 2011 (P = .03), when the clinic had been preparing saline flush from a common-source bag of saline. Other infection control deficiencies that were identified at the clinic included suboptimal procedures for central line access and preparation of chemotherapy.
Although multiple infection control lapses were identified, the outbreak was likely caused by improper preparation of saline flush syringes by the clinic. The outbreak demonstrates that bloodstream infections among oncology patients can result from improper infection control practices and highlights the critical need for increased attention to and oversight of infection control in outpatient oncology settings.
This article is an executive summary of a report from the Centers for Disease Control and Prevention Ventilator-Associated Pneumonia Surveillance Definition Working Group, entitled “Developing a new, national approach to surveillance for ventilator-associated events” and published in Critical Care Medicine. The full report provides a comprehensive description of the Working Group process and outcome.
In September 2011, the Centers for Disease Control and Prevention (CDC) convened a Ventilator-Associated Pneumonia (VAP) Surveillance Definition Working Group to organize a formal process for leaders and experts of key stakeholder organizations to discuss the challenges of VAP surveillance definitions and to propose new approaches to VAP surveillance in adult patients (Table 1).
Of the 13 US vancomycin-resistant Staphylococcus aureus (VRSA) cases, 8 were identified in southeastern Michigan, primarily in patients with chronic lower-extremity wounds. VRSA infections develop when the vanA gene from vancomycin-resistant enterococcus (VRE) transfers to S. aureus. Incl8-like plasmids in VRE and pSK41-like plasmids in S. aureus appear to be important precursors to this transfer.
Identify the prevalence of VRSA precursor organisms.
Prospective cohort with embedded case-control study.
Southeastern Michigan adults with chronic lower-extremity wounds.
Adults presenting to 3 southeastern Michigan medical centers during the period February 15 through March 4, 2011, with chronic lower-extremity wounds had wound, nares, and perirectal swab specimens cultured for S. aureus and VRE, which were tested for pSK41-like and Incl8-like plasmids by polymerase chain reaction. We interviewed participants and reviewed clinical records. Risk factors for pSK41-positive S. aureus were assessed among all study participants (cohort analysis) and among only S. aureus-colonized participants (case-control analysis).
Of 179 participants with wound cultures, 26% were colonized with methicillin-susceptible S. aureus, 27% were colonized with methicillin-resistant S. aureus, and 4% were colonized with VRE, although only 17% consented to perirectal culture. Six participants (3%) had pSK41-positive S. aureus, and none had Incl8-positive VRE. Having chronic wounds for over 2 years was associated with pSK41-positive S. aureus colonization in both analyses.
Colonization with VRSA precursor organisms was rare. Having long-standing chronic wounds was a risk factor for pSK41-positive S. aureus colonization. Additional investigation into the prevalence of VRSA precursors among a larger cohort of patients is warranted.
To describe a Klebsiella pneumoniae carbapenemase (KPC)–producing carbapenem-resistant Enterobacteriaceae (CRE) outbreak and interventions to prevent transmission.
Design, Setting, and Patients.
Epidemiologic investigation of a CRE outbreak among patients at a long-term acute care hospital (LTACH).
Microbiology records at LTACH A from March 2009 through February 2011 were reviewed to identify CRE transmission cases and cases admitted with CRE. CRE bacteremia episodes were identified during March 2009–July 2011. Biweekly CRE prevalence surveys were conducted during July 2010–July 2011, and interventions to prevent transmission were implemented, including education and auditin? of staff and isolation and cohorting of CRE patients with dedicated nursing staff and shared medical equipment. Trends were evaluated using weighted linear or Poisson regression. CRE transmission cases were included in a case-control study to evaluate risk factors for acquisition. A real-time polymerase chain reaction assay was used to detect the blaKPC gene, and pulsed-field gel electrophoresis was performed to assess the genetic relatedness of isolates.
Ninety-nine CRE transmission cases, 16 admission cases (from 7 acute care hospitals), and 29 CRE bacteremia episodes were identified. Significant reductions were observed in CRE prevalence (49% vs 8%), percentage of patients screened with newly detected CRE (44% vs 0%), and CRE bacteremia episodes (2.5 vs 0.0 per 1,000 patient-days). Cases were more likely to have received β-lactams, have diabetes, and require mechanical ventilation. All tested isolates were KPC-producing K. pneumoniae, and nearly all isolates were genetically related.
CRE transmission can be reduced in LTACHs through surveillance testing and targeted interventions. Sustainable reductions within and across healthcare facilities may require a regional public health approach.
Infect Control Hosp Epidemiol 2012;33(10):984-992
Email your librarian or administrator to recommend adding this to your organisation's collection.