To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To identify important risk factors for carbapenem-resistant Enterobacterales (CRE) infections among hospitalized patients.
We utilized a case–case–control design that compared patients with CRE infections to patients with carbapenem-susceptible Enterobacterales (CSE) infections and randomly selected controls during the period from January 2011 through December 2016.
The study population was selected from patients at a large metropolitan tertiary-care and instructional medical center.
Cases of CRE were defined as initial admission of adults diagnosed with a bacterial infection of an Enterobacterales species resistant clinically or through sensitivity testing to carbapenems 48 hours or more after admission. Cases of CSE were selected from the same patient population as the CRE cases within a 30-day window for admission, with diagnostic pathogens identified as susceptible to carbapenems. Controls were defined as adult patients admitted to any service within a 30-day window from a CRE case for >48 hours who did not meet either of the above case definitions during that admission.
Antibiotic exposure within 90 days prior to admission and length of hospital stay were both associated with increased odds of CRE and CSE infections compared to controls. Patients with CRE infections had >18 times greater odds of prior antibiotic exposure compared to patients with CSE infections.
Antibiotic exposure and increased length of hospital stay may result in increased patient risk of developing an infection resistant to carbapenems and other β-lactams.
Central line–associated bloodstream infection (BSI) rates are a key quality metric for comparing hospital quality and safety. Traditional BSI surveillance may be limited by interrater variability. We assessed whether a computer-automated method of central line–associated BSI detection can improve the validity of surveillance.
Retrospective cohort study.
Eight medical and surgical intensive care units (ICUs) in 4 academic medical centers.
Traditional surveillance (by hospital staff) and computer algorithm surveillance were each compared against a retrospective audit review using a random sample of blood culture episodes during the period 2004–2007 from which an organism was recovered. Episode-level agreement with audit review was measured with κ statistics, and differences were assessed using the test of equal κ coefficients. Linear regression was used to assess the relationship between surveillance performance (κ) and surveillance-reported BSI rates (BSIs per 1,000 central line–days).
We evaluated 664 blood culture episodes. Agreement with audit review was significantly lower for traditional surveillance (κ [95% confidence interval (CI)] = 0.44 [0.37–0.51]) than computer algorithm surveillance (κ [95% CI] [0.52–0.64]; P = .001). Agreement between traditional surveillance and audit review was heterogeneous across ICUs (P = .001); furthermore, traditional surveillance performed worse among ICUs reporting lower (better) BSI rates (P = .001). In contrast, computer algorithm performance was consistent across ICUs and across the range of computer-reported central line–associated BSI rates.
Compared with traditional surveillance of bloodstream infections, computer automated surveillance improves accuracy and reliability, making interfacility performance comparisons more valid.
Infect Control Hosp Epidemiol 2014;35(12):1483–1490
Infection surveillance definitions for long-term care facilities (ie, the McGeer Criteria) have not been updated since 1991. An expert consensus panel modified these definitions on the basis of a structured review of the literature. Significant changes were made to the criteria defining urinary tract and respiratory tract infections. New definitions were added for norovirus gastroenteritis and Clostridum difficile infections.
To evaluate the use of inpatient pharmacy and administrative data to detect surgical site infections (SSIs) following hysterectomy and colorectal and vascular surgery.
Retrospective cohort study.
Five hospitals affiliated with academic medical centers.
Adults who underwent abdominal or vaginal hysterectomy, colorectal surgery, or vascular surgery procedures between July 1, 2003, and June 30, 2005.
We reviewed the medical records of weighted, random samples drawn from 3,079 abdominal and vaginal hysterectomy, 4,748 colorectal surgery, and 3,332 vascular surgery procedures. We compared routine surveillance with screening of inpatient pharmacy data and diagnosis codes and then performed medical record review to confirm SSI status.
Medical records from 823 hysterectomy, 736 colorectal surgery, and 680 vascular surgery procedures were reviewed. SSI rates determined by antimicrobial- and/or diagnosis code-based screening followed by medical record review (enhanced surveillance) were substantially higher than rates determined by routine surveillance (4.3% [95% confidence interval, 3.6%—5.1%] vs 2.7% for hysterectomies, 7.1% [95% confidence interval, 6.7%–8.2%] vs 2.0% for colorectal procedures, and 2.3% [95% confidence interval, 1.9%–2.9%] vs 1.4% for vascular procedures). Enhanced surveillance had substantially higher sensitivity than did routine surveillance to detect SSI (92% vs 59% for hysterectomies, 88% vs 22% for colorectal procedures, and 72% vs 43% for vascular procedures). A review of medical records confirmed SSI for 31% of hysterectomies, 20% of colorectal procedures, and 31% of vascular procedures that met the enhanced screening criteria.
Antimicrobial- and diagnosis code-based screening may be a useful method for enhancing and streamlining SSI surveillance for a variety of surgical procedures, including those procedures targeted by the Centers for Medicare and Medicaid Services.
Automated surveillance using electronically available data has been found to be accurate and save time. An automated Clostridium difficile infection (CDI) surveillance algorithm was validated at 4 Centers for Disease Control and Prevention Epicenter hospitals. Electronic surveillance was highly sensitive, specific, and showed good to excellent agreement for hospital-onset; community-onset, study facility-associated; indeterminate; and recurrent CDI.
To compare incidence of hospital-onset Clostridium difficile infection (CDI) measured by the use of International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) discharge diagnosis codes with rates measured by the use of electronically available C. difficile toxin assay results.
Cases of hospital-onset CDI were identified at 5 US hospitals during the period from July 2000 through June 2006 with the use of 2 surveillance definitions: positive toxin assay results (gold standard) and secondary ICD-9-CM discharge diagnosis codes for CDI. The x2 test was used to compare incidence, linear regression models were used to analyze trends, and the test of equality was used to compare slopes.
Of 8,670 cases of hospital-onset CDI, 38% were identified by the use of both toxin assay results and the ICD-9-CM code, 16% by the use of toxin assay results alone, and 45% by the use of the ICD-9-CM code alone. Nearly half (47%) of cases of CDI identified by the use of a secondary diagnosis code alone were community-onset CDI according to the results of the toxin assay. The rate of hospital-onset CDI found by use of ICD-9-CM codes was significantly higher than the rate found by use of toxin assay results overall (P<.001), as well as individually at 3 of the 5 hospitals (P<.001 for all). The agreement between toxin assay results and the presence of a secondary ICD-9-CM diagnosis code for CDI was moderate, with an overall k value of 0.509 and hospital-specific k values of 0.489–0.570. Overall, the annual increase in CDI incidence was significantly greater for rates determined by the use of ICD-9-CM codes than for rates determined by the use of toxin assay results (P = .006).
Although the ICD-9-CM code for CDI seems to be adequate for measuring the overall CDI burden, use of the ICD-9-CM discharge diagnosis code for CDI, without present-on-admission code assignment, is not an acceptable surrogate for surveillance for hospital-onset CDI.
The incidence of surgical site infection (SSI) after hysterectomy ranges widely from 2% to 21%. A specific risk stratification index could help to predict more accurately the risk of incisional SSI following abdominal hysterectomy and would help determine the reasons for the wide range of reported SSI rates in individual studies. To increase our understanding of the risk factors needed to build a specific risk stratification index, we performed a retrospective multihospital analysis of risk factors for SSI after abdominal hysterectomy.
Retrospective case-control study of 545 abdominal and 275 vaginal hysterectomies from July 1, 2003, to June 30, 2005, at 4 institutions. SSIs were defined by using Centers for Disease Control and Prevention/National Nosocomial Infections Surveillance criteria. Independent risk factors for abdominal hysterectomy were identified by using logistic regression.
There were 13 deep incisional, 53 superficial incisional, and 18 organ-space SSIs after abdominal hysterectomy and 14 organ-space SSIs after vaginal hysterectomy. Because risk factors for organ-space SSI were different according to univariate analysis, we focused further analyses on incisional SSI after abdominal hysterectomy. The maximum serum glucose level within 5 days after operation was highest in patients with deep incisional SSI, lower in patients with superficial incisional SSI, and lowest in uninfected patients (median, 189, 156, and 141 mg/dL, respectively; P = .005). Independent risk factors for incisional SSI included blood transfusion (odds ratio [OR], 2.4) and morbid obesity (body mass index [BMI], >35; OR, 5.7). Duration of operation greater than the 75th percentile (OR, 1.7), obesity (BMI, 30–35; OR, 3.0), and lack of private health insurance (OR, 1.7) were marginally associated with increased odds of SSI.
Incisional SSI after abdominal hysterectomy was associated with increased BMI and blood transfusion. Longer duration of operation and lack of private health insurance were marginally associated with SSI.
To evaluate the impact of cases of community-onset, healthcare facility (HCF)-associated Clostridium difficile infection (CDI) on the incidence and outbreak detection of CDI.
A retrospective multicenter cohort study.
Five university-affiliated, acute care HCFs in the United States.
We collected data (including results of C. difficile toxin assays of stool samples) on all of the adult patients admitted to the 5 hospitals during the period from July I, 2000, through June 30, 2006. CDI cases were classified as HCF-onset if they were diagnosed more than 48 hours after admission or as community-onset, HCF-associated if they were diagnosed within 48 hours after admission and if the patient had recently been discharged from the HCF. Four surveillance definitions were compared: cases of HCF-onset CDI only (hereafter referred to as HCF-onset CDI) and cases of HCF-onset and community-onset, HCF-associated CDI diagnosed within 30, 60, and 90 days after the last discharge from the study hospital (hereafter referred to as 30-day, 60-day, and 90-day CDI, respectively). Monthly CDI rates were compared. Control charts were used to identify potential CDI outbreaks.
The rate of 30-day CDI was significantly higher than the rate of HCF-onset CDI at 2 HCFs (P < .01 ). The rates of 30-day CDI were not statistically significantly different from the rates of 60-day or 90-day CDI at any HCF. The correlations between each HCF's monthly rates of HCF-onset CDI and 30-day CDI were almost perfect (ρ range, 0.94-0.99; P < .001). Overall, 12 time points had a CDI rate that was more than 3 standard deviations above the mean, including 11 time points identified using the definition for HCF-onset CDI and 9 time points identified using the definition for 30-day CDI, with discordant results at 4 time points (k = 0.794; P < .001).
Tracking cases of both community-onset and HCF-onset, HCF-associated CDI captures significantly more CDI cases, but surveillance of HCF-onset, HCF-associated CDI alone is sufficient to detect an outbreak.
To measure infection rates in a regional cohort of long-term-care facilities (LTCFs) using standard surveillance methods and to analyze different methods for interfacility comparisons.
Seventeen LTCFs in Idaho.
Prospective, active surveillance for LTCF-acquired infections using standard definitions and case-finding methods was conducted from July 2001 to June 2002. All surveillance data were combined and individual facility performance was compared with the aggregate employing a variety of statistical and graphic methods.
The surveillance data set consisted of 472,019 resident-days of care with 1,717 total infections for a pooled mean rate of 3.64 infections per 1,000 resident-days. Specific infections included respiratory (828; rate, 1.75), skin and soft tissue (520; rate, 1.10), urinary tract (282; rate, 0.60), gastrointestinal (77; rate, 0.16), unexplained febrile illnesses (6; rate, 0.01), and bloodstream (4; rate, 0.01). Initially, methods adopted from the National Nosocomial Infections Surveillance System were used comparing individual rates with pooled means and percentiles of distribution. A more sensitive method appeared to be detecting statistically significant deviations (based on chi-square analysis) of the individual facility rates from the aggregate of all other facilities. One promising method employed statistical process control charts (U charts) adjusted to compare individual rates with aggregate monthly rates, providing simultaneous visual and statistical comparisons. Small multiples graphs were useful in providing images valid for rapid concurrent comparison of all facilities.
Interfacility comparisons have been demonstrated to be valuable for hospital infection control programs, but have not been studied extensively in LTCFs.
To compare the efficacy of the polysaccharide pneumococcal vaccine in older adults between clinical trial and observational studies and to discuss the implications for long-term–care facilities (LTCFs).
A Medline search (to April 2003).
All meta-analyses of randomized and quasi-randomized trials of pneumococcal vaccines with placebo or no treatment were sought. All cohort or case–control studies were sought.
Of the 16 individual randomized clinical trials included in the reviews, 8 compared pneumococcal vaccine in individuals 55 years and older individuals. Only one study specifically addressed LTCF residents. Although no significant protective effect of the vaccine in elderly subpopulations was found, on the basis of wide confidence intervals and small subpopulation sample sizes, beneficial effects, particularly for pneumococcal bacteremia, could not be ruled out. Of the individual observational studies, 11 specifically evaluated vaccine efficacy in older adults. Vaccine efficacy was demonstrated in 9 of the 11 studies with no protective effect was shown in 2 studies.
Although the pooling of clinical trial data does not demonstrate significant efficacy of the pneumococcal polysaccharide vaccine in subgroups of older adults, these subgroup studies lacked power to show significant differences. Observational studies repeatedly demonstrate efficacy in older adults, and the vaccine has been demonstrated to be cost-effective and safe. It is strongly promoted by U.S. and Canadian advisory committees. On the basis of this available evidence, the pneumococcal polysaccharide vaccine should currently be recommended for older adults, especially those who are residents of LTCFs.
Infection control programs were among the first organized efforts to improve the quality of healthcare delivered to patients and are an excellent model for the development of other healthcare performance improvement activities. Whether labeled as infection control, quality improvement, or patient safety, performance improvement initiatives share similar methods and principles. The quality of care in long-term–care facilities (LTCFs) has been scrutinized foryears and has received renewed attention with the recent initiation of public reporting of quality measures by Medicare. This article reviews the principles of performance improvement, discusses the importance of employing evidence-based interventions, and emphasizes the value of local performance improvement in LTCFs. Residents of LTCFs remain at high risk for the development of nosocomial infections, and among performance improvement initiatives, infection control is recom-mended as a high priority for all LTCFs. Fortunately, infection control contains the essential elements for performance improvement, and a successful infection control program can provide the foun-dation for expanding performance improvement throughout the LTCF. There is still much that needs to be done to determine the best clinical practices for LTCFs, and this should remain a priority for future research. Furthermore, efforts should continue to apply these principles at the local level to ensure that all residents of LTCFs receive the best care possible.
To describe an outbreak of infections with permanent cuffed hemodialysis catheters recognized through ongoing surveillance and related to a specific malfunctioning permanent catheter.
The outbreak was suspected from the results of prospective infection surveillance and confirmed by a retrospective cohort study using medical records for patients receiving dialysis between April 1,1999, and March 31, 2000.
Integrated network of six outpatient hemodialysis facilities in southern Idaho and eastern Oregon.
Outpatients receiving long-term hemodialysis.
During the 18 months prior to the outbreak, the overall infection rate was 4.1 infections per 1,000 dialysis sessions with a catheter rate of 8.9 per 1,000 dialysis sessions. During the 7 months of the outbreak, the overall rate increased to 5.8 per 1,000 dialysis sessions, whereas the catheter rate increased to 18.1 per 1,000 dialysis sessions. Reports of malfunctioning “Brand A” catheters prompted discontinuation of their placement. A manufacturer recall occurred in April 2000. During the 14 months after the outbreak, the overall infection rate decreased to 3.3 per 1,000 dialysis sessions and the catheter rate to 10.8 per 1,000 dialysis sessions. A 12-month retrospective cohort study recognized 96 patients with an identifiable catheter brand and 48 infections. Of these, 27 (56%) occurred in patients with Brand A catheters. The relative risk for infection when compared with other catheter brands was 1.96 (95% confidence interval, 1.32 to 2.92; P < .001).
Ongoing infection surveillance in hemodialysis facilities can identify specific device-related outbreaks of infections and promote interventions to reduce infectious complications and promote patient safety. Surveillance for vascular access site infections is recommended as a routine activity in hemodialysis facilities.
To evaluate collaborative efforts and intervention strategies by peer-review organizations (PROs) and long-term-care facilities (LTCFs) for improving pneumococcal vaccination rates among residents of LTCFs.
Baseline pneumococcal vaccination rates were determined by medical-record review, self-reporting by patient or family members, and review of Medicare claims information. Remeasurement of vaccination rates was accomplished from documentation of vaccination of eligible residents by each LTCF.
133 LTCFs with 7,623 residents from Alaska, Idaho, Montana, and Wyoming participated in this quality-improvement project. This accounted for 41% (133/321) of the potential nursing homes and resident population in the participating states. Baseline overall vaccination rates were 40% (3,050/7,589). The overall vaccination rate improved to 75% (5,720/7,623, P<.001). The number of facilities meeting the Healthy People 2000 vaccination goal of 80% improved from 18% (24/133) to 62% (83/133, P<.001). Initial use of chart stickers and implementation of standing orders led to similar increases in vaccination rates, but the standing-order strategy required commitment of fewer PRO resources at a statewide level. Remeasurement of vaccination rates in a subset of participating Idaho LTCFs 1 year after initial vaccination efforts demonstrated a sustained vaccination rate of 70% in facilities enforcing a standing-order policy.
Simple and straightforward vaccination strategies implemented in LTCFs over a short period of time can have a significant impact on vaccination rates. Collaborative efforts between state PROs and LTCFs enhance implementation of these strategies and can result in the achievement of national vaccination objectives. Standing orders appear to be one intervention effective in sustaining successful vaccination efforts. Regardless of the specific interventions employed, PROs played a significant role in facilitating vaccination program development and intervention implementation.
To develop a standardized surveillance system for monitoring hemodialysis vascular-access infections in order to compare infection rates between outpatient sites and to assess the effectiveness of infection control interventions.
Prospective descriptive analysis of incidence infection rates.
An outpatient hemodialysis center with facilities in Idaho and Oregon.
All outpatients receiving chronic outpatient hemodialysis.
There were 38,096 hemodialysis sessions (31,603 via permanent fistulae or grafts, 5,060 via permanent tunneled central catheters, and 1,433 via temporary catheters) during an 18-month study period in 1997 to 1998. We identified 176 total infections, for a rate of 4.62/1,000 dialysis sessions (ds). Of the 176, 80 involved permanent fistulae or grafts (2.53/1,000 ds), 69 involved permanent tunneled central catheter infections (13.64/1,000 ds), and 27 involved temporary catheter infections (18.84/1,000 ds). There were 35 bloodstream infections (0.92/1,000 ds) and 10 episodes of clinical sepsis (0.26 /1,000 ds). One hundred thirty-one vascular-site infections without bacteremia were identified (3.44/1,000 ds), including 65 permanent fistulae or graft infections (2.06/1,000 ds), 42 permanent tunneled central catheter infections (8.3/1,000 ds), and 24 temporary catheter infections (16.75/1,000 ds).
Infection rates were highest among temporary catheters and lowest among permanent native arteriovenous fistulae or synthetic grafts. This represents the first report of extensive incidence data on hemodialysis vascular access infections and represents a standardized surveillance and data-collection system that could be implemented in hemodialysis facilities to allow for reliable data comparison and benchmarking.
Email your librarian or administrator to recommend adding this to your organisation's collection.