To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To determine the effectiveness of ultraviolet (UV) environmental disinfection system on rates of hospital-acquired vancomycin-resistant enterococcus (VRE) and Clostridium difficile.
Using active surveillance and an interrupted time-series design, hospital-acquired acquisition of VRE and C. difficile on a bone marrow transplant (BMT) unit were examined before and after implementation of terminal disinfection with UV on all rooms regardless of isolation status of patients. The main outcomes were hospital-based acquisition measured through (1) active surveillance: admission, weekly, and discharge screening for VRE and toxigenic C. difficile (TCD) and (2) clinical surveillance: incidence of VRE and CDI on the unit.
Bone marrow transplant unit at a tertiary-care cancer center.
Stem cell transplant (SCT) recipients.
Terminal disinfection of all rooms with UV regardless of isolation status of patients.
During the 20-month study period, 579 patients had 704 admissions to the BMT unit, and 2,160 surveillance tests were performed. No change in level or trend in the incidence of VRE (trend incidence rate ratio [IRR], 0.96; 95% confidence interval [CI], 0.81–1.14; level IRR, 1.34; 95% CI, 0.37–1.18) or C. difficile (trend IRR, 1.08; 95% CI, 0.89–1.31; level IRR, 0.51; 95% CI, 0.13–2.11) was observed after the intervention.
Utilization of UV disinfection to supplement routine terminal cleaning of rooms was not effective in reducing hospital-acquired VRE and C. difficile among SCT recipients.
To describe the utilization of electronic medical data resources, including health records and nursing scheduling resources, to conduct a tuberculosis (TB) exposure investigation in a high-risk oncology unit.
A 42-bed inpatient unit with a mix of surgical and medical patients at a large tertiary-care cancer center in New York City.
High-risk subjects and coworkers exposed to a healthcare worker (HCW) with cavitary smear positive lung TB.
During the 3-month exposure period, 270 patients were admitted to the unit; 137 of these (50.7%) received direct care from the index case HCW. Host immune status and intensity of exposure were used to establish criteria for postexposure testing, and 63 patients (45%) met these criteria for first-tier postexposure testing. No cases of active TB occurred. Among coworkers, 146 had significant exposure (ie, >8 hours cumulative). In the 22-month follow-up period after the exposure, no purified protein derivative or interferon gamma release assay conversions or active cases of TB occurred among exposed HCWs or patients.
Electronic medical records and employee scheduling systems are useful resources to conduct otherwise labor-intensive contact investigations. Despite the high-risk features of our index case, a highly vulnerable immunocompromised patient population, and extended proximity to coworkers, we did not find any evidence of transmission of active or latent tuberculosis infection among exposed individuals.
In this study, we examined the impact of routine use of a passive disinfection cap for catheter hub decontamination in hematology–oncology patients.
A tertiary care cancer center in New York City
In this multiphase prospective study, we used 2 preintervention phases (P1 and P2) to establish surveillance and baseline rates followed by sequential introduction of disinfection caps on high-risk units (HRUs: hematologic malignancy wards, hematopoietic stem cell transplant units and intensive care units) (P3) and general oncology units (P4). Unit-specific and hospital-wide hospital-acquired central-line–associated bloodstream infection (HA-CLABSI) rates and blood culture contamination (BCC) with coagulase negative staphylococci (CONS) were measured.
Implementation of a passive disinfection cap resulted in a 34% decrease in hospital-wide HA-CLABSI rates (combined P1 and P2 baseline rate of 2.66–1.75 per 1,000 catheter days at the end of the study period). This reduction occurred only among high-risk patients and not among general oncology patients. In addition, the use of the passive disinfection cap resulted in decreases of 63% (HRUs) and 51% (general oncology units) in blood culture contamination, with an estimated reduction of 242 BCCs with CONS. The reductions in HA-CLABSI and BCC correspond to an estimated annual savings of $3.2 million in direct medical costs.
Routine use of disinfection caps is associated with decreased HA-CLABSI rates among high-risk hematology oncology patients and a reduction in blood culture contamination among all oncology patients.
Infect. Control Hosp. Epidemiol. 2015;36(12):1401–1408
A multicenter survey of 11 cancer centers was performed to determine the rate of hospital-onset Clostridium difficile infection (HO-CDI) and surveillance practices. Pooled rates of HO-CDI in patients with cancer were twice the rates reported for all US patients (15.8 vs 7.4 per 10,000 patient-days). Rates were elevated regardless of diagnostic test used.
The success of central line-associated bloodstream infection (CLABSI) prevention programs in intensive care units (ICUs) has led to the expansion of surveillance at many hospitals. We sought to compare non-ICU CLABSI (nCLABSI) rates with national reports and describe methods of surveillance at several participating US institutions.
Design and Setting.
An electronic survey of several medical centers about infection surveillance practices and rate data for non-ICU Patients.
Ten tertiary care hospitals.
In March 2011, a survey was sent to 10 medical centers. The survey consisted of 12 questions regarding demographics and CLABSI surveillance methodology for non-ICU patients at each center. Participants were also asked to provide available rate and device utilization data.
Hospitals ranged in size from 238 to 1,400 total beds (median, 815). All hospitals reported using Centers for Disease Control and Prevention (CDC) definitions. Denominators were collected by different means: counting patients with central lines every day (5 hospitals), indirectly estimating on the basis of electronic orders (n = 4), or another automated method (n = 1). Rates of nCLABSI ranged from 0.2 to 4.2 infections per 1,000 catheter-days (median, 2.5). The national rate reported by the CDC using 2009 data from the National Healthcare Surveillance Network was 1.14 infections per 1,000 catheter-days.
Only 2 hospitals were below the pooled CLABSI rate for inpatient wards; all others exceeded this rate. Possible explanations include differences in average central line utilization or hospital size in the impact of certain clinical risk factors notably absent from the definition and in interpretation and reporting practices. Further investigation is necessary to determine whether the national benchmarks are low or whether the hospitals surveyed here represent a selection of outliers.
In 2007–2008, several US hospitals reported summertime increases in the number of clinical blood cultures positive for Bacillus species, which are common environmental bacteria.
To investigate increased rates of isolation of Bacillus species from blood cultures, identify risk factors, and recommend control strategies.
Survey and case-control study.
Multiple hospitals, including a cancer center.
We surveyed 24 facilities that reported increases. We also conducted a field investigation at a hospital with a high rate, reviewing charts, collecting clinical and environmental isolates, and observing infection control procedures. A case-control study compared inpatient case patients who had any blood culture positive for Bacillus with unmatched control patients who had a blood culture with no growth during June-August 2008.
Among surveyed facilities, mean monthly rates rose from 25 to a peak of 75 Bacillus-positive blood cultures per 10,000 blood cultures performed during the period June-August. At the hospital where the case-control investigation was conducted, for most case patients (75%), the Bacillus-positive blood cultures represented contamination or device colonization rather than infection. We enrolled 48 case patients and 48 control patients; in multivariate analysis, only central venous access device use was significantly associated with case status (odds ratio, 14.0; P < .01). Laboratory testing identified at least 12 different Bacillus species (non-anthracis) among the isolates. Observation of infection control procedures revealed variability in central line care and blood sample collection techniques.
Periodic increases in the environmental load of Bacillus species may occur in hospitals. Our investigation indicated that at one facility, these increases likely represented a pseudo-outbreak of Bacillus species colonizing central venous lines or their accessories, such as needleless connector devices. Vigilant attention should be paid to infection control practices when collecting blood samples for culture, to minimize the risk of contamination by environmental microorganisms.
Clostridium difficile-associated diarrhea (CDAD) is an important infection in hospital settings. Its impact on outpatient care has not been well defined.
To examine risk factors of ambulatory cancer patients with CDAD.
Memorial Sloan-Kettering Cancer Center, a tertiary-care hospital.
Cases of CDAD among oncology outpatients from January 1999 through December 2000 were identified via positive C. difficile toxin assay results on stool specimens sent from clinics or the emergency department. A 1:3 matched case-control study examined exposures associated with CDAD.
Forty-eight episodes of CDAD were identified in cancer outpatients. The mean age was 51 years; 44% were female. Forty-one (85%) had received antibiotics within 60 days of diagnosis, completing courses a median of 16.5 days prior to diagnosis. Case-patients received longer courses of first-generation cephalosporins (4.8 vs 3.2 days; P = .03) and fluoroquinolones (23.6 vs 8 days; P < .01) than did control-patients. Those receiving clindamycin were 3.9-fold more likely to develop CDAD (P < .01). For each additional day of clindamycin or third-generation cephalosporin exposure, patients were 1.29- and 1.26-fold more likely to develop CDAD (P < .01 and .04, respectively). The 38 CDAD patients hospitalized during the risk period (79.2%) spent more time as inpatients than did control-patients (19.3 vs 9.7 days; P <. 001).
Antibiotic use, especially with cephalosporins and clindamycin, and prolonged hospitalization contributed to the development of CDAD. Outpatient CDAD appears to be most strongly related to inpatient exposures; reasons for the delayed development of symptoms are unknown.
To assess the effect of implementing safety-engineered devices on percutaneous injury epidemiology, specifically on percutaneous injuries associated with a higher risk of blood-borne pathogen exposure.
Before-and-after intervention trial comparing 3-year preintervention (1998–2000) and 1-year postintervention (2001–2002) periods. Percutaneous injury data have been entered prospectively into CDC NaSH software since 1998.
A 427-bed, tertiary-care hospital in Manhattan.
All employees who reported percutaneous injuries during the study period.
A “safer-needle system,” composed of a variety of safety-engineered devices to allow for needle-safe IV delivery, blood collection, IV insertion, and intramuscular and subcutaneous injection, was implemented in February 2001.
The mean annual incidence of percutaneous injuries decreased from 34.08 per 1,000 full-time–equivalent employees preintervention to 14.25 postintervention (P < .001). Reductions in the average monthly number of percutaneous injuries resulting from both low-risk (P < .01) and high-risk (P was not significant) activities were observed. Nurses experienced the greatest decrease (74.5%, P < .001), followed by ancillary staff (61.5%, P = .03). Significant rate reductions were observed for the following activities: manipulating patients or sharps (83.5%, P < .001), collisions or contact with sharps (73.0%, P = .01), disposal-related injuries (21.41%, P = .001), and catheter insertions (88.2%, P < .001). Injury rates involving hollow-bore needles also decreased (70.6%, P < .001).
The implementation of safety-engineered devices reduced percutaneous injury rates across occupations, activities, times of injury, and devices. Moreover, intervention impact was observed when stratified by risk for blood-borne pathogen transmission.
To examine whether implementation of safety-engineered devices in 2001 had an effect on rates of percutaneous injury (PI) reported by HCWs.
Before-and-after intervention trial comparing 3-year preintervention (1998–2001) and 2-year postintervention (2001–2002) periods. PI data from anonymous, self-administered surveys were prospectively entered into CDC NaSH software.
A 427-bed, tertiary-care hospital in Manhattan.
HCWs who attended state-mandated training sessions and completed the survey (1,132 preintervention; 821 postintervention).
Implementation of a “safer-needle system” composed of various safety-engineered devices for needle-safe TV delivery-insertion, blood collection, and intramuscular-subcutaneous injection.
Preintervention, the overall annual rate of PIs self-reported on the survey was 36.5 per 100 respondents, compared with 13.9 per 100 respondents postintervention (P < .01). The annual rate of formally reported PIs decreased from 8.3 to 3.1 per 100 respondents (P < .01). Report rates varied by occupational group (P ≤ .02). The overall rate did not change between study periods (22.7% to 22.3%), although reporting improved among nurses (23.6% to 44.4%, P = .03) and worsened among building services staff (90.5% to 50%, P = .03). HCWs with greater numbers of Pis self-reported on the survey were less likely to formally report injuries (P < .01). The two most common reasons for nonreport (ie, thought injury was low risk or believed patient was low risk for blood-borne disease) did not vary from preintervention to postintervention.
Safety-engineered device implementation decreased rates of Pis formally reported and self-reported on the survey. However, this intervention, with concomitant intensive education, had varying effects on reporting behavior by occupation and a minimal effect on overall reporting rates.
Recent guidelines for the prevention of opportunistic infections have addressed a variety of issues germane to recipients of hematopoietic stem cell transplant. However, there are several issues regarding postexposure prophylaxis against varicella-zoster virus that remain unresolved. We address these questions and offer several consensus recommendations.
We estimated the impact of vancomycin-resistant Enterococcus (VRE) infection on the outcomes of patients with leukemia in a case-control study. Compared with their matched controls (n = 45), cases (n = 23) had 22% greater total charges and shorter survival (P= .04). These findings substantiate the need for aggressive interventions to prevent VRE transmission.
Despite the 1989 Advisory Committee on Immunization Practices recommendation of a second dose of vaccine, measles seropositivity rates had declined for adult healthcare workers in their 20s hired at a cancer hospital between 1998 and 1999 compared with those of the same age hired between 1983 and 1988. Continued monitoring will be important as individuals born after 1989 enter the workforce.
In January 1998, an outbreak of influenza A occurred on our adult bone marrow transplant unit. Aggressive infection control measures were instituted to halt further nosocomial spread. A new, more rigorous approach was implemented for the 1998/99 influenza season and was extremely effective in preventing nosocomial influenza at our institution.
To determine the seroconversion rate after varicella immunization of healthcare workers (HCWs) and the effect of seroconversion rate on current cost-based recommendations for universal vaccination.
A voluntary vaccination program for HCWs was performed at a tertiary-care cancer center in New York City. A commercial latex agglutination assay was used to test postvaccination antibody response. Costs for vaccination and postvaccination serological testing were compared to potential costs of postexposure employee furloughs.
Of 263 seronegative HCWs, 96 (36.5%) began the vaccine program. Thirty-nine HCWs received only one dose of vaccine. Seven returned for follow-up antibody testing, of whom 4 were seropositive. Of the 57 HCWs who received two doses, 38 returned for follow-up serology. Thirty-one (81.6%) HCWs were seropositive for varicella-zoster virus antibodies, and seven HCWs (18.4%) remained seronegative. Total cost of vaccination for all 263 seronegative HCWs was estimated and compared to the cost of varicella-related furloughs at our institution.
We found a considerably lower rate of vaccine-induced seroconversion at our hospital compared to that of the published literature. Despite this finding, universal varicella vaccination remained an extremely cost-effective alternative to the furloughing of exposed, seronegative HCWs. Projected hospital savings exceeded $53,000 in the first year after vaccination alone.
Email your librarian or administrator to recommend adding this to your organisation's collection.