Objective.To determine rates of blood culture contamination comparing 3 strategies to prevent intensive care unit (ICU) infections: screening and isolation, targeted decolonization, and universal decolonization.
Patients.Patients admitted to adult ICUs from July 1, 2009, to September 30, 2011.
Methods.After a 6-month baseline period, hospitals were randomly assigned to 1 of 3 strategies, with all participating adult ICUs in a given hospital assigned to the same strategy. Arm 1 implemented methicillin-resistant Staphylococcus aureus (MRSA) nares screening and isolation, arm 2 targeted decolonization (screening, isolation, and decolonization of MRSA carriers), and arm 3 conducted no screening but universal decolonization of all patients with mupirocin and chlorhexidine (CHG) bathing. Blood culture contamination rates in the intervention period were compared to the baseline period across all 3 arms.
Results.During the 6-month baseline period, 7,926 blood cultures were collected from 3,399 unique patients: 1,099 sets in arm 1, 928 in arm 2, and 1,372 in arm 3. During the 18-month intervention period, 22,761 blood cultures were collected from 9,878 unique patients: 3,055 sets in arm 1, 3,213 in arm 2, and 3,610 in arm 3. Among all individual draws, for arms 1,2, and 3, the contamination rates were 4.1%, 3.9%, and 3.8% for the baseline period and 3.3%, 3.2%, and 2.4% for the intervention period, respectively. When we evaluated sets of blood cultures rather than individual draws, the contamination rate in arm 1 (screening and isolation) was 9.8% (N = 108 sets) in the baseline period and 7.5% (N = 228) in the intervention period. For arm 2 (targeted decolonization), the baseline rate was 8.4% (N = 78) compared to 7.5% (N = 241) in the intervention period. Arm 3 (universal decolonization) had the greatest decrease in contamination rate, with a decrease from 8.7% (N = 119) contaminated blood cultures during the baseline period to 5.1% (N = 184) during the intervention period. Logistic regression models demonstrated a significant difference across the arms when comparing the reduction in contamination between baseline and intervention periods in both unadjusted (P = .02) and adjusted (P = .02) analyses. Arm 3 resulted in the greatest reduction in blood culture contamination rates, with an unadjusted odds ratio (OR) of 0.56 (95% confidence interval [CI], 0.044-0.71) and an adjusted OR of 0.55 (95% CI, 0.43-0.71).