To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To systematically assess enhanced personal protective equipment (PPE) doffing safety risks.
We employed a 3-part approach to this study: (1) hierarchical task analysis (HTA) of the PPE doffing process; (2) human factors-informed failure modes and effects analysis (FMEA); and (3) focus group sessions with a convenience sample of infection prevention (IP) subject matter experts.
A large academic US hospital with a regional Special Pathogens Treatment Center and enhanced PPE doffing protocol experience.
Eight IP experts.
The HTA was conducted jointly by 2 human-factors experts based on the Centers for Disease Control and Prevention PPE guidelines. The findings were used as a guide in 7 focus group sessions with IP experts to assess PPE doffing safety risks. For each HTA task step, IP experts identified failure mode(s), assigned priority risk scores, identified contributing factors and potential consequences, and identified potential risk mitigation strategies. Data were recorded in a tabular format during the sessions.
Of 103 identified failure modes, the highest priority scores were associated with team members moving between clean and contaminated areas, glove removal, apron removal, and self-inspection while preparing to doff. Contributing factors related to the individual (eg, technical/ teamwork competency), task (eg, undetected PPE contamination), tools/technology (eg, PPE design characteristics), environment (eg, inadequate space), and organizational aspects (eg, training) were identified. Participants identified 86 types of risk mitigation strategies targeting the failure modes.
Despite detailed guidelines, our study revealed 103 enhanced PPE doffing failure modes. Analysis of the failure modes suggests potential mitigation strategies to decrease self-contamination risk during enhanced PPE doffing.
We describe the use of implementation science at the unit level and organizational level to guide an intervention to reduce central-line–associated bloodstream infections (CLABSIs) in a high-volume, regional, burn intensive care unit (BICU).
A single center observational quasi-experimental study.
A regional BICU in Maryland serving 300–400 burn patients annually.
In 2011, an organizational-level and unit-level intervention was implemented to reduce the rates of CLABSI in a high-risk patient population in the BICU. At the organization level, leaders declared a goal of zero infections, created an infrastructure to support improvement efforts by creating a coordinating team, and engaged bedside staff. Performance data were transparently shared. At the unit level, the Comprehensive Unit-based Safety Program (CUSP)/ Translating Research Into Practice (TRIP) model was used. A series of interventions were implemented: development of new blood culture procurement criteria, implementation of chlorhexidine bathing and chlorhexidine dressings, use of alcohol impregnated caps, routine performance of root-cause analysis with executive engagement, and routine central venous catheter changes.
The use of an implementation science framework to guide multiple interventions resulted in the reduction of CLABSI rates from 15.5 per 1,000 central-line days to zero with a sustained rate of zero CLABSIs over 3 years (rate difference, 15.5; 95% confidence interval, 8.54–22.48).
CLABSIs in high-risk units may be preventable with the a use a structured organizational and unit-level paradigm.
Several studies demonstrating that central line–associated bloodstream infections (CLABSIs) are preventable prompted a national initiative to reduce the incidence of these infections.
We conducted a collaborative cohort study to evaluate the impact of the national “On the CUSP: Stop BSI” program on CLABSI rates among participating adult intensive care units (ICUs). The program goal was to achieve a unit-level mean CLABSI rate of less than 1 case per 1,000 catheter-days using standardized definitions from the National Healthcare Safety Network. Multilevel Poisson regression modeling compared infection rates before, during, and up to 18 months after the intervention was implemented.
A total of 1,071 ICUs from 44 states, the District of Columbia, and Puerto Rico, reporting 27,153 ICU-months and 4,454,324 catheter-days of data, were included in the analysis. The overall mean CLABSI rate significantly decreased from 1.96 cases per 1,000 catheter-days at baseline to 1.15 at 16–18 months after implementation. CLABSI rates decreased during all observation periods compared with baseline, with adjusted incidence rate ratios steadily decreasing to 0.57 (95% confidence intervals, 0.50–0.65) at 16–18 months after implementation.
Coincident with the implementation of the national “On the CUSP: Stop BSI” program was a significant and sustained decrease in CLABSIs among a large and diverse cohort of ICUs, demonstrating an overall 43% decrease and suggesting the majority of ICUs in the United States can achieve additional reductions in CLABSI rates.
To evaluate the impact of a multifaceted intervention on compliance with evidence-based therapies and ventilator-associated pneumonia (VAP) rates.
Collaborative cohort before-after study.
Intensive care units (ICUs) predominantly in Michigan.
We implemented a multifaceted intervention to improve compliance with 5 evidence-based recommendations for mechanically ventilated patients and to prevent VAP. A standardized CDC definition of VAP was used and maintained at each site, and data on the number of VAPs and ventilator-days were obtained from the hospital's infection preventionists. Baseline data were reported and postimplementation data were reported for 30 months. VAP rates (in cases per 1,000 ventilator-days) were calculated as the proportion of ventilator-days per quarter in which patients received all 5 therapies in the ventilator care bundle. Two interventions to improve safety culture and communication were implemented first.
One hundred twelve ICUs reporting 3,228 ICU-months and 550,800 ventilator-days were included. The overall median VAP rate decreased from 5.5 cases (mean, 6.9 cases) per 1,000 ventilator-days at baseline to 0 cases (mean, 3.4 cases) at 16–18 months after implementation (P < .001) and 0 cases (mean, 2.4 cases) at 28-30 months after implementation (P < .001). Compared to baseline, VAP rates decreased during all observation periods, with incidence rate ratios of 0.51 (95% confidence interval, 0.41–0.64) at 16–18 months after implementation and 0.29 (95% confidence interval, 0.24–0.34) at 28–30 months after implementation. Compliance with evidence-based therapies increased from 32% at baseline to 75% at 16–18 months after implementation (P < .001) and 84% at 28–30 months after implementation (P < .001).
A multifaceted intervention was associated with an increased use of evidence-based therapies and a substantial (up to 71%) and sustained (up to 2.5 years) decrease in VAP rates.
Central line-associated bloodstream infection (CLABSI) rates are gaining importance as they become publicly reported metrics and potential pay-for-performance indicators. However, the current conventional method by which they are calculated may be misleading and unfairly penalize high-acuity care settings, where patients often have multiple consurrent central venous catheters (CVCs).
We compared the conventional method of calculating CLABSI rates, in which the number of catheter-days is used (1 patient with n catheters for 1 day has 1 catheter-day), with a new method that accounts for multiple concurrent catheters (1 patient with n catheters for 1 day has n catheter-days), to determine whether the difference appreciably changes the estimated CLABSI rate.
Academic, tertiary care hospital.
Adult patients who were consecutively admitted from June 10 through July 9, 2009, to a cardiac-surgical intensive care unit and a surgical intensive and surgical intermediate care unit.
Using the conventional method, we counted 485 catheter-days throughout the study period, with a daily mean of 18.6 catheter-days (95% confidence interval, 17.2-20.0 catheter-days) in the 2 intensive care units. In contrast, the new method identified 745 catheter-days, with a daily mean of 27.5 catheter-days (95% confidence interval, 25.6-30.3) in the 2 intensive care units. The difference was statistically significant (P < .001). The new method that accounted for multiple concurrent CVCs resulted in a 53.6% increase in the number of catheter-days; this increased denominator decreases the calculated CLABSI rate by 36%.
The undercounting of catheter-days for patients with multiple concurrent CVCs that occurs when the conventional method of calculating CLABSI rates is used inflates the CLABSI rate for care settings that have a high CVC burden and may not adjust for underlying medical illness. Additional research is needed to validate and generalize our findings.
Email your librarian or administrator to recommend adding this to your organisation's collection.