Hostname: page-component-cd9895bd7-jkksz Total loading time: 0 Render date: 2024-12-22T04:52:03.088Z Has data issue: false hasContentIssue false

Real-Time, Automated Detection of Ventilator-Associated Events: Avoiding Missed Detections, Misclassifications, and False Detections Due to Human Error

Published online by Cambridge University Press:  17 May 2018

Erica S. Shenoy*
Affiliation:
Infection Control Unit, Massachusetts General Hospital, Boston, Massachusetts Division of Infectious Diseases, Massachusetts General Hospital, Boston, Massachusetts Harvard Medical School, Boston, Massachusetts Clinical Data Animation Center, Massachusetts General Hospital, Boston, Massachusetts Health Sciences and Technology Program, Harvard Medical School, Boston, Massachusetts
Eric S. Rosenthal
Affiliation:
Harvard Medical School, Boston, Massachusetts Department of Neurology, Massachusetts General Hospital, Boston, Massachusetts Clinical Data Animation Center, Massachusetts General Hospital, Boston, Massachusetts Health Sciences and Technology Program, Harvard Medical School, Boston, Massachusetts
Yu-Ping Shao
Affiliation:
Department of Neurology, Massachusetts General Hospital, Boston, Massachusetts Clinical Data Animation Center, Massachusetts General Hospital, Boston, Massachusetts
Siddharth Biswal
Affiliation:
Department of Computer Science, Georgia Institute of Technology College of Computing, Atlanta, Georgia
Manohar Ghanta
Affiliation:
Department of Neurology, Massachusetts General Hospital, Boston, Massachusetts Clinical Data Animation Center, Massachusetts General Hospital, Boston, Massachusetts
Erin E. Ryan
Affiliation:
Infection Control Unit, Massachusetts General Hospital, Boston, Massachusetts
Dolores Suslak
Affiliation:
Infection Control Unit, Massachusetts General Hospital, Boston, Massachusetts
Nancy Swanson
Affiliation:
Infection Control Unit, Massachusetts General Hospital, Boston, Massachusetts
Valdery Moura Junior
Affiliation:
Department of Neurology, Massachusetts General Hospital, Boston, Massachusetts Clinical Data Animation Center, Massachusetts General Hospital, Boston, Massachusetts
David C. Hooper
Affiliation:
Infection Control Unit, Massachusetts General Hospital, Boston, Massachusetts Division of Infectious Diseases, Massachusetts General Hospital, Boston, Massachusetts Harvard Medical School, Boston, Massachusetts
M. Brandon Westover
Affiliation:
Harvard Medical School, Boston, Massachusetts Department of Neurology, Massachusetts General Hospital, Boston, Massachusetts Clinical Data Animation Center, Massachusetts General Hospital, Boston, Massachusetts Health Sciences and Technology Program, Harvard Medical School, Boston, Massachusetts
*
Address correspondence to Erica S. Shenoy, MD, PhD, Massachusetts General Hospital, 55 Fruit Street, Bulfinch 334, Boston, MA 02114 (eshenoy@mgh.harvard.edu).

Abstract

OBJECTIVE

To validate a system to detect ventilator associated events (VAEs) autonomously and in real time.

DESIGN

Retrospective review of ventilated patients using a secure informatics platform to identify VAEs (ie, automated surveillance) compared to surveillance by infection control (IC) staff (ie, manual surveillance), including development and validation cohorts.

SETTING

The Massachusetts General Hospital, a tertiary-care academic health center, during January–March 2015 (development cohort) and January–March 2016 (validation cohort).

PATIENTS

Ventilated patients in 4 intensive care units.

METHODS

The automated process included (1) analysis of physiologic data to detect increases in positive end-expiratory pressure (PEEP) and fraction of inspired oxygen (FiO2); (2) querying the electronic health record (EHR) for leukopenia or leukocytosis and antibiotic initiation data; and (3) retrieval and interpretation of microbiology reports. The cohorts were evaluated as follows: (1) manual surveillance by IC staff with independent chart review; (2) automated surveillance detection of ventilator-associated condition (VAC), infection-related ventilator-associated complication (IVAC), and possible VAP (PVAP); (3) senior IC staff adjudicated manual surveillance–automated surveillance discordance. Outcomes included sensitivity, specificity, positive predictive value (PPV), and manual surveillance detection errors. Errors detected during the development cohort resulted in algorithm updates applied to the validation cohort.

RESULTS

In the development cohort, there were 1,325 admissions, 479 ventilated patients, 2,539 ventilator days, and 47 VAEs. In the validation cohort, there were 1,234 admissions, 431 ventilated patients, 2,604 ventilator days, and 56 VAEs. With manual surveillance, in the development cohort, sensitivity was 40%, specificity was 98%, and PPV was 70%. In the validation cohort, sensitivity was 71%, specificity was 98%, and PPV was 87%. With automated surveillance, in the development cohort, sensitivity was 100%, specificity was 100%, and PPV was 100%. In the validation cohort, sensitivity was 85%, specificity was 99%, and PPV was 100%. Manual surveillance detection errors included missed detections, misclassifications, and false detections.

CONCLUSIONS

Manual surveillance is vulnerable to human error. Automated surveillance is more accurate and more efficient for VAE surveillance.

Infect Control Hosp Epidemiol 2018;826–833

Type
Original Article
Copyright
© 2018 by The Society for Healthcare Epidemiology of America. All rights reserved. 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

PREVIOUS PRESENTATION. This work was presented at ID Week 2017 (abstract no. 2151) on October 7, 2017, in San Diego, California.

a

Senior authors with equal contribution.

References

REFERENCES

1. Magill, SS, Klompas, M, Balk, R, et al. Developing a new, national approach to surveillance for ventilator-associated events: executive summary. Clin Infect Dis 2013;57:17421746.CrossRefGoogle ScholarPubMed
2. Klompas, M. Ventilator-associated events surveillance: a patient safety opportunity. Curr Opin Crit Care 2013;19:424431.CrossRefGoogle ScholarPubMed
3. Klompas, M, Magill, S, Robicsek, A, et al. Objective surveillance definitions for ventilator-associated pneumonia. Crit Care Med 2012;40:31543161.CrossRefGoogle ScholarPubMed
4. Mann, T, Ellsworth, J, Huda, N, et al. Building and validating a computerized algorithm for surveillance of ventilator-associated events. Infect Control Hosp Epidemiol 2015;36:9991003.CrossRefGoogle ScholarPubMed
5. Klein Klouwenberg, PM, van Mourik, MS, Ong, DS, et al. Electronic implementation of a novel surveillance paradigm for ventilator-associated events. Feasibility and validation. Am J Respir Crit Care Med 2014;189:947955.CrossRefGoogle ScholarPubMed
6. Stevens, JP, Silva, G, Gillis, J, et al. Automated surveillance for ventilator-associated events. Chest 2014;146:16121618.CrossRefGoogle ScholarPubMed
7. Nuckchady, D, Heckman, MG, Diehl, NN, et al. Assessment of an automated surveillance system for detection of initial ventilator-associated events. Am J Infect Control 2015;43:11191121.CrossRefGoogle ScholarPubMed
8. National Healthcare Safety Network. Ventilator-associated events (VAE). Centers for Disease Control and Prevention website. https://www.cdc.gov/nhsn/pdfs/pscmanual/10-vae_final.pdf. Published 2017. Accessed April 9, 2018.Google Scholar
9. Ventilator-associated event calculator (version 4.0). Materials for enrolled facilities. Centers for Disease Control and Prevention website. https://www.cdc.gov/nhsn/vae-calculator/index.html. Published August 4, 2016. Accessed October 10, 2017.Google Scholar
10. Halpin, H, Shortell, SM, Milstein, A, Vanneman, M. Hospital adoption of automated surveillance technology and the implementation of infection prevention and control programs. Am J Infect Control 2011;39:270276.CrossRefGoogle ScholarPubMed
11. Gastmeier, P, Behnke, M. Electronic surveillance and using administrative data to identify healthcare associated infections. Curr Opin Infect Dis 2016;29:394399.CrossRefGoogle ScholarPubMed
12. Puhto, T, Syrjala, H. Incidence of healthcare-associated infections in a tertiary care hospital: results from a three-year period of electronic surveillance. J Hosp Infect 2015;90:4651.CrossRefGoogle Scholar
13. de Bruin, JS, Seeling, W, Schuh, C. Data use and effectiveness in electronic surveillance of healthcare associated infections in the 21st century: a systematic review. J Am Med Inform Assoc 2014;21:942951.CrossRefGoogle ScholarPubMed
14. Grota, PG, Stone, PW, Jordan, S, Pogorzelska, M, Larson, E. Electronic surveillance systems in infection prevention: organizational support, program characteristics, and user satisfaction. Am J Infect Control 2010;38:509514.CrossRefGoogle ScholarPubMed
15. Furuno, JP, Schweizer, ML, McGregor, JC, Perencevich, EN. Economics of infection control surveillance technology: cost-effective or just cost? Am J Infect Control 2008;36(3 Suppl):S12S17.CrossRefGoogle ScholarPubMed
16. Sips, ME, Bonten, MJM, van Mourik, MSM. Automated surveillance of healthcare-associated infections: state of the art. Curr Opin Infect Dis 2017;30:425431.CrossRefGoogle ScholarPubMed
17. Klompas, M, Yokoe, DS. Automated surveillance of health care-associated infections. Clin Infect Dis 2009;48:12681275.CrossRefGoogle ScholarPubMed
18. Resetar, E, McMullen, KM, Russo, AJ, Doherty, JA, Gase, KA, Woeltje, KF. Development, implementation and use of electronic surveillance for ventilator-associated events (VAE) in adults. AMIA Annu Symp Proc 2014:10101017.Google ScholarPubMed