Hostname: page-component-76fb5796d-22dnz Total loading time: 0 Render date: 2024-04-26T19:23:20.172Z Has data issue: false hasContentIssue false

Validation of Surgical Wound Surveillance

Published online by Cambridge University Press:  21 June 2016

Pamela S. Falk
Affiliation:
Hospital Epidemiology Unit of The Regional Medical Center at Memphis
C. Glen Mayhall*
Affiliation:
Hospital Epidemiology Unit of The Regional Medical Center at Memphis
*
Division of Infectious Diseases, University of Tennessee Memphis, 956 Court Avenue, Room H-308, Memphis, TN 38163

Abstract

Objective:

To determine the sensitivity and specificity of standard infection control surveillance techniques for the identification of surgical wound infections.

Design:

Surveillance data collected by three infection control practitioners (ICPs) was compared to surveillance data collected simultaneously by a gold standard observer.

Setting:

University-affiliated, tertiary care hospital.

Methods:

Using standard infection control surveillance techniques (chart review and discussions with patients' nurses and physicians), ICPs collected surveillance data on patients on the General Surgery and Trauma Surgery Services on days 4 and 7 after surgery and then weekly for 30 days or until patients were discharged from the hospital. Simultaneously, a hospital epidemiologist collected surveillance data and examined each patient's wound daily.

Results:

Nine hundred twenty-five surgical patients including 537 trauma cases and 388 elective general surgery cases were followed postoperatively. The ICPs identified 67 surgical wound infections, and the hospital epidemiologist identified 80 surgical wound infections for a sensitivity of 83.8% with a 95% confidence interval (CI95) of 75.7% to 91.9%. Specificity was 99.8% with a CI95 of 99% to 100%. The sensitivity was the same for trauma surgery and general surgery, but incisional wound infections were more difficult to identify than deep wound infections. During a second validation period, sensitivity was 92.3% with a CI95 of 62% to 100%.

Conclusions:

Standard infection control surveillance techniques have the same sensitivity for detection of surgical wound infections as they do for identification of other nosocomial infections. Accurate data on surgical wound infections can be collected without direct examination of surgical wounds.

Type
Original Articles
Copyright
Copyright © The Society for Healthcare Epidemiology of America 1993

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

Escola Paulista Medicina in San Paulo, Brazil

References

1. Haley, RW, Gaynes, RP, Aber, RC, Bennett, JV. Surveillance of nosocomial infections. In: Bennett, JV, Brachman, PS, eds. Hospital Infections. 3rd ed. Boston, MA Little, Brown and Co: 1992:101.Google Scholar
2. Brachman, PS, Dan, BB, Haley, RW, Hooton, TM, Garner, JS, Allen, JR. Nosocomial surgical infections: incidence and cost. Surg Clin North Am. 1980;60:1525.Google Scholar
3. Cruse, PJE, Foord, R. The epidemiology of wound infection. A 10-year prospective study of 62,939 wounds. Surg Clin North Am. 1980;60:2740.Google Scholar
4. Condon, RE, Schulte, WJ, Malangoni, MA, Anderson-Teschendorf, MJ. Effectiveness of a surgical wound surveillance program. Arch Surg. 1983;118:303307.Google Scholar
5. Haley, RW, Culver, DH, White, JW, et al. The efficacy of infection surveillance and control programs in preventing nosocomial infections in US hospitals. Am J Epidemiol. 1985;121:182205.Google Scholar
6. Olson, M, O'Connor, M, Schwartz, MF. Surgical wound infections. A 5-year prospective study of 20,193 wounds at the Minneapolis VA Medical Center. Ann Sur?. 1984:199:253259.Google Scholar
7. Mead, PB, Pories, SE, Hall, P, Vacek, PM, Davis, JH Jr, Gamelli, RL. Decreasing the incidence of surgical wound infections. Validation of a surveillance-notification program. Arch Surg. 1986;121:458461.Google Scholar
8. Olson, MM, Lee, JT Jr. Continuous, 10-year wound infection surveillance. Results, advantages, and unanswered questions. Arch Surg. 1990;125:794803.Google Scholar
9. Gil-Egea, MJ, Pi-Sunyer, MX Verdaguer, A, Sanz, F Sitges-Serra, A, Eleizegui, LT Surgical wound infections: prospective study of 4,468 clean wounds. Infect Control. 1987;8:277280.Google Scholar
10. NNIS Manual. National Nosocomial Infections Surveillance System. Atlanta, GA Centers for Disease Control; 1988.Google Scholar
11. Rosner, B. Fundamentals of Biostatistics. 3rd ed. Boston, MA: PWS-Kent Publishing Co; 1990.Google Scholar
12. Wenzel, RP, Osterman, CA, Hunting, KJ, Gwaltney, JM Jr. Hospital-acquired infections, I: surveillance in a university hospital. Am J Epidemiol. 1976;103:251260.Google Scholar
13. Froggatt, JW, Mayhall, CG. Development and validation of a surveillance program for postoperative wound infections in a university center. Presented at the Annual Meeting of the American Society for Microbiology, May 1418, 1989; New Orleans, FA.Google Scholar
14. Condon, RE, Haley, RW, Lee, JF Jr, Meakins, JE. Does infection control control infection? Arch Surg. 1988;123:250256.Google Scholar
15. Kerstein, M, Flower, M, Harkavy, FM, Gross, PA Surveillance for postoperative wound infections: practical aspects. Am Surg. 1978;44:210214.Google Scholar