To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Methicillin-resistant Staphylococcus aureus (MRSA) is a significant nosocomial pathogen in the ICU. MRSA contamination of healthcare personnel (HCP) gloves and gowns after providing care to patients with MRSA occurs at a rate of 14%–16% in the ICU setting. Little is known about whether the MRSA isolates identified on HCP gown and gloves following patient care activities are the same as MRSA isolates identified as colonizing or infecting the patient.
From a multisite cohort of 388 independent patient MRSA isolates and their corresponding HCP gown and glove isolates, we selected 91 isolates pairs using a probability to proportion size (PPS) sampling method. To determine whether the patient and HCP gown or gloves isolates were genetically similar, we used 5 comparative genomic typing methods: phylogenetic analysis, spa typing, multilocus sequence typing (MLST), large-scale BLAST score ratio (LSBSR), and single-nucleotide variant (SNV) analysis.
We identified that 56 (61.5%) of isolate pairs were genetically similar at least by 4 of the methods. Comparably, the spa typing and the LSBSR analyses revealed that >75% of the examined isolate pairs were concordant, with the thresholds established for each analysis.
Many of the patient MRSA isolates were genetically similar to those on the HCP gown or gloves following a patient care activity. This finding indicates that the patient is often the primary source of the MRSA isolates transmitted to the HCP, which can potentially be spread to other patients or hospital settings through HCP vectors. These results have important implications because they provide additional evidence for hospitals considering ending the use of contact precautions (gloves and gowns) for MRSA patients.
To determine whether electronically available comorbidities and laboratory values on admission are risk factors for hospital-onset Clostridioides difficile infection (HO-CDI) across multiple institutions and whether they could be used to improve risk adjustment.
All patients at least 18 years of age admitted to 3 hospitals in Maryland between January 1, 2016, and January 1, 2018.
Comorbid conditions were assigned using the Elixhauser comorbidity index. Multivariable log-binomial regression was conducted for each hospital using significant covariates (P < .10) in a bivariate analysis. Standardized infection ratios (SIRs) were computed using current Centers for Disease Control and Prevention (CDC) risk adjustment methodology and with the addition of Elixhauser score and individual comorbidities.
At hospital 1, 314 of 48,057 patient admissions (0.65%) had a HO-CDI; 41 of 8,791 patient admissions (0.47%) at community hospital 2 had a HO-CDI; and 75 of 29,211 patient admissions (0.26%) at community hospital 3 had a HO-CDI. In multivariable regression, Elixhauser score was a significant risk factor for HO-CDI at all hospitals when controlling for age, antibiotic use, and antacid use. Abnormal leukocyte level at hospital admission was a significant risk factor at hospital 1 and hospital 2. When Elixhauser score was included in the risk adjustment model, it was statistically significant (P < .01). Compared with the current CDC SIR methodology, the SIR of hospital 1 decreased by 2%, whereas the SIRs of hospitals 2 and 3 increased by 2% and 6%, respectively, but the rankings did not change.
Electronically available patient comorbidities are important risk factors for HO-CDI and may improve risk-adjustment methodology.
Background: Healthcare personnel (HCP) acquire MRSA on their gown and gloves during routine care activities for patients who are colonized or infected with MRSA at a rate of ∼15%. Certain care activities (eg, physical exam, care of endotracheal tube, wound care and bathing/hygiene) have been associated with a higher frequency of transmission from the patient to HCP gown and gloves than other activities (ie, administration of oral medicines, glucose monitoring, and manipulation of IV tubing/medication delivery). However, quantification of MRSA contamination and risk to subsequent patients is poorly defined. Objective: We sought to determine the mean MRSA colony-forming units (CFU) found on the gloves and gowns of HCP who acquire MRSA after various care activities involving patients with MRSA. Methods: We conducted a prospective cohort study at the University of Maryland Medical Center from December 2018 to October 2019. We identified patients colonized or infected with MRSA based on culture data from the prior 7 days. HCP performing prespecified care activities on eligible patients were observed. To isolate the risk of each care activity, HCP donned new gloves and gown prior to a specific care activity. Once that care activity was performed, HCP gloves and gown were swabbed prior to the any further care activities. HCP gloves were cultured with an E-swab by swabbing each digit up and down 3 times followed by 2 circles on the palm of their hands. HCP gowns were sampled by swabbing a 15 × 30-cm area along the beltline of the gown and along each inner forearm twice. E-swab liquid was then serially diluted and plated in triplicate on CHROMagar MRSA II (BD, Sparks, MD) to obtain CFU. We calculated the median CFUs and the interquartile range (IQR) for each specific care activity stratified by gown and gloves. Results: In total, 604 HCP–patient care interactions were observed. Table 1 displays the mean MRSA CFUs stratified by gown and gloves for each patient care activity of interest. Conclusions: The quantity of MRSA found on gowns and gloves varies depending on patient care activities. Recognition of differential transmission rates between various activities may allow different approaches to infection prevention, such as the use of personal protective equipment in high- versus low-risk activities and/or the use of more aggressive interventions for high-risk activities.
Background: Inappropriate antibiotic prescription leads to increased Clostridiodes difficile infections, adverse effects including organ toxicity, and generation of antibiotic-resistant bacteria. Despite efforts to improve antibiotic use in acute-care settings, unnecessary and inappropriate prescription still occur in 30%–50% of patients. Objectives: We assessed factors associated with inappropriate antibiotic prescription at 2 time points: (1) initial, empiric therapy and (2) 3–5 days after therapy initiation. Methods: As part of a multicenter study investigating strategies to reduce antibiotic therapy after 3–5 days of use, antibiotic prescription data were collected from 11 adult and pediatric intensive care and general medical units at 6 hospitals in Maryland in 2014 and 2015. We performed a retrospective cohort study of all hospitalized patients who received any of 23 common antibiotics for at least 3 days. Each medical record was reviewed for demographics, admission and discharge dates, patient comorbidities, and antibiotic regimen by at least 1 infectious disease physician or pharmacist. Classification of antibiotic inappropriateness was based on each institution’s guidelines and standards. Bivariate analyses were performed using logistic regression for both initial therapy and therapy at days 3–5. Multivariable logistic regression was performed using covariates meeting the significance level of P < .05. Results: In total, 3,436 antibiotic courses were assessed at time of initial therapy, and 1541 regimens were continued and reviewed again at days 3–5 of therapy. For the initial therapy, 1,255 regimens (37%) were inappropriate; 45% of these were considered unnecessary and 41% were too broad in spectrum. In the multivariable regression, older age and antibiotic prescription during the summer were associated with the receipt of inappropriate antibiotics (Table 1). Having end-stage renal disease as a comorbid condition was protective against inappropriate use. At days 3–5 of therapy, 688 (45%) of the antibiotic courses were inappropriate. Reasons regimens were considered inappropriate included unnecessary antibiotic prescriptions (49%) and antibiotics being too broad (38%). Older age and receiving cefepime or piperacillin-tazobactam on day 3 of therapy were factors associated with inappropriate use (Table 2). Having undergone a transplant or a surgical procedure was protective of inappropriate antimicrobial use at days 3–5 of therapy. Conclusions: Older patients are more likely to receive inappropriate antibiotics at both initial regimen and 3–5 days later. Patients receiving cefepime or piperacillin-tazobactam are at greater risk of receiving inappropriate antibiotics at days 3–5 due to failure to de-escalate. Antibiotic stewardship strategies targeting these patient populations may limit inappropriate use.
Background: In October 2013, the University of Maryland Medical Center established a formal antibiotic prophylaxis protocol for patients undergoing ventricular assist device (VAD) placement, replacing a previous system of various broad-spectrum antibiotic combinations typically for prolonged durations based on surgeon preference. This new protocol consisted of a standardized regimen of 72 hours of vancomycin and ceftriaxone after the procedure. The objective of this project was to evaluate the rate of surgical site infection (SSI) related to VAD placement to ensure that implementing the new protocol did not cause an increase in SSI rates. Methods: The study was a retrospective cohort study of patients who had undergone VAD placement before the protocol change (January 1, 2011, to October 1, 2013) and after the change (October 1, 2013, to November 15, 2015). The primary outcomes was the difference in SSI rate before and after the protocol change using CDC NHSN definitions. Pertinent data points of interest included reason for VAD placement, duration/type of antibiotics used, delayed sternal closure, SSI, characterization of infection (bloodstream, driveline, or pocket), organism identified on culture and mortality at 30 days and 1 year. SSI rates were assessed using the Fischer exact test, and descriptive statistics were used for other outcome variables. Results: In total, 75 patients were included before the protocol and 46 after the protocol change. Overall, 27% and 17% of patients were on therapeutic antibiotics prior to the VAD placement, respectively (P = 0.23). Also, 8 (6.6%) patients in the preintervention group had an SSI compared to 1 patient (0.8%) in the postintervention group (P = .15). Adherence to the protocol was suboptimal, with 27% of patients in the postintervention group receiving non–protocol-adherent antibiotics and 65% of patients receiving antibiotics >96 hours postoperatively. When evaluating the patients collectively, SSI rates were the same when antibiotics were discontinued <72 hours postoperatively versus when antibiotics were continued beyond 72 hours postoperatively or were not given at all postoperatively (3.1% vs 10.7% vs 0%; P = .24). SSI rates were also no different among patients who received cefazolin monotherapy (0%), vancomycin and ceftriaxone (2.7%), vancomycin and piperacillin tazobactam (2%), and other antibiotic combinations (7.7%) for surgical prophylaxis (P = 0.1). Conclusions: No change in SSI rates was noted after a protocol change narrowing the spectrum and duration of antibiotic prophylaxis was implemented. Evaluation of optimal surgical prophylaxis in this patient population is difficult due to low event rates and frequent therapeutic indications for antibiotics outside the standard prophylaxis. Despite these challenges, this study supports the safety of studying SSI prophylaxis reduction in the VAD population. Further studies are reasonable and warranted.
To determine which healthcare worker (HCW) roles and patient care activities are associated with acquisition of vancomycin-resistant Enterococcus (VRE) on HCW gloves or gowns after patient care, as a surrogate for transmission to other patients.
Prospective cohort study.
Medical and surgical intensive care units at a tertiary-care academic institution.
VRE-colonized patients on Contact Precautions and their HCWs.
Overall, 94 VRE-colonized patients and 469 HCW–patient interactions were observed. Research staff recorded patient care activities and cultured HCW gloves and gowns for VRE before doffing and exiting patient room.
VRE were isolated from 71 of 469 HCWs’ gloves or gowns (15%) following patient care. Occupational/physical therapists, patient care technicians, nurses, and physicians were more likely than environmental services workers and other HCWs to have contaminated gloves or gowns. Compared to touching the environment alone, the odds ratio (OR) for VRE contamination associated with touching both the patient (or objects in the immediate vicinity of the patient) and environment was 2.78 (95% confidence interval [CI], 0.99–0.77) and the OR associated with touching only the patient (or objects in the immediate vicinity) was 3.65 (95% CI, 1.17–11.41). Independent risk factors for transmission of VRE to HCWs were touching the patient’s skin (OR, 2.18; 95% CI, 1.15–4.13) and transferring the patient into or out of bed (OR, 2.66; 95% CI, 1.15–6.43).
Patient contact is a major risk factor for HCW contamination and subsequent transmission. Interventions should prioritize contact precautions and hand hygiene for HCWs whose activities involve touching the patient.
To analyze whether electronically available comorbid conditions are risk factors for Centers for Disease Control and Prevention (CDC)-defined, hospital-onset Clostridium difficile infection (CDI) after controlling for antibiotic and gastric acid suppression therapy use.
Patients aged ≥18 years admitted to the University of Maryland Medical Center between November 7, 2015, and May 31, 2017.
Comorbid conditions were assessed using the Elixhauser comorbidity index. The Elixhauser comorbidity index and the comorbid condition components were calculated using the International Classification of Disease, Tenth Revision, Clinical Modification (ICD-10-CM) codes extracted from electronic medical records. Bivariate associations between CDI and potential covariates for multivariable regression, including antibiotic use, gastric acid suppression therapy use, as well as comorbid conditions, were estimated using log binomial multivariable regression.
After controlling for antibiotic use, age, proton-pump inhibitor use, and histamine-blocker use, the Elixhauser comorbidity index was a significant risk factor for predicting CDI. There was an increased risk of 1.26 (95% CI, 1.19–1.32) of having CDI for each additional Elixhauser point added to the total Elixhauser score.
An increase in Elixhauser score is associated with CDI. Our study and other studies have shown that comorbid conditions are important risk factors for CDI. Electronically available comorbid conditions and scores like the Elixhauser index should be considered for risk-adjustment of CDC CDI rates.
We assessed various locations and frequency of environmental sampling to maximize information and maintain efficiency when sampling for Acinetobacter baumannii. Although sampling sites in closer proximity to the patient were more likely positive, to fully capture environmental contamination, we found value in sampling all sites and across multiple days.
Documentation of antibiotic indication provides helpful information for antimicrobial stewardship, but accuracy is not understood. Review of 396 antibiotic orders in a pediatric ICU and adult medicine step-down unit found 90% agreement between provider-selected indication and independent review. Prompts to enter antibiotic indication during order entry provide largely accurate information.
Risk adjustment is needed to fairly compare central-line–associated bloodstream infection (CLABSI) rates between hospitals. Until 2017, the Centers for Disease Control and Prevention (CDC) methodology adjusted CLABSI rates only by type of intensive care unit (ICU). The 2017 CDC models also adjust for hospital size and medical school affiliation. We hypothesized that risk adjustment would be improved by including patient demographics and comorbidities from electronically available hospital discharge codes.
Using a cohort design across 22 hospitals, we analyzed data from ICU patients admitted between January 2012 and December 2013. Demographics and International Classification of Diseases, Ninth Edition, Clinical Modification (ICD-9-CM) discharge codes were obtained for each patient, and CLABSIs were identified by trained infection preventionists. Models adjusting only for ICU type and for ICU type plus patient case mix were built and compared using discrimination and standardized infection ratio (SIR). Hospitals were ranked by SIR for each model to examine and compare the changes in rank.
Overall, 85,849 ICU patients were analyzed and 162 (0.2%) developed CLABSI. The significant variables added to the ICU model were coagulopathy, paralysis, renal failure, malnutrition, and age. The C statistics were 0.55 (95% CI, 0.51–0.59) for the ICU-type model and 0.64 (95% CI, 0.60–0.69) for the ICU-type plus patient case-mix model. When the hospitals were ranked by adjusted SIRs, 10 hospitals (45%) changed rank when comorbidity was added to the ICU-type model.
Our risk-adjustment model for CLABSI using electronically available comorbidities demonstrated better discrimination than did the CDC model. The CDC should strongly consider comorbidity-based risk adjustment to more accurately compare CLABSI rates across hospitals.
Antibiotic resistance is a major threat to public health. Resistance is largely driven by antibiotic usage, which in many cases is unnecessary and can be improved. The impact of decreasing overall antibiotic usage on resistance is unknown and difficult to assess using standard study designs. The objective of this study was to explore the potential impact of reducing antibiotic usage on the transmission of multidrug-resistant organisms (MDROs).
We used agent-based modeling to simulate interactions between patients and healthcare workers (HCWs) using model inputs informed by the literature. We modeled the effect of antibiotic usage as (1) a microbiome effect, for which antibiotic usage decreases competing bacteria and increases the MDRO transmission probability between patients and HCWs and (2) a mutation effect that designates a proportion of patients who receive antibiotics to subsequently develop a MDRO via genetic mutation.
Intensive care unit
Absolute reduction in overall antibiotic usage by experimental values of 10% and 25%
Reducing antibiotic usage absolutely by 10% (from 75% to 65%) and 25% (from 75% to 50%) reduced acquisition rates of high-prevalence MDROs by 11.2% (P<.001) and 28.3% (P<.001), respectively. We observed similar effect sizes for low-prevalence MDROs.
In a critical care setting, where up to 50% of antibiotic courses may be inappropriate, even a moderate reduction in antibiotic usage can reduce MDRO transmission.
To determine the typical microbial bioburden (overall bacterial and multidrug-resistant organisms [MDROs]) on high-touch healthcare environmental surfaces after routine or terminal cleaning.
Prospective 2.5-year microbiological survey of large surface areas (>1,000 cm2).
MDRO contact-precaution rooms from 9 acute-care hospitals and 2 long-term care facilities in 4 states.
Samples from 166 rooms (113 routine cleaned and 53 terminal cleaned rooms).
Using a standard sponge-wipe sampling protocol, 2 composite samples were collected from each room; a third sample was collected from each Clostridium difficile room. Composite 1 included the TV remote, telephone, call button, and bed rails. Composite 2 included the room door handle, IV pole, and overbed table. Composite 3 included toileting surfaces. Total bacteria and MDROs (ie, methicillin-resistant Staphylococcus aureus, vancomycin-resistant enterococci [VRE], Acinetobacter baumannii, Klebsiella pneumoniae, and C. difficile) were quantified, confirmed, and tested for drug resistance.
The mean microbial bioburden and range from routine cleaned room composites were higher (2,700 colony-forming units [CFU]/100 cm2; ≤1–130,000 CFU/100 cm2) than from terminal cleaned room composites (353 CFU/100 cm2; ≤1–4,300 CFU/100 cm2). MDROs were recovered from 34% of routine cleaned room composites (range ≤1–13,000 CFU/100 cm2) and 17% of terminal cleaned room composites (≤1–524 CFU/100 cm2). MDROs were recovered from 40% of rooms; VRE was the most common (19%).
This multicenter bioburden summary provides a first step to determining microbial bioburden on healthcare surfaces, which may help provide a basis for developing standards to evaluate cleaning and disinfection as well as a framework for studies using an evidentiary hierarchy for environmental infection control.
To identify comorbid conditions associated with surgical site infection (SSI) among patients undergoing renal transplantation and improve existing risk adjustment methodology used by the Centers for Disease Control and Prevention National Healthcare Safety Network (NHSN).
Patients (≥18 years) who underwent renal transplantation at University of Maryland Medical Center January 1, 2010-December 31, 2011.
Trained infection preventionists reviewed medical records to identify surgical site infections that developed within 30 days after transplantation, using NHSN criteria. Patient demographic characteristics and risk factors for surgical site infections were identified through a central data repository. International Statistical Classification of Disease, Ninth Revision, Clinical Modification codes were used to analyze individual component comorbid conditions and calculate the Charlson and Elixhauser comorbidity indices. These indices were compared with the current NHSN risk adjustment methodology.
A total of 441 patients were included in the final cohort. In bivariate analysis, the Charlson components of cerebrovascular disease, peripheral vascular disease, and rheumatologic disorders and Elixhauser components of obesity, rheumatoid arthritis, and weight loss were significantly associated with the outcome. A model utilizing the variables from the NHSN methodology had a c-statistic of 0.56 (95% CI, 0.48–0.63), whereas a model that also included comorbidities from the Charlson and Elixhauser indices had a c-statistic of 0.65 (95% CI, 0.58–0.73). The model with all 3 risk adjustment scores performed best and was statistically different from the NHSN model alone, demonstrated by improvement in the c statistic (0.65 vs 0.56).
Risk adjustment models should incorporate electronically available comorbid conditions.
To determine the prevalence of Pseudomonas aeruginosa colonization on intensive care unit (ICU) admission, risk factors for P. aeruginosa colonization, and the incidence of subsequent clinical culture with P. aeruginosa among those colonized and not colonized.
We conducted a cohort study of patients admitted to a medical or surgical intensive care unit of a tertiary care hospital. Patients had admission perirectal surveillance cultures performed. Risk factors analyzed included comorbidities at admission, age, sex, antibiotics received during current hospitalization before ICU admission, and type of ICU.
Of 1,840 patients, 213 (11.6%) were colonized with P. aeruginosa on ICU admission. Significant risk factors in the multivariable analysis for colonization were age (odds ratio, 1.02 [95% CI, 1.01–1.03]), anemia (1.90 [1.05–3.42]), and neurologic disorder (1.80 [1.27–2.54]). Of the 213 patients colonized with P. aeruginosa on admission, 41 (19.2%) had a subsequent clinical culture positive for P. aeruginosa on ICU admission and 60 (28.2%) had a subsequent clinical culture positive for P. aeruginosa in the current hospitalization (ICU period and post-ICU period). Of these 60 patients, 49 (81.7%) had clinical infections. Of the 1,627 patients not colonized on admission, only 68 (4.2%) had a subsequent clinical culture positive for P. aeruginosa in the current hospitalization. Patients colonized with P. aeruginosa were more likely to have a subsequent positive clinical culture than patients not colonized (incidence rate ratio, 6.74 [95% CI, 4.91–9.25]).
Prediction rules or rapid diagnostic testing will help clinicians more appropriately choose empirical antibiotic therapy for subsequent infections.
To assess antimicrobial utilization before and after a change in urine culture ordering practice in adult intensive care units (ICUs) whereby urine cultures were only performed when pyuria was detected.
A 700-bed academic medical center
Patients admitted to any adult ICU
Aggregate data for all adult ICUs were obtained for population-level antimicrobial use (days of therapy [DOT]), urine cultures performed, and bacteriuria, all measured per 1,000 patient days before the intervention (January–December 2012) and after the intervention (January–December 2013). These data were compared using interrupted time series negative binomial regression. Randomly selected patient charts from the population of adult ICU patients with orders for urine culture in the presence of indwelling or recently removed urinary catheters were reviewed for demographic, clinical, and antimicrobial use characteristics, and pre- and post-intervention data were compared.
Statistically significant reductions were observed in aggregate monthly rates of urine cultures performed and bacteriuria detected but not in DOT. At the patient level, compared with the pre-intervention group (n=250), in the post-intervention group (n=250), fewer patients started a new antimicrobial therapy based on urine culture results (23% vs 41%, P=.002), but no difference in the mean total DOT was observed.
A change in urine-culture ordering practice was associated with a decrease in the percentage of patients starting a new antimicrobial therapy based on the index urine-culture order but not in total duration of antimicrobial use in adult ICUs. Other drivers of antimicrobial use in ICU patients need to be evaluated by antimicrobial stewardship teams.
Infect. Control Hosp. Epidemiol. 2016;37(4):448–454
Central-line–associated bloodstream infection (CLABSI) rate is an important quality measure, but it suffers from subjectivity and interrater variability, and decreasing national CLABSI rates may compromise its power to discriminate between hospitals. This study evaluates hospital-onset bacteremia (HOB, ie, any positive blood culture obtained 48 hours post admission) as a healthcare-associated infection–related outcome measure by assessing the association between HOB and CLABSI rates and comparing the power of each to discriminate quality among intensive care units (ICUs).
In this multicenter study, ICUs provided monthly CLABSI and HOB rates for 2012 and 2013. A Poisson regression model was used to assess the association between these 2 rates. We compared the power of each measure to discriminate between ICUs using standardized infection ratios (SIRs) with 95% confidence intervals (CIs). A measure was defined as having greater power to discriminate if more of the SIRs (with surrounding CIs) were different from 1.
In 80 ICUs from 16 hospitals in the United States and Canada, a total of 663 CLABSIs, 475,420 central line days, 11,280 HOBs, and 966,757 patient days were reported. An absolute change in HOB of 1 per 1,000 patient days was associated with a 2.5% change in CLABSI rate (P<.001). Among the 80 ICUs, 20 (25%) had a CLABSI SIR and 60 (75%) had an HOB SIR that was different from 1 (P<.001).
Change in HOB rate is strongly associated with change in CLABSI rate and has greater power to discriminate between ICU performances. Consideration should be given to using HOB to replace CLABSI as an outcome measure in infection prevention quality assessments.
Infect. Control Hosp. Epidemiol. 2016;37(2):143–148
Using a validated air sampling method we found Acinetobacter baumannii in the air surrounding only 1 of 12 patients known to be colonized or infected with A. baumannii. Patients’ closed-circuit ventilator status, frequent air exchanges in patient rooms, and short sampling time may have contributed to this low burden.
To identify factors associated with the development of surgical site infection (SSI) among adult patients undergoing renal transplantation
A retrospective cohort study
An urban tertiary care center in Baltimore, Maryland, with a well-established renal transplantation program that performs ~200–250renal transplant procedures annually
At total of 441 adult patients underwent renal transplantation between January 1, 2010, and December 31, 2011. Of these 441patients, 66 (15%) developed an SSI; of these 66, 31 (47%) were superficial incisional infections and 35 (53%) were deep-incisional or organ-space infections. The average body mass index (BMI) among this patient cohort was 29.7; 84 (42%) were obese (BMI >30). Patients who developed an SSI had a greater mean BMI (31.7 vs 29.4; P=.004) and were more likely to have a history of peripheral vascular disease, rheumatologic disease, and narcotic abuse. History of cerebral vascular disease was protective. Multivariate analysis showed BMI (odds ratio [OR] 1.06; 95% confidence interval [CI], 1.02–1.11) and past history of narcotic use/abuse (OR, 4.86; 95% CI, 1.24–19.12) to be significantly associated with development of SSI after controlling for National Healthcare Surveillance Network (NHSN) score and presence of cerebrovascular, peripheral vascular, and rheumatologic disease.
We identified higher BMI as a risk factor for the development of SSI following renal transplantation. Notably, neither aggregate comorbidity scores nor NHSN risk index were associated with SSI in this population. Additional risk adjustment measures and research in this area are needed to compare SSIs across transplant centers.
Centers for Disease Control and Prevention (CDC) risk adjustment methods for central-line–associated bloodstream infections (CLABSI) only adjust for type of intensive care unit (ICU). This cohort study explored risk factors for CLABSI using 2 comorbidity classification schemes, the Charlson Comorbidity Index (CCI) and the Chronic Disease Score (CDS). Our study supports the need for additional research into risk factors for CLABSI, including electronically available comorbid conditions.