To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: The CDC recommends routine use of contact precautions for patients infected or colonized with multidrug-resistant organisms (MDROs). There is variability in implementation of and adherence to this recommendation, which we hypothesized may have been exacerbated by the COVID-19 pandemic. Methods: In September 2022, we emailed an 8-question survey to Emerging Infections Network (EIN) physician members with infection prevention and hospital epidemiology responsibilities. The survey asked about the respondent’s primary hospital’s recommendations on transmission-based precautions, adjunctive measures to reduce MDRO transmission, and changes that occurred during the COVID-19 pandemic. We sent 2 reminder emails over a 1-month period. We used descriptive statistics to summarize the data and to compare results to a similar EIN survey (n = 336) administered in 2014 (Russell D, et al. doi:10.1017/ice.2015.246). Results: Of 708 EIN members, 283 (40%) responded to the survey, and 201 were involved in infection prevention. Most respondents were adult infectious diseases physicians (n = 228, 80%) with at least 15 years of experience (n = 174, 63%). Respondents were well distributed among community, academic, and nonuniversity teaching facilities (Table 1). Most respondents reported that their facility routinely used CP for methicillin-resistant Staphylococcus aureus (MRSA, 66%) and vancomycin-resistant Enterococcus (VRE, 69%), compared to 93% and 92% respectively, in the 2014 survey. Nearly all (>90%) reported using contact precautions for Candida auris, carbapenem-resistant Enterobacterales (CRE), and carbapenem-resistant Acinetobacter spp, but there was variability in the use of contact precautions for carbapenem-resistant Pseudomonas aeruginosa and extended-spectrum β-lactamase–producing gram-negative organisms. In 2014, 81% reported that their hospital performed active surveillance testing for MRSA, and in 2022 this rate fell to 54% (Table 2). The duration of contact precautions varied by MDRO (Table 3). Compared to 2014, in 2022 facilities were less likely to use contact precautions indefinitely for MRSA (18% vs 6%) and VRE (31% vs 11%). Also, 180 facilities (90%) performed chlorhexidine bathing in at least some inpatients and 106 facilities (53%) used ultraviolet light or hydrogen peroxide vapor disinfection at discharge in some rooms. Furthermore, 89 facilities (44%) reported institutional changes to contact precautions policies after the start of the COVID-19 pandemic that remain in place. Conclusions: Use of contact precautions for patients with MDROs is heterogenous, and policies vary based on the organism. Although most hospitals still routinely use contact precautions for MRSA and VRE, this practice has declined substantially since 2014. Changes in contact-precaution policies may have been influenced by the COVID-19 pandemic, and more specifically, contemporary public health guidance is needed to define who requires contact precautions and for what duration.
Initial specimen diversion devices (ISDDs) are a potential solution for reducing blood-culture contamination rates. We report the implementation of an ISDD associated with a sustained reduction in blood-culture contamination rates for >18 months after implementation. We did not observe a clinically significant reduction in inpatient vancomycin usage.
The incidence of infections from extended-spectrum β-lactamase (ESBL)–producing Enterobacterales (ESBL-E) is increasing in the United States. We describe the epidemiology of ESBL-E at 5 Emerging Infections Program (EIP) sites.
During October–December 2017, we piloted active laboratory- and population-based (New York, New Mexico, Tennessee) or sentinel (Colorado, Georgia) ESBL-E surveillance. An incident case was the first isolation from normally sterile body sites or urine of Escherichia coli or Klebsiella pneumoniae/oxytoca resistant to ≥1 extended-spectrum cephalosporin and nonresistant to all carbapenems tested at a clinical laboratory from a surveillance area resident in a 30-day period. Demographic and clinical data were obtained from medical records. The Centers for Disease Control and Prevention (CDC) performed reference antimicrobial susceptibility testing and whole-genome sequencing on a convenience sample of case isolates.
We identified 884 incident cases. The estimated annual incidence in sites conducting population-based surveillance was 199.7 per 100,000 population. Overall, 800 isolates (96%) were from urine, and 790 (89%) were E. coli. Also, 393 cases (47%) were community-associated. Among 136 isolates (15%) tested at the CDC, 122 (90%) met the surveillance definition phenotype; 114 (93%) of 122 were shown to be ESBL producers by clavulanate testing. In total, 111 (97%) of confirmed ESBL producers harbored a blaCTX-M gene. Among ESBL-producing E. coli isolates, 52 (54%) were ST131; 44% of these cases were community associated.
The burden of ESBL-E was high across surveillance sites, with nearly half of cases acquired in the community. EIP has implemented ongoing ESBL-E surveillance to inform prevention efforts, particularly in the community and to watch for the emergence of new ESBL-E strains.
To determine the incidence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection among healthcare personnel (HCP) and to assess occupational risks for SARS-CoV-2 infection.
Prospective cohort of healthcare personnel (HCP) followed for 6 months from May through December 2020.
Large academic healthcare system including 4 hospitals and affiliated clinics in Atlanta, Georgia.
HCP, including those with and without direct patient-care activities, working during the coronavirus disease 2019 (COVID-19) pandemic.
Incident SARS-CoV-2 infections were determined through serologic testing for SARS-CoV-2 IgG at enrollment, at 3 months, and at 6 months. HCP completed monthly surveys regarding occupational activities. Multivariable logistic regression was used to identify occupational factors that increased the risk of SARS-CoV-2 infection.
Of the 304 evaluable HCP that were seronegative at enrollment, 26 (9%) seroconverted for SARS-CoV-2 IgG by 6 months. Overall, 219 participants (73%) self-identified as White race, 119 (40%) were nurses, and 121 (40%) worked on inpatient medical-surgical floors. In a multivariable analysis, HCP who identified as Black race were more likely to seroconvert than HCP who identified as White (odds ratio, 4.5; 95% confidence interval, 1.3–14.2). Increased risk for SARS-CoV-2 infection was not identified for any occupational activity, including spending >50% of a typical shift at a patient’s bedside, working in a COVID-19 unit, or performing or being present for aerosol-generating procedures (AGPs).
In our study cohort of HCP working in an academic healthcare system, <10% had evidence of SARS-CoV-2 infection over 6 months. No specific occupational activities were identified as increasing risk for SARS-CoV-2 infection.
Understand how the built environment can affect safety and efficiency outcomes during doffing of personal protective equipment (PPE) in the context of coronavirus disease 2019 (COVID-19) patient care.
We conducted (1) field observations and surveys administered to healthcare workers (HCWs) performing PPE doffing, (2) focus groups with HCWs and infection prevention experts, and (3) a with healthcare design experts.
This study was conducted in 4 inpatient units treating patients with COVID-19, in 3 hospitals of a single healthcare system.
The study included 24 nurses, 2 physicians, 1 respiratory therapist, and 2 infection preventionists.
The doffing task sequence and the layout of doffing spaces varied considerably across sites, with field observations showing most doffing tasks occurring around the patient room door and PPE support stations. Behaviors perceived as most risky included touching contaminated items and inadequate hand hygiene. Doffing space layout and types of PPE storage and work surfaces were often associated with inadequate cleaning and improper storage of PPE. Focus groups and the design charrette provided insights on how design affording standardization, accessibility, and flexibility can support PPE doffing safety and efficiency in this context.
There is a need to define, organize and standardize PPE doffing spaces in healthcare settings and to understand the environmental implications of COVID-19–specific issues related to supply shortage and staff workload. Low-effort and low-cost design adaptations of the layout and design of PPE doffing spaces may improve HCW safety and efficiency in existing healthcare facilities.
To describe the epidemiology of patients with nonintestinal carbapenem-resistant Enterobacterales (CRE) colonization and to compare clinical outcomes of these patients to those with CRE infection.
A secondary analysis of Consortium on Resistance Against Carbapenems in Klebsiella and other Enterobacteriaceae 2 (CRACKLE-2), a prospective observational cohort.
A total of 49 US short-term acute-care hospitals.
Patients hospitalized with CRE isolated from clinical cultures, April, 30, 2016, through August 31, 2017.
We described characteristics of patients in CRACKLE-2 with nonintestinal CRE colonization and assessed the impact of site of colonization on clinical outcomes. We then compared outcomes of patients defined as having nonintestinal CRE colonization to all those defined as having infection. The primary outcome was a desirability of outcome ranking (DOOR) at 30 days. Secondary outcomes were 30-day mortality and 90-day readmission.
Of 547 patients with nonintestinal CRE colonization, 275 (50%) were from the urinary tract, 201 (37%) were from the respiratory tract, and 71 (13%) were from a wound. Patients with urinary tract colonization were more likely to have a more desirable clinical outcome at 30 days than those with respiratory tract colonization, with a DOOR probability of better outcome of 61% (95% confidence interval [CI], 53%–71%). When compared to 255 patients with CRE infection, patients with CRE colonization had a similar overall clinical outcome, as well as 30-day mortality and 90-day readmission rates when analyzed in aggregate or by culture site. Sensitivity analyses demonstrated similar results using different definitions of infection.
Patients with nonintestinal CRE colonization had outcomes similar to those with CRE infection. Clinical outcomes may be influenced more by culture site than classification as “colonized” or “infected.”
To assess preventability of hospital-onset bacteremia and fungemia (HOB), we developed and evaluated a structured rating guide accounting for intrinsic patient and extrinsic healthcare-related risks.
HOB preventability rating guide was compared against a reference standard expert panel.
A 10-member panel of clinical experts was assembled as the standard of preventability assessment, and 2 physician reviewers applied the rating guide for comparison.
The expert panel independently rated 82 hypothetical HOB scenarios using a 6-point Likert scale collapsed into 3 categories: preventable, uncertain, or not preventable. Consensus was defined as concurrence on the same category among ≥70% experts. Scenarios without consensus were deliberated and followed by a second round of rating.
Two reviewers independently applied the rating guide to adjudicate the same 82 scenarios in 2 rounds, with interim revisions. Interrater reliability was evaluated using the κ (kappa) statistic.
Expert panel consensus criteria were met for 52 scenarios (63%) after 2 rounds.
After 2 rounds, guide-based rating matched expert panel consensus in 40 of 52 (77%) and 39 of 52 (75%) cases for reviewers 1 and 2, respectively. Agreement rates between the 2 reviewers were 84% overall (κ, 0.76; 95% confidence interval [CI], 0.64–0.88]) and 87% (κ, 0.79; 95% CI, 0.65–0.94) for the 52 scenarios with expert consensus.
Preventability ratings of HOB scenarios by 2 reviewers using a rating guide matched expert consensus in most cases with moderately high interreviewer reliability. Although diversity of expert opinions and uncertainty of preventability merit further exploration, this is a step toward standardized assessment of HOB preventability.
We performed an epidemiological investigation and genome sequencing of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) to define the source and scope of an outbreak in a cluster of hospitalized patients. Lack of appropriate respiratory hygiene led to SARS-CoV-2 transmission to patients and healthcare workers during a single hemodialysis session, highlighting the importance of infection prevention precautions.
To determine the impact of an inpatient stewardship intervention targeting fluoroquinolone use on inpatient and postdischarge Clostridioides difficile infection (CDI).
We used an interrupted time series study design to evaluate the rate of hospital-onset CDI (HO-CDI), postdischarge CDI (PD-CDI) within 12 weeks, and inpatient fluoroquinolone use from 2 years prior to 1 year after a stewardship intervention.
An academic healthcare system with 4 hospitals.
All inpatients hospitalized between January 2017 and September 2020, excluding those discharged from locations caring for oncology, bone marrow transplant, or solid-organ transplant patients.
Introduction of electronic order sets designed to reduce inpatient fluoroquinolone prescribing.
Among 163,117 admissions, there were 683 cases of HO-CDI and 1,104 cases of PD-CDI. In the context of a 2% month-to-month decline starting in the preintervention period (P < .01), we observed a reduction in fluoroquinolone days of therapy per 1,000 patient days of 21% after the intervention (level change, P < .05). HO-CDI rates were stable throughout the study period. In contrast, we also detected a change in the trend of PD-CDI rates from a stable monthly rate in the preintervention period to a monthly decrease of 2.5% in the postintervention period (P < .01).
Our systemwide intervention reduced inpatient fluoroquinolone use immediately, but not HO-CDI. However, a downward trend in PD-CDI occurred. Relying on outcome measures limited to the inpatient setting may not reflect the full impact of inpatient stewardship efforts.
Understanding the cognitive determinants of healthcare worker (HCW) behavior is important for improving the use of infection prevention and control (IPC) practices. Given a patient requiring only standard precautions, we examined the dimensions along which different populations of HCWs cognitively organize patient care tasks (ie, their mental models).
HCWs read a description of a patient and then rated the similarities of 25 patient care tasks from an infection prevention perspective. Using multidimensional scaling, we identified the dimensions (ie, characteristics of tasks) underlying these ratings and the salience of each dimension to HCWs.
Adult inpatient hospitals across an academic hospital network.
In total, 40 HCWs, comprising infection preventionists and nurses from intensive care units, emergency departments, and medical-surgical floors rated the similarity of tasks. To identify the meaning of each dimension, another 6 nurses rated each task in terms of specific characteristics of tasks.
Each HCW population perceived patient care tasks to vary along 3 common dimensions; most salient was the perceived magnitude of infection risk to the patient in a task, followed by the perceived dirtiness and risk of HCW exposure to body fluids, and lastly, the relative importance of a task for preventing versus controlling an infection in a patient.
For a patient requiring only standard precautions, different populations of HCWs have similar mental models of how various patient care tasks relate to IPC. Techniques for eliciting mental models open new avenues for understanding and ultimately modifying the cognitive determinants of IPC behaviors.
Among 353 healthcare personnel in a longitudinal cohort in 4 hospitals in Atlanta, Georgia (May–June 2020), 23 (6.5%) had severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies. Spending >50% of a typical shift at the bedside (OR, 3.4; 95% CI, 1.2–10.5) and black race (OR, 8.4; 95% CI, 2.7–27.4) were associated with SARS-CoV-2 seropositivity.
To describe the epidemiology of carbapenem-resistant Enterobacterales (CRE) bacteriuria and to determine whether urinary catheters increase the risk of subsequent CRE bacteremia.
Using active population- and laboratory-based surveillance we described a cohort of patients with incident CRE bacteriuria and identified risk factors for developing CRE bacteremia within 1 year.
The study was conducted among the 8 counties of Georgia Health District 3 (HD3) in Atlanta, Georgia.
Residents of HD3 with CRE first identified in urine between 2012 and 2017.
We identified 464 patients with CRE bacteriuria (mean yearly incidence, 1.96 cases per 100,000 population). Of 425 with chart review, most had a urinary catheter (56%), and many resided in long-term care facilities (48%), had a Charlson comorbidity index >3 (38%) or a decubitus ulcer (37%). 21 patients (5%) developed CRE bacteremia with the same organism within 1 year. Risk factors for subsequent bacteremia included presence of a urinary catheter (odds ratio [OR], 8.0; 95% confidence interval [CI], 1.8–34.9), central venous catheter (OR, 4.3; 95% CI, 1.7–10.6) or another indwelling device (OR, 4.3; 95% CI, 1.6–11.4), urine culture obtained as an inpatient (OR, 5.7; 95% CI, 1.3–25.9), and being in the ICU in the week prior to urine culture (OR, 2.9; 95% CI, 1.1–7.8). In a multivariable analysis, urinary catheter increased the risk of CRE bacteremia (OR, 5.3; 95% CI, 1.2–23.6).
In patients with CRE bacteriuria, urinary catheters increase the risk of CRE bacteremia. Future interventions should aim to reduce inappropriate insertion and early removal of urinary catheters.
Background: Automated testing instruments (ATIs) are commonly used by clinical microbiology laboratories to perform antimicrobial susceptibility testing (AST), whereas public health laboratories may use established reference methods such as broth microdilution (BMD). We investigated discrepancies in carbapenem minimum inhibitory concentrations (MICs) among Enterobacteriaceae tested by clinical laboratory ATIs and by reference BMD at the CDC. Methods: During 2016–2018, we conducted laboratory- and population-based surveillance for carbapenem-resistant Enterobacteriaceae (CRE) through the CDC Emerging Infections Program (EIP) sites (10 sites by 2018). We defined an incident case as the first isolation of Enterobacter spp (E. cloacae complex or E. aerogenes), Escherichia coli, Klebsiella pneumoniae, K. oxytoca, or K. variicola resistant to doripenem, ertapenem, imipenem, or meropenem from normally sterile sites or urine identified from a resident of the EIP catchment area in a 30-day period. Cases had isolates that were determined to be carbapenem-resistant by clinical laboratory ATI MICs (MicroScan, BD Phoenix, or VITEK 2) or by other methods, using current Clinical and Laboratory Standards Institute (CLSI) criteria. A convenience sample of these isolates was tested by reference BMD at the CDC according to CLSI guidelines. Results: Overall, 1,787 isolates from 112 clinical laboratories were tested by BMD at the CDC. Of these, clinical laboratory ATI MIC results were available for 1,638 (91.7%); 855 (52.2%) from 71 clinical laboratories did not confirm as CRE at the CDC. Nonconfirming isolates were tested on either a MicroScan (235 of 462; 50.9%), BD Phoenix (249 of 411; 60.6%), or VITEK 2 (371 of 765; 48.5%). Lack of confirmation was most common among E. coli (62.2% of E. coli isolates tested) and Enterobacter spp (61.4% of Enterobacter isolates tested) (Fig. 1A), and among isolates testing resistant to ertapenem by the clinical laboratory ATI (52.1%, Fig. 1B). Of the 1,388 isolates resistant to ertapenem in the clinical laboratory, 1,006 (72.5%) were resistant only to ertapenem. Of the 855 nonconfirming isolates, 638 (74.6%) were resistant only to ertapenem based on clinical laboratory ATI MICs. Conclusions: Nonconfirming isolates were widespread across laboratories and ATIs. Lack of confirmation was most common among E. coli and Enterobacter spp. Among nonconfirming isolates, most were resistant only to ertapenem. These findings may suggest that ATIs overcall resistance to ertapenem or that isolate transport and storage conditions affect ertapenem resistance. Further investigation into this lack of confirmation is needed, and CRE case identification in public health surveillance may need to account for this phenomenon.
To assess the clarity and efficacy of the World Health Organization (WHO) hand-rub diagram, develop a modified version, and compare the 2 diagrams.
Randomized group design preceded by controlled observation and iterative product redesigns.
The Cognitive Ergonomics Lab in the School of Psychology at the Georgia Institute of Technology.
We included participants who were unfamiliar with the WHO hand-rub diagram (convenience sampling) to ensure that performance was based on the diagram and not, for example, on prior experience.
We iterated through the steps of a human factors design procedure: (1) Participants simulated hand hygiene using ultraviolet (UV) absorbent lotion and a hand-rub technique diagram (ie, WHO or a redesign). (2) Coverage, confusion judgments, and behavioral videos informed potentially improved diagrams. And (3) the redesigned diagrams were compared with the WHO version in a randomized group design. Coverage was assessed across 72 hand areas from multiple UV photographs.
The WHO diagram led to multiple omissions in hand-surface coverage, including inadequate coverage by up to 75% of participants for the ulnar edge. The redesigns improved coverage significantly overall and often substantially.
Human factors modification to the WHO diagram reduced inadequate coverage for naïve users. Implementation of an improved diagram should help in the prevention of healthcare-associated infections.
To determine the effect of an electronic medical record (EMR) nudge at reducing total and inappropriate orders testing for hospital-onset Clostridioides difficile infection (HO-CDI).
An interrupted time series analysis of HO-CDI orders 2 years before and 2 years after the implementation of an EMR intervention designed to reduce inappropriate HO-CDI testing. Orders for C. difficile testing were considered inappropriate if the patient had received a laxative or stool softener in the previous 24 hours.
Four hospitals in an academic healthcare network.
All patients with a C. difficile order after hospital day 3.
Orders for C. difficile testing in patients administered a laxative or stool softener in <24 hours triggered an EMR alert defaulting to cancellation of the order (“nudge”).
Of the 17,694 HO-CDI orders, 7% were inappropriate (8% prentervention vs 6% postintervention; P < .001). Monthly HO-CDI orders decreased by 21% postintervention (level-change rate ratio [RR], 0.79; 95% confidence interval [CI], 0.73–0.86), and the rate continued to decrease (postintervention trend change RR, 0.99; 95% CI, 0.98–1.00). The intervention was not associated with a level change in inappropriate HO-CDI orders (RR, 0.80; 95% CI, 0.61–1.05), but the postintervention inappropriate order rate decreased over time (RR, 0.95; 95% CI, 0.93–0.97).
An EMR nudge to minimize inappropriate ordering for C. difficile was effective at reducing HO-CDI orders, and likely contributed to decreasing the inappropriate HO-CDI order rate after the intervention.
Accurately diagnosing urinary tract infections (UTIs) in hospitalized patients remains challenging, requiring correlation of frequently nonspecific symptoms and laboratory findings. Urine cultures (UCs) are often ordered indiscriminately, especially in patients with urinary catheters, despite the Infectious Diseases Society of America guidelines recommending against routine screening for asymptomatic bacteriuria (ASB).1,2 Positive UCs can be difficult for providers to ignore, leading to unnecessary antibiotic treatment of ASB.2,3 Using diagnostic stewardship to limit UCs to situations with a positive urinalysis (UA) can reduce inappropriate UCs since the absence of pyuria suggests the absence of infection.4–6 We assessed the impact of the implementation of a UA with reflex to UC algorithm (“reflex intervention”) on UC ordering practices, diagnostic efficiency, and UTIs using a quasi-experimental design.
The current methodology for calculating central-line–associated bloodstream infection (CLABSI) rates, used for pay-for-performance measures, does not account for multiple concurrent central lines.
To compare CLABSI rates using standard National Healthcare Safety Network (NHSN) denominators to rates accounting for multiple concurrent central lines.
Descriptive analysis and retrospective cohort analysis.
We identified all adult patients with central lines at 2 academic medical centers over an 18-month period. CLABSI rates were calculated for intensive care units (ICUs) and non-ICUs using the standard NHSN methodology and denominator (a patient could only have 1 central-line day for a given patient day) and a modified denominator (number of central lines in 1 patient in 1 day count as number of line days). We also compared characteristics of patients with and without multiple concurrent central lines.
Among 18,521 hospital admissions, there were 156,574 central-line days and 239 CLABSIs (ICU, 105; non-ICU, 134). Our modified denominator reduced CLABSI rates by 25% in ICUs (1.95 vs 1.47 per 1,000 line days) and 6% (1.30 vs 1.22 per 1,000 line days) in non-ICUs. Patients with multiple concurrent central lines were more likely to be in an ICU, to have a longer admission, to have a dialysis catheter, and to have a CLABSI.
Using the number of central lines as the denominator decreased CLABSI rates in ICUs by 25%. The presence of multiple concurrent central lines may be a marker of severity of illness. The risk of CLABSI per lumen of a central line is similar in ICUs compared to wards.
OBJECTIVES/SPECIFIC AIMS: To describe the epidemiology of patients with carbapenem-resistant Enterobacteriaceae (CRE) bacteriuria in metropolitan Atlanta, GA and to identify risk factors associated with progression to an invasive CRE infection. We hypothesize that having an indwelling urinary catheter increases the risk of progression. METHODS/STUDY POPULATION: The Georgia Emerging Infections Program (EIP) performs active population- and laboratory-based surveillance to identify CRE isolated from a sterile site (e.g. blood) or urine among patients who reside in the 8-county metropolitan Atlanta area (population ~4 million). The Georgia EIP performs a chart review of each case to extract data on demographics, culture location, resistance patterns, healthcare exposures, and other underlying risk factors. We used a retrospective cohort study design to include all Georgia EIP cases with Escherichia coli, Klebsiella pneumoniae, Klebsiella oxytoca, Enterobacter cloacae, or Klebsiella (formerly Enterobacter) aerogenes, adapting the current EIP definition of resistance to only include isolates resistant to meropenem, imipenem or doripenem (minimum inhibitory concentration ≥ 4) first identified in a urine culture from 8/1/2011 to 7/31/2017. Patients with CRE identified in a sterile site culture prior to a urine culture will be excluded. Within this cohort, we will identify which patients had a subsequent similar CRE isolate identified from a sterile site between one day and one year after the original urine culture was identified (termed “progression”). CRE isolates will be defined as similar if they are the same species and have the same carbapenem susceptibility pattern. Univariable analyses using T-tests or other nonparametric tests for continuous variables, and Chi-square tests (or Fisher’s exact tests as appropriate) for categorical variables will compare patient demographics, comorbidities and presence of invasive devices including urinary catheters between patients who had progression to an invasive infection and those who did not have progression. Covariates with a p-value of < 0.2 will be eligible for inclusion in the multivariable logistic regression model with progression to invasive infection as the primary outcome. All statistical analyses will be done in SAS 9.4. RESULTS/ANTICIPATED RESULTS: From 8/1/2011 to 7/31/2017 we have preliminarily identified 546 patients with CRE first identified in urine, representing an annual incidence rate of 1.1 cases per 100,000 population. Most cases were K. pneumoniae (352, 64%), followed by E. coli (117, 21%), E. cloacae (48, 9%), K. aerogenes (18, 3%), and K. oxytoca (11, 2%). The mean patient age was 64 +/− 18 years and the majority (308, 56%) were female. Clinical characterization through chart review was available for 507 patients. The majority of the patients were black (301, 59%), followed by white (166, 33%), Asian (12, 2%), and other or unknown race (28, 6%). 466 (92%) patients had at least one underlying comorbid condition with a median Charlson Comorbidity Index of 3 (IQR 1-5). 460 (91%) infections were considered healthcare-associated (366 community-onset and 94 hospital-onset), while 44 (9%) were community-associated. 279 (55%) patients had a urinary catheter within the two days prior to the CRE culture. The analysis of patients who progress to an invasive CRE infection, including the results of the univariable and multivariable analyses assessing risk factors for progression is in progress and will be reported in the future. DISCUSSION/SIGNIFICANCE OF IMPACT: In metropolitan Atlanta, the annual incidence of CRE first isolated in urine was estimated to be 1.1 cases per 100,000 population between 2011 and 2017, with the majority of the cases being K. pneumoniae. Most patients had prior healthcare exposure and more than 50% of the patients had a urinary catheter. Our anticipated results will identify risk factors associated with progression from CRE bacteriuria to an invasive infection with a specific focus on having a urinary catheter, as this is a potentially modifiable characteristic that could be a target of future interventions.
Hospital-onset bacteremia and fungemia (HOB), a potential measure of healthcare-associated infections, was evaluated in a pilot study among 60 patients across 3 hospitals. Two-thirds of all HOB events and half of nonskin commensal HOB events were judged as potentially preventable. Follow-up studies are needed to further develop this measure.