To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To examine temporal changes in coverage with a complete primary series of coronavirus disease 2019 (COVID-19) vaccination and staffing shortages among healthcare personnel (HCP) working in nursing homes in the United States before, during, and after the implementation of jurisdiction-based COVID-19 vaccination mandates for HCP.
Sample and setting:
HCP in nursing homes from 15 US jurisdictions.
We analyzed weekly COVID-19 vaccination data reported to the Centers for Disease Control and Prevention’s National Healthcare Safety Network from June 7, 2021, through January 2, 2022. We assessed 3 periods (preintervention, intervention, and postintervention) based on the announcement of vaccination mandates for HCP in 15 jurisdictions. We used interrupted time-series models to estimate the weekly percentage change in vaccination with complete primary series and the odds of reporting a staffing shortage for each period.
Complete primary series vaccination among HCP increased from 66.7% at baseline to 94.3% at the end of the study period and increased at the fastest rate during the intervention period for 12 of 15 jurisdictions. The odds of reporting a staffing shortage were lowest after the intervention.
These findings demonstrate that COVID-19 vaccination mandates may be an effective strategy for improving HCP vaccination coverage in nursing homes without exacerbating staffing shortages. These data suggest that mandates can be considered to improve COVID-19 coverage among HCP in nursing homes to protect both HCP and vulnerable nursing home residents.
Background: The emergence and spread of drug-resistant pathogens continues to significantly impact patient safety and healthcare systems. Although antimicrobial susceptibility test (AST) results of clinical specimens are used by individual facilities for antimicrobial resistance surveillance, accurate tracking and benchmark comparison of a facility’s antimicrobial resistance using national data requires risk-adjusted methods to be more meaningful. The CDC NHSN Antimicrobial Resistance (AR) Option collects patient-level, deduplicated, isolate information, including AST results, for >20 organisms from cerebrospinal fluid, lower respiratory tract (LRT), blood, and urinary specimens. To provide risk-adjusted national benchmarks, we developed prediction models for incidence of hospital-onset isolates with antimicrobial resistance. Methods: We analyzed AST results of isolates reported through the NHSN AR Option for January through December 2019. Isolates from facilities that had >10% missing AST results for the organism-drug combinations or from hospitals that used outdated breakpoints were excluded. We assessed associations between facility-level factors and incidence rates of hospital-onset (specimen collected 3 days or more after hospital admission) isolates of specific drug-resistant phenotypes from blood, LRT, and urinary specimens. Factors included number of beds, length of stay, and prevalence of community onset isolates of the same phenotype. Drug-resistant phenotypes assessed included methicillin-resistant Staphylococcus aureus (MRSA), multidrug-resistant (MDR) Pseudomonas aeruginosa, carbapenem-resistant Enterobacterales (CRE), fluoroquinolone-resistant Pseudomonas aeruginosa, fluoroquinolone-resistant Enterobacterales, and extended-spectrum cephalosporin-resistant Enterobacterales. Isolates of different phenotypes and from different specimen sources were modeled separately. Negative binomial regression was used to evaluate the factors associated with antimicrobial resistance incidence. Variable entry into the models is based on significance level P Among the models, 1 for each drug-resistant phenotype-specimen type combination, the number of isolates with AST results ranged from 718 (Pseudomonas aeruginosa–fluoroquinolones, blood) to 16,412 (Enterobacterales–fluoroquinolones, urine). The pooled incidence rate was highest for fluoroquinolone-resistant Enterobacterales in urinary specimens (0.2179 isolates per 1,000 patient days) among all phenotype-specimen combinations evaluated (Table 1). The incidence of drug-resistant isolates was consistently associated with community-onset prevalence across models evaluated. Other associated factors varied across phenotype-specimen combinations (Table 2). Conclusions: We developed statistical models to predict facility-level incidence rates of hospital-onset antimicrobial resistant isolates based on community-onset drug-resistant prevalence and facility characteristics. These models will enable facilities to compare antimicrobial resistance rates to the national benchmarks and therefore to inform their antimicrobial stewardship and infection prevention efforts.
We reviewed trimethoprim-sulfamethoxazole antibiotic susceptibility testing data among Staphylococcus aureus using 3 national inpatient databases. In all 3 databases, we observed an increases in the percentage of methicillin-resistant Staphylococcus aureus that were not susceptible to trimethoprim-sulfamethoxazole. Providers should select antibiotic regimens based on local resistance patterns and should report changes to the public health department.
During March 27–July 14, 2020, the Centers for Disease Control and Prevention’s National Healthcare Safety Network extended its surveillance to hospital capacities responding to COVID-19 pandemic. The data showed wide variations across hospitals in case burden, bed occupancies, ventilator usage, and healthcare personnel and supply status. These data were used to inform emergency responses.
The rapid spread of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) throughout key regions of the United States in early 2020 placed a premium on timely, national surveillance of hospital patient censuses. To meet that need, the Centers for Disease Control and Prevention’s National Healthcare Safety Network (NHSN), the nation’s largest hospital surveillance system, launched a module for collecting hospital coronavirus disease 2019 (COVID-19) data. We present time-series estimates of the critical hospital capacity indicators from April 1 to July 14, 2020.
From March 27 to July 14, 2020, the NHSN collected daily data on hospital bed occupancy, number of hospitalized patients with COVID-19, and the availability and/or use of mechanical ventilators. Time series were constructed using multiple imputation and survey weighting to allow near–real-time daily national and state estimates to be computed.
During the pandemic’s April peak in the United States, among an estimated 431,000 total inpatients, 84,000 (19%) had COVID-19. Although the number of inpatients with COVID-19 decreased from April to July, the proportion of occupied inpatient beds increased steadily. COVID-19 hospitalizations increased from mid-June in the South and Southwest regions after stay-at-home restrictions were eased. The proportion of inpatients with COVID-19 on ventilators decreased from April to July.
The NHSN hospital capacity estimates served as important, near–real-time indicators of the pandemic’s magnitude, spread, and impact, providing quantitative guidance for the public health response. Use of the estimates detected the rise of hospitalizations in specific geographic regions in June after they declined from a peak in April. Patient outcomes appeared to improve from early April to mid-July.
Background: More than 450,000 patients receive outpatient hemodialysis in the United States. Patients on hemodialysis are at high risk of bloodstream infections (BSIs), which are associated with significant morbidity and mortality. National prevention efforts targeting hemodialysis facilities have resulted in widespread changes in practice, including modifications to central venous catheter (CVC) maintenance procedures. We analyzed dialysis event surveillance data submitted to the CDC NHSN to describe changes in BSI rates among hemodialysis outpatients from 2014 to 2018. Methods: Outpatient hemodialysis facilities report BSIs (ie, positive blood cultures collected in the outpatient setting or within 1 calendar day after hospital admission) and the number of hemodialysis outpatients treated during the first 2 working days of each month to the NHSN. For each BSI, the suspected source (ie, vascular access, another site, contamination, or uncertain) and vascular access type are indicated: CVC, arteriovenous fistula (AVF) or arteriovenous graft (AVG). Pooled mean rates (per 100 patient months) were calculated for BSIs, access-related BSIs (ARBSIs), and BSIs and ARBSIs were stratified by vascular access type. Annual BSI rate trends were evaluated using a negative binomial regression model, which treated patient months as an offset variable and included access type, year, and an access-year interaction variable. Results: More than 6,000 outpatient hemodialysis facilities reported 134,961 BSIs from 2014 to 2018. Of these BSIs, 102,505 (76%) were categorized as access related. CVCs were present in 63% of BSIs and 70% of ARBSIs. Pooled mean BSI rates decreased 27% from 0.64 to 0.47 per 100 patient months; rates of ARBSIs decreased 27% from 0.49 to 0.36 per 100 patient months. Significant decreases in event rates occurred across vascular access strata (Fig. 1). The reduction in BSI and ARBSI burden was most pronounced among patients with CVCs. BSI rates in patients with CVCs decreased 32% from 2.16 per 100 patient months to 1.46 (annual average decrease, 9.5%), and ARBSI rates in patients with CVCs decreased 32% from 1.83 per 100 patient months to 1.24 (annual average decrease, 9.4%). Conclusions: Substantial reductions in BSI and ARBSI rates among hemodialysis outpatients occurred during this 5-year period, and these reductions appear to be most prominent among CVC and AVF patients. Improvements in infection prevention and control practices, including CVC care, have likely contributed to these reductions. Additional efforts to increase the uptake of known prevention practices and to identify new strategies for prevention might contribute to continued decreases in infections among this highly vulnerable population.
Background:Clostridioides difficile infection (CDI) is one of the most common laboratory-identified (LabID) healthcare-associated events reported to the National Healthcare Safety Network (NHSN). CDI prevention remains a national priority, and efforts to reduce infection burden and improve antibiotic stewardship continue to expand across the healthcare spectrum. Beginning in 2013, the Centers for Medicare and Medicaid Services (CMS) required acute-care hospitals participating in CMS’ Inpatient Quality Reporting program to report CDI LabID data to NHSN and, in 2015, extended this reporting requirement to emergency departments (ED) and 24-hour observation units. To assess national progress, we evaluated changes in hospital onset CDI (HO-CDI) incidence during 2010–2018. Methods: Cases of HO-CDI were reported to NHSN by hospitals using the NHSN’s LabID criteria. Generalized linear mixed-effects modeling was used to assess trends of HO-CDI by treating the hospital as a random intercept to account for the correlation of the repeated responses over time. The data were summarized at the quarterly level, the main effect was time, and the covariates of interest were the following: CDI test type, inpatient community-onset (CO) infection rate, hospital type, average length of stay, medical school affiliation, number of beds, number of ICU beds, number of infection control professionals, presence of an ED or observation unit , and an indicator for 2015 to account for CDI protocol changes that required hospitals to conduct surveillance in both inpatient and ED or observation unit setting. Results: During 2010–2013, the number of hospitals reporting CDI increased and then stabilized after 2013 (Table 1). Crude HO-CDI rates decreased over time, except for an increase in 2015 and steeper reduction thereafter. (Table 2). During 2010–2014, the adjusted quarterly rate of change was −0.45% (95% CI, −0.57% to −0.33%; P < .0001). The rate of reduction was smaller in 2010–2014 compared to those of 2015–2018 (−2.82%; 95% CI, −3.10% to −2.54%; P < .0001). Compared to 2014, the adjusted rate in 2015 increased by 79.14% (95% CI, 72.42%–86.11%; P < .0001). Conclusions: The number of hospitals reporting CDI LabID data grew substantially in 2013 as a result of the CMS requirement for reporting. Adjusted HO-CDI rates decreased over time, with a rate hike in the year of 2015 and a rapid decrease thereafter. The increase in 2015 may be explained by changes in the NHSN CDI surveillance protocol and better test type classification in later years. Overall decreases in HO-CDI rates may be influenced by prevention strategies.
Background: The Centers for Disease Control and Prevention’s National Healthcare Safety Network (NHSN) has included surveillance of laboratory-identified (LabID) methicillin-resistant Staphylococcus aureus (MRSA) bacteremia events since 2009. In 2013, the Centers for Medicare & Medicaid Services (CMS) began requiring acute-care hospitals (ACHs) that participate in the CMS Inpatient Quality Reporting program to report MRSA LabID events to the NHSN and, in 2015, ACHs were required to report MRSA LabID events from emergency departments (EDs) and/or 24-hour observation locations. Prior studies observed a decline in hospital-onset MRSA (HO-MRSA) rates in national studies over shorter periods or other surveillance systems. In this analysis, we review the national reporting trend for HO-MRSA bacteremia LabID events, 2010–2018. Method: This analysis was limited to MRSA bacteremia LabID event data reported by ACHs that follow NHSN surveillance protocols. The data were restricted to events reported for overall inpatient facility-wide and, if applicable, EDs and 24-hour observation locations. MRSA events were classified as HO (collected >3 days after admission) or inpatient or outpatient community onset (CO, collected ≤3 days after admission). An interrupted time series random-effects generalized linear model was used to examine the relationship between HO-MRSA incidence rates (per 1,000 patient days) and time (year) while controlling for potential risk factors as fixed effects. The following potential risk factors were evaluated: facility’s annual survey data (facility type, medical affiliation, length of facility stay, number of beds, and number of intensive care unit beds) and quarterly summary data (inpatient and outpatient CO prevalence rates). Result: The number of reporting ACHs increased during this period, from 473 in 2010 to 3,651 in 2018. The crude HO-MRSA incidence rates (per 1,000 patient days) have declined over time, from a high of 0.067 in 2011 to 0.052 in 2018 (Table 1). Compared to 2014, the adjusted annual incidence rate increased in 2015 by 16.38%, (95% confidence interval [CI], 10.26%–22.84%; P < .0001). After controlling for all significant risk factors, the estimated annual HO-MRSA incidence rates declined by 5.98% (95% CI, 5.17%–6.78%; P < .0001) (Table 2). Conclusions: HO-MRSA bacteremia incidence rates have decreased over the past 9 years, despite a slight increase in 2015. This national trend analysis reviewed a longer period while analyzing potential risk factors. The decline in HO-MRSA incidence rates has been gradual; however, given the current trend, it is not likely to meet the Healthy People 2020 objectives. This analysis suggests the need for hospitals to continue and/or enhance HO-MRSA infection prevention efforts to reduce rates further.
Background: To provide a standardized, risk-adjusted method for summarizing antimicrobial use (AU), the Centers for Disease Control and Prevention developed the standardized antimicrobial administration ratio, an observed-to-predicted use ratio in which predicted use is estimated from a statistical model accounting for patient locations and hospital characteristics. The infection burden, which could drive AU, was not available for assessment. To inform AU risk adjustment, we evaluated the relationship between the burden of drug-resistant gram-positive infections and the use of anti-MRSA agents. Methods: We analyzed data from acute-care hospitals that reported ≥10 months of hospital-wide AU and microbiologic data to the National Healthcare Safety Network (NHSN) from January 2018 through June 2019. Hospital infection burden was estimated using the prevalence of deduplicated positive cultures per 1,000 admissions. Eligible cultures included blood and lower respiratory specimens that yielded oxacillin/cefoxitin–resistant Staphylococcus aureus (SA) and ampicillin-nonsusceptible enterococci, and cerebrospinal fluid that yielded SA. The anti-MRSA use rate is the total antimicrobial days of ceftaroline, dalbavancin, daptomycin, linezolid, oritavancin, quinupristin/dalfopristin, tedizolid, telavancin, and intravenous vancomycin per 1,000 days patients were present. AU rates were modeled using negative binomial regression assessing its association with infection burden and hospital characteristics. Results: Among 182 hospitals, the median (interquartile range, IQR) of anti-MRSA use rate was 86.3 (59.9–105.0), and the median (IQR) prevalence of drug-resistant gram-positive infections was 3.4 (2.1–4.8). Higher prevalence of drug-resistant gram-positive infections was associated with higher use of anti-MRSA agents after adjusting for facility type and percentage of beds in intensive care units (Table 1). Number of hospital beds, average length of stay, and medical school affiliation were nonsignificant. Conclusions: Prevalence of drug-resistant gram-positive infections was independently associated with the use of anti-MRSA agents. Infection burden should be used for risk adjustment in predicting the use of anti-MRSA agents. To make this possible, we recommend that hospitals reporting to NHSN’s AU Option also report microbiologic culture results.
Background: An indwelling urinary catheter is used in ~12%–16% of adult hospital inpatients during their hospitalization, which poses risks for acquiring a catheter-associated urinary tract infection (CAUTI). CAUTI data have been reported to the NHSN since 2005, and national benchmarks are annually reported in NHSN progress reports. Trends analyses in the incidence of CAUTI reported to the NHSN over long time have not been previously assessed. Objective: We investigated the national trends of CAUTI incidence separately for intensive care units (ICUs) and wards in acute-care hospitals (ACHs) from 2009 through 2018. Methods: We analyzed CAUTI data from ACHs reported to NHSN in 2009–2018. To evaluate trends of CAUTI incidence (per 1,000 catheter days), we conducted interrupted time-series analysis using negative-binomial mixed-effects modeling, separately for ICUs (nonneonatal ICUs) and wards. Due to the reporting requirement for adult and pediatric ICUs in 2012, and medical, surgical, and medical-surgical wards in 2015 by the CMS and the institution of the NHSN CAUTI definitional changes in 2015, calendar years 2012 and 2015 were treated as interruptions to the outcome in ICU models, and year 2015 was treated as a single interruption in the ward models. Regression models were assessed and adjusted, as appropriate, for patient care location type and facility-level characteristics such as hospital type, teaching status, bed size, number (and percentage) of ICU beds, and average length of inpatient stay. Random intercept and slope models were evaluated with covariance tests and were included to account for differential baseline incidence and trends among reporting hospitals. Results: The volume of patient care locations and hospitals reporting to the NHSN increased over time. Among the ICUs, the CAUTI incidence rate did not change in 2009–2012 and increased at an average of 5.6% per year in 2012–2014 (Fig. 1). CAUTI incidence rate dropped nearly 40% in 2015; thereafter, it decreased at an average of 8.9% per year. Among the wards, CAUTI incidence rate decreased at an average of 4.3% per year beginning 2009 (Fig. 2). The CAUTI incidence rate dropped almost 28% in 2015 and then decreased at an average of 4.3% per year. Conclusions: CAUTI incidence decreased substantially in 2015 among both ICUs and wards, which was partially attributable to CAUTI definitional change (see also Fig. 7 at https://www.cdc.gov/hai/data/archive/data-summary-assessing-progress.html). The significant decline of CAUTI incidence in both location types since 2015 is encouraging, and continued efforts in prevention of CAUTI are vital to sustaining this decline in the future.
Background:Staphylococcus aureus has long been an important cause of healthcare-associated infections (HAIs) and remains the second most common HAI pathogen in the United States. Often resistant to several antibiotics, S. aureus infections are difficult to treat and can leave patients at risk for serious complications such as pneumonia and sepsis. HAI pathogens and their antimicrobial susceptibility testing (AST) results have been reported to NHSN since its inception in 2005. Previous NHSN surveillance reports have presented national annual benchmarks for antimicrobial resistance phenotypes, such as methicillin-resistant S. aureus (MRSA). Whether there have been any significant changes over time in the prevalence of methicillin resistance among S. aureus infections reported to NHSN has not been previously assessed. Methods:S. aureus AST data from central-line–associated bloodstream infections, catheter-associated urinary tract infections, and inpatient surgical site infections reported from acute-care hospitals between 2009 and 2018 were analyzed. S. aureus was defined as MRSA if it was reported as resistant to oxacillin, cefoxitin, or methicillin. A national percentage resistant (%R) was calculated for each year as the number of resistant pathogens divided by the number of pathogens tested for susceptibility multiplied by 100. A generalized linear mixed model with logistic function was created to evaluate annual changes in the percentage resistant. Several patient-level and hospital-level characteristics were assessed as potential covariates. To account for differential baseline %R values between individual hospitals, specification of random intercept and slope were used during model creation. Differences in the trend of %R between HAI types were assessed using interaction terms. Data were analyzed using SAS v 9.3 software, and P < .05 was considered significant. Results: Overall, 3,317 hospitals reported at least 1 S. aureus pathogen tested for susceptibility between 2009 and 2018. The national unadjusted %R decreased from 49.2% (2009) to 41.2% (2018), with similar decreases seen in each HAI type (Table 1). After adjusting for significant covariates, a statistically significant annual 3% decrease in the prevalence of resistance was observed (Fig. 1). Significant differences between HAI types did not exist. Conclusions: The percentage of healthcare-associated S. aureus resistant to oxacillin, cefoxitin, or methicillin has declined consistently over the past 10 years. Continued efforts in infection prevention and antimicrobial stewardship are vital to sustaining this decline.
Background: Hospitals have submitted surveillance data for surgical site infections (SSIs) following colon surgeries (COLO) to the Centers for Disease Control and Prevention’s National Healthcare Safety Network (NHSN) since 2005. COLO SSI data submissions to NHSN have increased substantially beginning in 2012 as result of a Centers for Medicare and Medicaid Services (CMS) mandatory reporting requirement that began that year. A trend analysis of COLO SSIs, using data submitted to NHSN, has not been previously reported. To estimate the national trend of COLO SSI rates, we analyzed data reported from acute-care hospitals during 2009–2018. Methods: We analyzed inpatient adult COLO procedures with primary closure and resulting deep incisional primary and organ-space SSIs detected during the same hospitalization or rehospitalization in the same hospital. SSIs reported as infection present at time of surgery (PATOS) were included in the analysis. A protocol change that reprioritized COLO above small bowel surgery (SB) in the multiprocedural abdominal operations selection list for SSI attribution beginning in 2013 was a potential interruption to COLO SSI outcome. An interrupted time series with mixed-effects logistic regression was used to estimate the annual change in the log odds of COLO SSI. The estimates were adjusted for the following variables: hospital bed size, gender, emergency, trauma, general anesthesia, scope, ASA score, wound classification, medical school affiliation type, procedure duration and age. We also assessed the slope and level change of log odds before and after 2013. Results: The number of hospitals and procedures increased and then stabilized after 2012 (Table 1). The annual crude SSI rates ranged from 2.40% to 3.10%. There was no statistically significant slope change in 2013 and after. Compared to 2009–2012, the log odds of COLO SSI increased in 2013–2018 (OR, 1.1975; P < .0001). Based on this model, we estimate a 0.58% annual decrease in the odds of having a COLO SSI during 2009–2012 and 2013–2018 after controlling for the aforementioned variables (Table 2). Conclusions: We observed a substantial increase in the volume of hospitals and procedures reported to the NHSN since 2012 and an increase in odds of having a COLO SSI in 2013–2018 associated with surveillance protocol changes. After adjusting for these changes, we found a slight annual decrease in the overall odds of COLO SSI. Greater prevention efforts are needed for COLO SSI.
Background: Central-line–associated bloodstream infections (CLABSIs) are a major source of healthcare-associated infections (HAIs) in neonatal intensive care unit (NICU) patients, and they are associated with increased morbidity, mortality, and costs. CLABSI surveillance has been a critical component for hospitals participating in the Center for Disease Control and Prevention’s National Healthcare Safety Network (NHSN) for many years. CLABSI reporting grew substantially as a result of state reporting mandates first introduced in 2005 and federal reporting requirements for all intensive care units that began in 2011. However, no recent assessment of NHSN CLABSI incidence rate changes have been performed. The objective of this analysis was to estimate the overall trends in annual CLABSI incidence rates in NICUs from 2009 to 2018. Methods: We analyzed NHSN CLABSI data reported from NICUs during 2009–2018. CLABSIs further classified as mucosal barrier injury were included in this analysis. To evaluate the trends of CLABSI incidence (per 1,000 central-line days), and to account for the potential impact of definition changes introduced in 2015, we conducted an interrupted time-series analysis using mixed-effects negative binomial regression modeling. Birth weight category, patient care location type and hospital-level characteristics such as hospital type, medical affiliation, teaching status, bed size, and average length of inpatient stay) were assessed as potential covariates in regression analysis. Random intercept and slope models were evaluated with covariance tests and used to account for differential baseline incidence and trends among reporting NICUs. Results: The number of NICUs reporting to NHSN increased significantly following the federal mandate and has remained slightly >1,000 NICUs since 2013. The crude incidence of CLABSI dropped from 2.24 in 2009 to 0.98 infections per 1,000 central-line days in 2018, except for an increase in 2015 (Table 1). The CLABSI incidence, adjusted for birth weight category, decreased by an average of 11.6% per year from 2009 to 2018 except for a 35.8% increase in 2015 (Table 2). Conclusion: These findings suggest that hospitals have made significant strides in reducing the occurrence of CLABSIs in NICUs over the last 10 years. The increase in 2015 could be explained in part by the implementation and application of new definitional changes. Continued practices and policies that target, assess and prevent CLABSI in this setting may have been effective and remain vital to sustaining this decline nationally in subsequent years.
Background: Central-line–associated bloodstream infections (CLABSIs) are an important cause of healthcare-associated morbidity and mortality in the United States. CLABSI surveillance in the CDC NHSN began in 2005 and has been propelled by state CLABSI reporting requirements, first introduced in 2005, and subsequently by the CMS requirements for intensive care units (ICUs) in 2011 and select ward locations in 2015. Although trend analyses were previously reported, no recent assessment of the NHSN CLABSI incidence rate changes has been performed. In this analysis, we evaluated trends in CLABSI rates in nonneonatal ICUs and all wards reported from acute-care hospitals. Methods: CLABSI rates, including blood stream infections attributed to mucosal barrier injury reported to the NHSN from 2009 to 2018, were analyzed. To evaluate trends in CLABSI incidence and to account for the potential impact of definitional changes in catheter-associated urinary tract infections (CAUTIs) that indirectly impacted CLABSI rates, as well as the CMS mandate for select wards, we conducted an interrupted time-series analysis using negative binomial random-effects modeling with an interruption in 2015. ICUs and ward locations were analyzed separately. Models were adjusted for patient care location type and hospital-level characteristics: hospital type, medical affiliation, teaching status, bed size, number of ICU beds, and average length of inpatient stay. Random intercept and slope models were used to account for differential baseline incidence and trends among reporting hospitals. Results: The overall crude incidence of CLABSI per 1,000 central-line days decreased from 1.6 infections in 2009 to 0.9 infections in 2018, except for an increase in 2015. Similar trends were observed by location type. Among the ICUs, adjusted CLABSI incidence decreased by 10% annually in 2009–2014, increased nearly 29% in 2015, and thereafter decreased at an average of 6.8% per year. Among the wards, adjusted CLABSI incidence decreased at an average of 7.9% annually, except for a 29.3% increase in 2015. Conclusions: Substantial progress has been made in reducing CLABSIs in both ICUs and wards over the last 10 years. Indirect effects of CAUTI definitional changes may explain the immediate increase in ICUs, whereas the CMS mandate may explain the similar increase in wards in 2015. Despite this increase, these findings suggest that policies and practices aimed at prevention of CLABSI have likely been effective on a national level.
Background:Escherichia coli is the third most common pathogen responsible for healthcare-associated infections (HAIs), but it is increasingly resistant to multiple antibiotics. Antimicrobial susceptibility test (AST) results for fluoroquinolones (FQs) among E. coli implicated in select HAIs are reported to the NHSN surveillance system. Trend analyses in the prevalence of FQ resistance among healthcare-associated E. coli infections reported to NHSN have not been previously assessed. Objective: We investigated the national trends of prevalence of FQ resistance among E. coli HAIs in acute-care hospitals from 2009 through 2018. Method: We analyzed E. coli AST data from central-line–associated bloodstream infections (CLABSIs), catheter-associated urinary tract infections (CAUTIs), and surgical site infections (SSIs) reported to the NHSN between 2009 and 2018. Fluoroquinolone resistance is defined as the number of E. coli isolates that tested resistant or intermediate to at least 1 of 3 quinolones (ciprofloxacin, levofloxacin, and moxifloxacin), divided by number of pathogens tested for susceptibility, multiplied by 100. To evaluate the trends of fluoroquinolone resistance over time, we conducted an interrupted time-series analysis using a generalized linear mixed model with a logistic function. Substantial HAI definitional changes, most consequentially CAUTI in 2015 and a directional incidence change in 2018, were treated as interruptions to the outcome. Regression models adjusted for patient-level (ie, age, gender, HAI type) and facility-level characteristics (ie, facility type, teaching status, number of beds in intensive care units, and average length of stay) were obtained from the NHSN annual hospital surveys. Random-intercept and slope models were evaluated with covariance tests and were included to account for differential baseline fluoroquinolone resistance and trends among reporting facilities. Data were analyzed using SAS with statistical significance defined at alpha = 0.05. Results: During 2009–2018, the number of E. coli isolates with AST results for FQ reported to NHSN (Fig. 1) increased. After adjusting for covariates, fluoroquinolone resistance significantly increased from 2009 through 2015 at an average of 4.2% per year (Fig. 2, β1). There was no significant change in fluoroquinolone resistance from 2015 through 2017 (Fig. 2, β1+ β3). In 2018, there was 6.4% decline in fluoroquinolone resistance compared to 2017 (Fig. 2, β4) Conclusions: An increasing trend of fluoroquinolone resistance from 2009 through 2015 was observed, and fluoroquinolone resistance was stable during 2015–2017. Despite a relatively brief decline in fluoroquinolone resistance in 2018, the absolute change was small (~2%). Sustaining this decline warrants continued efforts in infection prevention and antimicrobial stewardship.
Background:Staphylococcus aureus is frequently implicated in healthcare-associated infections in the United States, and a substantial proportion of these infections are attributed to methicillin-resistant Staphylococcus aureus (MRSA). Although MRSA infections have decreased in health care settings, accurate estimates of the rate of decline call for risk-adjusted methods for calculating the resistant proportion (%R), that is, the proportion of S. aureus resistant to cefoxitin or oxacillin. Risk-adjusted %R also enables more accurate interhospital comparisons and can serve as a quantitative guide and evaluation metric for prevention efforts. Methods: To develop a risk-adjusted %R for S. aureus, we analyzed the antimicrobial susceptibility test (AST) results for S. aureus isolates reported to the CDC NHSN Antimicrobial Resistance Option during 2017–2018. Isolates were reported for cerebrospinal fluid (CSF), blood, lower respiratory tract (LRT), and urine. Isolates without cefoxitin and oxacillin test results, or from the facilities that had >10% missing test results were excluded. Test results were differentiated between those associated with community-onset and hospital-onset (HO) infections by defining the latter group as test results for isolates obtained 3 days or more after hospital admission. Logistic regression was used to evaluate the factors associated with oxacillin/cefoxitin resistance. Hospital, patient and isolate-level variables from NHSN annual survey and AR option were assessed as covariates. Variable entry into the models is based on significance level P < .05. Results: Among 9,992 hospital-onset SA isolates from 9,019 patients in 315 facilities, 5,488 (54.9%) were MRSA. Logistic regression showed that a higher proportion of HO-MRSA was significantly associated with older age, female, particular sources of specimen (urine and LRT), and selected hospital characteristics: hospitals not serving as major teaching hospitals, hospitals with a higher proportion of MRSA among community-onset SA isolates, hospitals with lower percentage of beds in intensive care units, and hospitals outsourcing AST service (Table 1). Conclusions: HO-MRSA is independently associated with community burden of MRSA, older and female patient populations, and hospital teaching status and AST practices, which highlights the importance of public health engagement and regional collaborations to prevent MRSA. To provide a standardized MRSA proportion for public health surveillance, taking some of these factors into account in MRSA proportion standardization should be considered.
Background: Surveillance data for surgical site infections (SSIs) following abdominal hysterectomy (HYST) have been reported to the CDC NHSN since 2005. Beginning in 2012, HYST SSI surveillance coverage expanded substantially as a result of a CMS mandatory reporting requirement as part of the Hospital Inpatient Quality Reporting Program. A trend analysis of HYST SSI using data submitted to the NHSN has not been previously reported. To estimate the overall trend of HYST SSI incidence rates, we analyzed data reported from acute-care hospitals with surgery performed between January 1, 2009, and December 31, 2018. Methods: We analyzed inpatient adult HYST procedures with primary closure resulting deep incisional primary and organ-space SSIs detected during the same hospitalization or rehospitalization to the same hospital. SSIs reported as infection present at time of surgery (PATOS) were included in the analysis. Due to the surveillance definition changes for primary closure in 2013 and 2015, these were tested separately as interruptions to HYST SSI outcome using an interrupted time-series model with a mixed-effects logistic regression. Because the previously described changes were not significantly associated with changes in HYST SSI risk, mixed-effects logistic regression was used to estimate the annual change in the log odds of HYST SSI. The estimates were adjusted for the following covariates: hospital bed size, general anesthesia, scope, ASA score, wound classification, medical school affiliation type, procedure duration and age. Results: The number of hospitals and procedures reported to NHSN for HYST increased and then stabilized after 2012 (Table 1). The unadjusted annual SSI incidence rates ranged from 0.60% to 0.81%. Based on the model, we estimate a 2.58% decrease in the odds of having a HYST SSI annually after controlling for variables mentioned above (Table 2). Conclusions: The volume of hospitals and procedures for HYST reported to NHSN increased substantially because of the CMS reporting requirement implemented in 2012. The overall adjusted HYST SSI odds ratio decreased annually over 2009–2018, which indicates progress in preventing HYST SSIs.
Prevention of Clostridioides difficile infection (CDI) is a national priority and may be facilitated by deployment of the Targeted Assessment for Prevention (TAP) Strategy, a quality improvement framework providing a focused approach to infection prevention. This article describes the process and outcomes of TAP Strategy implementation for CDI prevention in a healthcare system.
Hospital A was identified based on CDI surveillance data indicating an excess burden of infections above the national goal; hospitals B and C participated as part of systemwide deployment. TAP facility assessments were administered to staff to identify infection control gaps and inform CDI prevention interventions. Retrospective analysis was performed using negative-binomial, interrupted time series (ITS) regression to assess overall effect of targeted CDI prevention efforts. Analysis included hospital-onset, laboratory-identified C. difficile event data for 18 months before and after implementation of the TAP facility assessments.
The systemwide monthly CDI rate significantly decreased at the intervention (β2, −44%; P = .017), and the postintervention CDI rate trend showed a sustained decrease (β1 + β3; −12% per month; P = .008). At an individual hospital level, the CDI rate trend significantly decreased in the postintervention period at hospital A only (β1 + β3, −26% per month; P = .003).
This project demonstrates TAP Strategy implementation in a healthcare system, yielding significant decrease in the laboratory-identified C. difficile rate trend in the postintervention period at the system level and in hospital A. This project highlights the potential benefit of directing prevention efforts to facilities with the highest burden of excess infections to more efficiently reduce CDI rates.
Describe common pathogens and antimicrobial resistance patterns for healthcare-associated infections (HAIs) that occurred during 2015–2017 and were reported to the Centers for Disease Control and Prevention’s (CDC’s) National Healthcare Safety Network (NHSN).
Data from central line-associated bloodstream infections (CLABSIs), catheter-associated urinary tract infections (CAUTIs), ventilator-associated events (VAEs), and surgical site infections (SSIs) were reported from acute-care hospitals, long-term acute-care hospitals, and inpatient rehabilitation facilities. This analysis included device-associated HAIs reported from adult location types, and SSIs among patients ≥18 years old. Percentages of pathogens with nonsusceptibility (%NS) to selected antimicrobials were calculated for each HAI type, location type, surgical category, and surgical wound closure technique.
Overall, 5,626 facilities performed adult HAI surveillance during this period, most of which were general acute-care hospitals with <200 beds. Escherichia coli (18%), Staphylococcus aureus (12%), and Klebsiella spp (9%) were the 3 most frequently reported pathogens. Pathogens varied by HAI and location type, with oncology units having a distinct pathogen distribution compared to other settings. The %NS for most pathogens was significantly higher among device-associated HAIs than SSIs. In addition, pathogens from long-term acute-care hospitals had a significantly higher %NS than those from general hospital wards.
This report provides an updated national summary of pathogen distributions and antimicrobial resistance among select HAIs and pathogens, stratified by several factors. These data underscore the importance of tracking antimicrobial resistance, particularly in vulnerable populations such as long-term acute-care hospitals and intensive care units.
To describe common pathogens and antimicrobial resistance patterns for healthcare-associated infections (HAIs) among pediatric patients that occurred in 2015–2017 and were reported to the Centers for Disease Control and Prevention’s National Healthcare Safety Network (NHSN).
Antimicrobial resistance data were analyzed for pathogens implicated in central line-associated bloodstream infections (CLABSIs), catheter-associated urinary tract infections (CAUTIs), ventilator-associated pneumonias (VAPs), and surgical site infections (SSIs). This analysis was restricted to device-associated HAIs reported from pediatric patient care locations and SSIs among patients <18 years old. Percentages of pathogens with nonsusceptibility (%NS) to selected antimicrobials were calculated by HAI type, location type, and surgical category.
Overall, 2,545 facilities performed surveillance of pediatric HAIs in the NHSN during this period. Staphylococcus aureus (15%), Escherichia coli (12%), and coagulase-negative staphylococci (12%) were the 3 most commonly reported pathogens associated with pediatric HAIs. Pathogens and the %NS varied by HAI type, location type, and/or surgical category. Among CLABSIs, the %NS was generally lowest in neonatal intensive care units and highest in pediatric oncology units. Staphylococcus spp were particularly common among orthopedic, neurosurgical, and cardiac SSIs; however, E. coli was more common in abdominal SSIs. Overall, antimicrobial nonsusceptibility was less prevalent in pediatric HAIs than in adult HAIs.
This report provides an updated national summary of pathogen distributions and antimicrobial resistance patterns among pediatric HAIs. These data highlight the need for continued antimicrobial resistance tracking among pediatric patients and should encourage the pediatric healthcare community to use such data when establishing policies for infection prevention and antimicrobial stewardship.