To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Antiviral chemoprophylaxis for influenza is recommended in nursing homes to prevent transmission and severe disease among residents with higher risk of severe influenza complications. Interim CDC guidance recommends that long-term care facilities initiate antiviral chemoprophylaxis with oral oseltamivir for all non-ill residents living in the same unit following the start of an outbreak in a facility (ie, ≥2 patients ill within 72 hours and of whom at least 1 resident has laboratory-confirmed influenza). Prophylaxis continues for a minimum of 2 weeks and for at least 7 days after the last laboratory-confirmed case. However, facilities may not strictly adhere to this guidance, with 1 study showing up to 68% of facilities were nonadherent to national guidance (Silva et al 2020). Here, we model the potential impacts of different antiviral prophylaxis strategies. Methods: We developed a susceptible–exposed–asymptomatic–infected–recovered (SEAIR) compartmental model of an average-sized nursing home comprising short-stay residents, long-stay residents, and healthcare personnel (HCP). Persons treated with antiviral chemoprophylaxis were less susceptible to infection, had a lower probability of symptoms if infected, a reduced viral load, and a shortened duration of infectiousness. We included influenza vaccination for residents and HCP through reduced probability of symptomatic infection. Coverage rates were estimated from CDC FluVaxView and CMS COVID-19 nursing home data. As a base case, we modeled a scenario with prophylaxis implemented according to guidance. We varied uptake by residents and HCP (from 10% to 90%), case thresholds for prophylaxis initiation (1–5 cases identified), and timing of prophylaxis cessation: either time dependent (ie, 10–14 days of prophylaxis) or case-dependent (ie, continuing prophylaxis for 1–7 days with no cases). Results: In the scenario based on current guidance, prophylaxis reduced resident cases by 16% and resident hospitalizations by 45%, compared to no prophylaxis (Fig. 1A). Scenarios that differed from the guidance altered case burden and timing: Time-dependent prophylaxis cessation increased resident cases and hospitalizations (Fig. 1A). Timing of prophylaxis initiation had slight effects on the timing of the epidemic and minimal effects on resident cases and hospitalizations (Fig. 1B). High resident uptake was important for reducing resident cases and hospitalizations (Fig. 1C), but increasing HCP uptake had minimal effect (Fig. 1D). Conclusions: Our findings support the current prophylaxis guidance. Promptly implementing prophylaxis reduces resident cases and hospitalizations. Continuing prophylaxis until cases are no longer identified reduces cases and hospitalizations.
Background:Clostridioides difficile and multidrug-resistant organisms (MDROs) pose challenges due to treatment complexities and substantial morbidity and mortality. Susceptibility to colonization with these organisms and potential onward transmission if colonized (ie, infectivity) is influenced by the human microbiome and its dynamics. Disruptive effects of antibiotics on the microbiome imply potential indirect effects of antibiotics on C. difficile colonization. Mathematical models can help explore the relative impact of key pathways linking antibiotic use to C. difficile colonization, including the relationship between population-level antibiotic use and colonization prevalence. Methods: We built a compartmental model of long-term C. difficile colonization prevalence of nursing home residents (though malleable for any MDRO), allowing interactions between the microbiome and the colonization process. Based on proportional abundance of microbial taxa, we classified individuals into high and low α diversity groups, each further stratified into uncolonized or colonized with C. difficile. The rate of transition from the high to low microbiome diversity group was proportional to the population-level rate of antibiotic use. Transmission dynamics followed a susceptible–infectious–susceptible framework with the possibility for increased susceptibility and infectivity for the low-diversity microbiome group. First, as a comparator, we used a “null model” in which microbiome diversity did not influence host susceptibility or infectivity. Next, we sampled from realistic (literature informed) parameter ranges to analyze how the microbiome mediates the effect of antibiotics on colonization in this population. Results: Our analysis suggests that antibiotic use can catalyze colonization with C. difficile through interactions with the host microbiome, resulting in a sharp increase in colonization with a modest increase in antibiotic use (Fig 1). Increasing the population-level antibiotic use by 5% led to a median 24% increase in long-term colonization prevalence in the model (Fig 2). In contrast, increasing susceptibility or infectivity rates by 5% resulted in slightly higher increases in total colonization (27% and 29%, respectively). However, there was considerable uncertainty around these estimates, with interquartile ranges of up to 20% for some parameters (Fig 2). Conclusions: Higher population-level antibiotic use likely increases colonization by C. difficile through indirect effects of the microbiome. The increased colonization burden attributable to increasing antibiotic use may be substantial. With high uncertainty around some estimates, conducting observational studies to better understand key colonization and microbiome parameters (eg, the relative increase in susceptibility or infectivity with lower microbiome diversity) is critical for future efforts to estimate the impact of antibiotic use on colonization with C. difficile and MDROs.
Background: Multimodal approaches are often used to prevent transmission of antimicrobial-resistant pathogens among patients in healthcare settings; understanding the effect of individual interventions is challenging. We designed a model to compare the effectiveness of hand hygiene (HH) with or without decolonization in reducing patient colonization with carbapenem-resistant Enterobacterales (CRE). Methods: We developed an agent-based model to represent transmission of CRE in an acute-care hospital comprising 3 general wards and 2 ICUs, each with 20 single-occupancy rooms, located in a community of 85,000 people. The model accounted for the movement of healthcare personnel (HCP), including their visits to patients. CRE dynamics were modeled using a susceptible–infectious–susceptible framework with transmission occurring via HCP–patient contacts. The mean time to clearance of CRE colonization without intervention was 387 days (Zimmerman et al, 2013). Our baseline included a facility-level HH compliance of 30%, with an assumed efficacy of 50%. Contact precautions were employed for patients with CRE-positive cultures with assumed adherence and efficacy of 80% and 50%, respectively. Intervention scenarios included decolonization of culture-positive CRE patients, with a mean time to decolonization of 3 days. We considered 2 hypothetical intervention scenarios: (A) decolonization of patients with the baseline HH compliance and (B) decolonization with a slightly improved HH compliance of 35%. The hospital-level CRE incidence rate was used to compare the results from these intervention scenarios. Results: CRE incidence rates were lower in intervention scenarios than the baseline scenario (Fig. 1). The baseline mean incidence rate was 29.1 per 10,000 patient days. For decolonization with the baseline HH, the mean incidence rate decreased to 14.5 per 10,000 patient days, which is a 50.2% decrease relative to the baseline incidence (Table 1). The decolonization scenario with a slightly improved HH compliance of 35% produced a relative reduction of 71.9% relative to the baseline incidence. Conclusions: Our analysis shows that decolonization, combined with modest improvement in HH compliance, could lead to large decreases in pathogen transmission. In turn, this model implies that efforts to identify and improve decolonization strategies for better patient safety in health care may be needed and are worth exploring.
Background: Pathogen transmission among staff and residents in nursing homes can vary depending on their interactions and by the amount of time a resident receives care in the facility. Understanding the relative differences in transmission rates between and among staff and residents can identify the pathways that contributed most to the spread of SARS-CoV-2 in US nursing homes. Further exploring relative differences by categorizing facilities by residents’ lengths of stay can identify priority categories for intervention. Methods: Using US National Healthcare Safety Network (NHSN) surveillance data on resident and staff cases, vaccination, and resident deaths during June 2020–June 2022, we estimated SARS-CoV-2 transmission among and between residents and staff. We used a Bayesian inversion of a susceptible–exposed–infected–removed–virus–death (SEIRVD) compartmental model to produce the estimates. The facilities were divided into those with median length of stay (LOS) among the residents of 10 weeks. Additional inputs included the incidence and vaccination levels of the county where each facility was located. For the compartmental model, all data were averaged to form a representative facility for each category. Transmission was estimated separately for 3 periods: (1) June 2020–March 2021 as before the SARS-CoV-2 delta variant, (2) April 2021—October 2021 during SARS-CoV-2 delta variant dominance, and (3) November 2021—June 2022 during the prevalence of the SARS-CoV-2 omicron variant. Results: Regardless of facility category, transmission was highest from staff to residents or resident to resident (Fig.). These estimates of transmission were highest during the pre–SARS-CoV-2 delta variant phase. Transmission in that phase was highest in the facilities with LOS >10 weeks from staff to residents at 0.88 per week (95% credible interval [CrI], 0.06–1.85), in the facilities with LOS 6–10 weeks from staff to residents at 0.68 per week (95% CrI, 0.03–1.78), and in the facilities with LOS <6 weeks between residents at 0.47 per week (95% CrI, 0.02–0.95). Conclusions: Staff-to-resident or resident-to-resident transmission were the dominant pathways of spread of SARS-CoV-2 across the periods or the facility categories. Facilities with LOS 6 weeks or longer had higher median transmission estimates across the periods and transmission routes compared to facilities with LOS less than 6 weeks, implying that when prioritization of intervention resources is needed, facilities caring for populations with longer stays could be prioritized.
Background: The CDC’s new Public Health Strategies to Prevent the Spread of Novel and Targeted Multidrug-Resistant Organisms (MDROs) were informed by mathematical models that assessed the impact of implementing preventive strategies directed at a subset of healthcare facilities characterized as influential or highly connected based on their predicted role in the regional spread of MDROs. We developed an interactive tool to communicate mathematical modeling results and visualize the regional patient transfer network for public health departments and healthcare facilities to assist in planning and implementing prevention strategies. Methods: An interactive RShiny application is currently hosted in the CDC network and is accessible to external partners through the Secure Access Management Services (SAMS). Patient transfer volumes (direct and indirect, that is, with up to 30 days in the community between admissions) were estimated from the CMS fee-for-service claims data from 2019. The spread of a carbapenem-resistant Enterobacterales (CRE)–like MDROs within a US state was simulated using a deterministic model with susceptible and infectious compartments in the community and healthcare facilities interconnected through patient transfers. Individuals determined to be infectious through admission screening, point-prevalence surveys (PPSs), or notified from interfacility communication were assigned lower transmissibility if enhanced infection prevention and control practices were in place at a facility. Results: The application consists of 4 interactive tabs. Users can visualize the statewide patient-sharing network for any US state and select territories in the first tab (Fig. 1). A feature allows users to highlight a facility of interest and display downstream or upstream facilities that received or sent transfers from the facility of interest, respectively. A second tab lists influential facilities to aid in prioritizing screening and prevention activities. A third tab lists all facilities in the state in descending order of their dispersal rate (ie, the rate at which patients are shared downstream to other facilities), which can help identify highly connected facilities. In the fourth tab, an interactive graph displays the predicted reduction of MDRO prevalence given a range of intervention scenarios (Fig. 2). Conclusions: Our RShiny application, which can be accessed by public health partners, can assist healthcare facilities and public health departments in planning and tailoring MDRO prevention activity bundles.
Background: Multidrug-resistant organisms (MDROs), such as carbapenem-resistant Enterobacterales (CRE), can spread rapidly in a region. Facilities that care for high-acuity patients with long average lengths of stay (eg, long-term acute-care hospitals or LTACHs and ventilator-capable skilled nursing facilities or vSNFs) may amplify this spread. We assessed the impact of interventions on CRE spread within a region individually, bundled, and implemented at different facility types. Methods: We developed a deterministic compartmental model, parametrized using CRE data reported to the NHSN and patient transfer data from the CMS specific to a US state. The model includes the community and the healthcare facilities within the state. Individuals may be either susceptible or infected and infectious. Infected patients determined to have CRE through admission screening or point-prevalence surveys at a facility are placed in a state of lower transmissibility if enhanced infection prevention and control (IPC) practices are in place. Results: Intervention bundles that included periodic point-prevalence surveys and enhanced IPC at high-acuity postacute-care facilities had the greatest impact on regional prevalence 10 years into an outbreak; the benefits of including admission screening and improved interfacility communication were more modest (Fig. 1A). Delaying interventions by 3 years is predicted to result in smaller reductions in prevalence (Fig. 1B). Increasing the frequency of point-prevalence surveys from biannually to quarterly resulted in a substantial relative reduction in prevalence (from 25% to 44%) if conducted from the start of an outbreak. IPC improvements in vSNFs resulted in greater relative reductions than in LTACHs. Admission screening at LTACHs and vSNFs was predicted to have a greater impact on prevalence if in place prior to CRE introduction (~20% reduction), and the impact decreased by approximately half if implementation was delayed until 3 years after CRE introduction. In contrast, the effect of admission screening in ACH was less (~10% reduction in prevalence) and did not change with implementation delays. Conclusions: Our model suggests that interventions that limit unrecognized MDRO introduction to, or dispersal from, LTACHs and vSNFs through screening are predicted to slow distribution regionally. Interventions to detect colonization and improve IPC practices within LTACHs and vSNFs may substantially reduce the regional burden. Prevention strategies are predicted to have the greatest impact when interventions are bundled and implemented before an MDRO is identified in a region, but reduction in overall prevalence is still possible if implemented after initial MDRO spread.
Background: Carbapenem-resistant Enterobacteriaceae (CRE) are endemic in the Chicago region. We assessed the regional impact of a CRE control intervention targeting high-prevalence facilities; that is, long-term acute-care hospitals (LTACHs) and ventilator-capable skilled nursing facilities (vSNFs). Methods: In July 2017, an academic–public health partnership launched a regional CRE prevention bundle: (1) identifying patient CRE status by querying Illinois’ XDRO registry and periodic point-prevalence surveys reported to public health, (2) cohorting or private rooms with contact precautions for CRE patients, (3) combining hand hygiene adherence, monitoring with general infection control education, and guidance by project coordinators and public health, and (4) daily chlorhexidine gluconate (CHG) bathing. Informed by epidemiology and modeling, we targeted LTACHs and vSNFs in a 13-mile radius from the coordinating center. Illinois mandates CRE reporting to the XDRO registry, which can also be manually queried or generate automated alerts to facilitate interfacility communication. The regional intervention promoted increased automation of alerts to hospitals. The prespecified primary outcome was incident clinical CRE culture reported to the XDRO registry in Cook County by month, analyzed by segmented regression modeling. A secondary outcome was colonization prevalence measured by serial point-prevalence surveys for carbapenemase-producing organism colonization in LTACHs and vSNFs. Results: All eligible LTACHs (n = 6) and vSNFs (n = 9) participated in the intervention. One vSNF declined CHG bathing. vSNFs that implemented CHG bathing typically bathed residents 2–3 times per week instead of daily. Overall, there were significant gaps in infection control practices, especially in vSNFs. Also, 75 Illinois hospitals adopted automated alerts (56 during the intervention period). Mean CRE incidence in Cook County decreased from 59.0 cases per month during baseline to 40.6 cases per month during intervention (P < .001). In a segmented regression model, there was an average reduction of 10.56 cases per month during the 24-month intervention period (P = .02) (Fig. 1), and an estimated 253 incident CRE cases were averted. Mean CRE incidence also decreased among the stratum of vSNF/LTACH intervention facilities (P = .03). However, evidence of ongoing CRE transmission, particularly in vSNFs, persisted, and CRE colonization prevalence remained high at intervention facilities (Table 1). Conclusions: A resource-intensive public health regional CRE intervention was implemented that included enhanced interfacility communication and targeted infection prevention. There was a significant decline in incident CRE clinical cases in Cook County, despite high persistent CRE colonization prevalence in intervention facilities. vSNFs, where understaffing or underresourcing were common and lengths of stay range from months to years, had a major prevalence challenge, underscoring the need for aggressive infection control improvements in these facilities.
Funding: The Centers for Disease Control and Prevention (SHEPheRD Contract No. 200-2011-42037)
Disclosures: M.Y.L. has received research support in the form of contributed product from OpGen and Sage Products (now part of Stryker Corporation), and has received an investigator-initiated grant from CareFusion Foundation (now part of BD).
Background: Successful containment of regional outbreaks of emerging multidrug-resistant organisms (MDROs) relies on early outbreak detection. However, deploying regional containment is resource intensive; understanding the distribution of different types of outbreaks might aid in further classifying types of responses. Objective: We used a stochastic model of disease transmission in a region where healthcare facilities are linked by patient sharing to explore optimal strategies for early outbreak detection. Methods: We simulated the introduction and spread of Candida auris in a region using a lumped-parameter stochastic adaptation of a previously described deterministic model (Clin Infect Dis 2019 Mar 28. doi:10.1093/cid/ciz248). Stochasticity was incorporated to capture early-stage behavior of outbreaks with greater accuracy than was possible with a deterministic model. The model includes the real patient sharing network among healthcare facilities in an exemplary US state, using hospital claims data and the minimum data set from the CMS for 2015. Disease progression rates for C. auris were estimated from surveillance data and the literature. Each simulated outbreak was initiated with an importation to a Dartmouth Atlas of Health Care hospital referral region. To estimate the potential burden, we quantified the “facility-time” period during which infectious patients presented a risk of subsequent transmission within each healthcare facility. Results: Of the 28,000 simulated outbreaks initiated with an importation to the community, 2,534 resulted in patients entering the healthcare facility network. Among those, 2,480 (98%) initiated a short outbreak that died out or quickly attenuated within 2 years without additional intervention. In the simulations, if containment responses were initiated for each of those short outbreaks, facility time at risk decreased by only 3%. If containment responses were initiated for the 54 (2%) outbreaks lasting 2 years or longer, facility time at risk decreased by 79%. Sentinel surveillance through point-prevalence surveys (PPSs) at the 23 skilled-nursing facilities caring for ventilated patients (vSNF) in the network detected 50 (93%) of the 54 longer outbreaks (median, 235 days to detection). Quarterly PPSs at the 23 largest acute-care hospitals (ie, most discharges) detected 48 longer outbreaks (89%), but the time to detection was longer (median, 716 days to detection). Quarterly PPSs also identified 76 short-term outbreaks (in comparison to only 14 via vSNF PPS) that self-terminated without intervention. Conclusions: A vSNF-based sentinel surveillance system likely provides better information for guiding regional intervention for the containment of emerging MDROs than a similarly sized acute-care hospital–based system.
Background: Shared Healthcare Intervention to Eliminate Life-threatening Dissemination of MDROs in Orange County, California (SHIELD OC) was a CDC-funded regional decolonization intervention from April 2017 through July 2019 involving 38 hospitals, nursing homes (NHs), and long-term acute-care hospitals (LTACHs) to reduce MDROs. Decolonization in NH and LTACHs consisted of universal antiseptic bathing with chlorhexidine (CHG) for routine bathing and showering plus nasal iodophor decolonization (Monday through Friday, twice daily every other week). Hospitals used universal CHG in ICUs and provided daily CHG and nasal iodophor to patients in contact precautions. We sought to evaluate whether decolonization reduced hospitalization and associated healthcare costs due to infections among residents of NHs participating in SHIELD compared to nonparticipating NHs. Methods: Medicaid insurer data covering NH residents in Orange County were used to calculate hospitalization rates due to a primary diagnosis of infection (counts per member quarter), hospital bed days/member-quarter, and expenditures/member quarter from the fourth quarter of 2015 to the second quarter of 2019. We used a time-series design and a segmented regression analysis to evaluate changes attributable to the SHIELD OC intervention among participating and nonparticipating NHs. Results: Across the SHIELD OC intervention period, intervention NHs experienced a 44% decrease in hospitalization rates, a 43% decrease in hospital bed days, and a 53% decrease in Medicaid expenditures when comparing the last quarter of the intervention to the baseline period (Fig. 1). These data translated to a significant downward slope, with a reduction of 4% per quarter in hospital admissions due to infection (P < .001), a reduction of 7% per quarter in hospitalization days due to infection (P < .001), and a reduction of 9% per quarter in Medicaid expenditures (P = .019) per NH resident. Conclusions: The universal CHG bathing and nasal decolonization intervention adopted by NHs in the SHIELD OC collaborative resulted in large, meaningful reductions in hospitalization events, hospitalization days, and healthcare expenditures among Medicaid-insured NH residents. The findings led CalOptima, the Medicaid provider in Orange County, California, to launch an NH incentive program that provides dedicated training and covers the cost of CHG and nasal iodophor for OC NHs that enroll.
Disclosures: Gabrielle M. Gussin, University of California, Irvine, Stryker (Sage Products): Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Clorox: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Medline: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes. Xttrium: Conducting studies in which contributed antiseptic product is provided to participating hospitals and nursing homes.
Background: Epidemiological studies have utilized administrative discharge diagnosis codes to identify methicillin-resistant and methicillin-sensitive Staphylococcus aureus (MRSA and MSSA) infections and trends, despite debate regarding the accuracy of utilizing codes for this purpose. We assessed the sensitivity and positive predictive value (PPV) of MRSA- and MSSA-specific diagnosis codes, trends, characteristics, and outcomes of S. aureus hospitalizations by method of identification. Methods: Clinical micro biology results and discharge data from geographically diverse US hospitals participating in the Premier Healthcare Database from 2012–2017 were used to identify monthly rates of MRSA and MSSA. Positive MRSA or MSSA clinical cultures and/or a MRSA- or MSSA-specific International Classification of Diseases, Ninth/Tenth Revision, Clinical Modification (ICD-9/10 CM) diagnosis codes from adult inpatients (aged ≥18 years) were included as S. aureus hospitalizations. Septicemia was defined as a positive blood culture or a MRSA or MSSA septicemia code. Sensitivity and PPV for codes were calculated for hospitalizations where admission status was not listed as transfer; true infection was considered a positive clinical culture. Negative binominal regression models measured trends in rates of MRSA and MSSA per 1,000 hospital discharges. Results: We identified 168,634 MRSA and 148,776 MSSA hospitalizations in 256 hospitals; 17% of MRSA and 21% of MSSA were septicemia. Less than half of all S. aureus hospitalizations (49% MRSA, 46% MSSA) and S. aureus septicemia hospitalizations (37% MRSA, 38% MSSA) had both a positive culture and diagnosis code (Fig. 1). Sensitivity of MRSA codes in identifying positive cultures was 61% overall and 56% for septicemia, PPV was 62% overall and 53% for septicemia. MSSA codes had a sensitivity of 49% in identifying MSSA cultures and 52% for MSSA septicemia; PPV was 69% overall and 62% for septicemia. Despite low sensitivity, MRSA trends are similar for cultures and codes, and MSSA trends are divergent (Fig. 2). For hospitalizations with septicemia, mortality was highest among those with a blood culture only (31.3%) compared to hospitalizations with both a septicemia code and blood culture (16.6%), and septicemia code only (14.7%). Conclusions: ICD diagnosis code sensitivity and PPV for identifying infections were consistently poor in recent years. Less than half of hospitalizations have concordant microbiology laboratory results and diagnosis codes. Rates and trend estimates for MSSA differ by method of identification. Using diagnosis codes to identify S. aureus infections may not be appropriate for descriptive epidemiology or assessing trends due to significant misclassification.
Disclosures: Scott Fridkin reports that his spouse receives consulting fees from the vaccine industry.
To compare risk of surgical site infection (SSI) following cesarean delivery between women covered by Medicaid and private health insurance.
Cesarean deliveries covered by Medicaid or private insurance and reported to the National Healthcare Safety Network (NHSN) and state inpatient discharge databases by hospitals in California (2011–2013).
Deliveries reported to NHSN and state inpatient discharge databases were linked to identify SSIs in the 30 days following cesarean delivery, primary payer, and patient and procedure characteristics. Additional hospital-level characteristics were obtained from public databases. Relative risk of SSI by primary payer primary payer was assessed using multivariable logistic regression adjusting for patient, procedure, and hospital characteristics, accounting for facility-level clustering.
Of 291,757 cesarean deliveries included, 48% were covered by Medicaid. SSIs were detected following 1,055 deliveries covered by Medicaid (0.75%) and 955 deliveries covered by private insurance (0.63%) (unadjusted odds ratio, 1.2; 95% confidence interval [CI], 1.1–1.3; P < .0001). The adjusted odds of SSI following cesarean deliveries covered by Medicaid was 1.4 (95% CI, 1.2–1.6; P < .0001) times the odds of those covered by private insurance.
In this, the largest and only multicenter study to investigate SSI risk following cesarean delivery by primary payer, Medicaid-insured women had a higher risk of infection than privately insured women. These findings suggest the need to evaluate and better characterize the quality of maternal healthcare for and needs of women covered by Medicaid to inform targeted infection prevention and policy.
As the US population ages, the number of hip and knee arthroplasties is expected to increase. Because surgical site infections (SSIs) following these procedures contribute substantial morbidity, mortality, and costs, we projected SSIs expected to occur from 2020 through 2030.
We used a stochastic Poisson process to project the number of primary and revision arthroplasties and SSIs. Primary arthroplasty rates were calculated using annual estimates of hip and knee arthroplasty stratified by age and gender from the 2012–2014 Nationwide Inpatient Sample and standardized by census population data. Revision rates, dependent on time from primary procedure, were obtained from published literature and were uniformly applied for all ages and genders. Stratified complex SSI rates for arthroplasties were obtained from 2012–2015 National Healthcare Safety Network data. To evaluate the possible impact of prevention measures, we recalculated the projections with an SSI rate reduced by 30%, the national target established by the US Department of Health and Human Services (HHS).
Without a reduction in SSI rates, we projected an increase in complex SSIs following hip and knee arthroplasty of 14% between 2020 and 2030. We projected a total burden of 77,653 SSIs; however, meeting the 30% rate reduction could prevent 23,297 of these SSIs.
Given current SSI rates, we project that complex SSI burden for primary and revision arthroplasty may increase due to an aging population. Reducing the SSI rate to the national HHS target could prevent 23,000 SSIs and reduce subsequent morbidity, mortality, and Medicare costs.
The purpose of this study was to quantify the effect of multidrug-resistant (MDR) gram-negative bacteria and methicillin-resistant Staphylococcus aureus (MRSA) healthcare-associated infections (HAIs) on mortality following infection, regardless of patient location.
We conducted a retrospective cohort study of patients with an inpatient admission in the US Department of Veterans Affairs (VA) system between October 1, 2007, and November 30, 2010. We constructed multivariate log-binomial regressions to assess the impact of a positive culture on mortality in the 30- and 90-day periods following the first positive culture, using a propensity-score–matched subsample.
Patients identified with positive cultures due to MDR Acinetobacter (n=218), MDR Pseudomonas aeruginosa (n=1,026), and MDR Enterobacteriaceae (n=3,498) were propensity-score matched to 14,591 patients without positive cultures due to these organisms. In addition, 3,471 patients with positive cultures due to MRSA were propensity-score matched to 12,499 patients without positive MRSA cultures. Multidrug-resistant gram-negative bacteria were associated with a significantly elevated risk of mortality both for invasive (RR, 2.32; 95% CI, 1.85–2.92) and noninvasive cultures (RR, 1.33; 95% CI, 1.22–1.44) during the 30-day period. Similarly, patients with MRSA HAIs (RR, 2.77; 95% CI, 2.39–3.21) and colonizations (RR, 1.32; 95% CI, 1.22–1.50) had an increased risk of death at 30 days.
We found that HAIs due to gram-negative bacteria and MRSA conferred significantly elevated 30- and 90-day risks of mortality. This finding held true both for invasive cultures, which are likely to be true infections, and noninvasive infections, which are possibly colonizations.
To determine the potential epidemiologic and economic value of the implementation of a multifaceted Clostridium difficile infection (CDI) control program at US acute care hospitals
Markov model with a 5-year time horizon
Patients whose data were used in our simulations were limited to hospitalized Medicare beneficiaries ≥65 years old.
CDI is an important public health problem with substantial associated morbidity, mortality, and cost. Multifaceted national prevention efforts in the United Kingdom, including antimicrobial stewardship, patient isolation, hand hygiene, environmental cleaning and disinfection, and audit, resulted in a 59% reduction in CDI cases reported from 2008 to 2012.
Our analysis was conducted from the federal perspective. The intervention we modeled included the following components: antimicrobial stewardship utilizing the Antimicrobial Use and Resistance module of the National Healthcare Safety Network (NHSN), use of contact precautions, and enhanced environmental cleaning. We parameterized our model using data from CDC surveillance systems, the AHRQ Healthcare Cost and Utilization Project, and literature reviews. To address uncertainty in our parameter estimates, we conducted sensitivity analyses for intervention effectiveness and cost, expenditures by other federal partners, and discount rate. Each simulation represented a cohort of 1,000 hospitalized patients over 1,000 trials.
In our base case scenario with 50% intervention effectiveness, we estimated that 509,000 CDI cases and 82,000 CDI-attributable deaths would be prevented over a 5-year time horizon. Nationally, the cost savings across all hospitalizations would be $2.5 billion (95% credible interval: $1.2 billion to $4.0 billion).
The potential benefits of a multifaceted national CDI prevention program are sizeable from the federal perspective.
Infect Control Hosp Epidemiol 2015;00(0): 1–7
Email your librarian or administrator to recommend adding this to your organisation's collection.