To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Novel ST398 methicillin susceptible Staphylococcus aureus (MSSA) in the United States was first observed in New York City (2004–2007); its diffusion across the country resulted in changing treatment options. Utilizing outpatient antimicrobial susceptibility data from the Veterans Health Administration from 2010 to 2019, the spatiotemporal prevalence of potential ST398 MSSA is documented.
The impact of hurricane-related flooding on infectious diseases in the US is not well understood. Using geocoded electronic health records for 62,762 veterans living in North Carolina counties impacted by Hurricane Matthew coupled with flood maps, we explore the impact of hurricane and flood exposure on infectious outcomes in outpatient settings and emergency departments as well as antimicrobial prescribing. Declines in outpatient visits and antimicrobial prescribing are observed in weeks 0-2 following the hurricane as compared with the baseline period and the year prior, while increases in antimicrobial prescribing are observed 3+ weeks following the hurricane. Taken together, hurricane and flood exposure appear to have had minor impacts on infectious outcomes in North Carolina veterans, not resulting in large increases in infections or antimicrobial prescribing
Background: Antibiotic use during end-of-life (EOL) care is an increasingly important target for antimicrobial stewardship given the high prevalence of antibiotic use in this setting with limited evidence on safety and effectiveness to guide antibiotic decision making. We estimated antibiotic use during the last 6 months of life for patients under hospice or palliative care, and we identified potential targets (ie time points) during the EOL period when antimicrobial stewardship interventions could be targeted for maximal benefit. Methods: We conducted a retrospective cohort study of nationwide Veterans’ Affairs (VA) patients, 18 years and older who died between January 1, 2014, and December 31, 2019, and who had been hospitalized within 6 months prior to death. Data from the VA’s integrated electronic medical record (EMR) were collected including demographics, comorbid conditions, and duration of inpatient antibiotics administered, along with outpatient antibiotics dispensed. A propensity-score matched-cohort analysis was conducted to compare antibiotic use between patients placed into palliative care or hospice matched to patients not receiving palliative care or hospice care. Repeated measures ANOVA and repeated measures linear regression methods were used to analyze difference in difference (D-I-D) of days of therapy (DOT) between the 2 cohorts. Results: There were 251,822 patients in the cohort, including 23,746 in hospice care, 89,768 in palliative care, and 138,308 without palliative or hospice care. The median days from last discharge to death was 9 days. The most common comorbidities were chronic obstructive pulmonary diseases (50%), malignancy (46%), and diabetes mellitus (43%). Overall, 18,296 (77%) of 23,746 hospice patients, and 71,812 (80%) of 89,768 palliative care patients received at least 1 antibiotic, whereas 95,167 (69%) of 138,308 who were not placed in hospice or did not receive palliative care received antibiotics. In the primary matched cohort analysis that compared patients placed into hospice or palliative care to propensity-score matched controls, entry into palliative care was associated with a 11% absolute increase in antibiotic prescribing, and entry into hospice was associated with a 4% absolute increase during the 7–14 days after entry versus the 7–14 days before entry (Fig. 1). The stratified cohorts had very similar balanced covariates as the overall cohort. Conclusions: In our large cohort study, we observed that patients receiving EOL care had high levels of antibiotic exposure across VA population, particularly on entry to hospice or during admissions when they received palliative care consultation. Future studies are needed to identify the optimal EOL strategies for collaboration between antimicrobial stewardship and palliative care.
Temporal overlap of the Atlantic hurricane season and seasonal influenza vaccine rollout has the potential to result in delays or disruptions of vaccination campaigns. We documented seasonal influenza vaccination behavior over a 5-year period and explored associations between flooding following Hurricane Harvey and timing and uptake of vaccines, as well as how the impacts of Hurricane Harvey on vaccination vary by race, wealth, and rurality.
Retrospective cohort analysis.
Texas counties affected by Hurricane Harvey.
Active users of the Veterans’ Health Administration in 2017.
We used geocoded residential address data to assess flood exposure status following Hurricane Harvey. Days to receipt of seasonal influenza vaccines were calculated for each year from 2014 to 2019. Proportional hazards models were used to determine how likelihood of vaccination varied according to flood status as well as the race, wealth, and rural–urban residence of patients.
The year of Hurricane Harvey was associated with a median delay of 2 weeks to vaccination and lower overall vaccination than in prior years. Residential status in flooded areas was associated with lower hazards of influenza vaccination in all years. White patients had higher proportional hazards of influenza vaccination than non-White patients, though this attenuated to 6.39% (hazard ratio [HR], 1.0639; 95% confidence interval [CI], 1.034–1.095) in the hurricane. year.
Receipt of seasonal influenza vaccination following regional exposure to the effects of Hurricane Harvey was delayed among US veterans. White, non–low-income, and rural patients had higher likelihood of vaccination in all years of the study, but these gaps narrowed during the hurricane year.
We evaluated barriers and facilitators to patient adherence with a bundled intervention including chlorhexidine gluconate (CHG) bathing and decolonizing Staphylococcus aureus nasal carriers in a real-world setting. Survey data identified 85.5% adherence with home use of CHG as directed and 52.9% adherence with home use of mupirocin as directed.
To assess the effectiveness and acceptability of antimicrobial stewardship-focused implementation strategies on inpatient fluoroquinolones.
Stewardship champions at 15 hospitals were surveyed regarding the use and acceptability of strategies to improve fluoroquinolone prescribing. Antibiotic days of therapy (DOT) per 1,000 days present (DP) for sites with and without prospective audit and feedback (PAF) and/or prior approval were compared.
Among all of the sites, 60% had PAF or prior approval implemented for fluoroquinolones. Compared to sites using neither strategy (64.2 ± 34.4 DOT/DP), fluoroquinolone prescribing rates were lower for sites that employed PAF and/or prior approval (35.5 ± 9.8; P = .03) and decreased from 2017 to 2018 (P < .001). This decrease occurred without an increase in advanced-generation cephalosporins. Total antibiotic rates were 13% lower for sites with PAF and/or prior approval, but this difference did not reach statistical significance (P = .20). Sites reporting that PAF and/or prior approval were “completely” accepted had lower fluoroquinolone rates than sites where it was “moderately” accepted (34.2 ± 5.7 vs 48.7 ± 4.5; P < .01). Sites reported that clinical pathways and/or local guidelines (93%), prior approval (93%), and order forms (80%) “would” or “may” be effective in improving fluoroquinolone use. Although most sites (73%) indicated that requiring infectious disease consults would or may be effective in improving fluoroquinolones, 87% perceived implementation to be difficult.
PAF and prior approval implementation strategies focused on fluoroquinolones were associated with significantly lower fluoroquinolone prescribing rates and nonsignificant decreases in total antibiotic use, suggesting limited evidence for class substitution. The association of acceptability of strategies with lower rates highlights the importance of culture. These results may indicate increased acceptability of implementation strategies and/or sensitivity to FDA warnings.
To develop a fully automated algorithm using data from the Veterans’ Affairs (VA) electrical medical record (EMR) to identify deep-incisional surgical site infections (SSIs) after cardiac surgeries and total joint arthroplasties (TJAs) to be used for research studies.
Retrospective cohort study.
This study was conducted in 11 VA hospitals.
Patients who underwent coronary artery bypass grafting or valve replacement between January 1, 2010, and March 31, 2018 (cardiac cohort) and patients who underwent total hip arthroplasty or total knee arthroplasty between January 1, 2007, and March 31, 2018 (TJA cohort).
Relevant clinical information and administrative code data were extracted from the EMR. The outcomes of interest were mediastinitis, endocarditis, or deep-incisional or organ-space SSI within 30 days after surgery. Multiple logistic regression analysis with a repeated regular bootstrap procedure was used to select variables and to assign points in the models. Sensitivities, specificities, positive predictive values (PPVs) and negative predictive values were calculated with comparison to outcomes collected by the Veterans’ Affairs Surgical Quality Improvement Program (VASQIP).
Overall, 49 (0.5%) of the 13,341 cardiac surgeries were classified as mediastinitis or endocarditis, and 83 (0.6%) of the 12,992 TJAs were classified as deep-incisional or organ-space SSIs. With at least 60% sensitivity, the PPVs of the SSI detection algorithms after cardiac surgeries and TJAs were 52.5% and 62.0%, respectively.
Considering the low prevalence rate of SSIs, our algorithms were successful in identifying a majority of patients with a true SSI while simultaneously reducing false-positive cases. As a next step, validation of these algorithms in different hospital systems with EMR will be needed.
There are currently no guidelines for central-line insertion site evaluation. Our study revealed an association between insertion site inflammation (ISI) and the development of central-line–associated bloodstream infections (CLABSIs). Automated surveillance for ISI is feasible and could help prevent CLABSI.
Background: Studies of interventions to decrease rates of surgical site infections (SSIs) must include thousands of patients to be statistically powered to demonstrate a significant reduction. Therefore, it is important to develop methodology to extract data available in the electronic medical record (EMR) to accurately measure SSI rates. Prior studies have created tools that optimize sensitivity to prioritize chart review for infection control purposes. However, for research studies, positive predictive value (PPV) with reasonable sensitivity is preferred to limit the impact of false-positive results on the assessment of intervention effectiveness. Using information from the prior tools, we aimed to determine whether an algorithm using data available in the Veterans Affairs (VA) EMR could accurately and efficiently identify deep incisional or organ-space SSIs found in the VA Surgical Quality Improvement Program (VASQIP) data set for cardiac and orthopedic surgery patients. Methods: We conducted a retrospective cohort study of patients who underwent cardiac surgery or total joint arthroplasty (TJA) at 11 VA hospitals between January 1, 2007, and April 30, 2017. We used EMR data that were recorded in the 30 days after surgery on inflammatory markers; microbiology; antibiotics prescribed after surgery; International Classification of Diseases (ICD) and current procedural terminology (CPT) codes for reoperation for an infection related purpose; and ICD codes for mediastinitis, prosthetic joint infection, and other SSIs. These metrics were used in an algorithm to determine whether a patient had a deep or organ-space SSI. Sensitivity, specificity, PPV and negative predictive values (NPV) were calculated for accuracy of the algorithm through comparison with 30-day SSI outcomes collected by nurse chart review in the VASQIP data set. Results: Among the 11 VA hospitals, there were 18,224 cardiac surgeries and 16,592 TJA during the study period. Of these, 20,043 were evaluated by VASQIP nurses and were included in our final cohort. Of the 8,803 cardiac surgeries included, manual review identified 44 (0.50%) mediastinitis cases. Of the 11,240 TJAs, manual review identified 71 (0.63%) deep or organ-space SSIs. Our algorithm identified 32 of the mediastinitis cases (73%) and 58 of the deep or organ-space SSI cases (82%). Sensitivity, specificity, PPV, and NPV are shown in Table 1. Of the patients that our algorithm identified as having a deep or organ-space SSI, only 21% (PPV) actually had an SSI after cardiac surgery or TJA. Conclusions: Use of the algorithm can identify most complex SSIs (73%–82%), but other data are necessary to separate false-positive from true-positive cases and to improve the efficiency of case detection to support research questions.
Background: Enhanced terminal room cleaning with ultraviolet C (UVC) disinfection has become more commonly used as a strategy to reduce the transmission of important nosocomial pathogens, including Clostridioides difficile, but the real-world effectiveness remains unclear. Objectives: We aimed to assess the association of UVC disinfection during terminal cleaning with the incidence of healthcare-associated C. difficile infection and positive test results for C. difficile within the nationwide Veterans Health Administration (VHA) System. Methods: Using a nationwide survey of VHA system acute-care hospitals, information on UV-C system utilization and date of implementation was obtained. Hospital-level incidence rates of clinically confirmed hospital-onset C. difficile infection (HO-CDI) and positive test results with recent healthcare exposures (both hospital-onset [HO-LabID] and community-onset healthcare-associated [CO-HA-LabID]) at acute-care units between January 2010 and December 2018 were obtained through routine surveillance with bed days of care (BDOC) as the denominator. We analyzed the association of UVC disinfection with incidence rates of HO-CDI, HO-Lab-ID, and CO-HA-LabID using a nonrandomized, stepped-wedge design, using negative binomial regression model with hospital-specific random intercept, the presence or absence of UVC disinfection use for each month, with baseline trend and seasonality as explanatory variables. Results: Among 143 VHA acute-care hospitals, 129 hospitals (90.2%) responded to the survey and were included in the analysis. UVC use was reported from 42 hospitals with various implementation start dates (range, June 2010 through June 2017). We identified 23,021 positive C. difficile test results (HO-Lab ID: 5,014) with 16,213 HO-CDI and 24,083,252 BDOC from the 129 hospitals during the study period. There were declining baseline trends nationwide (mean, −0.6% per month) for HO-CDI. The use of UV-C had no statistically significant association with incidence rates of HO-CDI (incidence rate ratio [IRR], 1.032; 95% CI, 0.963–1.106; P = .65) or incidence rates of healthcare-associated positive C. difficile test results (HO-Lab). Conclusions: In this large quasi-experimental analysis within the VHA System, the enhanced terminal room cleaning with UVC disinfection was not associated with the change in incidence rates of clinically confirmed hospital-onset CDI or positive test results with recent healthcare exposure. Further research is needed to understand reasons for lack of effectiveness, such as understanding barriers to utilization.
Background: Central lines (CL) are widely used in the inpatient setting and central-line–associated bloodstream infection (CLABSI) is a serious complication of CL use. Because CL insertion site inflammation (ISI) may precede the onset of CLABSI, we aimed to define ISI, to determine whether ISI was associated with CLABSI, and to develop an automated surveillance system for ISI. Methods: We extracted electronic medical records (EMRs) of adult patients hospitalized at the University of Iowa Hospitals & Clinics during January 2015–December 2018. Nurses routinely document CL insertion-site characteristics in specifically designed flow sheets in the EMR. An ISI was counted every time ≥1 of the following signs were documented during CL assessments: edema, erythema, induration, tenderness, or drainage. A 1:2 case-control investigation was performed by matching nonmucosal barrier injury (non-MBI) CLABSI patients (cases) to patients without a CLABSI diagnosis (controls). We matched for age (±10 years), sex, date (±30 days), inpatient unit, central-line days, and central-line type (temporary vs permanent). The main exposure of interest was having an ISI on or before CLABSI onset. CLABSIs were determined using CDC NHSN definitions. We then created a metric: ISI days (defined as the number of days with ≥1 ISI documented) and plotted ISI incidence (ISI days per central-line days) to quantify the burden of ISIs and to determine whether ISI and non-MBI CLABSI incidences were collinear. An automated surveillance system for ISI was created using structured query language queries to the EMR data repository and Tableau software. Results: During 2015–2018, we detected 194 CLABSI cases that were matched to 338 controls. CLABSI patients had greater odds of having an ISI (OR, 2.3; 95% CI, 1.3–4.0). Over the study period, ISI incidence decreased from ~80 to ~50 ISI days per 1,000 CL days. Non-MBI CLABSI rates also decreased from ~1.5 to ~1.0 CLABSIs per 1,000 CL days. Conclusions: ISI incidence is associated with non-MBI CLABSI incidence. Because ISI incidence is higher than CLABSI incidence, surveillance for ISI could be a more sensitive indicator for monitoring the impact of CLABSI prevention practices. Automated surveillance for novel process metrics is a promising infection prevention tool.
In the absence of pyuria, positive urine cultures are unlikely to represent infection. Conditional urine reflex culture policies have the potential to limit unnecessary urine culturing. We evaluated the impact of this diagnostic stewardship intervention.
We conducted a retrospective, quasi-experimental (nonrandomized) study, with interrupted time series, from August 2013 to January 2018 to examine rates of urine cultures before versus after the policy intervention. We compared 3 intervention sites to 3 control sites in an aggregated series using segmented negative binomial regression.
The study included 6 acute-care hospitals within the Veterans’ Health Administration across the United States.
Adult patients with at least 1 urinalysis ordered during acute-care admission, excluding pregnant patients or those undergoing urological procedures, were included.
At the intervention sites, urine cultures were performed if a preceding urinalysis met prespecified criteria. No such restrictions occurred at the control sites. The primary outcome was the rate of urine cultures performed per 1,000 patient days. The safety outcome was the rate of gram-negative bloodstream infection per 1,000 patient days.
The study included 224,573 urine cultures from 50,901 admissions in 24,759 unique patients. Among the intervention sites, the overall average number of urine cultures performed did not significantly decrease relative to the preintervention period (5.9% decrease; P = 0.8) but did decrease by 21% relative to control sites (P < .01). We detected no significant difference in the rates of gram-negative bloodstream infection among intervention or control sites (P = .49).
Conditional urine reflex culture policies were associated with a decrease in urine culturing without a change in the incidence of gram-negative bloodstream infection.
Although most hospitals report very high levels of hand hygiene compliance (HHC), the accuracy of these overtly observed rates is questionable due to the Hawthorne effect and other sources of bias. In the study, we aimed (1) to compare HHC rates estimated using the standard audit method of overt observation by a known observer and a new audit method that employed a rapid (<15 minutes) “secret shopper” method and (2) to pilot test a novel feedback tool.
Quality improvement project using a quasi-experimental stepped-wedge design.
This study was conducted in 5 acute-care hospitals (17 wards, 5 intensive care units) in the Midwestern United States.
Sites recruited a hand hygiene observer from outside the acute-care units to rapidly and covertly observe entry and exit HHC during the study period, October 2016–September 2017. After 3 months of observations, sites received a monthly feedback tool that communicated HHC information from the new audit method.
The absolute difference in HHC estimates between the standard and new audit methods was ~30%. No significant differences in HHC were detected between the baseline and feedback phases (OR, 0.92; 95% CI, 0.84–1.01), but the standard audit method had significantly higher estimates than the new audit method (OR, 9.83; 95% CI, 8.82–10.95).
HHC estimates obtained using the new audit method were substantially lower than estimates obtained using the standard audit method, suggesting that the rapid, secret-shopper method is less subject to bias. Providing feedback using HHC from the new audit method did not seem to impact HHC behaviors.
Email your librarian or administrator to recommend adding this to your organisation's collection.