To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The gold standard for hand hygiene (HH) while wearing gloves requires removing gloves, performing HH, and donning new gloves between WHO moments. The novel strategy of applying alcohol-based hand rub (ABHR) directly to gloved hands might be effective and efficient.
A mixed-method, multicenter, 3-arm, randomized trial.
Adult and pediatric medical-surgical, intermediate, and intensive care units at 4 hospitals.
Healthcare personnel (HCP).
HCP were randomized to 3 groups: ABHR applied directly to gloved hands, the current standard, or usual care.
Gloved hands were sampled via direct imprint. Gold-standard and usual-care arms were compared with the ABHR intervention.
Bacteria were identified on gloved hands after 432 (67.4%) of 641 observations in the gold-standard arm versus 548 (82.8%) of 662 observations in the intervention arm (P < .01). HH required a mean of 14 seconds in the intervention and a mean of 28.7 seconds in the gold-standard arm (P < .01). Bacteria were identified on gloved hands after 133 (98.5%) of 135 observations in the usual-care arm versus 173 (76.6%) of 226 observations in the intervention arm (P < .01). Of 331 gloves tested 6 (1.8%) were found to have microperforations; all were identified in the intervention arm [6 (2.9%) of 205].
Compared with usual care, contamination of gloved hands was significantly reduced by applying ABHR directly to gloved hands but statistically higher than the gold standard. Given time savings and microbiological benefit over usual care and lack of feasibility of adhering to the gold standard, the Centers for Disease Control and Prevention and the World Health Organization should consider advising HCP to decontaminate gloved hands with ABHR when HH moments arise during single-patient encounters.
Environmental cleaning is important in the interruption of pathogen transmission. Although prevention initiatives have targeted environmental cleaning, practice variations exist and compliance is low. Evaluation of human factors influencing variations in cleaning practices can be valuable in developing interventions to standardized practices. We conducted a work-system analysis using a human-factors engineering (HFE) framework to identify barriers and facilitators to environmental cleaning practices in acute and long-term care settings within the Veterans’ Affairs health system.
We conducted a qualitative study with key stakeholders at 3 VA facilities. We analyzed transcripts for thematic content and mapped themes to the HFE framework.
Staffing consistency was felt to improve cleaning practices and teamwork. We found that many environmental management service (EMS) staff were veterans who were motivated to serve fellow veterans, especially to prevent infections. However, hiring veterans comes with regulatory hurdles that affect staffing. Sites reported some form of monitoring their cleaning process, but there was variation in method and frequency. The EMS workload was affected by whether rooms were occupied by patients or were semiprivate rooms; both were reportedly more difficult to clean. Room design and surface finishes were identified as important to cleaning efficiency.
HFE work analysis identified barriers and facilitators to environmental cleaning. These findings highlight intervention entry points that may facilitate standardized work practices. There is a need to develop task-specific procedures such as cleaning occupied beds and semiprivate rooms. Future research should evaluate interventions that address these determinants of environmental cleaning.
Novel ST398 methicillin susceptible Staphylococcus aureus (MSSA) in the United States was first observed in New York City (2004–2007); its diffusion across the country resulted in changing treatment options. Utilizing outpatient antimicrobial susceptibility data from the Veterans Health Administration from 2010 to 2019, the spatiotemporal prevalence of potential ST398 MSSA is documented.
We performed a systematic literature review and meta-analysis on the effectiveness of coronavirus disease 2019 (COVID-19) vaccination against post-COVID conditions (long COVID) among fully vaccinated individuals.
Systematic literature review/meta-analysis.
We searched PubMed, Cumulative Index to Nursing and Allied Health, EMBASE, Cochrane Central Register of Controlled Trials, Scopus, and Web of Science from December 1, 2019, to June 2, 2023, for studies evaluating the COVID-19 vaccine effectiveness (VE) against post-COVID conditions among fully vaccinated individuals who received two doses of COVID-19 vaccine. A post-COVID condition was defined as any symptom that was present four or more weeks after COVID-19 infection. We calculated the pooled diagnostic odds ratio (DOR) (95% confidence interval) for post-COVID conditions between fully vaccinated and unvaccinated individuals. Vaccine effectiveness was estimated as 100% x (1-DOR).
Thirty-two studies with 775,931 individuals evaluated the effect of vaccination on post-COVID conditions, of which, twenty-four studies were included in the meta-analysis. The pooled DOR for post-COVID conditions among fully vaccinated individuals was 0.680 (95% CI: 0.523–0.885) with an estimated VE of 32.0% (11.5%–47.7%). Vaccine effectiveness was 36.9% (23.1%–48.2%) among those who received two doses of COVID-19 vaccine before COVID-19 infection and 68.7% (64.7%–72.2%) among those who received three doses before COVID-19 infection. The stratified analysis demonstrated no protection against post-COVID conditions among those who received COVID-19 vaccination after COVID-19 infection.
Receiving a complete COVID-19 vaccination prior to contracting the virus resulted in a significant reduction in post-COVID conditions throughout the study period, including during the Omicron era. Vaccine effectiveness demonstrated an increase when supplementary doses were administered.
Background: Many clinical guidelines recommend that clinicians should use antibiograms to decide on empiric antimicrobial therapy. However, antibiograms aggregate epidemiologic data without consideration for any other factors that may affect the risk of antimicrobial resistance (AMR), and little is known about an antibiogram’s reliability in predicting antimicrobial susceptibility. We assessed the diagnostic accuracy of antibiograms as a prediction tool for E. coli clinical isolates in predicting the risk of AMR for individual patients. Methods: We extracted microbiologic and patient-level data from the nationwide clinical data warehouse of the Veterans Health Administration (VHA). We assessed the diagnostic accuracy of the antibiogram for 3 commonly used antimicrobial classes for E. coli: ceftriaxone, fluoroquinolones, and trimethoprim-sulfamethoxazole. First, we retrospectively generated facility-level antibiograms for all VHA facilities from 2000 to 2019 using all clinical culture specimens positive for E. coli, according to the latest Clinical & Laboratory Standards Institute guideline. Second, we created a patient-level data set by including only patients who did not have a positive culture for E. coli in the preceding 12 months. Then we assessed the diagnostic accuracy of an antibiogram for E. coli to predict resistance for the isolates in the following calendar year, using logistic regression models with percentages in the antibiogram as dependent variables. We also set 5 stepwise thresholds at 80%, 85%, 90%, 95%, and 98%, and we calculated sensitivity, specificity, and accuracy for each antimicrobial. Results: Among 127 VHA hospitals, 1,484,038 isolates from 704,779 patients were available for analysis. The area under the ROC curve (AU-ROC) was 0.686 for ceftriaxone, 0.637 for fluoroquinolones, and 0.578 for trimethoprim-sulfamethoxazole, suggesting their relatively poor prediction performances (Fig. 1). The sensitivity and specificity of the antibiogram widely varied by antimicrobial groups and thresholds, with substantial trade-offs. Along with AU-ROC, these metrics suggest poor prediction performances when antibiograms are used as the sole prediction tool (Fig. 2). Conclusions: Antibiograms for E. coli have poor performances in predicting the risk of AMR for individual patients when they are used as a sole tool, and their contribution to the clinical decision making may be limited. Clinicians should also consider other clinical and epidemiologic data when interpreting antibiograms, and guideline statements that suggest antibiogram as a valuable tool for decision making in empiric therapy may need to be reconsidered. Further studies are needed to evaluate the contribution of antibiograms when combined with other patient-level factors.
Background: Statistically significant decreases in methicillin-resistant Staphylococcus aureus (MRSA) healthcare-associated infections (HAIs) occurred in Veterans Health Administration (VA) facilities from 2007 to 2019 using active surveillance for facility admissions and contact precautions for patients colonized (CPC) or infected (CPI) with MRSA, but the value of these interventions is controversial. Objective: To determine the impact of active surveillance, CPC, and CPI on prevention MRSA HAIs, we conducted a prospective cohort study between July 2020 and June 2022 in all 123 acute-care VA medical facilities. In April 2020, all facilities were given the option to suspend any combination of active surveillance, CPC, or CPI to free up laboratory resources for COVID-19 testing and conserve personal protective equipment. We measured MRSA HAIs (cases per 1,000 patient days) in intensive care units (ICUs) and non-ICUs by the infection control policy. Results: During the analysis period, there were 917,591 admissions, 5,225,174 patient days, and 568 MRSA HAIs. Only 20% of facilities continued all 3 MRSA infection control measures in July 2020, but this rate increased to 57% by June 2022. The MRSA HAI rate for all infection sites in non-ICUs was 0.07 (95% CI, 0.05–0.08) for facilities practicing active surveillance plus CPC plus CPI compared to 0.12 (95% CI, 0.08–0.19; P = .01) for those not practicing any of these strategies, and in ICUs the MRSA HAI rates were 0.20 (95% CI, 0.15–0.26) and 0.65 (95% CI, 0.41–0.98; P < .001) for the respective policies. Similar differences were seen when the analyses were restricted to MRSA bloodstream HAIs. Accounting for monthly COVID-19 admissions to facilities over the analysis period using a negative binomial regression model did not change the relationships between facility policy and MRSA HAI rates in the ICUs or non-ICUs. There was no statistically significant difference in monthly facility urinary catheter-associated infection rates, a nonequivalent dependent variable, in the categories during the analysis period in either ICUs or non-ICUs. Conclusions: In Veterans Affairs medical centers, there were fewer MRSA HAIs when facilities practiced active surveillance and contact precautions for colonized or infected patients during the COVID-19 pandemic. The effect was greater in ICUs than non-ICUs.
Patients diagnosed with coronavirus disease 2019 (COVID-19) aerosolize severe acute respiratory coronavirus virus 2 (SARS-CoV-2) via respiratory efforts, expose, and possibly infect healthcare personnel (HCP). To prevent transmission of SARS-CoV-2 HCP have been required to wear personal protective equipment (PPE) during patient care. Early in the COVID-19 pandemic, face shields were used as an approach to control HCP exposure to SARS-CoV-2, including eye protection.
An MS2 bacteriophage was used as a surrogate for SARS-CoV-2 and was aerosolized using a coughing machine. A simulated HCP wearing a disposable plastic face shield was placed 0.41 m (16 inches) away from the coughing machine. The aerosolized virus was sampled using SKC biosamplers on the inside (near the mouth of the simulated HCP) and the outside of the face shield. The aerosolized virus collected by the SKC Biosampler was analyzed using a viability assay. Optical particle counters (OPCs) were placed next to the biosamplers to measure the particle concentration.
There was a statistically significant reduction (P < .0006) in viable virus concentration on the inside of the face shield compared to the outside of the face shield. The particle concentration was significantly lower on the inside of the face shield compared to the outside of the face shield for 12 of the 16 particle sizes measured (P < .05).
Reductions in virus and particle concentrations were observed on the inside of the face shield; however, viable virus was measured on the inside of the face shield, in the breathing zone of the HCP. Therefore, other exposure control methods need to be used to prevent transmission from virus aerosol.
To evaluate the impact of a multicenter, try automated dashboard on ASP activities and its acceptance among ASP leaders.
Frontline stewards were asked to participate in semi-structured interviews before and after implementation of a web-based ASP information dashboard providing risk-adjusted benchmarking, longitudinal trends, and analysis of antimicrobial usage patterns at each facility.
The study was performed at Iowa City VA Health Care System.
ASP team members from nine medical centers in the VA Midwest Health Care Network (VISN 23).
Semi-structured interviews were conducted pre- and post-implementation, with interview guides informed by clinical experiences and the Consolidated Framework for Implementation Research (CFIR). Participants evaluated the dashboard’s ease of use, applicability to ongoing ASP activities, perceived validity and reliability, and relative advantage over other ASP monitoring systems.
Compared to established stewardship data collection and reporting methods, participants found the dashboard more intuitive and accessible, allowing them to reduce dependence on other systems and staff to obtain and share data. Standardized and risk-adjusted rankings were largely accepted as a valuable benchmarking method; however, participants felt their facility’s characteristics significantly influenced the rankings’ validity. Participants recognized staffing, training, and uncertainty with using the dashboard as an intervention tool as barriers to consistent and comprehensive dashboard implementation.
Participants generally accepted the dashboard’s risk-adjusted metrics and appreciated its usability. While creating automated tools to rigorously benchmark antimicrobial use across hospitals can be helpful, the displayed metrics require further validation, and the longitudinal utility of the dashboard warrants additional study.
We assessed the implementation of telehealth-supported stewardship activities in acute-care units and long-term care (LTC) units in Veterans’ Administration medical centers (VAMCs).
Before-and-after, quasi-experimental implementation effectiveness study with a baseline period (2019–2020) and an intervention period (2021).
The study was conducted in 3 VAMCs without onsite infectious disease (ID) support.
The study included inpatient providers at participating sites who prescribe antibiotics.
During 2021, an ID physician met virtually 3 times per week with the stewardship pharmacist at each participating VAMC to review patients on antibiotics in acute-care units and LTC units. Real-time feedback on prescribing antibiotics was given to providers. Additional implementation strategies included stakeholder engagement, education, and quality monitoring.
The reach–effectiveness–adoption–implementation–maintenance (RE-AIM) framework was used for program evaluation. The primary outcome of effectiveness was antibiotic days of therapy (DOT) per 1,000 days present aggregated across all 3 sites. An interrupted time-series analysis was performed to compare this rate during the intervention and baseline periods. Electronic surveys, periodic reflections, and semistructured interviews were used to assess other RE-AIM outcomes.
The telehealth program reviewed 502 unique patients and made 681 recommendations to 24 providers; 77% of recommendations were accepted. After program initiation, antibiotic DOT immediately decreased in the LTC units (−30%; P < .01) without a significant immediate change in the acute-care units (+16%; P = .22); thereafter DOT remained stable in both settings. Providers generally appreciated feedback and collaborative discussions.
The implementation of our telehealth program was associated with reductions in antibiotic use in the LTC units but not in the smaller acute-care units. Overall, providers perceived the intervention as acceptable. Wider implementation of telehealth-supported stewardship activities may achieve reductions in antibiotic use.
To determine risk factors for the development of long coronavirus disease 2019 (COVID-19) in healthcare personnel (HCP).
We conducted a case–control study among HCP who had confirmed symptomatic COVID-19 working in a Brazilian healthcare system between March 1, 2020, and July 15, 2022. Cases were defined as those having long COVID according to the Centers for Disease Control and Prevention definition. Controls were defined as HCP who had documented COVID-19 but did not develop long COVID. Multiple logistic regression was used to assess the association between exposure variables and long COVID during 180 days of follow-up.
Of 7,051 HCP diagnosed with COVID-19, 1,933 (27.4%) who developed long COVID were compared to 5,118 (72.6%) who did not. The majority of those with long COVID (51.8%) had 3 or more symptoms. Factors associated with the development of long COVID were female sex (OR, 1.21; 95% CI, 1.05–1.39), age (OR, 1.01; 95% CI, 1.00–1.02), and 2 or more SARS-CoV-2 infections (OR, 1.27; 95% CI, 1.07–1.50). Those infected with the SARS-CoV-2 δ (delta) variant (OR, 0.30; 95% CI, 0.17–0.50) or the SARS-CoV-2 o (omicron) variant (OR, 0.49; 95% CI, 0.30–0.78), and those receiving 4 COVID-19 vaccine doses prior to infection (OR, 0.05; 95% CI, 0.01–0.19) were significantly less likely to develop long COVID.
Long COVID can be prevalent among HCP. Acquiring >1 SARS-CoV-2 infection was a major risk factor for long COVID, while maintenance of immunity via vaccination was highly protective.
The impact of hurricane-related flooding on infectious diseases in the US is not well understood. Using geocoded electronic health records for 62,762 veterans living in North Carolina counties impacted by Hurricane Matthew coupled with flood maps, we explore the impact of hurricane and flood exposure on infectious outcomes in outpatient settings and emergency departments as well as antimicrobial prescribing. Declines in outpatient visits and antimicrobial prescribing are observed in weeks 0-2 following the hurricane as compared with the baseline period and the year prior, while increases in antimicrobial prescribing are observed 3+ weeks following the hurricane. Taken together, hurricane and flood exposure appear to have had minor impacts on infectious outcomes in North Carolina veterans, not resulting in large increases in infections or antimicrobial prescribing
Even though antimicrobial days of therapy did not significantly decrease during a period of robust stewardship activities at our center, we detected a significant downward trend in antimicrobial spectrum, as measured by days of antibiotic spectrum coverage (DASC). The DASC metric may help more broadly monitor the effect of stewardship activities.
Although multiple studies have revealed that coronavirus disease 2019 (COVID-19) vaccines can reduce COVID-19–related outcomes, little is known about their impact on post–COVID-19 conditions. We performed a systematic literature review and meta-analysis on the effectiveness of COVID-19 vaccination against post–COVID-19 conditions (ie, long COVID).
We searched PubMed, CINAHL, EMBASE, Cochrane Central Register of Controlled Trials, Scopus, and Web of Science from December 1, 2019, to April 27, 2022, for studies evaluating COVID-19 vaccine effectiveness against post–COVID-19 conditions among individuals who received at least 1 dose of Pfizer/BioNTech, Moderna, AstraZeneca, or Janssen vaccine. A post–COVID-19 condition was defined as any symptom that was present 3 or more weeks after having COVID-19. Editorials, commentaries, reviews, study protocols, and studies in the pediatric population were excluded. We calculated the pooled diagnostic odds ratios (DORs) for post–COVID-19 conditions between vaccinated and unvaccinated individuals. Vaccine effectiveness was estimated as 100% × (1 − DOR).
In total, 10 studies with 1,600,830 individuals evaluated the effect of vaccination on post–COVID-19 conditions, of which 6 studies were included in the meta-analysis. The pooled DOR for post–COVID-19 conditions among individuals vaccinated with at least 1 dose was 0.708 (95% confidence interval (CI), 0.692–0.725) with an estimated vaccine effectiveness of 29.2% (95% CI, 27.5%–30.8%). The vaccine effectiveness was 35.3% (95% CI, 32.3%–38.1%) among those who received the COVID-19 vaccine before having COVID-19, and 27.4% (95% CI, 25.4%–29.3%) among those who received it after having COVID-19.
COVID-19 vaccination both before and after having COVID-19 significantly decreased post–COVID-19 conditions for the circulating variants during the study period although vaccine effectiveness was low.
The optimal metric for outpatient antimicrobial stewardship has not been well defined. The number of antibiotic prescriptions per clinic visit does not account for the therapeutic duration. We found only moderate association between prescription-based metrics and days-supplied–based metrics. Outpatient antibiotic consumption metrics should incorporate the duration of therapy.
Contaminated surfaces in healthcare settings contribute to the transmission of nosocomial pathogens. Adequate environmental cleaning is important for preventing the transmission of important pathogens and reducing healthcare-associated infections. However, effective cleaning practices vary considerably. We examined environmental management services (EMS) staff experiences and perceptions surrounding environmental cleaning to describe perceived challenges and ideas to promote an effective environmental services program.
Frontline EMS staff.
From January to June 2019, we conducted individual semistructured interviews with key stakeholders (ie, EMS staff) at 3 facilities within the Veterans’ Affairs Healthcare System. We used the Systems Engineering Initiative for Patient Safety (SEIPS) framework (ie, people, environment, organization, tasks, tools) to guide this study. Interviews were audio-recorded, transcribed, and analyzed for thematic content.
In total, 13 EMS staff and supervisors were interviewed. A predominant theme that emerged were the challenges EMS staff saw as hindering their ability to be effective at their jobs. EMS staff interviewed felt they understand their job requirements and are dedicated to their work; however, they described challenges related to feeling undervalued and staffing issues.
EMS staff play a critical role in infection prevention in healthcare settings. However, some do not believe their role is recognized or valued by the larger healthcare team and leadership. EMS staff provided ideas for improving feelings of value and job satisfaction, including higher pay, opportunities for certifications and advancement, as well as collaboration or integration with the larger healthcare team. Healthcare organizations should focus on utilizing these suggestions to improve the EMS work climate.
To investigate factors that influence antibiotic prescribing decisions, we interviewed 49 antibiotic stewardship champions and stakeholders across 15 hospitals. We conducted thematic analysis and subcoding of decisional factors. We identified 31 factors that influence antibiotic prescribing decisions. These factors may help stewardship programs identify educational targets and design more effective interventions.
To describe national trends in testing and detection of carbapenemases
produced by carbapenem-resistant Enterobacterales (CRE) and associate
testing with culture and facility characteristics.
Retrospective cohort study.
Department of Veterans’ Affairs medical centers (VAMCs).
Patients seen at VAMCs between 2013 and 2018 with cultures positive for CRE,
defined by national VA guidelines.
Microbiology and clinical data were extracted from national VA data sets.
Carbapenemase testing was summarized using descriptive statistics.
Characteristics associated with carbapenemase testing were assessed with
Of 5,778 standard cultures that grew CRE, 1,905 (33.0%) had evidence of
molecular or phenotypic carbapenemase testing and 1,603 (84.1%) of these had
carbapenemases detected. Among these cultures confirmed as
carbapenemase-producing CRE, 1,053 (65.7%) had molecular testing for
≥1 gene. Almost all testing included KPC (n = 1,047, 99.4%), with KPC
detected in 914 of 1,047 (87.3%) cultures. Testing and detection of other
enzymes was less frequent. Carbapenemase testing increased over the study
period from 23.5% of CRE cultures in 2013 to 58.9% in 2018. The South US
Census region (38.6%) and the Northeast (37.2%) region had the highest
proportion of CRE cultures with carbapenemase testing. High complexity (vs
low) and urban (vs rural) facilities were significantly associated with
carbapenemase testing (P < .0001).
Between 2013 and 2018, carbapenemase testing and detection increased in the
VA, largely reflecting increased testing and detection of KPC. Surveillance
of other carbapenemases is important due to global spread and increasing
antibiotic resistance. Efforts supporting the expansion of carbapenemase
testing to low-complexity, rural healthcare facilities and standardization
of reporting of carbapenemase testing are needed.
Background: Antimicrobials are frequently used during end-of-life care and may be prescribed without a clear clinical indication. Overuse of antimicrobials is a major public health concern because of the development of multidrug resistant organisms (MDROs). Antimicrobial stewardship programs are associated with reductions in antibiotic resistance and antibiotic-associated adverse events. We sought to identify and describe opportunities to successfully incorporate stewardship strategies into end-of-life care. Methods: We completed semistructured interviews with 15 healthcare providers at 2 VA medical centers, 1 inpatient setting and 1 long-term care setting. Interviews were conducted via telephone between November 2020 and June 2021 and covered topics related to antibiotic prescribing for hospice and palliative-care patients, including how to improve antimicrobial stewardship during the end-of-life period. We targeted healthcare providers who are involved in prescribing antibiotics during the end-of-life period, including hospitalists, infectious disease physicians, palliative care and hospice physicians, and pharmacists. All interviews were recorded, transcribed, and analyzed using consensus-based inductive and deductive coding. Results: End-of-life care, particularly hospice care, was described as an underutilized resource for patients, who are often enrolled in their final days of life rather than earlier in the dying process. Even at facilities with established antimicrobial stewardship programs, healthcare providers interviewed believed that opportunities for antimicrobial stewardship in the hospice and palliative care settings were missed. Recommendations for how stewardship should be incorporated in end-of-life care included receiving feedback on antimicrobial prescribing, increasing pharmacist involvement in prescribing decisions, and targeted education for providers on end-of-life care, including the value of shared decision making with patients around antibiotic use. Conclusions: Improved antibiotic prescribing during end-of-life care is critical in the effort to combat antimicrobial resistance. Healthcare providers discussed antimicrobial stewardship activities during end-of-life patient care as a potential avenue to improve appropriate antibiotic prescribing. Future research should evaluate the feasibility and effectiveness of incorporating these strategies into end-of-life patient care.
Background: Antibiotic use during end-of-life (EOL) care is an increasingly important target for antimicrobial stewardship given the high prevalence of antibiotic use in this setting with limited evidence on safety and effectiveness to guide antibiotic decision making. We estimated antibiotic use during the last 6 months of life for patients under hospice or palliative care, and we identified potential targets (ie time points) during the EOL period when antimicrobial stewardship interventions could be targeted for maximal benefit. Methods: We conducted a retrospective cohort study of nationwide Veterans’ Affairs (VA) patients, 18 years and older who died between January 1, 2014, and December 31, 2019, and who had been hospitalized within 6 months prior to death. Data from the VA’s integrated electronic medical record (EMR) were collected including demographics, comorbid conditions, and duration of inpatient antibiotics administered, along with outpatient antibiotics dispensed. A propensity-score matched-cohort analysis was conducted to compare antibiotic use between patients placed into palliative care or hospice matched to patients not receiving palliative care or hospice care. Repeated measures ANOVA and repeated measures linear regression methods were used to analyze difference in difference (D-I-D) of days of therapy (DOT) between the 2 cohorts. Results: There were 251,822 patients in the cohort, including 23,746 in hospice care, 89,768 in palliative care, and 138,308 without palliative or hospice care. The median days from last discharge to death was 9 days. The most common comorbidities were chronic obstructive pulmonary diseases (50%), malignancy (46%), and diabetes mellitus (43%). Overall, 18,296 (77%) of 23,746 hospice patients, and 71,812 (80%) of 89,768 palliative care patients received at least 1 antibiotic, whereas 95,167 (69%) of 138,308 who were not placed in hospice or did not receive palliative care received antibiotics. In the primary matched cohort analysis that compared patients placed into hospice or palliative care to propensity-score matched controls, entry into palliative care was associated with a 11% absolute increase in antibiotic prescribing, and entry into hospice was associated with a 4% absolute increase during the 7–14 days after entry versus the 7–14 days before entry (Fig. 1). The stratified cohorts had very similar balanced covariates as the overall cohort. Conclusions: In our large cohort study, we observed that patients receiving EOL care had high levels of antibiotic exposure across VA population, particularly on entry to hospice or during admissions when they received palliative care consultation. Future studies are needed to identify the optimal EOL strategies for collaboration between antimicrobial stewardship and palliative care.
Background: Antimicrobial stewardship programs (ASPs) are advised to audit antimicrobial consumption as a metric to feedback to clinicians. However, many ASPs lack the tools necessary for appropriate risk adjustment and standardized data collection, which are critical for peer-program benchmarking. We evaluated the impact of the dashboard deployment that displays these metrics and its acceptance among ASP members and antimicrobial prescribers. Materials/methods: We conducted semistructured interviews of ASP stewards and antimicrobial prescribers before and after implementation of a web-based ASP information dashboard (Fig. 1) implemented in the VA Midwest Health Care Network (VISN23). The dashboard provides risk-adjusted benchmarking, longitudinal trends, and analysis of antimicrobial usage patterns at each facility. Risk-adjusted benchmarking was based on an observed-to-expected comparison of antimicrobial days of therapy at each facility, after adjusting for differences in patient case mix and facility-level variables. Respondents were asked to evaluate several aspects of the dashboard, including its ease of use, applicability to ongoing ASP activities, perceived validity and reliability, and advantages compared to other ASP monitoring systems. All interviews were digitally recorded and transcribed verbatim. The analysis was conducted using MaxQDA 2020.4 and the Consolidated Framework for Implementation Research (CFIR) constructs. Results: We completed 4 preimplementation interviews and 11 postimplementation interviews with ASP champions and antimicrobial prescribers from 6 medical centers. We derived 4 key themes from the data that map onto CFIR constructs. These themes were interconnected so that implementation of the dashboard (ie, adapting and adopting) was influenced by respondents’ perception of a facility’s size, patient population, and priority placed on stewardship (ie, structural and cultural context), the availability of dedicated stewardship staff and training needed to implement the dashboard (ie, resources needed), and how the dashboard compared to established stewardship activities (ie, relative advantage). ASP champions and antimicrobial prescribers indicated that dashboard metrics were useful for identifying antimicrobial usage and for comparing metrics among similar facilities. Respondents also specified barriers to acceptance of the risk-adjusted metric, such as disagreement regarding how antimicrobials were grouped by the current NHSN protocol, uncertainty of factors involved in risk adjustments, and difficulty developing a clear interpretation of hospital rankings. Conclusions: Given the limited resources for antimicrobial stewardship personnel, automated, risk-adjusted, antimicrobial-use dashboards provided by ASPs are an attractive method to both facilitate compliance and improve efficiency. To increase the uptake of surveillance systems in antimicrobial stewardship, our study highlights the need for clear descriptions of methods and metrics.