We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Hospitals are increasingly consolidating into health systems. Some systems have appointed healthcare epidemiologists to lead system-level infection prevention programs. Ideal program infrastructure and support resources have not been described. We informally surveyed 7 healthcare epidemiologists with recent experience building and leading system-level infection prevention programs. Key facilitators and barriers for program structure and implementation are described.
Skin and soft-tissue infections (SSTIs) account for 3% of all emergency department (ED) encounters and are frequently associated with inappropriate antibiotic prescribing. We characterized barriers and facilitators to optimal antibiotic use for SSTIs in the ED using a systems engineering framework and matched them with targeted stewardship interventions.
Design and participants:
We conducted semistructured interviews with a purposefully selected sample of emergency physicians.
Methods:
An interview guide was developed using the Systems Engineering Initiative for Patient Safety (SEIPS) framework. Interviews were recorded, transcribed, and analyzed iteratively until conceptual saturation was achieved. Themes were identified using deductive directed content analysis guided by the SEIPS model.
Results:
We conducted 20 interviews with physicians of varying experience and from different practice settings. Identified barriers to optimal antibiotic prescribing for SSTIs included poor access to follow-up (organization), need for definitive diagnostic tools (tools and technology) and fear over adverse outcomes related to missed infections (person). Identified potential interventions included programs to enhance follow-up care; diagnostic aides (eg, rapid MRSA assays for purulent infections and surface thermal imaging for cellulitis); and shared decision-making tools.
Conclusions:
Using a systems engineering informed qualitative approach, we successfully characterized barriers and developed targeted antibiotic stewardship interventions for SSTIs managed in the ED work system. The interventions span multiple components of the ED work system and should inform future efforts to improve antibiotic stewardship for SSTIs in this challenging care setting.
Narrow-windrow burning (NWB) is a form of harvest weed seed control in which crop residues and weed seeds collected by the combine are concentrated into windrows and subsequently burned. The objectives of this study were to determine how NWB will 1) affect seed survival of Italian ryegrass in wheat and Palmer amaranth in soybean and 2) determine whether a relationship exists between NWB heat index (HI; the sum of temperatures above ambient) or effective burn time (EBT; the cumulative number of seconds temperatures exceed 200 C) and the post-NWB seed survival of both species. Average soybean and wheat windrow HI totaled 140,725 ± 14,370 and 66,196 ± 6224 C, and 259 ± 27 and 116 ± 12 s of EBT, respectively. Pre-NWB versus post-NWB germinability testing revealed an estimated seed kill rate of 79.7% for Italian ryegrass, and 86.3% for Palmer amaranth. Non-linear two-parameter exponential regressions between seed kill and HI or EBT indicated NWB at an HI of 146,000 C and 277 s of EBT potentially kills 99% of Palmer amaranth seed. Seventy-six percent of soybean windrow burning events resulted in estimated Palmer amaranth seed kill rates greater than 85%. Predicted Italian ryegrass seed kill was greater than 97% in all but two wheat NWB events; therefore, relationships were not calculated. These results validate the effectiveness of the ability of NWB to reduce seed survival, thereby improving weed management and combating herbicide resistance.
We implemented a preoperative staphylococcal decolonization protocol for colorectal surgeries if efforts to further reduce surgical site infections (SSIs).
Design:
Retrospective observational study.
Setting:
Tertiary-care, academic medical center.
Patients:
Adult patients who underwent colorectal surgery, as defined by National Healthcare Safety Network (NHSN), between July 2015 and June 2020. Emergent cases were excluded.
Methods:
Simple and multivariable logistic regression were performed to evaluate the relationship between decolonization and subsequent SSI. Other predictive variables included age, sex, body mass index, procedure duration, American Society of Anesthesiology (ASA) score, diabetes, smoking, and surgical oncology service.
Results:
In total, 1,683 patients underwent nonemergent NHSN-defined colorectal surgery, and 33.7% underwent the staphylococcal decolonization protocol. SSI occurred in 92 (5.5%); 53 were organ-space infections and 39 were superficial wound infections. We detected no difference in overall SSIs between those decolonized and not decolonized (P = .17). However, superficial wound infections were reduced in the group that received decolonization versus those that did not: 7 (1.2%) of 568 versus 32 (2.9%) of 1,115 (P = .04).
Conclusions:
Staphylococcal decolonization may prevent a subset of SSIs in patients undergoing colorectal surgery.
Hospital epidemiologists, infection preventionists, and antimicrobial stewards are integral to the pandemic workforce. However, regardless of pandemic surge or postsurge conditions, their workload remains high due to constant vigilance for new variants, emerging data, and evolving public health guidance. We describe the factors that have led to burnout and suggest strategies to enhance resilience.
Monoclonal antibody therapeutics to treat coronavirus disease (COVID-19) have been authorized by the US Food and Drug Administration under Emergency Use Authorization (EUA). Many barriers exist when deploying a novel therapeutic during an ongoing pandemic, and it is critical to assess the needs of incorporating monoclonal antibody infusions into pandemic response activities. We examined the monoclonal antibody infusion site process during the COVID-19 pandemic and conducted a descriptive analysis using data from 3 sites at medical centers in the United States supported by the National Disaster Medical System. Monoclonal antibody implementation success factors included engagement with local medical providers, therapy batch preparation, placing the infusion center in proximity to emergency services, and creating procedures resilient to EUA changes. Infusion process challenges included confirming patient severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) positivity, strained staff, scheduling, and pharmacy coordination. Infusion sites are effective when integrated into pre-existing pandemic response ecosystems and can be implemented with limited staff and physical resources.
Challenges for infection prevention and antimicrobial stewardship programs have arisen with the fourth wave of the coronavirus disease 2019 (COVID-19) pandemic, fueled by the delta variant. These challenges include breakthrough infections in vaccinated individuals, decisions to re-escalate infection prevention measures, critical medication shortages, and provider burnout. Various strategies are needed to meet these challenges.
In this study, atom probe tomography (APT) was used to investigate strontium-containing bioactive glass particles (BG-Sr10) and strontium-releasing bioactive glass-based scaffolds (pSrBG), both of which are attractive biomaterials with applications in critical bone damage repair. We outline the challenges and corresponding countermeasures of this nonconductive biomaterial for APT sample preparation and experiments, such as avoiding direct contact between focussed ion beam micromanipulators and the extracted cantilever to reduce damage during liftout. Using a low imaging voltage (≤3 kV) and current (≤500 pA) in the scanning electron microscope and a low acceleration voltage (≤2 kV) and current (≤200 pA) in the focussed ion beam prevents tip bending in the final stages of annular milling. To optimize the atom probe experiment, we considered five factors: total detected hits, multiple hits, the background level, the charge-state ratio, and the accuracy of the measured compositions, to explore the optimal laser pulse for BG-Sr10 bioactive glass. We show that a stage temperature of 30 K, 200–250 pJ laser pulse energy, 0.3% detection rate, and 200 kHz pulse rate are optimized experimental parameters for bioactive glass. The use of improved experimental preparation methods and optimized parameters resulted in a 90% successful yield of pSrBG samples by APT.
To determine whether cascade reporting is associated with a change in meropenem and fluoroquinolone consumption.
Design:
A quasi-experimental study was conducted using an interrupted time series to compare antimicrobial consumption before and after the implementation of cascade reporting.
Setting:
A 399-bed, tertiary-care, Veterans’ Affairs medical center.
Participants:
Antimicrobial consumption data across 8 inpatient units were extracted from the Center for Disease Control and Prevention (CDC) National Health Safety Network (NHSN) antimicrobial use (AU) module from April 2017 through March 2019, reported as antimicrobial days of therapy (DOT) per 1,000 days present (DP).
Intervention:
Cascade reporting is a strategy of reporting antimicrobial susceptibility test results in which secondary agents are only reported if an organism is resistant to primary, narrow-spectrum agents. A multidisciplinary team developed cascade reporting algorithms for gram-negative bacteria based on local antibiogram and infectious diseases practice guidelines, aimed at restricting the use of fluoroquinolones and carbapenems. The algorithms were implemented in March 2018.
Results:
Following the implementation of cascade reporting, mean monthly meropenem (P =.005) and piperacillin/tazobactam (P = .002) consumption decreased and cefepime consumption increased (P < .001). Ciprofloxacin consumption decreased by 2.16 DOT per 1,000 DP per month (SE, 0.25; P < .001). Clostridioides difficile rates did not significantly change.
Conclusion:
Ciprofloxacin consumption significantly decreased after the implementation of cascade reporting. Mean meropenem consumption decreased after cascade reporting was implemented, but we observed no significant change in the slope of consumption. cascade reporting may be a useful strategy to optimize antimicrobial prescribing.
The rate at which the coronavirus disease (COVID-19) spread required a rapid response across many, if not all, industries. Academic medical centers had to rapidly evaluate, prioritize, and coordinate the multiple requests for clinical trial participation. This involved redirecting resources and developing a collaborative system for assessment, decision making, and implementation. Our institution formed a team with diverse representation from multiple stakeholders to review and prioritize all research protocols related to COVID-19. To accomplish this, a prioritization matrix was developed to help determine the order in which the protocols should be placed for consideration by the treating clinician. The purpose of the team was to review the COVID-19 clinical trials in the pipeline, prioritize those trials that best met the needs of our patients, oversee training and resource needs, and lead the formulation of procedures for integration with clinical care. Resources from the Clinical Research Unit were then allocated to support the swift execution of such studies. This manuscript describes that process, the challenges encountered, and the lessons learned on how to make all clinical trials more successful in a complex and dynamic environment.