To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The coronavirus disease 2019 pandemic caused substantial changes to healthcare delivery and antibiotic prescribing beginning in March 2020. To assess pandemic impact on Clostridioides difficile infection (CDI) rates, we described patients and trends in facility-level incidence, testing rates, and percent positivity during 2019–2020 in a large cohort of US hospitals.
We estimated and compared rates of community-onset CDI (CO-CDI) per 10,000 discharges, hospital-onset CDI (HO-CDI) per 10,000 patient days, and C. difficile testing rates per 10,000 discharges in 2019 and 2020. We calculated percent positivity as the number of inpatients diagnosed with CDI over the total number of discharges with a test for C. difficile. We used an interrupted time series (ITS) design with negative binomial and logistic regression models to describe level and trend changes in rates and percent positivity before and after March 2020.
In pairwise comparisons, overall CO-CDI rates decreased from 20.0 to 15.8 between 2019 and 2020 (P < .0001). HO-CDI rates did not change. Using ITS, we detected decreasing monthly trends in CO-CDI (−1% per month, P = .0036) and HO-CDI incidence (−1% per month, P < .0001) during the baseline period, prior to the COVID-19 pandemic declaration. We detected no change in monthly trends for CO-CDI or HO-CDI incidence or percent positivity after March 2020 compared with the baseline period.
While there was a slight downward trajectory in CDI trends prior to March 2020, no significant change in CDI trends occurred during the COVID-19 pandemic despite changes in infection control practices, antibiotic use, and healthcare delivery.
Response to lithium in patients with bipolar disorder is associated with clinical and transdiagnostic genetic factors. The predictive combination of these variables might help clinicians better predict which patients will respond to lithium treatment.
To use a combination of transdiagnostic genetic and clinical factors to predict lithium response in patients with bipolar disorder.
This study utilised genetic and clinical data (n = 1034) collected as part of the International Consortium on Lithium Genetics (ConLi+Gen) project. Polygenic risk scores (PRS) were computed for schizophrenia and major depressive disorder, and then combined with clinical variables using a cross-validated machine-learning regression approach. Unimodal, multimodal and genetically stratified models were trained and validated using ridge, elastic net and random forest regression on 692 patients with bipolar disorder from ten study sites using leave-site-out cross-validation. All models were then tested on an independent test set of 342 patients. The best performing models were then tested in a classification framework.
The best performing linear model explained 5.1% (P = 0.0001) of variance in lithium response and was composed of clinical variables, PRS variables and interaction terms between them. The best performing non-linear model used only clinical variables and explained 8.1% (P = 0.0001) of variance in lithium response. A priori genomic stratification improved non-linear model performance to 13.7% (P = 0.0001) and improved the binary classification of lithium response. This model stratified patients based on their meta-polygenic loadings for major depressive disorder and schizophrenia and was then trained using clinical data.
Using PRS to first stratify patients genetically and then train machine-learning models with clinical predictors led to large improvements in lithium response prediction. When used with other PRS and biological markers in the future this approach may help inform which patients are most likely to respond to lithium treatment.
To determine the impact of an inpatient stewardship intervention targeting fluoroquinolone use on inpatient and postdischarge Clostridioides difficile infection (CDI).
We used an interrupted time series study design to evaluate the rate of hospital-onset CDI (HO-CDI), postdischarge CDI (PD-CDI) within 12 weeks, and inpatient fluoroquinolone use from 2 years prior to 1 year after a stewardship intervention.
An academic healthcare system with 4 hospitals.
All inpatients hospitalized between January 2017 and September 2020, excluding those discharged from locations caring for oncology, bone marrow transplant, or solid-organ transplant patients.
Introduction of electronic order sets designed to reduce inpatient fluoroquinolone prescribing.
Among 163,117 admissions, there were 683 cases of HO-CDI and 1,104 cases of PD-CDI. In the context of a 2% month-to-month decline starting in the preintervention period (P < .01), we observed a reduction in fluoroquinolone days of therapy per 1,000 patient days of 21% after the intervention (level change, P < .05). HO-CDI rates were stable throughout the study period. In contrast, we also detected a change in the trend of PD-CDI rates from a stable monthly rate in the preintervention period to a monthly decrease of 2.5% in the postintervention period (P < .01).
Our systemwide intervention reduced inpatient fluoroquinolone use immediately, but not HO-CDI. However, a downward trend in PD-CDI occurred. Relying on outcome measures limited to the inpatient setting may not reflect the full impact of inpatient stewardship efforts.
The coronavirus disease 2019 (COVID-19) pandemic has resulted in shortages of personal protective equipment (PPE), underscoring the urgent need for simple, efficient, and inexpensive methods to decontaminate masks and respirators exposed to severe acute respiratory coronavirus virus 2 (SARS-CoV-2). We hypothesized that methylene blue (MB) photochemical treatment, which has various clinical applications, could decontaminate PPE contaminated with coronavirus.
The 2 arms of the study included (1) PPE inoculation with coronaviruses followed by MB with light (MBL) decontamination treatment and (2) PPE treatment with MBL for 5 cycles of decontamination to determine maintenance of PPE performance.
MBL treatment was used to inactivate coronaviruses on 3 N95 filtering facepiece respirator (FFR) and 2 medical mask models. We inoculated FFR and medical mask materials with 3 coronaviruses, including SARS-CoV-2, and we treated them with 10 µM MB and exposed them to 50,000 lux of white light or 12,500 lux of red light for 30 minutes. In parallel, integrity was assessed after 5 cycles of decontamination using multiple US and international test methods, and the process was compared with the FDA-authorized vaporized hydrogen peroxide plus ozone (VHP+O3) decontamination method.
Overall, MBL robustly and consistently inactivated all 3 coronaviruses with 99.8% to >99.9% virus inactivation across all FFRs and medical masks tested. FFR and medical mask integrity was maintained after 5 cycles of MBL treatment, whereas 1 FFR model failed after 5 cycles of VHP+O3.
MBL treatment decontaminated respirators and masks by inactivating 3 tested coronaviruses without compromising integrity through 5 cycles of decontamination. MBL decontamination is effective, is low cost, and does not require specialized equipment, making it applicable in low- to high-resource settings.
Previously reported associations between hospital-level antibiotic use and hospital-onset Clostridioides difficile infection (HO-CDI) were reexamined using 2012–2018 data from a new cohort of US acute-care hospitals. This analysis revealed significant positive associations between total, third-generation, and fourth-generation cephalosporin, fluoroquinolone, carbapenem, and piperacillin-tazobactam use and HO-CDI rates, confirming previous findings.
The oceans have a huge capability to store, release, and transport heat, water, and various chemical species on timescales from seasons to centuries. Their transports affect global energy, water, and biogeochemical cycles and are crucial elements of Earth’s climate system. Ocean variability, as represented, for example, by sea surface temperature (SST) variations, can result in anomalous diabatic heating or cooling of the overlying atmosphere, which can in turn alter atmospheric circulation in such a way as to feedback on ocean thermal and current structures to modify the original SST variations. Ocean–atmosphere interactions in one ocean basin can also influence remote regions via interbasin teleconnections that can trigger responses having both local and far-field impacts. This chapter highlights the defining aspects of the climate in individual ocean basins, including mean states, seasonal cycles, interannual-to-interdecadal variability, and interactions with other basins. Key components of the global and tropical ocean observing system are also described.
Single-particle reconstruction can be used to perform three-dimensional (3D) imaging of homogeneous populations of nano-sized objects, in particular viruses and proteins. Here, it is demonstrated that it can also be used to obtain 3D reconstructions of heterogeneous populations of inorganic nanoparticles. An automated acquisition scheme in a scanning transmission electron microscope is used to collect images of thousands of nanoparticles. Particle images are subsequently semi-automatically clustered in terms of their properties and separate 3D reconstructions are performed from selected particle image clusters. The result is a 3D dataset that is representative of the full population. The study demonstrates a methodology that allows 3D imaging and analysis of inorganic nanoparticles in a fully automated manner that is truly representative of large particle populations.
Background: Delayed or in vitro inactive empiric antibiotic therapy may be detrimental to survival in patients with bloodstream infections (BSIs). Understanding the landscape of delayed or discordant empiric antibiotic therapy (DDEAT) across different patient, pathogen, and hospital types, as well as by their baseline resistance milieu, may enable providers, antimicrobial stewardship programs, and policy makers to optimize empiric prescribing. Methods: Inpatients with clinically suspected serious infection (based on sampling of blood cultures and receiving systemic antibiotic therapy on the same or next day) found to have BSI were identified in the Cerner Healthfacts EHR database. Patients were considered to have received DDEAT when, on culture sampling day, they received either no antibiotic(s) or none that displayed in vitro activity against the pathogenic bloodstream isolate. Antibiotic-resistant phenotypes were defined by in vitro resistance to taxon-specific prototype antibiotics (eg, methicillin/oxacillin resistance in S. aureus) and were used to estimate baseline resistance prevalence encountered by the hospital. The probability of DDEAT was examined by bacterial taxon, by time of BSI onset, and by presence versus absence of antibiotic-resistance phenotypes, sepsis or septic shock, hospital type, and baseline resistance. Results: Of 26,036 assessable patients with a BSI at 131 US hospitals between 2005 and 2014, 14,658 (56%) had sepsis, 3,623 (14%) had septic shock, 5,084 (20%) had antibiotic-resistant phenotypes, and 8,593 (33%) received DDEAT. Also, 4,428 (52%) recipients of DDEAT received no antibiotics on culture sampling day, whereas the remaining 4,165 (48%) received in vitro discordant therapy. DDEAT occurred most often in S. maltophilia (87%) and E. faecium (80%) BSIs; however, 75% of DDEAT cases and 76% of deaths among recipients of DDEAT collectively occurred among patients with S. aureus and Enterobacteriales BSIs. For every 8 bacteremic patients presenting with septic shock, 1 patient did not receive any antibiotics on culture day (Fig. 1A). Patients with BSIs of hospital (vs community) onset were twice as likely to receive no antibiotics on culture day, whereas those with bloodstream pathogens displaying antibiotic-resistant (vs susceptible) phenotypes were 3 times as likely to receive in vitro discordant therapy (Fig. 1B). The median proportion of DDEAT ranged between 25% (14, 37%) in eight <300-bed teaching hospitals in the lowest baseline resistance quartile and 40% (31, 50%) at five ≥300-bed teaching hospitals in the third baseline resistance quartile (Fig. 2). Conclusions: Delayed or in vitro discordant empiric antibiotic therapy is common among patients with BSI in US hospitals regardless of hospital size, teaching status, or local resistance patterns. Prompt empiric antibiotic therapy in septic shock and hospital-onset BSI needs more support. Reliable detection of S. aureus and Enterobacteriales bloodstream pathogens and their resistance patterns earlier with rapid point-of-care diagnostics may mitigate the population-level impact of DDEAT in BSI.
Funding: This study was funded in part by the National Institutes of Health Clinical Center, National Institutes of Allergy and Infectious Diseases, National Cancer Institute (NCI contract no. HHSN261200800001E) and the Agency for Healthcare Research and Quality.
To evaluate the Orange County Clostridium difficile infection (CDI) prevention collaborative’s effect on rates of CDI in acute-care hospitals (ACHs) in Orange County, California.
Controlled interrupted time series.
We convened a CDI prevention collaborative with healthcare facilities in Orange County to reduce CDI incidence in the region. Collaborative participants received onsite infection control and antimicrobial stewardship assessments, interactive learning and discussion sessions, and an interfacility transfer communication improvement initiative during June 2015–June 2016. We used segmented regression to evaluate changes in monthly hospital-onset (HO) and community-onset (CO) CDI rates for ACHs. The baseline period comprised 17 months (January 2014–June 2015) and the follow-up period comprised 28 months (September 2015–December 2017). All 25 Orange County ACHs were included in the CO-CDI model to account for direct and indirect effects of the collaborative. For comparison, we assessed HO-CDI and CO-CDI rates among 27 ACHs in 3 San Francisco Bay Area counties.
HO-CDI rates in the 15 participating Orange County ACHs decreased 4% per month (incidence rate ratio [IRR], 0.96; 95% CI, 0.95–0.97; P < .0001) during the follow-up period compared with the baseline period and 3% (IRR, 0.97; 95% CI, 0.95–0.99; P = .002) per month compared to the San Francisco Bay Area nonparticipant ACHs. Orange County CO-CDI rates declined 2% per month (IRR, 0.98; 95% CI, 0.96–1.00; P = .03) between the baseline and follow-up periods. This decline was not statistically different from the San Francisco Bay Area ACHs (IRR, 0.97; 95% CI, 0.95–1.00; P = .09).
Our analysis of ACHs in Orange County provides evidence that coordinated, regional multifacility initiatives can reduce CDI incidence.
To compare risk of surgical site infection (SSI) following cesarean delivery between women covered by Medicaid and private health insurance.
Cesarean deliveries covered by Medicaid or private insurance and reported to the National Healthcare Safety Network (NHSN) and state inpatient discharge databases by hospitals in California (2011–2013).
Deliveries reported to NHSN and state inpatient discharge databases were linked to identify SSIs in the 30 days following cesarean delivery, primary payer, and patient and procedure characteristics. Additional hospital-level characteristics were obtained from public databases. Relative risk of SSI by primary payer primary payer was assessed using multivariable logistic regression adjusting for patient, procedure, and hospital characteristics, accounting for facility-level clustering.
Of 291,757 cesarean deliveries included, 48% were covered by Medicaid. SSIs were detected following 1,055 deliveries covered by Medicaid (0.75%) and 955 deliveries covered by private insurance (0.63%) (unadjusted odds ratio, 1.2; 95% confidence interval [CI], 1.1–1.3; P < .0001). The adjusted odds of SSI following cesarean deliveries covered by Medicaid was 1.4 (95% CI, 1.2–1.6; P < .0001) times the odds of those covered by private insurance.
In this, the largest and only multicenter study to investigate SSI risk following cesarean delivery by primary payer, Medicaid-insured women had a higher risk of infection than privately insured women. These findings suggest the need to evaluate and better characterize the quality of maternal healthcare for and needs of women covered by Medicaid to inform targeted infection prevention and policy.
As the US population ages, the number of hip and knee arthroplasties is expected to increase. Because surgical site infections (SSIs) following these procedures contribute substantial morbidity, mortality, and costs, we projected SSIs expected to occur from 2020 through 2030.
We used a stochastic Poisson process to project the number of primary and revision arthroplasties and SSIs. Primary arthroplasty rates were calculated using annual estimates of hip and knee arthroplasty stratified by age and gender from the 2012–2014 Nationwide Inpatient Sample and standardized by census population data. Revision rates, dependent on time from primary procedure, were obtained from published literature and were uniformly applied for all ages and genders. Stratified complex SSI rates for arthroplasties were obtained from 2012–2015 National Healthcare Safety Network data. To evaluate the possible impact of prevention measures, we recalculated the projections with an SSI rate reduced by 30%, the national target established by the US Department of Health and Human Services (HHS).
Without a reduction in SSI rates, we projected an increase in complex SSIs following hip and knee arthroplasty of 14% between 2020 and 2030. We projected a total burden of 77,653 SSIs; however, meeting the 30% rate reduction could prevent 23,297 of these SSIs.
Given current SSI rates, we project that complex SSI burden for primary and revision arthroplasty may increase due to an aging population. Reducing the SSI rate to the national HHS target could prevent 23,000 SSIs and reduce subsequent morbidity, mortality, and Medicare costs.
Among dialysis facilities participating in a bloodstream infection (BSI) prevention collaborative, access-related BSI incidence rate improvements observed immediately following implementation of a bundle of BSI prevention interventions were sustained for up to 4 years. Overall, BSI incidence remained unchanged from baseline in the current analysis.
Coronary artery bypass graft (CABG) and primary arthroplasty surgical site infection (SSI) rates are declining slower than other healthcare-associated infection rates. We examined antimicrobial prophylaxis (AMP) regimens used for these operations and compared their spectrum of activity against reported SSI pathogens.
Pathogen distributions of CABG and hip/knee arthroplasty complex SSIs (deep and organ/space) reported to the National Healthcare Safety Network (NHSN) from 2006 through 2009 and AMP regimens (same procedures and time period) reported to the Surgical Care Improvement Project (SCIP) were analyzed. Regimens were categorized as standard (cefazolin or cefuroxime), β-lactam allergy (vancomycin or clindamycin with or without an aminoglycoside), and extended spectrum (vancomycin and/or an aminoglycoside with cefazolin or cefuroxime). AMP activity of each regimen was predicted on the basis of pathogen susceptibility reports and published spectra of antimicrobial activity.
There were 6,263 CABG and arthroplasty complex SSIs reported (680,489 procedures; 880 NHSN hospitals). Among 6,574 pathogens reported, methicillin-sensitive Staphylococcus aureus (23%), methicillin-resistant S. aureus (18%), coagulase-negative staphylococci (17%), and Enterococcus species (7%) were most common. AMP regimens for 2,435,703 CABG and arthroplasty procedures from 3,330 SCIP hospitals were analyzed. The proportion of pathogens predictably susceptible to standard (used in 75% of procedures), β-lactam (12%), and extended-spectrum (8%) regimens was 41%–45%, 47%–96%, and 81%—96%, respectively.
Standard AMP, used in three-quarters of CABG and primary arthroplasty procedures, has inadequate activity against more than half of SSI pathogens reported. Alternative strategies may be needed to prevent SSIs caused by pathogens resistant to standard AMP regimens.