To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We sought to determine the incidence of community-onset and hospital-acquired coinfection in patients hospitalized with coronavirus disease 2019 (COVID-19) and to evaluate associated predictors and outcomes.
In this multicenter retrospective cohort study of patients hospitalized for COVID-19 from March 2020 to August 2020 across 38 Michigan hospitals, we assessed prevalence, predictors, and outcomes of community-onset and hospital-acquired coinfections. In-hospital and 60-day mortality, readmission, discharge to long-term care facility (LTCF), and mechanical ventilation duration were assessed for patients with versus without coinfection.
Of 2,205 patients with COVID-19, 141 (6.4%) had a coinfection: 3.0% community onset and 3.4% hospital acquired. Of patients without coinfection, 64.9% received antibiotics. Community-onset coinfection predictors included admission from an LTCF (OR, 3.98; 95% CI, 2.34–6.76; P < .001) and admission to intensive care (OR, 4.34; 95% CI, 2.87–6.55; P < .001). Hospital-acquired coinfection predictors included fever (OR, 2.46; 95% CI, 1.15–5.27; P = .02) and advanced respiratory support (OR, 40.72; 95% CI, 13.49–122.93; P < .001). Patients with (vs without) community-onset coinfection had longer mechanical ventilation (OR, 3.31; 95% CI, 1.67–6.56; P = .001) and higher in-hospital mortality (OR, 1.90; 95% CI, 1.06–3.40; P = .03) and 60-day mortality (OR, 1.86; 95% CI, 1.05–3.29; P = .03). Patients with (vs without) hospital-acquired coinfection had higher discharge to LTCF (OR, 8.48; 95% CI, 3.30–21.76; P < .001), in-hospital mortality (OR, 4.17; 95% CI, 2.37–7.33; P ≤ .001), and 60-day mortality (OR, 3.66; 95% CI, 2.11–6.33; P ≤ .001).
Despite community-onset and hospital-acquired coinfection being uncommon, most patients hospitalized with COVID-19 received antibiotics. Admission from LTCF and to ICU were associated with increased risk of community-onset coinfection. Future studies should prospectively validate predictors of COVID-19 coinfection to facilitate the reduction of antibiotic use.
Background: Prevention of central-line–associated bloodstream infections (CLABSIs) and methicillin-resistant Staphylococcus aureus (MRSA) infections requires a multifaceted approach including strategies to decrease cutaneous bacterial colonization. Prior studies have shown benefit from chlorhexidine-gluconate (CHG) skin application on CLABSI and MRSA infection rates in intensive care units (ICUs); however, the use of CHG in the non-ICU population has not been well studied. Methods: We performed a quasi-experimental before-and-after study to evaluate the use of daily 2% CHG wipes in non-ICU patients at a 1,000 bed acute-care teaching hospital beginning in November 2017. The study population included adult and pediatric patients with central venous catheters on non-ICU units, excluding patients on the following units: stem cell transplant and hematologic malignancy (these units had already established use of CHG skin application as a standard prior to the intervention), labor and delivery, and psychiatry. CHG was applied according to the manufacturer’s instruction by nurses or nurse aides and random monthly auditing of compliance was performed. NHSN CLABSI, hospital-onset MRSA bacteremia, and hospital-onset MRSA LabID rates were compared for the period 24 months before the intervention (November 1, 2015, through October 31, 2017) to the 24-month period after the intervention (November 1, 2017, through October 31, 2019) using a paired t test. Notably, the health system also discontinued the use of contact precautions for patients with MRSA (excluding MRSA from open, draining wounds) 11 months prior to onset of this intervention. Results: The CLABSI rate decreased by 26% from 0.594 events per 1,000 central-line days (n = 50) before the intervention to 0.438 events per 1,000 central-line days (n = 38) after the intervention (P = 0.19). The number of CLABSIs with gram-positive organisms also decreased by 29%. MRSA LabID rates decreased by 37% from 0.301 events per 1,000 patient days (n = 119) to 0.189 events per 1,000 patient days (n = 75) (P = 0.01). MRSA bacteremia rates decreased by 79% from 0.058 events per 1,000 patient days (n = 23) to 0.012 events per 1,000 patient days (n = 5) (P < 0.01). Compliance with the intervention was 83% (n = 225). Conclusions: Daily CHG skin application in non-ICU patients with central venous catheters is an effective strategy to prevent CLABSIs and MRSA infections. We observed a decrease in MRSA LabID and bacteremia rates despite discontinuation of contact precautions. These findings suggest that a horizontal prevention approach of daily CHG skin application may be an effective alternative to contact isolation to interrupt transmission of MRSA in hospitalized patients outside the ICU setting.
To evaluate whether incorporating mandatory prior authorization for Clostridioides difficile testing into antimicrobial stewardship pharmacist workflow could reduce testing in patients with alternative etiologies for diarrhea.
Single center, quasi-experimental before-and-after study.
Tertiary-care, academic medical center in Ann Arbor, Michigan.
Adult and pediatric patients admitted between September 11, 2019 and December 10, 2019 were included if they had an order placed for 1 of the following: (1) C. difficile enzyme immunoassay (EIA) in patients hospitalized >72 hours and received laxatives, oral contrast, or initiated tube feeds within the prior 48 hours, (2) repeat molecular multiplex gastrointestinal pathogen panel (GIPAN) testing, or (3) GIPAN testing in patients hospitalized >72 hours.
A best-practice alert prompting prior authorization by the antimicrobial stewardship program (ASP) for EIA or GIPAN testing was implemented. Approval required the provider to page the ASP pharmacist and discuss rationale for testing. The provider could not proceed with the order if ASP approval was not obtained.
An average of 2.5 requests per day were received over the 3-month intervention period. The weekly rate of EIA and GIPAN orders per 1,000 patient days decreased significantly from 6.05 ± 0.94 to 4.87 ± 0.78 (IRR, 0.72; 95% CI, 0.56–0.93; P = .010) and from 1.72 ± 0.37 to 0.89 ± 0.29 (IRR, 0.53; 95% CI, 0.37–0.77; P = .001), respectively.
We identified an efficient, effective C. difficile and GIPAN diagnostic stewardship approval model.
Clinical Enterobacteriacae isolates with a colistin minimum inhibitory concentration (MIC) ≥4 mg/L from a United States hospital were screened for the mcr-1 gene using real-time polymerase chain reaction (RT-PCR) and confirmed by whole-genome sequencing. Four colistin-resistant Escherichia coli isolates contained mcr-1. Two isolates belonged to the same sequence type (ST-632). All subjects had prior international travel and antimicrobial exposure.
An estimated 293,300 healthcare-associated cases of Clostridium difficile infection (CDI) occur annually in the United States. To date, research has focused on developing risk prediction models for CDI that work well across institutions. However, this one-size-fits-all approach ignores important hospital-specific factors. We focus on a generalizable method for building facility-specific models. We demonstrate the applicability of the approach using electronic health records (EHR) from the University of Michigan Hospitals (UM) and the Massachusetts General Hospital (MGH).
We utilized EHR data from 191,014 adult admissions to UM and 65,718 adult admissions to MGH. We extracted patient demographics, admission details, patient history, and daily hospitalization details, resulting in 4,836 features from patients at UM and 1,837 from patients at MGH. We used L2 regularized logistic regression to learn the models, and we measured the discriminative performance of the models on held-out data from each hospital.
Using the UM and MGH test data, the models achieved area under the receiver operating characteristic curve (AUROC) values of 0.82 (95% confidence interval [CI], 0.80–0.84) and 0.75 ( 95% CI, 0.73–0.78), respectively. Some predictive factors were shared between the 2 models, but many of the top predictive factors differed between facilities.
A data-driven approach to building models for estimating daily patient risk for CDI was used to build institution-specific models at 2 large hospitals with different patient populations and EHR systems. In contrast to traditional approaches that focus on developing models that apply across hospitals, our generalizable approach yields risk-stratification models tailored to an institution. These hospital-specific models allow for earlier and more accurate identification of high-risk patients and better targeting of infection prevention strategies.
Peripherally inserted central catheters (PICCs) are associated with central-line–associated bloodstream infections (CLABSIs). However, no tools to predict risk of PICC-CLABSI have been developed.
To operationalize or prioritize CLABSI risk factors when making decisions regarding the use of PICCs using a risk model to estimate an individual’s risk of PICC-CLABSI prior to device placement.
Using data from the Michigan Hospital Medicine Safety consortium, patients that experienced PICC-CLABSI between January 2013 and October 2016 were identified. A Cox proportional hazards model with robust sandwich standard error estimates was then used to identify factors associated with PICC-CLABSI. Based on regression coefficients, points were assigned to each predictor and summed for each patient to create the Michigan PICC-CLABSI (MPC) score. The predictive performance of the score was assessed using time-dependent area-under-the-curve (AUC) values.
Of 23,088 patients that received PICCs during the study period, 249 patients (1.1%) developed a CLABSI. Significant risk factors associated with PICC-CLABSI included hematological cancer (3 points), CLABSI within 3 months of PICC insertion (2 points), multilumen PICC (2 points), solid cancers with ongoing chemotherapy (2 points), receipt of total parenteral nutrition (TPN) through the PICC (1 point), and presence of another central venous catheter (CVC) at the time of PICC placement (1 point). The MPC score was significantly associated with risk of CLABSI (P<.0001). For every point increase, the hazard ratio of CLABSI increased by 1.63 (95% confidence interval, 1.56–1.71). The area under the receiver-operating-characteristics curve was 0.67 to 0.77 for PICC dwell times of 6 to 40 days, which indicates good model calibration.
The MPC score offers a novel way to inform decisions regarding PICC use, surveillance of high-risk cohorts, and utility of blood cultures when PICC-CLABSI is suspected. Future studies validating the score are necessary.
Inappropriate treatment of asymptomatic bacteriuria (ASB) in the hospital setting is common. We sought to evaluate the treatment rate of ASB at the 3 hospitals and assess the impact of a hospitalist-focused improvement intervention.
Prospective, interventional trial.
Two community hospitals and a tertiary-care academic center.
Adult patients with a positive urine culture admitted to hospitalist services were included in this study. Exclusions included pregnancy, intensive care unit admission, history of a major urinary procedure, and actively being treated for a urinary tract infection (UTI) at the time of admission or >48 hours prior to urine collection.
An educational intervention using a pocket card was implemented at all sites followed by a pharmacist-based intervention at the academic center. Medical records of the first 50 eligible patients at each site were reviewed at baseline and after each intervention for signs and symptoms of UTI, microbiological results, antimicrobials used, and duration of treatment for positive urine cultures. Diagnosis of ASB was determined through adjudication by 2 hospitalists and 2 infectious diseases physicians.
Treatment rates of ASB decreased (23.5%; P=.001) after the educational intervention. Reductions in treatment rates for ASB differed by site and were greatest in patients without classic signs and symptoms of UTI (34.1%; P<.001) or urinary catheters (31.2%; P<.001). The pharmacist-based intervention was most effective at reducing ASB treatment rates in catheterized patients.
A hospitalist-focused educational intervention significantly reduced ASB treatment rates. The impact varied across sites and by patient characteristics, suggesting that a tailored approach may be useful.
Treatment of asymptomatic bacteriuria contributes to antimicrobial overuse in hospitalized patients. Indications for urine culture, treatment, and targets for improvement were evaluated in 153 patients. Drivers of antimicrobial overuse included fever with an alternative source, altered mental status, and leukocytosis, which led 435 excess days of antimicrobial therapy.
Carbapenem-resistant Enterobacteriaceae (CRE) are clinically challenging, threaten patient safety, and represent an emerging public health issue. CRE reporting is not mandated in Michigan.
The Michigan Department of Community Health–led CRE Surveillance and Prevention Initiative enrolled 21 facilities (17 acute care and 4 long-term acute care facilities) across the state. Baseline data collection began September 1, 2012, and ended February 28, 2013 (duration, 6 months). Enrolled facilities voluntarily reported cases of Klebsiella pneumoniae and Escherichia coli according to the surveillance algorithm. Patient demographic characteristics, laboratory testing, microbiology, clinical, and antimicrobial information were captured via standardized data collection forms. Facilities reported admissions and patient-days each month.
One-hundred two cases over 957,220 patient-days were reported, resulting in a crude incidence rate of 1.07 cases per 10,000 patient-days. Eighty-nine case patients had test results positive for K. pneumoniae, whereas 13 had results positive for E. coli. CRE case patients had a mean age of 63 years, and 51% were male. Urine cultures (61%) were the most frequently reported specimen source. Thirty-five percent of cases were hospital onset; sixty-five percent were community onset (CO), although 75% of CO case patients reported healthcare exposure within the previous 90 days. Cardiovascular disease, renal failure, and diabetes mellitus were the most frequently reported comorbid conditions. Common ris k factors included surgery within the previous 90 days, recent infection or colonization with a multidrug-resistant organism, and recent exposures to antimicrobials, especially third- or fourth-generation cephalosporins.
CRE are found throughout Michigan healthcare facilities. Implementing a regional, coordinated surveillance and prevention initiative may prevent CRE from becoming hyperendemic in Michigan.
Urine cultures are frequently obtained for hospitalized patients. We reviewed documented indications for culture and compared these with professional society guidelines. Lack of documentation and important clinical scenarios (before orthopedic procedures and when the patient has altered mental status without a urinary catheter) are highlighted as areas of use outside of current guidelines.
To determine relative rates of blood culture contamination for 3 skin antisepsis interventions—10% povidone iodine aqueous solution (PI), 2% iodine tincture (IT), and 2% chlorhexidine gluconate in 70% isopropyl alcohol (CHG)—when used by dedicated phlebotomy teams to obtain peripheral blood cultures.
Randomized crossover trial with hospital floor as the unit of randomization.
Teaching hospital with 885 beds.
All adult patients undergoing peripheral blood culture collection on 3 medical-surgical floors from May 2009 through September 2009.
Each antisepsis intervention was used for 5 months on each study floor, with random crossover after a 1-month washout period. Phlebotomy teams collected all peripheral blood cultures. Each positive blood culture was adjudicated by physicians blinded to the intervention and scored as a true positive or contaminated blood culture. The primary outcome was the rate of blood culture contamination for each antisepsis agent.
In total, 12,904 peripheral blood culture sets were evaluated, of which 735 (5.7%) were positive. There were 98 contaminated cultures, representing 13.3% of all positive cultures. The overall blood culture contamination rate for the study population was 0.76%. Intent-to-treat rates of contaminated blood cultures were not significantly different among the 3 antiseptics (P = .18), yielding 0.58% with PI (95% confidence interval [CI], 0.38%-0.86%), 0.76% with IT (95% CI, 0.52%-1.07%), and 0.93% with CHG (95% CI, 0.67%-1.27%).
Choice of antiseptic agent does not impact contamination rates when blood cultures are obtained by a phlebotomy team and should, therefore, be based on costs or preference.
Central line-associated bloodstream infections (CLABSIs) have been reduced in number but not eliminated in our intensive care units with use of central line bundles. We performed an analysis of remaining CLABSIs. Many bloodstream infections that met the definition of CLABSI had sources other than central lines or represented contaminated blood samples.
To describe the rate of infection, associated organisms, and potential risk factors for ventilator-associated pneumonia (VAP) in patients receiving mechanical ventilation at home.
Retrospective cohort study.
University-affiliated home care service.
Patients receiving mechanical ventilation at home from June 1995 through December 2001.
Fifty-seven patients underwent ventilation at home for a total of 50,762 ventilator-days (mean ± SD, 890.6 ± 644.43 days; range, 76-2,458 days). Seventy-nine episodes of VAP occurred in 27 patients (rate, 1.55 episodes per 1,000 ventilator-days). The first episode of VAP occurred after a mean (±SD) of 245 ± 318.07 ventilator-days. VAP was most common during the first 500 days of ventilation. Rates of VAP were higher among patients who required ventilation for longer daily durations, compared with those who required it for shorter daily durations. There was no association of VAP with age, sex, underlying disease, reason for ventilation, antacid therapy, or steroid use. Microorganisms isolated from 33 episodes of VAP with available culture results included Pseudomonas species (17 isolates), Staphylococcus aureus (11), Serratia species (7), and Stenotrophomonas species (5). Eight patients died during the study; no deaths were attributed to pneumonia.
Although the organisms associated with VAP in the home setting are similar to those associated with hospital-acquired VAP, the incidence and mortality is much lower in the home care setting. Interventions to reduce the risk of VAP among patients receiving home care should be focused on patients who require ventilation for longer daily durations or who are new to receiving mechanical ventilation at home.
Email your librarian or administrator to recommend adding this to your organisation's collection.