To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Prompt identification of patients colonized or infected with carbapenem-resistant Enterobacterales (CRE) upon admission can help ensure rapid initiation of infection prevention measures and may reduce intrafacility transmission of CRE. The Chicago CDC Prevention Epicenters Program previously created a CRE prediction model using state-wide public health data (doi: 10.1093/ofid/ofz483). We evaluated how well a similar model performed using data from a single academic healthcare system in Atlanta, Georgia, and we sought to determine whether including additional variables improved performance. Methods: We performed a case–control study using electronic medical record data. We defined cases as adult encounters to acute-care hospitals in a 4-hospital academic healthcare system from January 1, 2014, to December 31, 2021, with CRE identified from a clinical culture within the first 3 hospital days. Only the first qualifying encounter per patient was included. We frequency matched cases to control admissions (no CRE identified) from the same hospital and year. Using multivariable logistic regression, we compared 2 models. The “public health model” included 4 variables from the Chicago Epicenters model (age, number of hospitalizations in the prior 365 days, mean length of stay in hospitalizations in the prior 365 days, and hospital admission with an infection diagnosis in the prior 365 days). The “healthcare system model” added 4 additional variables (admission to the ICU in the prior 365 days, malignancy diagnosis, Elixhauser score and inpatient antibiotic days of therapy in the prior 365 days) to the public health model. We used billing codes to determine Elixhauser score, malignancy status, and recent infection diagnoses. We compared model performance using the area under the receiver operating curve (AUC). Results: We identified 105 cases and 441,460 controls (Table 1). CRE was most frequently identified in urine cultures (46%). All 4 variables included in the public health model and the 4 additional variables in the healthcare system model were all significantly associated with being a case in unadjusted analyses (Table 1). The AUC for the public health model was 0.76, and the AUC for the healthcare system model was 0.79 (Table 2; Fig. 1). In both models, a prior admission with an infection diagnosis was the most significant risk factor. Conclusions: A modified CRE prediction model developed using public health data and focused on prior healthcare exposures performed reasonably well when applied to a different academic healthcare system. The addition of variables accessible in large healthcare networks did not meaningfully improve model discrimination.
Background: Environmental contamination is a major risk factor for multidrug-resistant organism (MDRO) exposure and transmission in the healthcare setting. Sponge-stick sampling methods have been developed and validated for MDRO epidemiological investigations, leading to their recommendation by public health agencies. However, similar bacteriological yields with more readily available methods that require less processing time or specialized equipment have also been reported. We compared the ability of 4 sampling methods to recover a variety of MDRO taxa from a simulated contaminated surface. Methods: We assessed the ability of (1) cotton swabs moistened with phosphate buffer solution (PBS), (2) e-swabs moistened with e-swab solution, (3) cellulose-containing sponge sticks (CSS), and (4) non–cellulose-containing sponge sticks (NCS) to recover extended-spectrum β-lactamase (ESBL)–producing Escherichia coli, carbapenem-resistant Pseudomonas aeruginosa (CRPA), carbapenem-resistant Acinetobacter baumannii (CRAB), methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant Enterococcus faecium (VRE), and a mixture that contained VRE, MRSA, and ESBL organisms. A solution of known bacterial inoculum (~105 CFU/mL) was made for each MDRO. Then, 1 mL solution was pipetted on a stainless-steel surface (8 × 12 inch) in 5 µL dots and allowed to dry for 1 hour. All samples were collected by 1 individual to minimize variation in technique. Sponge sticks were expressed in PBS containing 0.02% Tween 80 using a stomacher, were centrifuged, and were then resuspended in PBS. Cotton and e-swabs were spun in a vortexer. Then, 1 mL of fluid from each method was plated to selective and nonselective media in duplicate and incubated at 35°C for 24 hours (MRSA plates, 48 hours) (Fig. 1). CFU per square inch and percentage recovery were calculated. Results: Table 1 shows the CFU per square inch and percentage recovery for each sampling method–MDRO taxa combination. The percentage recovery varied across MDRO taxa. Across all methods, the lowest rate of recovery was for CRPA and the highest was for VRE. Regardless of MDRO taxa, the percentage recovery was highest for the sponge stick (CSS and NCS) compared to swab (cotton and E-swab) methods across all taxa (Table 1 and Fig. 2).
Conclusions: These findings support the preferential use of sponge sticks for the recovery of MDROs from the healthcare environment, despite the additional processing and equipment time needed for sponge sticks. Further studies are needed to assess the robustness of these findings in noncontrived specimens as well as the comparative effectiveness of different sampling methods for non–culture-based MDRO detection.
The incidence of infections from extended-spectrum β-lactamase (ESBL)–producing Enterobacterales (ESBL-E) is increasing in the United States. We describe the epidemiology of ESBL-E at 5 Emerging Infections Program (EIP) sites.
During October–December 2017, we piloted active laboratory- and population-based (New York, New Mexico, Tennessee) or sentinel (Colorado, Georgia) ESBL-E surveillance. An incident case was the first isolation from normally sterile body sites or urine of Escherichia coli or Klebsiella pneumoniae/oxytoca resistant to ≥1 extended-spectrum cephalosporin and nonresistant to all carbapenems tested at a clinical laboratory from a surveillance area resident in a 30-day period. Demographic and clinical data were obtained from medical records. The Centers for Disease Control and Prevention (CDC) performed reference antimicrobial susceptibility testing and whole-genome sequencing on a convenience sample of case isolates.
We identified 884 incident cases. The estimated annual incidence in sites conducting population-based surveillance was 199.7 per 100,000 population. Overall, 800 isolates (96%) were from urine, and 790 (89%) were E. coli. Also, 393 cases (47%) were community-associated. Among 136 isolates (15%) tested at the CDC, 122 (90%) met the surveillance definition phenotype; 114 (93%) of 122 were shown to be ESBL producers by clavulanate testing. In total, 111 (97%) of confirmed ESBL producers harbored a blaCTX-M gene. Among ESBL-producing E. coli isolates, 52 (54%) were ST131; 44% of these cases were community associated.
The burden of ESBL-E was high across surveillance sites, with nearly half of cases acquired in the community. EIP has implemented ongoing ESBL-E surveillance to inform prevention efforts, particularly in the community and to watch for the emergence of new ESBL-E strains.
In the UK, postnatal depression is more common in British South Asian women than White Caucasion women. Cognitive–behavioural therapy (CBT) is recommended as a first-line treatment, but there is little evidence for the adaptation of CBT for postnatal depression to ensure its applicability to different ethnic groups.
To evaluate the clinical and cost-effectiveness of a CBT-based positive health programme group intervention in British South Asian women with postnatal depression.
We have designed a multicentre, two-arm, partially nested, randomised controlled trial with 4- and 12-month follow-up, comparing a 12-session group CBT-based intervention (positive health programme) plus treatment as usual with treatment as usual alone, for British South Asian women with postnatal depression. Participants will be recruited from primary care and appropriate community venues in areas of high South Asian density across the UK. It has been estimated that randomising 720 participants (360 into each group) will be sufficient to detect a clinically important difference between a 55% recovery rate in the intervention group and a 40% recovery rate in the treatment-as-usual group. An economic analysis will estimate the cost-effectiveness of the positive health programme. A qualitative process evaluation will explore barriers and enablers to study participation and examine the acceptability and impact of the programme from the perspective of British South Asian women and other key stakeholders.
To describe the epidemiology of carbapenem-resistant Enterobacterales (CRE) bacteriuria and to determine whether urinary catheters increase the risk of subsequent CRE bacteremia.
Using active population- and laboratory-based surveillance we described a cohort of patients with incident CRE bacteriuria and identified risk factors for developing CRE bacteremia within 1 year.
The study was conducted among the 8 counties of Georgia Health District 3 (HD3) in Atlanta, Georgia.
Residents of HD3 with CRE first identified in urine between 2012 and 2017.
We identified 464 patients with CRE bacteriuria (mean yearly incidence, 1.96 cases per 100,000 population). Of 425 with chart review, most had a urinary catheter (56%), and many resided in long-term care facilities (48%), had a Charlson comorbidity index >3 (38%) or a decubitus ulcer (37%). 21 patients (5%) developed CRE bacteremia with the same organism within 1 year. Risk factors for subsequent bacteremia included presence of a urinary catheter (odds ratio [OR], 8.0; 95% confidence interval [CI], 1.8–34.9), central venous catheter (OR, 4.3; 95% CI, 1.7–10.6) or another indwelling device (OR, 4.3; 95% CI, 1.6–11.4), urine culture obtained as an inpatient (OR, 5.7; 95% CI, 1.3–25.9), and being in the ICU in the week prior to urine culture (OR, 2.9; 95% CI, 1.1–7.8). In a multivariable analysis, urinary catheter increased the risk of CRE bacteremia (OR, 5.3; 95% CI, 1.2–23.6).
In patients with CRE bacteriuria, urinary catheters increase the risk of CRE bacteremia. Future interventions should aim to reduce inappropriate insertion and early removal of urinary catheters.
Background: Carbapenem-resistant Enterobacteriaceae (CRE), particularly carbapenemase-producing (CP) CRE, pose a major public health threat. In 2016, the phenotypic definition of CRE expanded to include ertapenem resistance to improve sensitivity for detecting CP-CRE. We compared characteristics of CRE resistant to ertapenem only (CRE-EO) to CRE resistant to ≥1 other carbapenem (CRE-O). Methods: The Georgia Emerging Infections Program performs active, population-based CRE surveillance in metropolitan Atlanta. CRE cases were defined as any Escherichia coli, Klebsiella pneumoniae, K. oxytoca, K. variicola, Enterobacter cloacae complex, or Enterobacter aerogenes resistant to ≥1 carbapenem by the clinical laboratory and isolated from urine or a sterile site between 2016 and 2018. Data were extracted from retrospective chart review and 90-day mortality from Georgia vital statistics for 2016–2017. Polymerase chain reaction (PCR) for carbapenemase genes was performed on a convenience sample of isolates by the CDC or Georgia Public Health Laboratory. We compared characteristics of CRE-EO cases to CRE-O cases using χ2 tests or t tests. Results: Among 927 CRE isolates, 553 (60%) were CRE-EO. CRE-EO were less frequently isolated from blood (5% vs 12%; P < .01) and less commonly K. pneumoniae (21% vs 58%; P < .01) than CRE-O. CRE-EO cases were more often women (65% vs 50%; P < .01), had a lower Charlson comorbidity index (mean ± SD, 2.4±2.3 vs 3.0±2.6; P < .01), and were less commonly at a long-term care facility (24% vs 31%) or hospital (15% vs 21%; P < .01) in the 4 days prior to the CRE culture. CRE-EO were more susceptible to all antibiotics tested at the clinical laboratory (P < .01) except for tigecycline (P = 1.0) (Table 1). Of the 300 (32%) isolates tested for carbapenemase genes, 98 (33%) were positive (7% CRE-EO vs 62% CRE-O; P < .01). Of the CP isolates, we identified blaKPC in 93 cases (95%), blaNDM in 3 cases (3%), blaOXA-48-like in 2 cases (2%). CRE-EO cases had lower 90-day mortality (13% vs 21%; P < .01). Conclusions: CRE-EO are epidemiologically distinct from CRE-O and are less likely to harbor carbapenemase genes. CRE-EO may require less intensive infection prevention interventions and have more therapeutic options.
Background: Carbapenem-resistant Acinetobacter baumannii (CRAB) is an important cause of healthcare-associated infections with limited treatment options and high mortality. To describe risk factors for mortality, we evaluated characteristics associated with 30-day mortality in patients with CRAB identified through the Emerging Infections Program (EIP). Methods: From January 2012 through December 2017, 8 EIP sites (CO, GA, MD, MN, NM, NY, OR, TN) participated in active, laboratory-, and population-based surveillance for CRAB. An incident case was defined as patient’s first isolation in a 30-day period of A. baumannii complex from sterile sites or urine with resistance to ≥1 carbapenem (excluding ertapenem). Medical records were abstracted. Patients were matched to state vital records to assess mortality within 30 days of incident culture collection. We developed 2 multivariable logistic regression models (1 for sterile site cases and 1 for urine cases) to evaluate characteristics associated with 30-day mortality. Results: We identified 744 patients contributing 863 cases, of which 185 of 863 cases (21.4%) died within 30 days of culture, including 113 of 257 cases (44.0%) isolated from a sterile site and 72 of 606 cases (11.9%) isolated from urine. Among 628 hospitalized cases, death occurred in 159 cases (25.3%). Among hospitalized fatal cases, death occurred after hospital discharge in 27 of 57 urine cases (47.4%) and 21 of 102 cases from sterile sites (20.6%). Among sterile site cases, female sex, intensive care unit (ICU) stay after culture, location in a healthcare facility, including a long-term care facility (LTCF), 3 days before culture, and diagnosis of septic shock were associated with increased odds of death in the model (Fig. 1). In urine cases, age 40–54 or ≥75 years, ICU stay after culture, presence of an indwelling device other than a urinary catheter or central line (eg, endotracheal tube), location in a LTCF 3 days before culture, diagnosis of septic shock, and Charlson comorbidity score ≥3 were associated with increased odds of mortality (Fig. 2). Conclusion: Overall 30-day mortality was high among patients with CRAB, including patients with CRAB isolated from urine. A substantial fraction of mortality occurred after discharge, especially among patients with urine cases. Although there were some differences in characteristics associated with mortality in patients with CRAB isolated from sterile sites versus urine, LTCF exposure and severe illness were associated with mortality in both patient groups. CRAB was associated with major mortality in these patients with evidence of healthcare experience and complex illness. More work is needed to determine whether prevention of CRAB infections would improve outcomes.
Background: Automated testing instruments (ATIs) are commonly used by clinical microbiology laboratories to perform antimicrobial susceptibility testing (AST), whereas public health laboratories may use established reference methods such as broth microdilution (BMD). We investigated discrepancies in carbapenem minimum inhibitory concentrations (MICs) among Enterobacteriaceae tested by clinical laboratory ATIs and by reference BMD at the CDC. Methods: During 2016–2018, we conducted laboratory- and population-based surveillance for carbapenem-resistant Enterobacteriaceae (CRE) through the CDC Emerging Infections Program (EIP) sites (10 sites by 2018). We defined an incident case as the first isolation of Enterobacter spp (E. cloacae complex or E. aerogenes), Escherichia coli, Klebsiella pneumoniae, K. oxytoca, or K. variicola resistant to doripenem, ertapenem, imipenem, or meropenem from normally sterile sites or urine identified from a resident of the EIP catchment area in a 30-day period. Cases had isolates that were determined to be carbapenem-resistant by clinical laboratory ATI MICs (MicroScan, BD Phoenix, or VITEK 2) or by other methods, using current Clinical and Laboratory Standards Institute (CLSI) criteria. A convenience sample of these isolates was tested by reference BMD at the CDC according to CLSI guidelines. Results: Overall, 1,787 isolates from 112 clinical laboratories were tested by BMD at the CDC. Of these, clinical laboratory ATI MIC results were available for 1,638 (91.7%); 855 (52.2%) from 71 clinical laboratories did not confirm as CRE at the CDC. Nonconfirming isolates were tested on either a MicroScan (235 of 462; 50.9%), BD Phoenix (249 of 411; 60.6%), or VITEK 2 (371 of 765; 48.5%). Lack of confirmation was most common among E. coli (62.2% of E. coli isolates tested) and Enterobacter spp (61.4% of Enterobacter isolates tested) (Fig. 1A), and among isolates testing resistant to ertapenem by the clinical laboratory ATI (52.1%, Fig. 1B). Of the 1,388 isolates resistant to ertapenem in the clinical laboratory, 1,006 (72.5%) were resistant only to ertapenem. Of the 855 nonconfirming isolates, 638 (74.6%) were resistant only to ertapenem based on clinical laboratory ATI MICs. Conclusions: Nonconfirming isolates were widespread across laboratories and ATIs. Lack of confirmation was most common among E. coli and Enterobacter spp. Among nonconfirming isolates, most were resistant only to ertapenem. These findings may suggest that ATIs overcall resistance to ertapenem or that isolate transport and storage conditions affect ertapenem resistance. Further investigation into this lack of confirmation is needed, and CRE case identification in public health surveillance may need to account for this phenomenon.
Background: Carbapenem-resistant Enterobacteriaceae (CRE) are a major public health problem. Ceftazidime-avibactam (CZA) is a treatment option for CRE approved in 2015; however, it does not have activity against isolates with metallo-β-lactamases (MBLs). Emerging resistance to CZA is a cause for concern. Our objective was to describe the microbiologic and epidemiologic characteristics of CZA-resistant (CZA-R) CRE. Methods: From 2015 to 2017, 9 states participated in laboratory- and population-based surveillance for carbapenem-resistant Escherichia coli, Klebsiella pneumoniae, K. oxytoca, K. aerogenes, and Enterobacter cloacae complex isolates from a normally sterile site or urine. A convenience sample of isolates from this surveillance were sent to the CDC for antimicrobial susceptibility testing (AST) using reference broth microdilution (BMD) including an MBL screen, species confirmation with MALDI-TOF, and real-time PCR to detect blaKPC, blaNDM, and blaOXA-48–like genes. Additional AST by BMD was performed on CZA-R isolates using meropenem-vaborbactam (MEV), imipenem-relebactam (IMR), plazomicin (PLZ), and eravacycline (ERV). Epidemiologic data were obtained from a medical record review. Community-associated cases were defined as having no healthcare exposures in the year prior to culture, no devices in place 2 days prior to culture, and culture collected before calendar day 3 after hospital admission. Data were analyzed in 3 groups: CRE that were CZA-susceptible (CZA-S), CZA-R that were due to blaNDM, and CZA-R without blaNDM. Results: Among 606 confirmed CRE tested with CZA, 33 (5.4%) were CZA-R. Of the CZA-R isolates, 16 (48.5%) harbored a blaNDM gene, of which 2 coharbored blaNDM and blaOXA-48-like genes; 9 (27.3%) harbored only a blaKPC gene. Of the 17 CZA-R isolates without blaNDM, all were MBL screen negative. CZA-R due to blaNDM were more frequently community-associated (43.8%) than CZA-S or CZA-R without blaNDM (11.0% and 5.9%, respectively); a higher percentage of CZA-R cases due to blaNDM also had recent international travel (25%) compared to the other groups (1.8% and 5.9%, respectively). CZA-R without blaNDM were more susceptible to MEV (76%), IMR (71%), PLZ (88%), and ERV (65%) compared to CZA-R due to blaNDM (19%, 6%, 56%, and 44%, respectively). Conclusions: The emergence of CZA-R isolates without blaNDM are concerning; however, these isolates are more susceptible to newer antimicrobials than those with blaNDM. In addition to high rates of resistance to newer antimicrobials, isolates with blaNDM are more frequently community-associated than other CRE. This underscores the need for more aggressive measures to stop the spread of CRE.
Background: Carbapenem-resistant Enterobacteriaceae (CRE) represent a significant antibiotic resistance threat, in part because carbapenemase genes can spread on mobile genetic elements. Here, we describe the molecular epidemiology and outcomes of patients with CRE bacteriuria from the same city in a nonoutbreak setting. Methods: The Georgia Emerging Infections Program performs active, population-based CRE surveillance in Atlanta. We studied a cohort of patients with CRE (resistant to all tested third-generation cephalosporins and ≥1 carbapenem, excluding ertapenem) first identified in urine, and not in a prior or simultaneous sterile site, between 2012 and 2015. Whole-genome sequencing (WGS) was performed on a convenience sample. We obtained epidemiologic and outcome data through chart review and Georgia Vital Statistics records (90-day mortality). Using WGS, we created a core-genome alignment-based phylogenetic tree of the Klebsiella pneumoniae isolates and calculated the SNP difference between each sample. Using SAS version 9.4 software, we performed the Fisher exact test and univariable odds ratios (OR) with 95% CI to compare patient isolates with and without a carbapenemase gene. Results: Among 81 patients included, the median age was 68 (IQR, 57–74) years, and most were female (58%), black (60%), and resided in a long-term care facility 4 days prior to culture isolation (53%). Organisms isolated were K. pneumoniae (84%), Escherichia coli (7%), Enterobacter cloacae (7%), and Klebsiella oxytoca (1%). WGS identified at least 1 β-lactamase gene in 91% of the isolates; 85% contained a carbapenemase gene, the most frequent of which was blaKPC-3 (94%). Patients with CRE containing a carbapenemase gene were more likely to be black (OR, 3.7; 95% CI, 1.0–13.8) and to have K. pneumoniae (OR, 8.9; 95% CI, 2.2–35.0). Using a core-genome alignment of 3,708 genes (~63% of the complete genome), we identified a median of 67 (IQR, 23–3,881) SNP differences between each K. pneumoniae isolate. A phylogenetic tree identified clustering around carbapenemase gene and multilocus sequence type (84% were ST 258) but not based on referring laboratory or county of residence (Fig. 1). Although 7% of patients developed an invasive CRE infection within 1 year and 21% died within 90 days, having a carbapenemase gene was not associated with these outcomes. Conclusions: Molecular sequencing of a convenience sample of CRE bacteriuria support K. pneumoniae ST258 harboring blaKPC-3 being distributed throughout the Atlanta area, across the healthcare continuum. Overall mortality was high in this population, but the presence of carbapenemase genes was not associated with worse outcomes.
Background: Extended-spectrum β-lactamase–producing (ESBL) Escherichia coli infection incidence is increasing in the United States. This increase may be due to the rapid expansion of ST131, which is now the predominant ESBL strain globally, often multidrug resistant, and has been shown to establish longer-term human colonization than other E. coli strains. We assessed potential risk factors that distinguish ST131 from other ESBL E. coli. Methods: From October 1 through December 31, 2017, 5 CDC Emerging Infections Program (EIP) sites pilot tested active, laboratory-based surveillance in selected counties in Colorado, Georgia, New Mexico, New York, and Tennessee. An E. coli case was defined as the first isolation from a normally sterile body site or urine in a surveillance area resident in a 30-day period resistant to 1 extended-spectrum cephalosporin antibiotic and susceptible or intermediate to all carbapenem antibiotics tested. Epidemiologic data were collected from case patients’ medical records. A convenience sample of 117 E. coli isolates from case patients was collected. All isolates underwent whole-genome sequencing to determine sequence type and the presence of ESBL genes. We compared ST131 E. coli epidemiology to other ESBL E. coli. Results: Among 117 E. coli isolates, 97 (83%) were ESBL producers. Of the 97 ESBL E. coli, 52 (54%) were ST131 (range, for 4 EIP sites submitting >10 isolates: 25%–88%; P < .001). Other common STs were ST38 (12%) and ST10 (5%). ST131 infections were more likely to be healthcare-associated than non-ST131 (56% vs 36%; P = .05) (Table 1). Among specific prior healthcare exposures, only residence in long-term care facilities (LTCFs) in the year before culture was more common among ST131 case patients (29% vs 11%; P = .03). Notably, 85% of ESBL E. coli collected from LTCF residents were ST131. ST131 E. coli were more common among patients with underlying medical conditions (81% vs 60%; P = .02). No statistically significant difference by sex, race, age, culture source, location of culture collection, and frequency of antibiotic use in the prior 30 days was observed. Conclusions:The prevalence of ST131 E. coli varies regionally. The association between ST131 and LTCFs suggests that these may be particularly important settings for ST131 acquisition. Improving infection control measures that limit ESBL transmission in these settings and preventing dissemination in facilities receiving patients from LTCFs may be necessary to contain ST131 spread.
Background: Carbapenem-resistant Acinetobacter baumannii (CRAB) is a serious threat to patient safety due to limited treatment options and propensity to spread in healthcare settings. Using Emerging Infections Program (EIP) data, we describe changes in CRAB incidence and epidemiology. Methods: During January 2012 to December 2018, 9 sites (Colorado, Connecticut, Georgia, Maryland, Minnesota, New Mexico, New York, Oregon, and Tennessee) participated in active laboratory- and population-based surveillance. An incident case was defined as the first isolation of A. baumannii complex, in a 30-day period, resistant to ≥1 carbapenem (excluding ertapenem) from a normally sterile site or urine of a surveillance area resident. Cases were considered hospital-onset (HO) if the culture was collected >3 days after hospital admission; all others were community-onset (CO). Cases were classified as device-associated (DA) if the patient had 1 or more medical devices (ie, urinary catheter, central venous catheter (CVC), endotracheal/nasotracheal tube, tracheostomy, or another indwelling device) present in the 2 days prior to culture collection. Temporal trends were estimated using generalized linear models adjusted for age, race, sex, and EIP site. Results: Overall, 984 incident CRAB cases were identified, representing 849 patients. Among these patients, 291 (34%) were women, 510 (61%) were nonwhite, and the median age was 62 years (mean, 59; range, 0–102). Among the cases, 226 (23%) were HO; 758 (77%) were CO; and 793 (81%) were DA. Overall incidence rates in 2012 and 2018 were 1.58 (95% CI, 1.29–1.90) and 0.60 (95% CI, 0.40–0.67) per 100,000 population, respectively. There was a 15% annual decrease in incidence (adjusted rate ratio [aRR] 0.85; 95% CI: 0.82-0.88, P < .0001). Decreases were observed among sterile site (aRR 0.88; 95% CI, 0.84–0.93) and urine cases (aRR 0.83; 95% CI, 0.80–0.87). Annual decreases occurred for HO cases (aRR, 0.78; 95% CI, 0.73–0.85) and CO cases (aRR, 0.86; 95% CI, 0.83–0.9). The DA cases decreased 16% annually overall (aRR, 0.84; 95% CI, 0.81–0.88). Decreases among cases in patients with CVC (aRR, 0.85; 95% CI, 0.80–0.90) and urinary catheters (aRR, 0.84; 95% CI, 0.80–0.88) were smaller than what was seen in patients with other indwelling devices (aRR, 0.81; 95% CI, 0.77–0.86). Discussion: Overall, from 2012 to 2018, the incidence of CRAB decreased >60%. Decreases were observed in all case groups, regardless of source, infection onset location, or types of devices. Smaller annual decreases in rates of CO-CRAB than HO-CRAB suggest that there may be opportunities to accelerate prevention outside the hospital to further reduce the incidence of these difficult-to-treat infections.
Collaborative care can support the treatment of depression in people with long-term conditions, but long-term benefits and costs are unknown.
To explore the long-term (24-month) effectiveness and cost-effectiveness of collaborative care in people with mental-physical multimorbidity.
A cluster randomised trial compared collaborative care (integrated physical and mental healthcare) with usual care for depression alongside diabetes and/or coronary heart disease. Depression symptoms were measured by the symptom checklist-depression scale (SCL-D13). The economic evaluation was from the perspective of the English National Health Service.
191 participants were allocated to collaborative care and 196 to usual care. At 24 months, the mean SCL-D13 score was 0.27 (95% CI, −0.48 to −0.06) lower in the collaborative care group alongside a gain of 0.14 (95% CI, 0.06-0.21) quality-adjusted life-years (QALYs). The cost per QALY gained was £13 069.
In the long term, collaborative care reduces depression and is potentially cost-effective at internationally accepted willingness-to-pay thresholds.
Objectives: Parkinson’s disease (PD) results in a range of non-motor deficits that can affect mood, cognition, and language, and many of these issues are unresponsive to pharmacological intervention. Aerobic exercise can improve mood and cognition in healthy older adults, although only a few studies have examined exercise effects on these domains in PD. The current study assesses the effects of aerobic exercise on aspects of cognition, mood, and language production in people with PD. Methods: This study compares the effects of aerobic exercise to stretch-balance training and a no-contact control group in participants with idiopathic PD. The aerobic and stretch-balance groups trained three times a week for 16 weeks, while controls continued normal activities. Outcome measures included disease severity, mood, cognition (speed of processing, memory, and executive function), and language production (picture descriptions). Cognition and language were assessed in single and dual task conditions. Results: Depressive symptoms increased only in the control group (p<.02). Executive function improved in the aerobic exercise group only in the single task (p=.007) and declined in controls in the dual task. Completeness of picture descriptions improved significantly more in the aerobic group than in the stretch-balance group (p<.02). Conclusions: Aerobic exercise is a viable intervention for PD that can be protective against increased depressive symptoms, and can improve several non-motor domains, including executive dysfunction and related aspects of language production. (JINS, 2016, 22, 878–889)
Background. There are significant barriers to accessing effective psychological therapy in primary care resulting from a lack of suitably trained therapists to meet current demand. More efficient service delivery using minimal interventions (such as bibliotherapy) provided by paraprofessional therapists may be one method of overcoming these problems, and is the subject of attention in the UK and elsewhere. A randomized trial was conducted to test the clinical effectiveness of this model. Assistant psychologists delivered a guided self-help intervention to patients with anxiety and depression who were currently waiting for psychological therapy.
Method. A total of 114 patients were randomized either to guided self-help or a waiting-list control group. All patients were followed up 3 months later, prior to starting conventional psychological therapy. Measures included self-reported adherence to the intervention, anxiety and depressive symptoms, social functioning and patient satisfaction.
Results. Adherence to the guided self-help intervention was acceptable and patients reported satisfaction with the intervention. However, there were no statistically significant differences between groups in anxiety and depression symptoms at 3 months.
Conclusions. The results demonstrate that this model of guided self-help did not provide additional benefit to patients on a waiting list for psychological therapy. The results are considered in the context of possible internal and external validity threats, and compared with previous trials of minimal interventions. The implications of the results for the design of future minimal interventions are considered.
Self-help interventions in mental health are increasingly seen as one way of overcoming problems with access to psychological therapy, but there is insufficient evidence of effectiveness in routine care settings. This paper investigates the process and outcome of a non-guided self-help manual for anxiety and depression compared to a waiting list control in a primary care setting. Patients with mild to moderate mental health problems were recruited from routine GP referrals to the local Primary Care Mental Health Team. Thirty patients were randomly assigned to either non-guided self-help or a waiting list control group. Patients completed outcome measures at baseline, 6 weeks and 12 weeks. Intention to treat analysis found no significant differences between the two groups on measures of anxiety or depression at 12 weeks. Between 40% to 50% of patients in both groups were no longer clinical cases at the end of the trial. However, there was a high level of satisfaction with the self-help manual. Within the limitations of the small sample size, the study does not support the hypothesis that non-guided self-help is superior to waiting list control in the treatment of anxiety and/or depression in primary care.
Experiments were carried out to determine the cooling power density of SiGe/Si superlattice microcoolers by integrating thin film metal resistor heaters on the cooling surface. By evaluating the maximum cooling of the device under different heat load conditions, the cooling power density was directly measured. Both micro thermocouple probes and the resistance of thin film heaters were used to get an accurate measurement of temperature on top of the device. Superlattice structures were used to enhance the device performance by reducing the thermal conductivity, and by providing selective emission of hot carriers through thermionic emission. Various device sizes were characterized. The maximum cooling and the cooling power density had different dependences on the micro refrigerator size. Net cooling over 4.1 K below ambient and cooling power density of 598 W/cm2 for 40 × 40 μm2 devices were measured at room temperature.
We present experimental and theoretical characterization of InP-based heterostructure integrated thermionic (HIT) coolers. In particular, the effect of doping on overall device performance is characterized. Several thin-film cooler devices have been fabricated and analyzed. The coolers consist of a 1μm thick superlattice structure composed of 25 periods of InGaAs well and InGaAsP (λgap ≈ 1.3μm) barrier layers 10 and 30nm thick, respectively. The superlattice is surrounded by highly-doped InGaAs layers that serve as the cathode and anode. All layers are lattice-matched to the n-type InP substrate. N-type doping of the well layers varies from 1.5×1018cm−3 to 8×1018cm−3 between devices, while the barrier layers are undoped. Device cooling performance was measured at room-temperature. Device current-versus-voltage relationships were measured from 45K to room-temperature. Detailed models of electron transport in superlattice structures were used to simulate device performance. Experimental results indicate that low-temperature electron transport is a strong function of well layer doping and that maximum cooling will decrease as this doping is increased. Theoretical models of both I-V curves and maximum cooling agree well with experimental results. The findings indicate that low-temperature electron transport is useful to characterize potential barriers and energy filtering in HIT coolers.
ZrO2 and HfO2 and their alloys with SiO2 are currently among the leading high-k materials for replacing SiOxNy as the gate dielectric for the sub-100 nm technology nodes. International SEMATECH (ISMT) is currently investigating integration issues associated with this required change in materials. Our work has focused on the integration of ALCVD deposited ZrO2 and HfO2 with an industry standard conventional MOSFET process flow with poly-Si electrode. Since the impact of contamination by these new high-k materials introduced in a production fab has not yet been established, it becomes very critical to prevent cross- contamination through the process tools in the fab. A baseline study was completed within ISMT's fab and appropriate protocols for handling high-k materials have been established. The integrated high-k gate stack in a conventional transistor flow should not only meet all the performance requirements of scaled transistors, but the gate dielectric film should be able withstand high-temperature anneal steps. Reactions between ZrO2 and Si have been observed at temperatures as low as 560°C (during the amorphous Si deposition process). Various wet chemistries were also evaluated for removing the high-k film inadvertently deposited on wafer backside, and it was found that ZrO2 etches at extremely slow rates in the majority of the common wet etch chemistries available in a fab. A new hot HF based process was found to be successful in lowering Zr contamination on the wafer backside to as low as 1.8 E10 atoms/cm2. The patterning of a high-k gate stack with poly-Si electrode is another area that required considerable focus. Various dry (plasma) etch and wet etch chemistries were evaluated for etching ZrO2 using both blanket films as well as wafers with patterned poly-Si gate over the high-k films. On the full CMOS flow device wafers, most of these wet chemistries resulted in severe pitting in the ZrO2 film remaining over the source/drain (S/D) areas, as well as in the Si substrate and the field oxide. A poly-Si gate over ZrO2 gate dielectric film was successfully patterned using the standard poly-Si gate etch (Cl2/HBr) for the main etch, followed by a combination of HF and H2SO4 clean for removing all of the ZrO2 remaining over the S/D area. This allowed the fabrication of low-resistance contacts to transistor S/D areas, which ultimately resulted in demonstration of functional transistors with high-k gate dielectric films.
Fabrication and characterization of SiGe/Si superlattice microcoolers integrated with thin film resistors are described. Superlattice structures were used to enhance the device performance by reducing the thermal conductivity, and by providing selective emission of hot carriers through thermionic emission. Thin film metal resistors were integrated on top of the cooler devices and they were used as heat load for cooling power density measurement. Various device sizes were characterized. Net cooling over 4.1 K and a cooling power density of 598 W/cm2 for 40 × 40 μm2 devices were measured at room temperature.