To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Not all patients who acquire carbapenemase-producing Enterobacteriaceae (CPE) develop infections by these organisms; many remain only colonized. Of 54 CPE-colonized patients, 16 (30%) developed CPE infections. We identified indwelling urinary catheter exposure, exposure to intravenous colistin, and overseas transfer as variables associated with CPE infection development among colonized patients.
Long-term care facilities (LTCFs) and their populations have been greatly affected by the coronavirus disease 2019 (COVID-19) pandemic. In this review, we summarize the literature to describe the current epidemiology of COVID-19 in LTCFs, clinical presentations and outcomes in the LTCF population with COVID-19, containment interventions, and the role of healthcare workers in SARS-CoV-2 transmission in these facilities.
The association between Clostridioides difficile colonization and C. difficile infection (CDI) is unknown in solid-organ transplant (SOT) patients. We examined C. difficile colonization and healthcare-associated exposures as risk factors for development of CDI in SOT patients.
The retrospective study cohort included all consecutive SOT patients with at least 1 screening test between May 2017 and April 2018. CDI was defined as the presence of diarrhea (without laxatives), a positive C. difficile clinical test, and the use of C. difficile-directed antimicrobial therapy as ordered by managing clinicians. In addition to demographic variables, exposures to antimicrobials, immunosuppressants, and gastric acid suppressants were evaluated from the time of first screening test to the time of CDI, death, or final discharge.
Of the 348 SOT patients included in our study, 33 (9.5%) were colonized with toxigenic C. difficile. In total, 11 patients (3.2%) developed CDI. Only C. difficile colonization (odds ratio [OR], 13.52; 95% CI, 3.46–52.83; P = .0002), age (OR, 1.09; CI, 1.02–1.17; P = .0135), and hospital days (OR, 1.05; 95% CI, 1.02–1.08; P = .0017) were independently associated with CDI.
Although CDI was more frequent in C. difficile colonized SOT patients, the overall incidence of CDI was low in this cohort.
Hearing loss affects over 1.3 billion individuals worldwide, with the greatest burden among adults. Little is known regarding the association between adult-onset hearing loss and employment.
Seven databases (PubMed, Embase, Cochrane Library, ABI/Inform Collection, Business Source Ultimate, Web of Science and Scopus) were searched through to October 2018. The key word terms used related to hearing loss and employment, excluding paediatric or congenital hearing loss and deaf or culturally deaf populations.
The initial search resulted in 13 144 articles. A total of 7494 articles underwent title and abstract screening, and 243 underwent full-text review. Twenty-five articles met the inclusion criteria. Studies were set in 10 predominantly high-income countries. Seven of the 25 studies analysed regionally or nationally representative datasets and controlled for key variables. Six of these seven studies reported associations between hearing loss and employment.
The highest quality studies currently available indicate that adult-onset hearing loss is associated with unemployment. However, considerable heterogeneity exists, and more rigorous studies that include low- and middle-income countries are needed.
Single nucleotide polymorphisms (SNPs) contribute small increases in risk for late-onset Alzheimer's disease (LOAD). LOAD SNPs cluster around genes with similar biological functions (pathways). Polygenic risk scores (PRS) aggregate the effect of SNPs genome-wide. However, this approach has not been widely used for SNPs within specific pathways.
We investigated whether pathway-specific PRS were significant predictors of LOAD case/control status.
We mapped SNPs to genes within 8 pathways implicated in LOAD. For our polygenic analysis, the discovery sample comprised 13,831 LOAD cases and 29,877 controls. LOAD risk alleles for SNPs in our 8 pathways were identified at a P-value threshold of 0.5. Pathway-specific PRS were calculated in a target sample of 3332 cases and 9832 controls. The genetic data were pruned with R2 > 0.2 while retaining the SNPs most significantly associated with AD. We tested whether pathway-specific PRS were associated with LOAD using logistic regression, adjusting for age, sex, country, and principal components. We report the proportion of variance in liability explained by each pathway.
The most strongly associated pathways were the immune response (NSNPs = 9304, = 5.63 × 10−19, R2 = 0.04) and hemostasis (NSNPs = 7832, P = 5.47 × 10−7, R2 = 0.015). Regulation of endocytosis, hematopoietic cell lineage, cholesterol transport, clathrin and protein folding were also significantly associated but accounted for less than 1% of the variance. With APOE excluded, all pathways remained significant except proteasome-ubiquitin activity and protein folding.
Genetic risk for LOAD can be split into contributions from different biological pathways. These offer a means to explore disease mechanisms and to stratify patients.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Duchenne muscular dystrophy is associated with progressive cardiorespiratory failure, including left ventricular dysfunction.
Methods and Results:
Males with probable or definite diagnosis of Duchenne muscular dystrophy, diagnosed between 1 January, 1982 and 31 December, 2011, were identified from the Muscular Dystrophy Surveillance Tracking and Research Network database. Two non-mutually exclusive groups were created: patients with ≥2 echocardiograms and non-invasive positive pressure ventilation-compliant patients with ≥1 recorded ejection fraction. Quantitative left ventricular dysfunction was defined as an ejection fraction <55%. Qualitative dysfunction was defined as mild, moderate, or severe. Progression of quantitative left ventricular dysfunction was modelled as a continuous time-varying outcome. Change in qualitative left ventricle function was assessed by the percentage of patients within each category at each age. Forty-one percent (n = 403) had ≥2 ejection fractions containing 998 qualitative assessments with a mean age at first echo of 10.8 ± 4.6 years, with an average first ejection fraction of 63.1 ± 12.6%. Mean age at first echo with an ejection fraction <55 was 15.2 ± 3.9 years. Thirty-five percent (140/403) were non-invasive positive pressure ventilation-compliant and had ejection fraction information. The estimated rate of decline in ejection fraction from first ejection fraction was 1.6% per year and initiation of non-invasive positive pressure ventilation did not change this rate.
In our cohort, we observed that left ventricle function in patients with Duchenne muscular dystrophy declined over time, independent of non-invasive positive pressure ventilation use. Future studies are needed to examine the impact of respiratory support on cardiac function.
Previously, we showed that disinfection of sink drains is effective at decreasing bacterial loads. Here, we report our evaluation of the ideal frequency of sink-drain disinfection and our comparison of 2 different hydrogen peroxide disinfectants.
Red meat is an important dietary source of protein and many other essential nutrients including omega(n)-3 polyunsaturated fatty acids (PUFA) which provide numerous benefits to human health. It is well known that grass-fed meat contains a more favourable fatty acid profile, compared to other feeding regimes, but the feasibility of grass finishing is in decline for many farmers/producers. Therefore, alternative methods to enhance the fatty acid profile of red meats, such as beef, are needed to meet increasing consumer demands for ‘healthier’ products. This study compared plasma PUFA concentrations across cattle finished on three different feeding regimes. Three farms supplied livestock to the current study, where cattle were fed three different feeding regimes for a minimum of 15-weeks prior to slaughter. Feeding regimes were ad lib concentrate (negative control), n3-enriched ad lib concentrate (treatment) or grass-fed only (positive control). Blood was collected at slaughter into EDTA tubes and plasma aliquots were stored at -80°C until analysis. A validated gas chromatography–mass spectrometry (GC-MS) method was used to quantify individual PUFA concentrations in mg/ml [linoleic acid (LA); arachidonic acid (AA); alpha-linolenic acid (ALA); eicosapentaenoic acid (EPA); docosapentaenoic acid (DPA); docosahexaenoic acid (DHA)]. Samples from 23, 49 and 40 animals (in control, treatment & grass groups, respectively) were available for the current analysis. One-way ANOVA tests revealed significant differences between groups in all PUFA concentrations quantified (all P < 0.026). Post-hoc (LSD) tests showed mean ± SD n3 PUFA concentrations were significantly different within all three groups (all P < 0.04), increasing from negative control (0.049 ± 0.013 mg/ml), to treatment (0.095 ± 0.034 mg/ml) and grass-fed groups (0.461 ± 0.132 mg/ml). The opposite was observed for mean ± SD n6 PUFA concentrations (1.060 ± 0.297 vs. 0.918 ± 0.267 vs. 0.355 ± 0.085 mg/ml, respectively; all P < 0.02). Cattle finished on either treatment or grass regimes had a more favourable n6:n3 PUFA ratio, compared to negative control (11.98 and 0.79 vs. 22.65, respectively). This study demonstrates that the finishing diet can impact plasma PUFA concentrations of beef cattle. Animals finished on the n3-enriched concentrate had, on average, double the total n3 PUFA concentrations, as well as an improved n6:n3 ratio, compared to control cattle. These results provide preliminary data on an alternative n3-enriched feeding regime for beef cattle to improve PUFA concentrations. Further research, however, is required to confirm if such beneficial changes are also observed in bovine muscle, which would have direct benefits for consumers.
A new carbon isotope record for two high-latitude sedimentary successions that span the Jurassic–Cretaceous boundary interval in the Sverdrup Basin of Arctic Canada is presented. This study, combined with other published Arctic data, shows a large negative isotopic excursion of organic carbon (δ13Corg) of 4‰ (V-PDB) and to a minimum of −30.7‰ in the probable middle Volgian Stage. This is followed by a return to less negative values of c. −27‰. A smaller positive excursion in the Valanginian Stage of c. 2‰, reaching maximum values of −24.6‰, is related to the Weissert Event. The Volgian isotopic trends are consistent with other high-latitude records but do not appear in δ13Ccarb records of Tethyan Tithonian strata. In the absence of any obvious definitive cause for the depleted δ13Corg anomaly, we suggest several possible contributing factors. The Sverdrup Basin and other Arctic areas may have experienced compositional evolution away from open-marine δ13C values during the Volgian Age due to low global or large-scale regional sea levels, and later become effectively coupled to global oceans by Valanginian time when sea level rose. A geologically sudden increase in volcanism may have caused the large negative δ13Corg values seen in the Arctic Volgian records but the lack of precise geochronological age control for the Jurassic–Cretaceous boundary precludes direct comparison with potentially coincident events, such as the Shatsky Rise. This study offers improved correlation constraints and a refined C-isotope curve for the Boreal region throughout latest Jurassic and earliest Cretaceous time.
To evaluate the clinical impact of an antimicrobial stewardship program (ASP) on high-risk pediatric patients.
Retrospective cohort study.
Free-standing pediatric hospital.
This study included patients who received an ASP review between March 3, 2008, and March 2, 2017, and were considered high-risk, including patients receiving care by the neonatal intensive care (NICU), hematology/oncology (H/O), or pediatric intensive care (PICU) medical teams.
The ASP recommendations included stopping antibiotics; modifying antibiotic type, dose, or duration; or obtaining an infectious diseases consultation. The outcomes evaluated in all high-risk patients with ASP recommendations were (1) hospital-acquired Clostridium difficile infection, (2) mortality, and (3) 30-day readmission. Subanalyses were conducted to evaluate hospital length of stay (LOS) and tracheitis treatment failure. Multivariable generalized linear models were performed to examine the relationship between ASP recommendations and each outcome after adjusting for clinical service and indication for treatment.
The ASP made 2,088 recommendations, and 50% of these recommendations were to stop antibiotics. Recommendation agreement occurred in 70% of these cases. Agreement with an ASP recommendation was not associated with higher odds of mortality or hospital readmission. Patients with a single ASP review and agreed upon recommendation had a shorter median LOS (10.2 days vs 13.2 days; P < .05). The ASP recommendations were not associated with high rates of tracheitis treatment failure.
ASP recommendations do not result in worse clinical outcomes among high-risk pediatric patients. Most ASP recommendations are to stop or to narrow antimicrobial therapy. Further work is needed to enhance stewardship efforts in high-risk pediatric patients.
Describe the epidemiological and molecular characteristics of an outbreak of Klebsiella pneumoniae carbapenemase (KPC)–producing organisms and the novel use of a cohorting unit for its control.
A 566-room academic teaching facility in Milwaukee, Wisconsin.
Solid-organ transplant recipients.
Infection control bundles were used throughout the time of observation. All KPC cases were intermittently housed in a cohorting unit with dedicated nurses and nursing aids. The rooms used in the cohorting unit had anterooms where clean supplies and linens were placed. Spread of KPC-producing organisms was determined using rectal surveillance cultures on admission and weekly thereafter among all consecutive patients admitted to the involved units. KPC-positive strains underwent pulsed-field gel electrophoresis and whole-genome sequencing.
A total of 8 KPC cases (5 identified by surveillance) were identified from April 2016 to April 2017. After the index patient, 3 patients acquired KPC-producing organisms despite implementation of an infection control bundle. This prompted the use of a cohorting unit, which immediately halted transmission, and the single remaining KPC case was transferred out of the cohorting unit. However, additional KPC cases were identified within 2 months. Once the cohorting unit was reopened, no additional KPC cases occurred. The KPC-positive species identified during this outbreak included Klebsiella pneumoniae, Enterobacter cloacae complex, and Escherichia coli. blaKPC was identified on at least 2 plasmid backbones.
A complex KPC outbreak involving both clonal and plasmid-mediated dissemination was controlled using weekly surveillances and a cohorting unit.
Introduction: Depending on the time and day of initial Emergency Department (ED) presentation, some patients may require a return to the ED the following day for ultrasound examination. Return visits for ultrasound may be time and resource intensive for both patients and the ED. Qualitative experience suggests that a percentage of return ultrasounds could be performed at a non-ED facility. Our objective was to undertake a retrospective audit of return for ultrasound usage, patterns and outcomes at 2 academic EDs. Methods: A retrospective review of all adult patients returning to the ED for ultrasound at both LHSC ED sites in 2016 was undertaken. Each chart was independently reviewed by two emergency medicine consultants. Charts were assessed for day and time of initial presentation and return, type of ultrasound ordered, and length of ED stay on initial presentation and return visit. Opinion based questions were considered by reviewers, including urgency of diagnosis clarification required, if symptoms were still present on return, and if any medical or surgical treatment or follow up was arranged based on ultrasound results. Agreement between reviewers was assessed. Results: After eliminating charts for which the return visit was not for a scheduled ultrasound examination, 328 patient charts were reviewed. 63% of patients were female and median [IQR] age was 40 years [27-56]. Abdomen/pelvis represented 50% of the ultrasounds; renal 24%; venous Doppler 15.9%. Symptoms were still present and documented in 79% of cases. 22% of cases required a medical intervention and 9% an immediate surgical intervention. 11% of patients were admitted to hospital on their return visit. Outpatient follow-up based on US results was initiated in 29% of cases. Median [IQR] combined LOS was 479.5 minutes [358.5-621.75]. Agreement between reviewers for opinion based questions was poor (63%-96%). Conclusion: Ideally, formal ultrasound should be available on a 24 hour basis for ED patients in order to avoid return visits. A percentage of return for ultrasound examinations do not result in any significant change in treatment. Emergency departments should consider the development of pathways to avoid return visits for follow up ultrasound when possible. The low incidence of surgical treatment in those returning for US suggests that this population could be served in a non-hospital setting. Further research is required to support this conclusion.
The short-term impact of prolonged grief disorder (PGD) following bereavement is well documented. The longer term sequelae of PGD however are poorly understood, possibly unrecognized, and may be incorrectly attributed to other mental health disorders and hence undertreated.
The aims of this study were to prospectively evaluate the prevalence of PGD three years post bereavement and to examine the predictors of long-term PGD in a population-based cohort of bereaved cancer caregivers.
A cohort of primary family caregivers of patients admitted to one of three palliative care services in Melbourne, Australia, participated in the study (n = 301). Sociodemographic, mental health, and bereavement-related data were collected from the caregiver upon the patient's admission to palliative care (T1). Further data addressing circumstances around the death and psychological health were collected at six (T2, n = 167), 13 (T3, n = 143), and 37 months (T4, n = 85) after bereavement.
At T4, 5% and 14% of bereaved caregivers met criteria for PGD and subthreshold PGD, respectively. Applying the total PGD score at T4, linear regression analysis found preloss anticipatory grief measured at T1 and self-reported coping measured at T2 were highly statistically significant predictors (both p < 0.0001) of PGD in the longer term.
For almost 20% of caregivers, the symptoms of PGD appear to persist at least three years post bereavement. These findings support the importance of screening caregivers upon the patient's admission to palliative care and at six months after bereavement to ascertain their current mental health. Ideally, caregivers at risk of developing PGD can be identified and treated before PGD becomes entrenched.
In low- and middle-income countries (LMIC) in general and sub-Sahara African (SSA) countries in particular, there is both a large treatment gap for mental disorders and a relative paucity of empirical evidence about how to fill this gap. This is more so for severe mental disorders, such as psychosis, which impose an additional vulnerability for human rights abuse on its sufferers. A major factor for the lack of evidence is the few numbers of active mental health (MH) researchers on the continent and the distance between the little evidence generated and the policy-making process.
The Partnership for Mental Health Development in Africa (PaM-D) aimed to bring together diverse MH stakeholders in SSA, working collaboratively with colleagues from the global north, to create an infrastructure to develop MH research capacity in SSA, advance global MH science by conducting innovative public health-relevant MH research in the region and work to link research to policy development. Participating SSA countries were Ghana, Kenya, Liberia, Nigeria and South Africa. The research component of PaM-D focused on the development and assessment of a collaborative shared care (CSC) program between traditional and faith healers (T&FHs) and biomedical providers for the treatment of psychotic disorders, as a way of improving the outcome of persons suffering from these conditions. The capacity building component aimed to develop research capacity and appreciation of the value of research in a broad range of stakeholders through bespoke workshops and fellowships targeting specific skill-sets as well as mentoring for early career researchers.
In the research component of PaM-D, a series of formative studies were implemented to inform the development of an intervention package consisting of the essential features of a CSC for psychosis implemented by primary care providers and T&FHs. A cluster randomised controlled trial was next designed to test the effectiveness of this package on the outcome of psychosis. In the capacity-building component, 35 early and mid-career researchers participated in the training workshops and several established mentor-mentee relationships with senior PaM-D members. At the end of the funding period, 60 papers have been published and 21 successful grant applications made.
The success of PaM-D in energising young researchers and implementing a cutting-edge research program attests to the importance of partnership among researchers in the global south working with those from the north in developing MH research and service in LMIC.
In 2018, the Clostridium difficile LabID event methodology changed so that hospitals doing 2-step tests, nucleic acid amplification test (NAAT) plus enzyme immunofluorescence assay (EIA), had their adjustment modified to EIA-based tests, and only positive final tests (eg, EIA) were counted in the numerator. We report the immediate impact of this methodological change at 3 Milwaukee hospitals.
Post-translocation monitoring is fundamental for assessing translocation success and identifying potential threats. We measured outcomes for four cohorts of tuatara Sphenodon punctatus translocated to warmer climates outside of their ecological region, to understand effects of climate warming. Translocation sites were on average 2–4 °C warmer than the source site. We used three short-term measures of success: survival, growth and reproduction. Data on recaptures, morphometric measurements, and reproduction were gathered over 2.5 years following release. Although decades of monitoring will be required to determine long-term translocation success in this species, we provide an interim measure of population progress and translocation site suitability. We found favourable recapture numbers, growth of founders and evidence of reproduction at most sites, with greater increases in body mass observed at warmer, less densely populated sites. Variable growth in the adult population at one translocation site suggested that higher population density, intraspecific competition, and lower water availability could be responsible for substantial weight loss in multiple individuals, and we make management recommendations to reduce population density. Overall, we found that sites with warmer climates and lower population densities were potentially beneficial to translocated tuatara, probably because of enhanced temperature-dependent and density-dependent growth rates. We conclude that tuatara could benefit from translocations to warmer sites in the short term, but further monitoring of this long-lived species is required to determine longer-term population viability following translocation. Future vulnerability to rising air temperatures, associated water availability, and community and ecosystem changes beyond the scope of this study must be considered.