We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A comparison of computer-extracted and facility-reported counts of hospitalized coronavirus disease 2019 (COVID-19) patients for public health reporting at 36 hospitals revealed 42% of days with matching counts between the data sources. Miscategorization of suspect cases was a primary driver of discordance. Clear reporting definitions and data validation facilitate emerging disease surveillance.
Cardiac intensivists frequently assess patient readiness to wean off mechanical ventilation with an extubation readiness trial despite it being no more effective than clinician judgement alone. We evaluated the utility of high-frequency physiologic data and machine learning for improving the prediction of extubation failure in children with cardiovascular disease.
Methods:
This was a retrospective analysis of clinical registry data and streamed physiologic extubation readiness trial data from one paediatric cardiac ICU (12/2016-3/2018). We analysed patients’ final extubation readiness trial. Machine learning methods (classification and regression tree, Boosting, Random Forest) were performed using clinical/demographic data, physiologic data, and both datasets. Extubation failure was defined as reintubation within 48 hrs. Classifier performance was assessed on prediction accuracy and area under the receiver operating characteristic curve.
Results:
Of 178 episodes, 11.2% (N = 20) failed extubation. Using clinical/demographic data, our machine learning methods identified variables such as age, weight, height, and ventilation duration as being important in predicting extubation failure. Best classifier performance with this data was Boosting (prediction accuracy: 0.88; area under the receiver operating characteristic curve: 0.74). Using physiologic data, our machine learning methods found oxygen saturation extremes and descriptors of dynamic compliance, central venous pressure, and heart/respiratory rate to be of importance. The best classifier in this setting was Random Forest (prediction accuracy: 0.89; area under the receiver operating characteristic curve: 0.75). Combining both datasets produced classifiers highlighting the importance of physiologic variables in determining extubation failure, though predictive performance was not improved.
Conclusion:
Physiologic variables not routinely scrutinised during extubation readiness trials were identified as potential extubation failure predictors. Larger analyses are necessary to investigate whether these markers can improve clinical decision-making.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
Healthcare personnel (HCP) with unprotected exposures to aerosol-generating procedures (AGPs) on patients with coronavirus disease 2019 (COVID-19) are at risk of infection with severe acute respiratory coronavirus virus 2 (SARS-CoV-2). A retrospective review at an academic medical center demonstrated an infection rate of <1% among HCP involved in AGPs without a respirator and/or eye protection.
In 2020 a group of U.S. healthcare leaders formed the National Organization to Prevent Hospital-Acquired Pneumonia (NOHAP) to issue a call to action to address non–ventilator-associated hospital-acquired pneumonia (NVHAP). NVHAP is one of the most common and morbid healthcare-associated infections, but it is not tracked, reported, or actively prevented by most hospitals. This national call to action includes (1) launching a national healthcare conversation about NVHAP prevention; (2) adding NVHAP prevention measures to education for patients, healthcare professionals, and students; (3) challenging healthcare systems and insurers to implement and support NVHAP prevention; and (4) encouraging researchers to develop new strategies for NVHAP surveillance and prevention. The purpose of this document is to outline research needs to support the NVHAP call to action. Primary needs include the development of better models to estimate the economic cost of NVHAP, to elucidate the pathophysiology of NVHAP and identify the most promising pathways for prevention, to develop objective and efficient surveillance methods to track NVHAP, to rigorously test the impact of prevention strategies proposed to prevent NVHAP, and to identify the policy levers that will best engage hospitals in NVHAP surveillance and prevention. A joint task force developed this document including stakeholders from the Veterans’ Health Administration (VHA), the U.S. Centers for Disease Control and Prevention (CDC), The Joint Commission, the American Dental Association, the Patient Safety Movement Foundation, Oral Health Nursing Education and Practice (OHNEP), Teaching Oral-Systemic Health (TOSH), industry partners and academia.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Cohorting patients who are colonized or infected with multidrug-resistant organisms (MDROs) protects uncolonized patients from acquiring MDROs in healthcare settings. The potential for cross transmission within the cohort and the possibility of colonized patients acquiring secondary isolates with additional antibiotic resistance traits is often neglected. We searched for evidence of cross transmission of KPC+ Klebsiella pneumoniae (KPC-Kp) colonization among cohorted patients in a long-term acute-care hospital (LTACH), and we evaluated the impact of secondary acquisitions on resistance potential.
Design:
Genomic epidemiological investigation.
Setting:
A high-prevalence LTACH during a bundled intervention that included cohorting KPC-Kp–positive patients.
Methods:
Whole-genome sequencing (WGS) and location data were analyzed to identify potential cases of cross transmission between cohorted patients.
Results:
Secondary KPC-Kp isolates from 19 of 28 admission-positive patients were more closely related to another patient’s isolate than to their own admission isolate. Of these 19 cases, 14 showed strong genomic evidence for cross transmission (<10 single nucleotide variants or SNVs), and most of these patients occupied shared cohort floors (12 patients) or rooms (4 patients) at the same time. Of the 14 patients with strong genomic evidence of acquisition, 12 acquired antibiotic resistance genes not found in their primary isolates.
Conclusions:
Acquisition of secondary KPC-Kp isolates carrying distinct antibiotic resistance genes was detected in nearly half of cohorted patients. These results highlight the importance of healthcare provider adherence to infection prevention protocols within cohort locations, and they indicate the need for future studies to assess whether multiple-strain acquisition increases risk of adverse patient outcomes.
The science of studying diamond inclusions for understanding Earth history has developed significantly over the past decades, with new instrumentation and techniques applied to diamond sample archives revealing the stories contained within diamond inclusions. This chapter reviews what diamonds can tell us about the deep carbon cycle over the course of Earth’s history. It reviews how the geochemistry of diamonds and their inclusions inform us about the deep carbon cycle, the origin of the diamonds in Earth’s mantle, and the evolution of diamonds through time.
The rocky shores of the north-east Atlantic have been long studied. Our focus is from Gibraltar to Norway plus the Azores and Iceland. Phylogeographic processes shape biogeographic patterns of biodiversity. Long-term and broadscale studies have shown the responses of biota to past climate fluctuations and more recent anthropogenic climate change. Inter- and intra-specific species interactions along sharp local environmental gradients shape distributions and community structure and hence ecosystem functioning. Shifts in domination by fucoids in shelter to barnacles/mussels in exposure are mediated by grazing by patellid limpets. Further south fucoids become increasingly rare, with species disappearing or restricted to estuarine refuges, caused by greater desiccation and grazing pressure. Mesoscale processes influence bottom-up nutrient forcing and larval supply, hence affecting species abundance and distribution, and can be proximate factors setting range edges (e.g., the English Channel, the Iberian Peninsula). Impacts of invasive non-native species are reviewed. Knowledge gaps such as the work on rockpools and host–parasite dynamics are also outlined.
Apolipoprotein E (APOE) E4 is the main genetic risk factor for Alzheimer’s disease (AD). Due to the consistent association, there is interest as to whether E4 influences the risk of other neurodegenerative diseases. Further, there is a constant search for other genetic biomarkers contributing to these phenotypes, such as microtubule-associated protein tau (MAPT) haplotypes. Here, participants from the Ontario Neurodegenerative Disease Research Initiative were genotyped to investigate whether the APOE E4 allele or MAPT H1 haplotype are associated with five neurodegenerative diseases: (1) AD and mild cognitive impairment (MCI), (2) amyotrophic lateral sclerosis, (3) frontotemporal dementia (FTD), (4) Parkinson’s disease, and (5) vascular cognitive impairment.
Methods:
Genotypes were defined for their respective APOE allele and MAPT haplotype calls for each participant, and logistic regression analyses were performed to identify the associations with the presentations of neurodegenerative diseases.
Results:
Our work confirmed the association of the E4 allele with a dose-dependent increased presentation of AD, and an association between the E4 allele alone and MCI; however, the other four diseases were not associated with E4. Further, the APOE E2 allele was associated with decreased presentation of both AD and MCI. No associations were identified between MAPT haplotype and the neurodegenerative disease cohorts; but following subtyping of the FTD cohort, the H1 haplotype was significantly associated with progressive supranuclear palsy.
Conclusion:
This is the first study to concurrently analyze the association of APOE isoforms and MAPT haplotypes with five neurodegenerative diseases using consistent enrollment criteria and broad phenotypic analysis.
The Comprehensive Assessment of Neurodegeneration and Dementia (COMPASS-ND) cohort study of the Canadian Consortium on Neurodegeneration in Aging (CCNA) is a national initiative to catalyze research on dementia, set up to support the research agendas of CCNA teams. This cross-country longitudinal cohort of 2310 deeply phenotyped subjects with various forms of dementia and mild memory loss or concerns, along with cognitively intact elderly subjects, will test hypotheses generated by these teams.
Methods:
The COMPASS-ND protocol, initial grant proposal for funding, fifth semi-annual CCNA Progress Report submitted to the Canadian Institutes of Health Research December 2017, and other documents supplemented by modifications made and lessons learned after implementation were used by the authors to create the description of the study provided here.
Results:
The CCNA COMPASS-ND cohort includes participants from across Canada with various cognitive conditions associated with or at risk of neurodegenerative diseases. They will undergo a wide range of experimental, clinical, imaging, and genetic investigation to specifically address the causes, diagnosis, treatment, and prevention of these conditions in the aging population. Data derived from clinical and cognitive assessments, biospecimens, brain imaging, genetics, and brain donations will be used to test hypotheses generated by CCNA research teams and other Canadian researchers. The study is the most comprehensive and ambitious Canadian study of dementia. Initial data posting occurred in 2018, with the full cohort to be accrued by 2020.
Conclusion:
Availability of data from the COMPASS-ND study will provide a major stimulus for dementia research in Canada in the coming years.
Drawing on a landscape analysis of existing data-sharing initiatives, in-depth interviews with expert stakeholders, and public deliberations with community advisory panels across the U.S., we describe features of the evolving medical information commons (MIC). We identify participant-centricity and trustworthiness as the most important features of an MIC and discuss the implications for those seeking to create a sustainable, useful, and widely available collection of linked resources for research and other purposes.
Although most hospitals report very high levels of hand hygiene compliance (HHC), the accuracy of these overtly observed rates is questionable due to the Hawthorne effect and other sources of bias. In the study, we aimed (1) to compare HHC rates estimated using the standard audit method of overt observation by a known observer and a new audit method that employed a rapid (<15 minutes) “secret shopper” method and (2) to pilot test a novel feedback tool.
Design
Quality improvement project using a quasi-experimental stepped-wedge design.
Setting
This study was conducted in 5 acute-care hospitals (17 wards, 5 intensive care units) in the Midwestern United States.
Methods
Sites recruited a hand hygiene observer from outside the acute-care units to rapidly and covertly observe entry and exit HHC during the study period, October 2016–September 2017. After 3 months of observations, sites received a monthly feedback tool that communicated HHC information from the new audit method.
Results
The absolute difference in HHC estimates between the standard and new audit methods was ~30%. No significant differences in HHC were detected between the baseline and feedback phases (OR, 0.92; 95% CI, 0.84–1.01), but the standard audit method had significantly higher estimates than the new audit method (OR, 9.83; 95% CI, 8.82–10.95).
Conclusions
HHC estimates obtained using the new audit method were substantially lower than estimates obtained using the standard audit method, suggesting that the rapid, secret-shopper method is less subject to bias. Providing feedback using HHC from the new audit method did not seem to impact HHC behaviors.
Posthodiplostomum minimum utilizes a three-host life cycle with multiple developmental stages. The metacercarial stage, commonly known as ‘white grub’, infects the visceral organs of many freshwater fishes and was historically considered a host generalist due to its limited morphological variation among a wide range of hosts. In this study, infection data and molecular techniques were used to evaluate the host and tissue specificity of Posthodiplostomum metacercariae in centrarchid fishes. Eleven centrarchid species from three genera were collected from the Illinois portion of the Ohio River drainage and necropsied. Posthodiplostomum infection levels differed significantly by host age, host genera and infection locality. Three Posthodiplostomum spp. were identified by DNA sequencing, two of which were relatively common within centrarchid hosts. Both common species were host specialists at the genus level, with one species restricted to Micropterus hosts and the other preferentially infecting Lepomis. Host specificity is likely dictated by physiological compatibility and deviations from Lepomis host specificity may be related to host hybridization. Posthodiplostomum species also differed in their utilization of host tissues. Neither common species displayed strong genetic structure over the scale of this study, likely due to their utilization of bird definitive hosts.
To determine the impact of recurrent Clostridium difficile infection (RCDI) on patient behaviors following illness.
METHODS
Using a computer algorithm, we searched the electronic medical records of 7 Chicago-area hospitals to identify patients with RCDI (2 episodes of CDI within 15 to 56 days of each other). RCDI was validated by medical record review. Patients were asked to complete a telephone survey. The survey included questions regarding general health, social isolation, symptom severity, emotional distress, and prevention behaviors.
RESULTS
In total, 119 patients completed the survey (32%). On average, respondents were 57.4 years old (standard deviation, 16.8); 57% were white, and ~50% reported hospitalization for CDI. At the time of their most recent illness, patients rated their diarrhea as high severity (58.5%) and their exhaustion as extreme (30.7%). Respondents indicated that they were very worried about getting sick again (41.5%) and about infecting others (31%). Almost 50% said that they have washed their hands more frequently (47%) and have increased their use of soap and water (45%) since their illness. Some of these patients (22%–32%) reported eating out less, avoiding certain medications and public areas, and increasing probiotic use. Most behavioral changes were unrelated to disease severity.
CONCLUSION
Having had RCDI appears to increase prevention-related behaviors in some patients. While some behaviors are appropriate (eg, handwashing), others are not supported by evidence of decreased risk and may negatively impact patient quality of life. Providers should discuss appropriate prevention behaviors with their patients and should clarify that other behaviors (eg, eating out less) will not affect their risk of future illness.
We analyzed glacier surface elevations (1957, 2010 and 2015) and surface mass-balance measurements (2008–2015) on the 30 km2 Eklutna Glacier, in the Chugach Mountains of southcentral Alaska. The geodetic mass balances from 1957 to 2010 and 2010 to 2015 are −0.52 ± 0.46 and −0.74 ± 0.10 m w.e. a−1, respectively. The glaciological mass balance of −0.73 m w.e. a−1 from 2010 to 2015 is indistinguishable from the geodetic value. Even after accounting for loss of firn in the accumulation zone, we found most of the mass loss over both time periods was from a broad, low-slope basin that includes much of the accumulation zone of the main branch. Ice-equivalent surface elevation changes in the basin were −1.0 ± 0.8 m a−1 from 1957 to 2010, and −0.6 ± 0.1 m a−1 from 2010 to 2015, shifting the glacier hypsometry downward and resulting in more negative mass balances: an altitude-mass-balance feedback. Net mass loss from Eklutna Glacier accounts for 7 ± 1% of the average inflow to Eklutna Reservoir, which is entirely used for water and power by Anchorage, Alaska's largest city. If the altitude-mass-balance feedback continues, this ‘deglaciation discharge dividend’ is likely to increase over the short-term before it eventually decreases due to diminishing glacier area.
Standard estimates of the impact of Clostridium difficile infections (CDI) on inpatient lengths of stay (LOS) may overstate inpatient care costs attributable to CDI. In this study, we used multistate modeling (MSM) of CDI timing to reduce bias in estimates of excess LOS.
METHODS
A retrospective cohort study of all hospitalizations at any of 120 acute care facilities within the US Department of Veterans Affairs (VA) between 2005 and 2012 was conducted. We estimated the excess LOS attributable to CDI using an MSM to address time-dependent bias. Bootstrapping was used to generate 95% confidence intervals (CI). These estimates were compared to unadjusted differences in mean LOS for hospitalizations with and without CDI.
RESULTS
During the study period, there were 3.96 million hospitalizations and 43,540 CDIs. A comparison of unadjusted means suggested an excess LOS of 14.0 days (19.4 vs 5.4 days). In contrast, the MSM estimated an attributable LOS of only 2.27 days (95% CI, 2.14–2.40). The excess LOS for mild-to-moderate CDI was 0.75 days (95% CI, 0.59–0.89), and for severe CDI, it was 4.11 days (95% CI, 3.90–4.32). Substantial variation across the Veteran Integrated Services Networks (VISN) was observed.
CONCLUSIONS
CDI significantly contributes to LOS, but the magnitude of its estimated impact is smaller when methods are used that account for the time-varying nature of infection. The greatest impact on LOS occurred among patients with severe CDI. Significant geographic variability was observed. MSM is a useful tool for obtaining more accurate estimates of the inpatient care costs of CDI.
Infect. Control Hosp. Epidemiol. 2015;36(9):1024–1030