We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To describe the genomic analysis and epidemiologic response related to a slow and prolonged methicillin-resistant Staphylococcus aureus (MRSA) outbreak.
Design:
Prospective observational study.
Setting:
Neonatal intensive care unit (NICU).
Methods:
We conducted an epidemiologic investigation of a NICU MRSA outbreak involving serial baby and staff screening to identify opportunities for decolonization. Whole-genome sequencing was performed on MRSA isolates.
Results:
A NICU with excellent hand hygiene compliance and longstanding minimal healthcare-associated infections experienced an MRSA outbreak involving 15 babies and 6 healthcare personnel (HCP). In total, 12 cases occurred slowly over a 1-year period (mean, 30.7 days apart) followed by 3 additional cases 7 months later. Multiple progressive infection prevention interventions were implemented, including contact precautions and cohorting of MRSA-positive babies, hand hygiene observers, enhanced environmental cleaning, screening of babies and staff, and decolonization of carriers. Only decolonization of HCP found to be persistent carriers of MRSA was successful in stopping transmission and ending the outbreak. Genomic analyses identified bidirectional transmission between babies and HCP during the outbreak.
Conclusions:
In comparison to fast outbreaks, outbreaks that are “slow and sustained” may be more common to units with strong existing infection prevention practices such that a series of breaches have to align to result in a case. We identified a slow outbreak that persisted among staff and babies and was only stopped by identifying and decolonizing persistent MRSA carriage among staff. A repeated decolonization regimen was successful in allowing previously persistent carriers to safely continue work duties.
One of six nursing home residents and staff with positive SARS-CoV-2 tests ≥90 days after initial infection had specimen cycle thresholds (Ct) <30. Individuals with specimen Ct<30 were more likely to report symptoms but were not different from individuals with high Ct value specimens by other clinical and testing data.
Early in the COVID-19 pandemic, the World Health Organization stressed the importance of daily clinical assessments of infected patients, yet current approaches frequently consider cross-sectional timepoints, cumulative summary measures, or time-to-event analyses. Statistical methods are available that make use of the rich information content of longitudinal assessments. We demonstrate the use of a multistate transition model to assess the dynamic nature of COVID-19-associated critical illness using daily evaluations of COVID-19 patients from 9 academic hospitals. We describe the accessibility and utility of methods that consider the clinical trajectory of critically ill COVID-19 patients.
Introduced mammalian predators are responsible for the decline and extinction of many native species, with rats (genus Rattus) being among the most widespread and damaging invaders worldwide. In a naturally fragmented landscape, we demonstrate the multi-year effectiveness of snap traps in the removal of Rattus rattus and Rattus exulans from lava-surrounded forest fragments ranging in size from <0.1 to >10 ha. Relative to other studies, we observed low levels of fragment recolonization. Larger rats were the first to be trapped, with the average size of trapped rats decreasing over time. Rat removal led to distinct shifts in the foraging height and location of mongooses and mice, emphasizing the need to focus control efforts on multiple invasive species at once. Furthermore, because of a specially designed trap casing, we observed low non-target capture rates, suggesting that on Hawai‘i and similar islands lacking native rodents the risk of killing non-target species in snap traps may be lower than the application of rodenticides, which have the potential to contaminate food webs. These efforts demonstrate that targeted snap-trapping is an effective removal method for invasive rats in fragmented habitats and that, where used, monitoring of recolonization should be included as part of a comprehensive biodiversity management strategy.
Psychotropic prescription rates continue to increase in the United States (USA). Few studies have investigated whether social-structural factors may play a role in psychotropic medication use independent of mental illness. Food insecurity is prevalent among people living with HIV in the USA and has been associated with poor mental health. We investigated whether food insecurity was associated with psychotropic medication use independent of the symptoms of depression and anxiety among women living with HIV in the USA.
Methods
We used cross-sectional data from the Women's Interagency HIV Study (WIHS), a nationwide cohort study. Food security (FS) was the primary explanatory variable, measured using the Household Food Security Survey Module. First, we used multivariable linear regressions to test whether FS was associated with symptoms of depression (Center for Epidemiologic Studies Depression [CESD] score), generalised anxiety disorder (GAD-7 score) and mental health-related quality of life (MOS-HIV Mental Health Summary score; MHS). Next, we examined associations of FS with the use of any psychotropic medications, including antidepressants, sedatives and antipsychotics, using multivariable logistic regressions adjusting for age, race/ethnicity, income, education and alcohol and substance use. In separate models, we additionally adjusted for symptoms of depression (CESD score) and anxiety (GAD-7 score).
Results
Of the 905 women in the sample, two-thirds were African-American. Lower FS (i.e. worse food insecurity) was associated with greater symptoms of depression and anxiety in a dose–response relationship. For the psychotropic medication outcomes, marginal and low FS were associated with 2.06 (p < 0.001; 95% confidence interval [CI] = 1.36–3.13) and 1.99 (p < 0.01; 95% CI = 1.26–3.15) times higher odds of any psychotropic medication use, respectively, before adjusting for depression and anxiety. The association of very low FS with any psychotropic medication use was not statistically significant. A similar pattern was found for antidepressant and sedative use. After additionally adjusting for CESD and GAD-7 scores, marginal FS remained associated with 1.93 (p < 0.05; 95% CI = 1.16–3.19) times higher odds of any psychotropic medication use. Very low FS, conversely, was significantly associated with lower odds of antidepressant use (adjusted odds ratio = 0.42; p < 0.05; 95% CI = 0.19–0.96).
Conclusions
Marginal FS was associated with higher odds of using psychotropic medications independent of depression and anxiety, while very low FS was associated with lower odds. These complex findings may indicate that people experiencing very low FS face barriers to accessing mental health services, while those experiencing marginal FS who do access services are more likely to be prescribed psychotropic medications for distress arising from social and structural factors.
To assess the impact of a newly developed Central-Line Insertion Site Assessment (CLISA) score on the incidence of local inflammation or infection for CLABSI prevention.
Design:
A pre- and postintervention, quasi-experimental quality improvement study.
Setting and participants:
Adult inpatients with central venous catheters (CVCs) hospitalized in an intensive care unit or oncology ward at a large academic medical center.
Methods:
We evaluated CLISA score impact on insertion site inflammation and infection (CLISA score of 2 or 3) incidence in the baseline period (June 2014–January 2015) and the intervention period (April 2015–October 2017) using interrupted times series and generalized linear mixed-effects multivariable analyses. These were run separately for days-to-line removal from identification of a CLISA score of 2 or 3. CLISA score interrater reliability and photo quiz results were evaluated.
Results:
Among 6,957 CVCs assessed 40,846 times, percentage of lines with CLISA score of 2 or 3 in the baseline and intervention periods decreased by 78.2% (from 22.0% to 4.7%), with a significant immediate decrease in the time-series analysis (P < .001). According to the multivariable regression, the intervention was associated with lower percentage of lines with a CLISA score of 2 or 3, after adjusting for age, gender, CVC body location, and hospital unit (odds ratio, 0.15; 95% confidence interval, 0.06–0.34; P < .001). According to the multivariate regression, days to removal of lines with CLISA score of 2 or 3 was 3.19 days faster after the intervention (P < .001). Also, line dwell time decreased 37.1% from a mean of 14 days (standard deviation [SD], 10.6) to 8.8 days (SD, 9.0) (P < .001). Device utilization ratios decreased 9% from 0.64 (SD, 0.08) to 0.58 (SD, 0.06) (P = .039).
Conclusions:
The CLISA score creates a common language for assessing line infection risk and successfully promotes high compliance with best practices in timely line removal.
During 1990 we surveyed the southern sky using a multi-beam receiver at frequencies of 4850 and 843 MHz. The half-power beamwidths were 4 and 25 arcmin respectively. The finished surveys cover the declination range between +10 and −90 degrees declination, essentially complete in right ascension, an area of 7.30 steradians. Preliminary analysis of the 4850 MHz data indicates that we will achieve a five sigma flux density limit of about 30 mJy. We estimate that we will find between 80 000 and 90 000 new sources above this limit. This is a revised version of the paper presented at the Regional Meeting by the first four authors; the surveys now have been completed.
Vegetation affects feedbacks in Earth's hydrologic system, but is constrained by physiological adaptations. In extant ecosystems, the mechanisms controlling plant water used can be measured experimentally; for extinct plants in the recent geological past, water use can be inferred from nearest living relatives, assuming minimal evolutionary change. In deep time, where no close living relatives exist, fossil material provides the only information for inferring plant water use. However, mechanistic models for extinct plant water use must be built on first principles and tested on extant plants. Plants serve as a conduit for water movement from the soil to the atmosphere, constrained by tissue-level construction and gross architecture. No single feature, such as stomata or veins, encompasses enough of the complexity underpinning water-use physiology to serve as the basis of a model of functional water use in all (or perhaps any) extinct plants. Rather, a “functional whole plant” model must be used. To understand the interplay between plant and atmosphere, water use in relation to environmental conditions is investigated in an extinct plant, the seed fern Medullosa ((Division Pteridospermatophyta), by reviewing methods for reconstructing physiological variables such as leaf and stem hydraulic capacity, photosynthetic rate, transpiration rate, stomatal conductance, and albedo. Medullosans had the potential for extremely high photosynthetic and assimilation rates, water transport, stomatal conductance, and transpiration—rates comparable to later angiosperms. When these high growth and gas exchange rates of medullosans are combined with the unique atmospheric gas composition of the late Paleozoic atmosphere, complex vegetation-environmental feedbacks are expected despite their basal phylogenetic position relative to post-Paleozoic seed plants.
We conducted a time-series analysis to evaluate the impact of the ASP over a 6.25-year period (July 1, 2008–September 30, 2014) while controlling for trends during a 3-year preintervention period (July 1, 2005–June 30, 2008). The primary outcome measures were total antibacterial and antipseudomonal use in days of therapy (DOT) per 1,000 patient-days (PD). Secondary outcomes included antimicrobial costs and resistance, hospital-onset Clostridium difficile infection, and other patient-centered measures.
RESULTS
During the preintervention period, total antibacterial and antipseudomonal use were declining (−9.2 and −5.5 DOT/1,000 PD per quarter, respectively). During the stewardship period, both continued to decline, although at lower rates (−3.7 and −2.2 DOT/1,000 PD, respectively), resulting in a slope change of 5.5 DOT/1,000 PD per quarter for total antibacterial use (P=.10) and 3.3 DOT/1,000 PD per quarter for antipseudomonal use (P=.01). Antibiotic expenditures declined markedly during the stewardship period (−$295.42/1,000 PD per quarter, P=.002). There were variable changes in antimicrobial resistance and few apparent changes in C. difficile infection and other patient-centered outcomes.
CONCLUSION
In a hospital with low baseline antibiotic use, implementation of an ASP was associated with sustained reductions in total antibacterial and antipseudomonal use and declining antibiotic expenditures. Common ASP outcome measures have limitations.
The woylie Bettongia penicillata is categorized as Critically Endangered, having declined by c. 90% between 1999 and 2006. The decline continues and the cause is not fully understood. Within a decline diagnosis framework we characterized the nature of the decline and identified potential causes, with a focus on the species’ largest populations, located in south-west Western Australia. We described the spatio-temporal pattern of the decline, and several attributes that are common across sites. We categorized the potential causes of the decline as resources, predators, disease and direct human interference. Based on the available evidence the leading hypothesis is that disease may be making woylies more vulnerable to predation but this remains to be tested. No substantial recoveries have been sustained to date, and one of the three remaining indigenous populations now appears to be extinct. Therefore, verifying the factors causing the decline and those limiting recovery is becoming increasingly urgent. Active adaptive management can be used to test putative agents, such as introduced predators. Insurance populations and ecological monitoring should also be included in an integrated conservation and management strategy for the species.
The Australian Imaging, Biomarkers and Lifestyle (AIBL) Flagship Study of Ageing is a prospective study of 1,112 individuals (211 with Alzheimer's disease (AD), 133 with mild cognitive impairment (MCI), and 768 healthy controls (HCs)). Here we report diagnostic and cognitive findings at the first (18-month) follow-up of the cohort. The first aim was to compute rates of transition from HC to MCI, and MCI to AD. The second aim was to characterize the cognitive profiles of individuals who transitioned to a more severe disease stage compared with those who did not.
Methods:
Eighteen months after baseline, participants underwent comprehensive cognitive testing and diagnostic review, provided an 80 ml blood sample, and completed health and lifestyle questionnaires. A subgroup also underwent amyloid PET and MRI neuroimaging.
Results:
The diagnostic status of 89.9% of the cohorts was determined (972 were reassessed, 28 had died, and 112 did not return for reassessment). The 18-month cohort comprised 692 HCs, 82 MCI cases, 197 AD patients, and one Parkinson's disease dementia case. The transition rate from HC to MCI was 2.5%, and cognitive decline in HCs who transitioned to MCI was greatest in memory and naming domains compared to HCs who remained stable. The transition rate from MCI to AD was 30.5%.
Conclusion:
There was a high retention rate after 18 months. Rates of transition from healthy aging to MCI, and MCI to AD, were consistent with established estimates. Follow-up of this cohort over longer periods will elucidate robust predictors of future cognitive decline.