To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Debate about the nature of climate and the magnitude of ecological change across Australia during the last glacial maximum (LGM; 26.5–19 ka) persists despite considerable research into the late Pleistocene. This is partly due to a lack of detailed paleoenvironmental records and reliable chronological frameworks. Geochemical and geochronological analyses of a 60 ka sedimentary record from Brown Lake, subtropical Queensland, are presented and considered in the context of climate-controlled environmental change. Optically stimulated luminescence dating of dune crests adjacent to prominent wetlands across North Stradbroke Island (Minjerribah) returned a mean age of 119.9 ± 10.6 ka; indicating relative dune stability soon after formation in Marine Isotope Stage 5. Synthesis of wetland sediment geochemistry across the island was used to identify dust accumulation and applied as an aridification proxy over the last glacial-interglacial cycle. A positive trend of dust deposition from ca. 50 ka was found with highest influx occurring leading into the LGM. Complexities of comparing sedimentary records and the need for robust age models are highlighted with local variation influencing the accumulation of exogenic material. An inter-site comparison suggests enhanced moisture stress regionally during the last glaciation and throughout the LGM, returning to a more positive moisture balance ca. 8 ka.
The Patient Health Questionnaire (PHQ-9), the Beck Depression Inventory (BDI-II) and the Generalised Anxiety Disorder Assessment (GAD-7) are widely used in the evaluation of interventions for depression and anxiety. The smallest reduction in depressive symptoms that matter to patients is known as the Minimum Clinically Important Difference (MCID). Little empirical study of the MCID for these scales exists.
A prospective cohort of 400 patients in UK primary care were interviewed on four occasions, 2 weeks apart. At each time point, participants completed all three questionnaires and a ‘global rating of change’ scale (GRS). MCID estimation relied on estimated changes in symptoms according to reported improvement on the GRS scale, stratified by baseline severity on the Clinical Interview Schedule (CIS-R).
For moderate baseline severity, those who reported improvement on the GRS had a reduction of 21% (95% confidence interval (CI) −26.7 to −14.9) on the PHQ-9; 23% (95% CI −27.8 to −18.0) on the BDI-II and 26.8% (95% CI −33.5 to −20.1) on the GAD-7. The corresponding threshold scores below which participants were more likely to report improvement were −1.7, −3.5 and −1.5 points on the PHQ-9, BDI-II and GAD-7, respectively. Patients with milder symptoms require much larger reductions as percentage of their baseline to endorse improvement.
An MCID representing 20% reduction of scores in these scales, is a useful guide for patients with moderately severe symptoms. If treatment had the same effect on patients irrespective of baseline severity, those with low symptoms are unlikely to notice a benefit.
To estimate the impact of California’s antimicrobial stewardship program (ASP) mandate on methicillin-resistant Staphylococcus aureus (MRSA) and Clostridioides difficile infection (CDI) rates in acute-care hospitals.
Centers for Medicare and Medicaid Services (CMS)–certified acute-care hospitals in the United States.
2013–2017 data from the CMS Hospital Compare, Provider of Service File and Medicare Cost Reports.
Difference-in-difference model with hospital fixed effects to compare California with all other states before and after the ASP mandate. We considered were standardized infection ratios (SIRs) for MRSA and CDI as the outcomes. We analyzed the following time-variant covariates: medical school affiliation, bed count, quality accreditation, number of changes in ownership, compliance with CMS requirements, % intensive care unit beds, average length of stay, patient safety index, and 30-day readmission rate.
In 2013, California hospitals had an average MRSA SIR of 0.79 versus 0.94 in other states, and an average CDI SIR of 1.01 versus 0.77 in other states. California hospitals had increases (P < .05) of 23%, 30%, and 20% in their MRSA SIRs in 2015, 2016, and 2017, respectively. California hospitals were associated with a 20% (P < .001) decrease in the CDI SIR only in 2017.
The mandate was associated with a decrease in CDI SIR and an increase in MRSA SIR.
Reward Deficiency Syndrome (RDS) is an umbrella term for all drug and nondrug addictive behaviors, due to a dopamine deficiency, “hypodopaminergia.” There is an opioid-overdose epidemic in the USA, which may result in or worsen RDS. A paradigm shift is needed to combat a system that is not working. This shift involves the recognition of dopamine homeostasis as the ultimate treatment of RDS via precision, genetically guided KB220 variants, called Precision Behavioral Management (PBM). Recognition of RDS as an endophenotype and an umbrella term in the future DSM 6, following the Research Domain Criteria (RDoC), would assist in shifting this paradigm.
N95 respirators are personal protective equipment most often used to control exposures to infections transmitted via the airborne route. Supplies of N95 respirators can become depleted during pandemics or when otherwise in high demand. In this paper, we offer strategies for optimizing supplies of N95 respirators in health care settings while maximizing the level of protection offered to health care personnel when there is limited supply in the United States during the 2019 coronavirus disease pandemic. The strategies are intended for use by professionals who manage respiratory protection programs, occupational health services, and infection prevention programs in health care facilities to protect health care personnel from job-related risks of exposure to infectious respiratory illnesses. Consultation with federal, state, and local public health officials is also important. We use the framework of surge capacity and the occupational health and safety hierarchy of controls approach to discuss specific engineering control, administrative control, and personal protective equipment measures that may help in optimizing N95 respirator supplies.
Healthcare personnel who perform invasive procedures and are living with HIV or hepatitis B have been required to self-notify the NC state health department since 1992. State coordinated review of HCP utilizes a panel of experts to evaluate transmission risk and recommend infection prevention measures. We describe how this practice balances HCP privacy and patient safety and health.
To measure the association between statewide adoption of the Centers for Disease Control and Prevention’s (CDC’s) Core Elements for Hospital Antimicrobial Stewardship Programs (Core Elements) and hospital-associated methicillin-resistant Staphylococcus aureus bacteremia (MRSA) and Clostridioides difficile infection (CDI) rates in the United States. We hypothesized that states with a higher percentage of reported compliance with the Core Elements have significantly lower MRSA and CDI rates.
All US states.
Observational longitudinal study.
We used 2014–2016 data from Hospital Compare, Provider of Service files, Medicare cost reports, and the CDC’s Patient Safety Atlas website. Outcomes were MRSA standardized infection ratio (SIR) and CDI SIR. The key explanatory variable was the percentage of hospitals that meet the Core Elements in each state. We estimated state and time fixed-effects models with time-variant controls, and we weighted our analyses for the number of hospitals in the state.
The percentage of hospitals reporting compliance with the Core Elements between 2014 and 2016 increased in all states. A 1% increase in reported ASP compliance was associated with a 0.3% decrease (P < .01) in CDIs in 2016 relative to 2014. We did not find an association for MRSA infections.
Increasing documentation of the Core Elements may be associated with decreases in the CDI SIR. We did not find evidence of such an association for the MRSA SIR, probably due to the short length of the study and variety of stewardship strategies that ASPs may encompass.
Although death by neurologic criteria (brain death) is legally recognized throughout the United States, state laws and clinical practice vary concerning three key issues: (1) the medical standards used to determine death by neurologic criteria, (2) management of family objections before determination of death by neurologic criteria, and (3) management of religious objections to declaration of death by neurologic criteria. The American Academy of Neurology and other medical stakeholder organizations involved in the determination of death by neurologic criteria have undertaken concerted action to address variation in clinical practice in order to ensure the integrity of brain death determination. To complement this effort, state policymakers must revise legislation on the use of neurologic criteria to declare death. We review the legal history and current laws regarding neurologic criteria to declare death and offer proposed revisions to the Uniform Determination of Death Act (UDDA) and the rationale for these recommendations.
Cognitive-behavioural therapy (CBT) is an effective treatment for depressed adults. CBT interventions are complex, as they include multiple content components and can be delivered in different ways. We compared the effectiveness of different types of therapy, different components and combinations of components and aspects of delivery used in CBT interventions for adult depression. We conducted a systematic review of randomised controlled trials in adults with a primary diagnosis of depression, which included a CBT intervention. Outcomes were pooled using a component-level network meta-analysis. Our primary analysis classified interventions according to the type of therapy and delivery mode. We also fitted more advanced models to examine the effectiveness of each content component or combination of components. We included 91 studies and found strong evidence that CBT interventions yielded a larger short-term decrease in depression scores compared to treatment-as-usual, with a standardised difference in mean change of −1.11 (95% credible interval −1.62 to −0.60) for face-to-face CBT, −1.06 (−2.05 to −0.08) for hybrid CBT, and −0.59 (−1.20 to 0.02) for multimedia CBT, whereas wait list control showed a detrimental effect of 0.72 (0.09 to 1.35). We found no evidence of specific effects of any content components or combinations of components. Technology is increasingly used in the context of CBT interventions for depression. Multimedia and hybrid CBT might be as effective as face-to-face CBT, although results need to be interpreted cautiously. The effectiveness of specific combinations of content components and delivery formats remain unclear. Wait list controls should be avoided if possible.
Environmental risk factors for dementia are poorly understood. Aluminium and fluorine in drinking water have been linked with dementia but uncertainties remain about this relationship.
In the largest longitudinal study in this context, we set out to explore the individual effect of aluminium and fluoride in drinking water on dementia risk and, as fluorine can increase absorption of aluminium, we also examine any synergistic influence on dementia.
We used Cox models to investigate the association between mean aluminium and fluoride levels in drinking water at their residential location (collected 2005–2012 by the Drinking Water Quality Regulator for Scotland) with dementia in members of the Scottish Mental Survey 1932 cohort who were alive in 2005.
A total of 1972 out of 6990 individuals developed dementia by the linkage date in 2012. Dementia risk was raised with increasing mean aluminium levels in women (hazard ratio per s.d. increase 1.09, 95% CI 1.03–1.15, P < 0.001) and men (1.12, 95% CI 1.03–1.21, P = 0.004). A dose-response pattern of association was observed between mean fluoride levels and dementia in women (1.34, 95% CI 1.28–1.41, P < 0.001) and men (1.30, 95% CI 1.22–1.39, P < 0.001), with dementia risk more than doubled in the highest quartile compared with the lowest. There was no statistical interaction between aluminium and fluoride levels in relation with dementia.
Higher levels of aluminium and fluoride were related to dementia risk in a population of men and women who consumed relatively low drinking-water levels of both.
Healthcare organizations are required to provide workers with respiratory protection (RP) to mitigate hazardous airborne inhalation exposures. This study sought to better identify gaps that exist between RP guidance and clinical practice to understand issues that would benefit from additional research or clarification.
Breakthrough Listen is a 10-yr initiative to search for signatures of technologies created by extraterrestrial civilisations at radio and optical wavelengths. Here, we detail the digital data recording system deployed for Breakthrough Listen observations at the 64-m aperture CSIRO Parkes Telescope in New South Wales, Australia. The recording system currently implements two modes: a dual-polarisation, 1.125-GHz bandwidth mode for single-beam observations, and a 26-input, 308-MHz bandwidth mode for the 21-cm multibeam receiver. The system is also designed to support a 3-GHz single-beam mode for the forthcoming Parkes ultra-wideband feed. In this paper, we present details of the system architecture, provide an overview of hardware and software, and present initial performance results.
Intermittent energy restriction (IER) involves short periods of severe energy restriction interspersed with periods of adequate energy intake, and can induce weight loss. Insulin sensitivity is impaired by short-term, complete energy restriction, but the effects of IER are not well known. In randomised order, fourteen lean men (age: 25 (sd 4) years; BMI: 24 (sd 2) kg/m2; body fat: 17 (4) %) consumed 24-h diets providing 100 % (10 441 (sd 812) kJ; energy balance (EB)) or 25 % (2622 (sd 204) kJ; energy restriction (ER)) of estimated energy requirements, followed by an oral glucose tolerance test (OGTT; 75 g of glucose drink) after fasting overnight. Plasma/serum glucose, insulin, NEFA, glucagon-like peptide-1 (GLP-1), glucose-dependent insulinotropic peptide (GIP) and fibroblast growth factor 21 (FGF21) were assessed before and after (0 h) each 24-h dietary intervention, and throughout the 2-h OGTT. Homoeostatic model assessment of insulin resistance (HOMA2-IR) assessed the fasted response and incremental AUC (iAUC) or total AUC (tAUC) were calculated during the OGTT. At 0 h, HOMA2-IR was 23 % lower after ER compared with EB (P<0·05). During the OGTT, serum glucose iAUC (P<0·001), serum insulin iAUC (P<0·05) and plasma NEFA tAUC (P<0·01) were greater during ER, but GLP-1 (P=0·161), GIP (P=0·473) and FGF21 (P=0·497) tAUC were similar between trials. These results demonstrate that severe energy restriction acutely impairs postprandial glycaemic control in lean men, despite reducing HOMA2-IR. Chronic intervention studies are required to elucidate the long-term effects of IER on indices of insulin sensitivity, particularly in the absence of weight loss.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
There is a paucity of data examining the effect of cutlery size on the microstructure of within-meal eating behaviour or food intake. Therefore, the present studies examined how manipulation of spoon size influenced these eating behaviour measures in lean young men. In study one, subjects ate a semi-solid porridge breakfast ad libitum, until satiation. In study two, subjects ate a standardised amount of porridge, with mean bite size and mean eating rate covertly measured by observation through a one-way mirror. Both studies involved subjects completing a familiarisation visit and two experimental visits, where they ate with a teaspoon (SMALL) or dessert spoon (LARGE), in randomised order. Subjective appetite measures (hunger, fullness, desire to eat and satisfaction) were made before and after meals. In study one, subjects ate 8 % less food when they ate with the SMALL spoon (SMALL 532 (SD 189) g; LARGE 575 (SD 227) g; P=0·006). In study two, mean bite size (SMALL 10·5 (SD 1·3) g; LARGE 13·7 (SD 2·6) g; P<0·001) and eating rate (SMALL 92 (SD 25) g/min; LARGE 108 (SD 29) g/min; P<0·001) were reduced in the SMALL condition. There were no condition or interaction effects for subjective appetite measures. These results suggest that eating with a small spoon decreases ad libitum food intake, possibly via a cascade of effects on within-meal eating microstructure. A small spoon might be a practical strategy for decreasing bite size and eating rate, likely increasing oral processing, and subsequently decreasing food intake, at least in lean young men.
The Pueblo population of Chaco Canyon during the Bonito Phase (AD 800–1130) employed agricultural strategies and water-management systems to enhance food cultivation in this unpredictable environment. Scepticism concerning the timing and effectiveness of this system, however, remains common. Using optically stimulated luminescence dating of sediments and LiDAR imaging, the authors located Bonito Phase canal features at the far west end of the canyon. Additional ED-XRF and strontium isotope (87Sr/86Sr) analyses confirm the diversion of waters from multiple sources during Chaco’s occupation. The extent of this water-management system raises new questions about social organisation and the role of ritual in facilitating responses to environmental unpredictability.
Potentially modifiable risk factors for developing dementia have been identified. However, risk factors for increased mortality in patients with diagnosed dementia are not well understood. Identifying factors that influence prognosis would help clinicians plan care and address unmet needs.
To investigate diagnosed depression and sociodemographic factors as predictors of mortality in patients with dementia in UK secondary clinical care services.
We conducted a cohort study of patients with a dementia diagnosis in an electronic health records database in a UK National Health Service mental health trust.
In 3374 patients with 10 856 person-years of follow-up, comorbid depression was not associated with mortality (adjusted hazard ratio 0.94; 95% CI 0.71–1.24). Single patients had higher mortality than those who were married (adjusted hazard ratio 1.25; 95% CI 1.03–1.50). Patients of Asian ethnicity had lower mortality rates than White British patients (adjusted hazard ratio 0.50; 95% CI 0.34–0.73).
Clinically diagnosed depression does not increase mortality in patients with dementia. Patients who are single are a potential high-mortality risk group. Lower mortality rates in Asian patients with dementia that have been reported in the USA also apply in the UK.