To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Though diet quality is widely recognised as linked to risk of chronic disease, health systems have been challenged to find a user-friendly, efficient way to obtain information about diet. The Penn Healthy Diet (PHD) survey was designed to fill this void. The purposes of this pilot project were to assess the patient experience with the PHD, to validate the accuracy of the PHD against related items in a diet recall and to explore scoring algorithms with relationship to the Healthy Eating Index (HEI)-2015 computed from the recall data. A convenience sample of participants in the Penn Health BioBank was surveyed with the PHD, the Automated Self-Administered 24-hour recall (ASA24) and experience questions. Kappa scores and Spearman correlations were used to compare related questions in the PHD to the ASA24. Numerical scoring, regression tree and weighted regressions were computed for scoring. Participants assessed the PHD as easy to use and were willing to repeat the survey at least annually. The three scoring algorithms were strongly associated with HEI-2015 scores using National Health and Nutrition Examination Survey 2017–2018 data from which the PHD was developed and moderately associated with the pilot replication data. The PHD is acceptable to participants and at least moderately correlated with the HEI-2015. Further validation in a larger sample will enable the selection of the strongest scoring approach.
In sub-Saharan Africa, there are no validated screening tools for delirium in older adults, despite the known vulnerability of older people to delirium and the associated adverse outcomes. This study aimed to assess the effectiveness of a brief smartphone-based assessment of arousal and attention (DelApp) in the identification of delirium amongst older adults admitted to the medical department of a tertiary referral hospital in Northern Tanzania.
Consecutive admissions were screened using the DelApp during a larger study of delirium prevalence and risk factors. All participants subsequently underwent detailed clinical assessment for delirium by a research doctor. Delirium and dementia were identified against DSM-5 criteria by consensus.
Complete data for 66 individuals were collected of whom 15 (22.7%) had delirium, 24.5% had dementia without delirium, and 10.6% had delirium superimposed on dementia. Sensitivity and specificity of the DelApp for delirium were 0.87 and 0.62, respectively (AUROC 0.77) and 0.88 and 0.73 (AUROC 0.85) for major cognitive impairment (dementia and delirium combined). Lower DelApp score was associated with age, significant visual impairment (<6/60 acuity), illness severity, reduced arousal and DSM-5 delirium on univariable analysis, but on multivariable logistic regression only arousal remained significant.
In this setting, the DelApp performed well in identifying delirium and major cognitive impairment but did not differentiate delirium and dementia. Performance is likely to have been affected by confounders including uncorrected visual impairment and reduced level of arousal without delirium. Negative predictive value was nevertheless high, indicating excellent ‘rule out’ value in this setting.
Prisons are susceptible to outbreaks. Control measures focusing on isolation and cohorting negatively affect wellbeing. We present an outbreak of coronavirus disease 2019 (COVID-19) in a large male prison in Wales, UK, October 2020 to April 2021, and discuss control measures.
We gathered case-information, including demographics, staff-residence postcode, resident cell number, work areas/dates, test results, staff interview dates/notes and resident prison-transfer dates. Epidemiological curves were mapped by prison location. Control measures included isolation (exclusion from work or cell-isolation), cohorting (new admissions and work-area groups), asymptomatic testing (case-finding), removal of communal dining and movement restrictions. Facemask use and enhanced hygiene were already in place. Whole-genome sequencing (WGS) and interviews determined the genetic relationship between cases plausibility of transmission.
Of 453 cases, 53% (n = 242) were staff, most aged 25–34 years (11.5% females, 27.15% males) and symptomatic (64%). Crude attack-rate was higher in staff (29%, 95% CI 26–64%) than in residents (12%, 95% CI 9–15%).
Whole-genome sequencing can help differentiate multiple introductions from person-to-person transmission in prisons. It should be introduced alongside asymptomatic testing as soon as possible to control prison outbreaks. Timely epidemiological investigation, including data visualisation, allowed dynamic risk assessment and proportionate control measures, minimising the reduction in resident welfare.
Wetland sediments are valuable archives of environmental change but can be challenging to date. Terrestrial macrofossils are often sparse, resulting in radiocarbon (14C) dating of less desirable organic fractions. An alternative approach for capturing changes in atmospheric 14C is the use of terrestrial microfossils. We 14C date pollen microfossils from two Australian wetland sediment sequences and compare these to ages from other sediment fractions (n = 56). For the Holocene Lake Werri Berri record, pollen 14C ages are consistent with 14C ages on bulk sediment and humic acids (n = 14), whilst Stable Polycyclic Aromatic Carbon (SPAC) 14C ages (n = 4) are significantly younger. For Welsby Lagoon, pollen concentrate 14C ages (n = 21) provide a stratigraphically coherent sequence back to 50 ka BP. 14C ages from humic acid and >100 µm fractions (n = 13) are inconsistent, and often substantially younger than pollen ages. Our comparison of Bayesian age-depth models, developed in Oxcal, Bacon and Undatable, highlight the strengths and weaknesses of the different programs for straightforward and more complex chrono-stratigraphic records. All models display broad similarities but differences in modeled age-uncertainty, particularly when age constraints are sparse. Intensive dating of wetland sequences improves the identification of outliers and generation of robust age models, regardless of program used.
There is evidence that the COVID-19 pandemic has negatively affected mental health, but most studies have been conducted in the general population.
To identify factors associated with mental health during the COVID-19 pandemic in individuals with pre-existing mental illness.
Participants (N = 2869, 78% women, ages 18–94 years) from a UK cohort (the National Centre for Mental Health) with a history of mental illness completed a cross-sectional online survey in June to August 2020. Mental health assessments were the GAD-7 (anxiety), PHQ-9 (depression) and WHO-5 (well-being) questionnaires, and a self-report question on whether their mental health had changed during the pandemic. Regressions examined associations between mental health outcomes and hypothesised risk factors. Secondary analyses examined associations between specific mental health diagnoses and mental health.
A total of 60% of participants reported that mental health had worsened during the pandemic. Younger age, difficulty accessing mental health services, low income, income affected by COVID-19, worry about COVID-19, reduced sleep and increased alcohol/drug use were associated with increased depression and anxiety symptoms and reduced well-being. Feeling socially supported by friends/family/services was associated with better mental health and well-being. Participants with a history of anxiety, depression, post-traumatic stress disorder or eating disorder were more likely to report that mental health had worsened during the pandemic than individuals without a history of these diagnoses.
We identified factors associated with worse mental health during the COVID-19 pandemic in individuals with pre-existing mental illness, in addition to specific groups potentially at elevated risk of poor mental health during the pandemic.
The mental health impact of the initial years of military service is an under-researched area. This study is the first to explore mental health trajectories and associated predictors in military members across the first 3–4 years of their career to provide evidence to inform early interventions.
This prospective cohort study surveyed Australian Defence personnel (n = 5329) at four time-points across their early military career. Core outcomes were psychological distress (K10+) and posttraumatic stress symptoms [four-item PTSD Checklist (PCL-4)] with intra-individual, organizational and event-related trajectory predictors. Latent class growth analyses (LCGAs) identified subgroups within the sample that followed similar longitudinal trajectories for these outcomes, while conditional LCGAs examined the variables that influenced patterns of mental health.
Three clear trajectories emerged for psychological distress: resilient (84.0%), worsening (9.6%) and recovery (6.5%). Four trajectories emerged for post-traumatic stress, including resilient (82.5%), recovery (9.6%), worsening (5.8%) and chronic subthreshold (2.3%) trajectories. Across both outcomes, prior trauma exposure alongside modifiable factors, such as maladaptive coping styles, and increased anger and sleep difficulties were associated with the worsening and chronic subthreshold trajectories, whilst members in the resilient trajectories were more likely to be male, report increased social support from family/friends and Australian Defence Force (ADF) sources, and use adaptive coping styles.
The emergence of symptoms of mental health problems occurs early in the military lifecycle for a significant proportion of individuals. Modifiable factors associated with wellbeing identified in this study are ideal targets for intervention, and should be embedded and consolidated throughout the military career.
Relapse and recurrence of depression are common, contributing to the overall burden of depression globally. Accurate prediction of relapse or recurrence while patients are well would allow the identification of high-risk individuals and may effectively guide the allocation of interventions to prevent relapse and recurrence.
To review prognostic models developed to predict the risk of relapse, recurrence, sustained remission, or recovery in adults with remitted major depressive disorder.
We searched the Cochrane Library (current issue); Ovid MEDLINE (1946 onwards); Ovid Embase (1980 onwards); Ovid PsycINFO (1806 onwards); and Web of Science (1900 onwards) up to May 2021. We included development and external validation studies of multivariable prognostic models. We assessed risk of bias of included studies using the Prediction model risk of bias assessment tool (PROBAST).
We identified 12 eligible prognostic model studies (11 unique prognostic models): 8 model development-only studies, 3 model development and external validation studies and 1 external validation-only study. Multiple estimates of performance measures were not available and meta-analysis was therefore not necessary. Eleven out of the 12 included studies were assessed as being at high overall risk of bias and none examined clinical utility.
Due to high risk of bias of the included studies, poor predictive performance and limited external validation of the models identified, presently available clinical prediction models for relapse and recurrence of depression are not yet sufficiently developed for deploying in clinical settings. There is a need for improved prognosis research in this clinical area and future studies should conform to best practice methodological and reporting guidelines.
Psychosis is a major mental illness with first onset in young adults. The prognosis is poor in around half of the people affected, and difficult to predict. The few tools available to predict prognosis have major weaknesses which limit their use in clinical practice. We aimed to develop and validate a risk prediction model of symptom non-remission in first-episode psychosis.
Our development cohort consisted of 1027 patients with first-episode psychosis recruited between 2005 to 2010 from 14 early intervention services across the National Health Service in England. Our validation cohort consisted of 399 patients with first-episode psychosis recruited between 2006 to 2009 from a further 11 English early intervention services. The one-year non-remission rate was 52% and 54% in the development and validation cohorts, respectively. Multivariable logistic regression was used to develop a risk prediction model for non-remission, which was externally validated.
The prediction model showed good discrimination (C-statistic of 0.74 (0.72, 0.76) and adequate calibration with intercept alpha of 0.13 (0.03, 0.23) and slope beta of 0.99 (0.87, 1.12). Our model improved the net-benefit by 16% at a risk threshold of 50%, equivalent to 16 more detected non-remitted first-episode psychosis individuals per 100 without incorrectly classifying remitted cases.
Once prospectively validated, our first episode psychosis prediction model could help identify patients at increased risk of non-remission at initial clinical contact.
To determine whether age, gender and marital status are associated with prognosis for adults with depression who sought treatment in primary care.
Medline, Embase, PsycINFO and Cochrane Central were searched from inception to 1st December 2020 for randomised controlled trials (RCTs) of adults seeking treatment for depression from their general practitioners, that used the Revised Clinical Interview Schedule so that there was uniformity in the measurement of clinical prognostic factors, and that reported on age, gender and marital status. Individual participant data were gathered from all nine eligible RCTs (N = 4864). Two-stage random-effects meta-analyses were conducted to ascertain the independent association between: (i) age, (ii) gender and (iii) marital status, and depressive symptoms at 3–4, 6–8,<Vinod: Please carry out the deletion of serial commas throughout the article> and 9–12 months post-baseline and remission at 3–4 months. Risk of bias was evaluated using QUIPS and quality was assessed using GRADE. PROSPERO registration: CRD42019129512. Pre-registered protocol https://osf.io/e5zup/.
There was no evidence of an association between age and prognosis before or after adjusting for depressive ‘disorder characteristics’ that are associated with prognosis (symptom severity, durations of depression and anxiety, comorbid panic disorderand a history of antidepressant treatment). Difference in mean depressive symptom score at 3–4 months post-baseline per-5-year increase in age = 0(95% CI: −0.02 to 0.02). There was no evidence for a difference in prognoses for men and women at 3–4 months or 9–12 months post-baseline, but men had worse prognoses at 6–8 months (percentage difference in depressive symptoms for men compared to women: 15.08% (95% CI: 4.82 to 26.35)). However, this was largely driven by a single study that contributed data at 6–8 months and not the other time points. Further, there was little evidence for an association after adjusting for depressive ‘disorder characteristics’ and employment status (12.23% (−1.69 to 28.12)). Participants that were either single (percentage difference in depressive symptoms for single participants: 9.25% (95% CI: 2.78 to 16.13) or no longer married (8.02% (95% CI: 1.31 to 15.18)) had worse prognoses than those that were married, even after adjusting for depressive ‘disorder characteristics’ and all available confounders.
Clinicians and researchers will continue to routinely record age and gender, but despite their importance for incidence and prevalence of depression, they appear to offer little information regarding prognosis. Patients that are single or no longer married may be expected to have slightly worse prognoses than those that are married. Ensuring this is recorded routinely alongside depressive ‘disorder characteristics’ in clinic may be important.
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
Frailty prevalence is higher in low- and middle-income countries (LMICs) compared with high-income countries when measured by biomedical frailty models, the most widely used being the frailty phenotype. Frailty in older people is becoming of global public health interest as a means of promoting health in old age in LMICs. As yet, little work has been done to establish to what extent the concept of frailty, as conceived according to ‘western’ biomedicine, has cross-cultural resonance for a low-income rural African setting. This study aimed to investigate the meaning of frailty contextually, using the biomedical concept of the frailty phenotype as a framework. Qualitative interviews were conducted with a purposive sample of older adults, their care-givers and community representatives in rural northern Tanzania. Thirty interview transcripts were transcribed, translated from Kiswahili to English and thematically analysed. Results reveal that despite superficial similarities in the understanding of frailty, to a great extent the physical changes highlighted by the frailty phenotype were naturalised, except when these were felt to be due to a scarcity of resources. Frailty was conceptualised as less of a physical problem of the individual, but rather, as a social problem of the community, suggesting that the frailty construct may be usefully applied cross-culturally when taking a social equity focus to the health of older people in LMICs.
This study aimed to investigate general factors associated with prognosis regardless of the type of treatment received, for adults with depression in primary care.
We searched Medline, Embase, PsycINFO and Cochrane Central (inception to 12/01/2020) for RCTs that included the most commonly used comprehensive measure of depressive and anxiety disorder symptoms and diagnoses, in primary care depression RCTs (the Revised Clinical Interview Schedule: CIS-R). Two-stage random-effects meta-analyses were conducted.
Twelve (n = 6024) of thirteen eligible studies (n = 6175) provided individual patient data. There was a 31% (95%CI: 25 to 37) difference in depressive symptoms at 3–4 months per standard deviation increase in baseline depressive symptoms. Four additional factors: the duration of anxiety; duration of depression; comorbid panic disorder; and a history of antidepressant treatment were also independently associated with poorer prognosis. There was evidence that the difference in prognosis when these factors were combined could be of clinical importance. Adding these variables improved the amount of variance explained in 3–4 month depressive symptoms from 16% using depressive symptom severity alone to 27%. Risk of bias (assessed with QUIPS) was low in all studies and quality (assessed with GRADE) was high. Sensitivity analyses did not alter our conclusions.
When adults seek treatment for depression clinicians should routinely assess for the duration of anxiety, duration of depression, comorbid panic disorder, and a history of antidepressant treatment alongside depressive symptom severity. This could provide clinicians and patients with useful and desired information to elucidate prognosis and aid the clinical management of depression.
HIV-associated neurocognitive disorders (HANDs) are prevalent in older people living with HIV (PLWH) worldwide. HAND prevalence and incidence studies of the newly emergent population of combination antiretroviral therapy (cART)-treated older PLWH in sub-Saharan Africa are currently lacking. We aimed to estimate HAND prevalence and incidence using robust measures in stable, cART-treated older adults under long-term follow-up in Tanzania and report cognitive comorbidities.
A systematic sample of consenting HIV-positive adults aged ≥50 years attending routine clinical care at an HIV Care and Treatment Centre during March–May 2016 and followed up March–May 2017.
HAND by consensus panel Frascati criteria based on detailed locally normed low-literacy neuropsychological battery, structured neuropsychiatric clinical assessment, and collateral history. Demographic and etiological factors by self-report and clinical records.
In this cohort (n = 253, 72.3% female, median age 57), HAND prevalence was 47.0% (95% CI 40.9–53.2, n = 119) despite well-managed HIV disease (Mn CD4 516 (98-1719), 95.5% on cART). Of these, 64 (25.3%) were asymptomatic neurocognitive impairment, 46 (18.2%) mild neurocognitive disorder, and 9 (3.6%) HIV-associated dementia. One-year incidence was high (37.2%, 95% CI 25.9 to 51.8), but some reversibility (17.6%, 95% CI 10.0–28.6 n = 16) was observed.
HAND appear highly prevalent in older PLWH in this setting, where demographic profile differs markedly to high-income cohorts, and comorbidities are frequent. Incidence and reversibility also appear high. Future studies should focus on etiologies and potentially reversible factors in this setting.
Debate about the nature of climate and the magnitude of ecological change across Australia during the last glacial maximum (LGM; 26.5–19 ka) persists despite considerable research into the late Pleistocene. This is partly due to a lack of detailed paleoenvironmental records and reliable chronological frameworks. Geochemical and geochronological analyses of a 60 ka sedimentary record from Brown Lake, subtropical Queensland, are presented and considered in the context of climate-controlled environmental change. Optically stimulated luminescence dating of dune crests adjacent to prominent wetlands across North Stradbroke Island (Minjerribah) returned a mean age of 119.9 ± 10.6 ka; indicating relative dune stability soon after formation in Marine Isotope Stage 5. Synthesis of wetland sediment geochemistry across the island was used to identify dust accumulation and applied as an aridification proxy over the last glacial-interglacial cycle. A positive trend of dust deposition from ca. 50 ka was found with highest influx occurring leading into the LGM. Complexities of comparing sedimentary records and the need for robust age models are highlighted with local variation influencing the accumulation of exogenic material. An inter-site comparison suggests enhanced moisture stress regionally during the last glaciation and throughout the LGM, returning to a more positive moisture balance ca. 8 ka.
During the past decade, genetics research has allowed scientists and clinicians to explore the human genome in detail and reveal many thousands of common genetic variants associated with disease. Genetic risk scores, known as polygenic risk scores (PRSs), aggregate risk information from the most important genetic variants into a single score that describes an individual’s genetic predisposition to a given disease. This article reviews recent developments in the predictive utility of PRSs in relation to a person’s susceptibility to breast cancer and coronary artery disease. Prognostic models for these disorders are built using data from the UK Biobank, controlling for typical clinical and underwriting risk factors. Furthermore, we explore the possibility of adverse selection where genetic information about multifactorial disorders is available for insurance purchasers but not for underwriters. We demonstrate that prediction of multifactorial diseases, using PRSs, provides population risk information additional to that captured by normal underwriting risk factors. This research using the UK Biobank is in the public interest as it contributes to our understanding of predicting risk of disease in the population. Further research is imperative to understand how PRSs could cause adverse selection if consumers use this information to alter their insurance purchasing behaviour.
We have previously shown that higher intake of cruciferous vegetables is inversely associated with carotid artery intima-media thickness. To further test the hypothesis that an increased consumption of cruciferous vegetables is associated with reduced indicators of structural vascular disease in other areas of the vascular tree, we aimed to investigate the cross-sectional association between cruciferous vegetable intake and extensive calcification in the abdominal aorta. Dietary intake was assessed, using a FFQ, in 684 older women from the Calcium Intake Fracture Outcome Study. Cruciferous vegetables included cabbage, Brussels sprouts, cauliflower and broccoli. Abdominal aortic calcification (AAC) was scored using the Kauppila AAC24 scale on dual-energy X-ray absorptiometry lateral spine images and was categorised as ‘not extensive’ (0–5) or ‘extensive’ (≥6). Mean age was 74·9 (sd 2·6) years, median cruciferous vegetable intake was 28·2 (interquartile range 15·0–44·7) g/d and 128/684 (18·7 %) women had extensive AAC scores. Those with higher intakes of cruciferous vegetables (>44·6 g/d) were associated with a 46 % lower odds of having extensive AAC in comparison with those with lower intakes (<15·0 g/d) after adjustment for lifestyle, dietary and CVD risk factors (ORQ4 v. Q1 0·54, 95 % CI 0·30, 0·97, P = 0·036). Total vegetable intake and each of the other vegetable types were not related to extensive AAC (P > 0·05 for all). This study strengthens the hypothesis that higher intake of cruciferous vegetables may protect against vascular calcification.