We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Reported childhood adversity (CA) is associated with development of depression in adulthood and predicts a more severe course of illness. Although elevated serotonin 1A receptor (5-HT1AR) binding potential, especially in the raphe nuclei, has been shown to be a trait associated with major depression, we did not replicate this finding in an independent sample using the partial agonist positron emission tomography tracer [11C]CUMI-101. Evidence suggests that CA can induce long-lasting changes in expression of 5-HT1AR, and thus, a history of CA may explain the disparate findings.
Methods
Following up on our initial report, 28 unmedicated participants in a current depressive episode (bipolar n = 16, unipolar n = 12) and 19 non-depressed healthy volunteers (HVs) underwent [11C]CUMI-101 imaging to quantify 5-HT1AR binding potential. Participants in a depressive episode were stratified into mild/moderate and severe CA groups via the Childhood Trauma Questionnaire. We hypothesized higher hippocampal and raphe nuclei 5-HT1AR with severe CA compared with mild/moderate CA and HVs.
Results
There was a group-by-region effect (p = 0.011) when considering HV, depressive episode mild/moderate CA, and depressive episode severe CA groups, driven by significantly higher hippocampal 5-HT1AR binding potential in participants in a depressive episode with severe CA relative to HVs (p = 0.019). Contrary to our hypothesis, no significant binding potential differences were detected in the raphe nuclei (p-values > 0.05).
Conclusions
With replication in larger samples, elevated hippocampal 5-HT1AR binding potential may serve as a promising biomarker through which to investigate the neurobiological link between CA and depression.
Structural compromises are one of the important underpinnings of the developmental origins of health and disease. Quantifying anatomic changes during development is difficult but improved technology for clinical imaging has brought new research opportunities for visualizing such alterations. During prenatal life, maternal malnutrition, toxic social stress and exposure to toxic chemicals change fetal organ structures in specific ways. High placental resistance suppresses cardiomyocyte endowment. New imaging techniques allow quantification of nephrons in cadaverous kidneys without tedious dissection. High fat diets can lead to fatty liver and fibrosis. Pancreatic islet numbers and function are compromised by poor maternal diets. Both social and nutritional stressors change wiring and cellular composition of the brain for life. Advances in optical imaging also offer exciting new technologies for viewing structure and function in cells stressed during development.
Little is known about environmental factors that may influence associations between genetic liability to suicidality and suicidal behavior.
Methods
This study examined whether a suicidality polygenic risk score (PRS) derived from a large genome-wide association study (N = 122,935) was associated with suicide attempts in a population-based sample of European-American US military veterans (N = 1664; 92.5% male), and whether cumulative lifetime trauma exposure moderated this association.
Results
Eighty-five veterans (weighted 6.3%) reported a history of suicide attempt. After adjusting for sociodemographic and psychiatric characteristics, suicidality PRS was associated with lifetime suicide attempt (odds ratio 2.65; 95% CI 1.37–5.11). A significant suicidality PRS-by-trauma exposure interaction emerged, such that veterans with higher levels of suicidality PRS and greater trauma burden had the highest probability of lifetime suicide attempt (16.6%), whereas the probability of attempts was substantially lower among those with high suicidality PRS and low trauma exposure (1.4%). The PRS-by-trauma interaction effect was enriched for genes implicated in cellular and developmental processes, and nervous system development, with variants annotated to the DAB2 and SPNS2 genes, which are implicated in inflammatory processes. Drug repurposing analyses revealed upregulation of suicide gene-sets in the context of medrysone, a drug targeting chronic inflammation, and clofibrate, a triacylglyceride level lowering agent.
Conclusion
Results suggest that genetic liability to suicidality is associated with increased risk of suicide attempt among veterans, particularly in the presence of high levels of cumulative trauma exposure. Additional research is warranted to investigate whether incorporation of genomic information may improve suicide prediction models.
Hercules Dome, Antarctica, has long been identified as a prospective deep ice core site due to the undisturbed internal layering, climatic setting and potential to obtain proxy records from the Last Interglacial (LIG) period when the West Antarctic ice sheet may have collapsed. We performed a geophysical survey using multiple ice-penetrating radar systems to identify potential locations for a deep ice core at Hercules Dome. The surface topography, as revealed with recent satellite observations, is more complex than previously recognized. The most prominent dome, which we term ‘West Dome’, is the most promising region for a deep ice core for the following reasons: (1) bed-conformal radar reflections indicate minimal layer disturbance and extend to within tens of meters of the ice bottom; (2) the bed is likely frozen, as evidenced by both the shape of the measured vertical ice velocity profiles beneath the divide and modeled ice temperature using three remotely sensed estimates of geothermal flux and (3) models of layer thinning have 132 ka old ice at 45–90 m above the bed with an annual layer thickness of ~1 mm, satisfying the resolution and preservation needed for detailed analysis of the LIG period.
Neurocognitive testing may advance the goal of predicting near-term suicide risk. The current study examined whether performance on a Go/No-go (GNG) task, and computational modeling to extract latent cognitive variables, could enhance prediction of suicide attempts within next 90 days, among individuals at high-risk for suicide.
Method
136 Veterans at high-risk for suicide previously completed a computer-based GNG task requiring rapid responding (Go) to target stimuli, while withholding responses (No-go) to infrequent foil stimuli; behavioral variables included false alarms to foils (failure to inhibit) and missed responses to targets. We conducted a secondary analysis of these data, with outcomes defined as actual suicide attempt (ASA), other suicide-related event (OtherSE) such as interrupted/aborted attempt or preparatory behavior, or neither (noSE), within 90-days after GNG testing, to examine whether GNG variables could improve ASA prediction over standard clinical variables. A computational model (linear ballistic accumulator, LBA) was also applied, to elucidate cognitive mechanisms underlying group differences.
Results
On GNG, increased miss rate selectively predicted ASA, while increased false alarm rate predicted OtherSE (without ASA) within the 90-day follow-up window. In LBA modeling, ASA (but not OtherSE) was associated with decreases in decisional efficiency to targets, suggesting differences in the evidence accumulation process were specifically associated with upcoming ASA.
Conclusions
These findings suggest that GNG may improve prediction of near-term suicide risk, with distinct behavioral patterns in those who will attempt suicide within the next 90 days. Computational modeling suggests qualitative differences in cognition in individuals at near-term risk of suicide attempt.
This chapter is reflective in nature and aims to demonstrate the dynamic, non-linear and complex trajectories involved in the journey towards long-term recovery from substance misuse. The authors first discuss some of the challenges in identifying and analysing long-term recovery. They then present two real-life examples of people with lived experience of long-term recovery and offer clinical reflections on these. The chapter ends with a discussion of these case examples and their limitations, offering recommendations for future research and a way forward that is more sensitive to the needs of people in long-term recovery.
Context
Substance misuse continues to pose a risk to public health and the safety of people living in the United Kingdom (UK). The 2019 European Drug Report indicated that treatment entrants for opiate and crack cocaine clients in the UK have recently increased (EMCDDA, 2019). In 2019, the UK had the highest overdose mortality rate in Europe (National Records of Scotland, 2019). Although this information does not provide a complete picture of the drug situation, it could indicate that drug treatment providers are struggling to keep up with the British government's target of ‘increas[ing] the rate of individuals recovering from their dependence’ (H.M. Government 2017, p 6). This shows that there remains a need for the scientific community to focus on recovery from substance misuse and its various trajectories and manifestations.
Traditionally, recovery has been analysed and understood by addiction science as either the non-use of substances or the absence of symptoms relating to addictive behaviours; in other words, as a quantifiable treatment outcome (Ashford et al, 2018). However, Brown and Ashford (2019, p 2) argue that ‘the absence of pathology reveals little information about the initialisation and sustainment of recovery’. In other words, the traditional way of studying recovery from substance misuse predominantly involves studying causes and effects of the substance-use disease (that is, pathology), which has reduced our understanding of recovery as being the absence of substance-misuse symptoms (Brown and Ashford, 2019). When, for example, an individual with co-occurring substance misuse and mental health issues stops using substances, they may still suffer from mental health issues which might make it hard to consider them as recovered.
Paramedics received training in point-of-care ultrasound (POCUS) to assess for cardiac contractility during management of medical out-of-hospital cardiac arrest (OHCA). The primary outcome was the percentage of adequate POCUS video acquisition and accurate video interpretation during OHCA resuscitations. Secondary outcomes included POCUS impact on patient management and resuscitation protocol adherence.
Methods:
A prospective, observational cohort study of paramedics was performed following a four-hour training session, which included a didactic lecture and hands-on POCUS instruction. The Prehospital Echocardiogram in Cardiac Arrest (PECA) protocol was developed and integrated into the resuscitation algorithm for medical non-shockable OHCA. The ultrasound (US) images were reviewed by a single POCUS expert investigator to determine the adequacy of the POCUS video acquisition and accuracy of the video interpretation. Change in patient management and resuscitation protocol adherence data, including end-tidal carbon dioxide (EtCO2) monitoring following advanced airway placement, adrenaline administration, and compression pauses under ten seconds, were queried from the prehospital electronic health record (EHR).
Results:
Captured images were deemed adequate in 42/49 (85.7%) scans and paramedic interpretation of sonography was accurate in 43/49 (87.7%) scans. The POCUS results altered patient management in 14/49 (28.6%) cases. Paramedics adhered to EtCO2 monitoring in 36/36 (100.0%) patients with an advanced airway, adrenaline administration for 38/38 (100.0%) patients, and compression pauses under ten seconds for 36/38 (94.7%) patients.
Conclusion:
Paramedics were able to accurately obtain and interpret cardiac POCUS videos during medical OHCA while adhering to a resuscitation protocol. These findings suggest that POCUS can be effectively integrated into paramedic protocols for medical OHCA.
A multi-disciplinary expert group met to discuss vitamin D deficiency in the UK and strategies for improving population intakes and status. Changes to UK Government advice since the 1st Rank Forum on Vitamin D (2009) were discussed, including rationale for setting a reference nutrient intake (10 µg/d; 400 IU/d) for adults and children (4+ years). Current UK data show inadequate intakes among all age groups and high prevalence of low vitamin D status among specific groups (e.g. pregnant women and adolescent males/females). Evidence of widespread deficiency within some minority ethnic groups, resulting in nutritional rickets (particularly among Black and South Asian infants), raised particular concern. Latest data indicate that UK population vitamin D intakes and status reamain relatively unchanged since Government recommendations changed in 2016. Vitamin D food fortification was discussed as a potential strategy to increase population intakes. Data from dose–response and dietary modelling studies indicate dairy products, bread, hens’ eggs and some meats as potential fortification vehicles. Vitamin D3 appears more effective than vitamin D2 for raising serum 25-hydroxyvitamin D concentration, which has implications for choice of fortificant. Other considerations for successful fortification strategies include: (i) need for ‘real-world’ cost information for use in modelling work; (ii) supportive food legislation; (iii) improved consumer and health professional understanding of vitamin D’s importance; (iv) clinical consequences of inadequate vitamin D status and (v) consistent communication of Government advice across health/social care professions, and via the food industry. These areas urgently require further research to enable universal improvement in vitamin D intakes and status in the UK population.
The coronavirus disease-2019 (COVID-19) pandemic has caused myriad health, social, and economic stressors. To date, however, no known study has examined changes in mental health during the pandemic in the U.S. military veteran population.
Methods
Data were analyzed from the 2019–2020 National Health and Resilience in Veterans Study, a nationally representative, prospective cohort survey of 3078 veterans. Pre-to-peri-pandemic changes in psychiatric symptoms were evaluated, as well as pre-pandemic risk and protective factors and pandemic-related correlates of increased psychiatric distress.
Results
The prevalence of generalized anxiety disorder (GAD) positive screens increased from pre- to peri-pandemic (7.1% to 9.4%; p < 0.001) and was driven by an increase among veterans aged 45–64 years (8.2% to 13.5%; p < 0.001), but the prevalence of major depressive disorder and posttraumatic stress disorder positive screens remained stable. Using a continuous measure of psychiatric distress, an estimated 13.2% of veterans reported a clinically meaningful pre-to-peri-pandemic increase in distress (mean = 1.1 standard deviation). Veterans with a larger pre-pandemic social network size and secure attachment style were less likely to experience increased distress, whereas veterans reporting more pre-pandemic loneliness were more likely to experience increased distress. Concerns about pandemic-related social losses, mental health COVID-19 effects, and housing stability during the pandemic were associated with increased distress, over-and-above pre-pandemic factors.
Conclusions
Although most U.S. veterans showed resilience to mental health problems nearly 1 year into the pandemic, the prevalence of GAD positive screens increased, particularly among middle-aged veterans, and one of seven veterans experienced increased distress. Clinical implications of these findings are discussed.
Antisaccade tasks can be used to index cognitive control processes, e.g. attention, behavioral inhibition, working memory, and goal maintenance in people with brain disorders. Though diagnoses of schizophrenia (SZ), schizoaffective (SAD), and bipolar I with psychosis (BDP) are typically considered to be distinct entities, previous work shows patterns of cognitive deficits differing in degree, rather than in kind, across these syndromes.
Methods
Large samples of individuals with psychotic disorders were recruited through the Bipolar-Schizophrenia Network on Intermediate Phenotypes 2 (B-SNIP2) study. Anti- and pro-saccade task performances were evaluated in 189 people with SZ, 185 people with SAD, 96 people with BDP, and 279 healthy comparison participants. Logistic functions were fitted to each group's antisaccade speed-performance tradeoff patterns.
Results
Psychosis groups had higher antisaccade error rates than the healthy group, with SZ and SAD participants committing 2 times as many errors, and BDP participants committing 1.5 times as many errors. Latencies on correctly performed antisaccade trials in SZ and SAD were longer than in healthy participants, although error trial latencies were preserved. Parameters of speed-performance tradeoff functions indicated that compared to the healthy group, SZ and SAD groups had optimal performance characterized by more errors, as well as less benefit from prolonged response latencies. Prosaccade metrics did not differ between groups.
Conclusions
With basic prosaccade mechanisms intact, the higher speed-performance tradeoff cost for antisaccade performance in psychosis cases indicates a deficit that is specific to the higher-order cognitive aspects of saccade generation.
During the Randomized Assessment of Rapid Endovascular Treatment (EVT) of Ischemic Stroke (ESCAPE) trial, patient-level micro-costing data were collected. We report a cost-effectiveness analysis of EVT, using ESCAPE trial data and Markov simulation, from a universal, single-payer system using a societal perspective over a patient’s lifetime.
Methods:
Primary data collection alongside the ESCAPE trial provided a 3-month trial-specific, non-model, based cost per quality-adjusted life year (QALY). A Markov model utilizing ongoing lifetime costs and life expectancy from the literature was built to simulate the cost per QALY adopting a lifetime horizon. Health states were defined using the modified Rankin Scale (mRS) scores. Uncertainty was explored using scenario analysis and probabilistic sensitivity analysis.
Results:
The 3-month trial-based analysis resulted in a cost per QALY of $201,243 of EVT compared to the best standard of care. In the model-based analysis, using a societal perspective and a lifetime horizon, EVT dominated the standard of care; EVT was both more effective and less costly than the standard of care (−$91). When the time horizon was shortened to 1 year, EVT remains cost savings compared to standard of care (∼$15,376 per QALY gained with EVT). However, if the estimate of clinical effectiveness is 4% less than that demonstrated in ESCAPE, EVT is no longer cost savings compared to standard of care.
Conclusions:
Results support the adoption of EVT as a treatment option for acute ischemic stroke, as the increase in costs associated with caring for EVT patients was recouped within the first year of stroke, and continued to provide cost savings over a patient’s lifetime.
Selenium is an essential micronutrient with biochemical and cellular effects through activities of 25 selenocysteine-containing selenoproteins. Selenoproteins are anti-inflammatory and have antioxidant properties. Severe selenium deficiency causes muscle weakness and atrophy in humans however the effects of moderate selenium deficiency are unclear. The aims of this study are twofold: 1) to determine dietary selenium intakes and contributing food sources in very old adults and; 2) to determine whether dietary selenium intakes are associated with 5-year trajectories of muscle function: hand-grip strength (HGS) and Timed-Up-and-Go (TUG).
Cross-sectional (baseline) and prospective (1.5, 3 and 5-year follow-up) analyses of 845 participants aged 85 years from the Newcastle 85 + study were assessed for HGS and TUG performance using standardized protocols (Antoneta et al. 2016). Baseline dietary intakes were assessed using 24-hour multiple pass recall methods on two separate days (Mendonça et al. 2016). The top selenium food contributors (~90%) and the adequacy of intakes were determined i.e. those with intakes < LRNI, between the LRNI and RNI and > RNI. Linear mixed models explored the associations between selenium intake categories and time on the prospective, 5-year change in HGS and TUG in all participants, males and females.
Median intakes of selenium were 39, 48 and 35μg for all participants, males and females, respectively. Selenium intakes were below the LRNI in 51% of participants (median 27μg) whilst 15% had intakes ≥ the RNI (median 85μg). Only 13.3% of females and 16.9 % of males met the RNI. The top selenium contributors were cereals (46%), meat (22%), fish (10%), milk (6%), eggs (4%) and potatoes (3%) making up 91% of selenium intakes. Those with the lowest intakes had 2.72 kg lower HGS and 2.36s slower TUG compared to those with higher intakes (P < 0.005). There was no association between selenium intake in HGS or TUG, but time had a significant effect on the rate of change over 5-years in both parameters (P < 0.001).
Overall these results show that poor dietary selenium intakes are common in very old adults and that cereal and cereal products are major sources of selenium in this population. Whilst low selenium intakes are associated with worse HGS and TUG performance in the cross-sectional analysis, no significant associations were observed in the prospective analyses.
John Stuart Mill is remembered today primarily for his views on freedom both in its negative form (privacy) and in its affirmative form (self-expression). But there is a third important aspect of his thought for which he is, unjustly, seldom remembered at all. Mill was a true egalitarian – and in the modern sense of that word “equality.” It was Mill who began to harmonize liberty and equality in a way that led to the modern progressive synthesis of the two values. It was Mill, moreover, who was an impassioned critic of slavery, racism and gender discrimination – at a time when these views were neither popular (even among the intelligentsia) nor considered part of the panoply of “liberal” positions. It was Mill who began to make equality an integral part of liberalism rather than a value in tension with liberty, as classical liberals had seen it.
When the Supreme Court overruled Lochner and its progeny, it did not – for long, at least – discontinue using the Due Process clause and other constitutional provisions to protect un-enumerated constitutional rights. Within a few years it began to work out the contours of a new conception of personal freedom, one with very different social and political ramifications than the older, classical liberal ideal exemplified by Lochner. Central to the more recent idea of liberty is the notion of expressive self-individuation. When expressed in constitutional terms, this ideal has three dimensions: First, there must be an individual realm which is, as much as possible, free from governmental and social influences – a realm of privacy in which the individual can “become himself” and live the kind of life he or she wishes. Second, the individual must be free to express his ideas, opinions, thoughts and – in the deepest sense – himself in the non-private sphere.
If Mill is insufficiently credited for the influence he has had on the right to privacy, he is generally acknowledged as one of the premier architects of our modern understanding of freedom of speech. Chapter 2 of On Liberty, entitled “Of the Liberty of Thought and Discussion,” is widely acknowledged as the single most influential defense of a capaciously generous theory of freedom of speech and press anywhere in modern liberal political thought. No one has influenced the development of modern First Amendment freedom of speech as deeply or as broadly as has Mill.
This chapter is intended as an introduction to constitutional history for those readers who are not lawyers, though it may also serve as a review for those who are. It seeks to explain the evolution of our current understanding of constitutional rights and to place our constitutional tradition in the broader framework of the liberal tradition.
The cemetery Saint Veran in Avignon, France is a twenty minute walk outside the walls of the old city, a short distance from the palace of the fourteenth century popes and the river Rhone. Toward the back of the cemetery, inauspiciously nestled among the markers and mausoleums, stands a sepulchre of flawless white Carrara marble – the only one in sight without a trace of religious symbolism. It was here that John Stuart Mill buried his wife of seven years, Harriet Taylor Mill, after she succumbed to what Mill called “the family disease” – tuberculosis – in November, 1858. There is an old legend that the cottage Mill purchased after her death, overlooked the cemetery and that Mill could look upon Harriet’s grave from his window. The legend is, as the cemetery caretaker described to me, “finely formed, but not fully true.” The cottage was actually about a ten minute walk from the cemetery.