We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
Methods
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Results
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
Conclusions
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
Unit cohesion may protect service member mental health by mitigating effects of combat exposure; however, questions remain about the origins of potential stress-buffering effects. We examined buffering effects associated with two forms of unit cohesion (peer-oriented horizontal cohesion and subordinate-leader vertical cohesion) defined as either individual-level or aggregated unit-level variables.
Methods
Longitudinal survey data from US Army soldiers who deployed to Afghanistan in 2012 were analyzed using mixed-effects regression. Models evaluated individual- and unit-level interaction effects of combat exposure and cohesion during deployment on symptoms of post-traumatic stress disorder (PTSD), depression, and suicidal ideation reported at 3 months post-deployment (model n's = 6684 to 6826). Given the small effective sample size (k = 89), the significance of unit-level interactions was evaluated at a 90% confidence level.
Results
At the individual-level, buffering effects of horizontal cohesion were found for PTSD symptoms [B = −0.11, 95% CI (−0.18 to −0.04), p < 0.01] and depressive symptoms [B = −0.06, 95% CI (−0.10 to −0.01), p < 0.05]; while a buffering effect of vertical cohesion was observed for PTSD symptoms only [B = −0.03, 95% CI (−0.06 to −0.0001), p < 0.05]. At the unit-level, buffering effects of horizontal (but not vertical) cohesion were observed for PTSD symptoms [B = −0.91, 90% CI (−1.70 to −0.11), p = 0.06], depressive symptoms [B = −0.83, 90% CI (−1.24 to −0.41), p < 0.01], and suicidal ideation [B = −0.32, 90% CI (−0.62 to −0.01), p = 0.08].
Conclusions
Policies and interventions that enhance horizontal cohesion may protect combat-exposed units against post-deployment mental health problems. Efforts to support individual soldiers who report low levels of horizontal or vertical cohesion may also yield mental health benefits.
Frascati international research criteria for HIV-associated neurocognitive disorders (HAND) are controversial; some investigators have argued that Frascati criteria are too liberal, resulting in a high false positive rate. Meyer et al. recommended more conservative revisions to HAND criteria, including exploring other commonly used methodologies for neurocognitive impairment (NCI) in HIV including the global deficit score (GDS). This study compares NCI classifications by Frascati, Meyer, and GDS methods, in relation to neuroimaging markers of brain integrity in HIV.
Method:
Two hundred forty-one people living with HIV (PLWH) without current substance use disorder or severe (confounding) comorbid conditions underwent comprehensive neurocognitive testing and brain structural magnetic resonance imaging and magnetic resonance spectroscopy. Participants were classified using Frascati criteria versus Meyer criteria: concordant unimpaired [Frascati(Un)/Meyer(Un)], concordant impaired [Frascati(Imp)/Meyer(Imp)], or discordant [Frascati(Imp)/Meyer(Un)] which were impaired via Frascati criteria but unimpaired via Meyer criteria. To investigate the GDS versus Meyer criteria, the same groupings were utilized using GDS criteria instead of Frascati criteria.
Results:
When examining Frascati versus Meyer criteria, discordant Frascati(Imp)/Meyer(Un) individuals had less cortical gray matter, greater sulcal cerebrospinal fluid volume, and greater evidence of neuroinflammation (i.e., choline) than concordant Frascati(Un)/Meyer(Un) individuals. GDS versus Meyer comparisons indicated that discordant GDS(Imp)/Meyer(Un) individuals had less cortical gray matter and lower levels of energy metabolism (i.e., creatine) than concordant GDS(Un)/Meyer(Un) individuals. In both sets of analyses, the discordant group did not differ from the concordant impaired group on any neuroimaging measure.
Conclusions:
The Meyer criteria failed to capture a substantial portion of PLWH with brain abnormalities. These findings support continued use of Frascati or GDS criteria to detect HIV-associated CNS dysfunction.
Whereas genetic susceptibility increases the risk for major depressive disorder (MDD), non-genetic protective factors may mitigate this risk. In a large-scale prospective study of US Army soldiers, we examined whether trait resilience and/or unit cohesion could protect against the onset of MDD following combat deployment, even in soldiers at high polygenic risk.
Methods
Data were analyzed from 3079 soldiers of European ancestry assessed before and after their deployment to Afghanistan. Incident MDD was defined as no MDD episode at pre-deployment, followed by a MDD episode following deployment. Polygenic risk scores were constructed from a large-scale genome-wide association study of major depression. We first examined the main effects of the MDD PRS and each protective factor on incident MDD. We then tested the effects of each protective factor on incident MDD across strata of polygenic risk.
Results
Polygenic risk showed a dose–response relationship to depression, such that soldiers at high polygenic risk had greatest odds for incident MDD. Both unit cohesion and trait resilience were prospectively associated with reduced risk for incident MDD. Notably, the protective effect of unit cohesion persisted even in soldiers at highest polygenic risk.
Conclusions
Polygenic risk was associated with new-onset MDD in deployed soldiers. However, unit cohesion – an index of perceived support and morale – was protective against incident MDD even among those at highest genetic risk, and may represent a potent target for promoting resilience in vulnerable soldiers. Findings illustrate the value of combining genomic and environmental data in a prospective design to identify robust protective factors for mental health.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
In a large and comprehensively assessed sample of patients with bipolar disorder type I (BDI), we investigated the prevalence of psychotic features and their relationship with life course, demographic, clinical, and cognitive characteristics. We hypothesized that groups of psychotic symptoms (Schneiderian, mood incongruent, thought disorder, delusions, and hallucinations) have distinct relations to risk factors.
Methods
In a cross-sectional study of 1342 BDI patients, comprehensive demographical and clinical characteristics were assessed using the Structured Clinical Interview for DSM-IV (SCID-I) interview. In addition, levels of childhood maltreatment and intelligence quotient (IQ) were assessed. The relationships between these characteristics and psychotic symptoms were analyzed using multiple general linear models.
Results
A lifetime history of psychotic symptoms was present in 73.8% of BDI patients and included delusions in 68.9% of patients and hallucinations in 42.6%. Patients with psychotic symptoms showed a significant younger age of disease onset (β = −0.09, t = −3.38, p = 0.001) and a higher number of hospitalizations for manic episodes (F11 338 = 56.53, p < 0.001). Total IQ was comparable between groups. Patients with hallucinations had significant higher levels of childhood maltreatment (β = 0.09, t = 3.04, p = 0.002).
Conclusions
In this large cohort of BDI patients, the vast majority of patients had experienced psychotic symptoms. Psychotic symptoms in BDI were associated with an earlier disease onset and more frequent hospitalizations particularly for manic episodes. The study emphasizes the strength of the relation between childhood maltreatment and hallucinations but did not identify distinct subgroups based on psychotic features and instead reported of a large heterogeneity of psychotic symptoms in BD.
Little is known about the association of cortical Aβ with depression and anxiety among cognitively normal (CN) elderly persons.
Methods:
We conducted a cross-sectional study derived from the population-based Mayo Clinic Study of Aging in Olmsted County, Minnesota; involving CN persons aged ≥ 60 years that underwent PiB-PET scans and completed Beck Depression Inventory-II (BDI-II) and Beck Anxiety Inventory (BAI). Cognitive diagnosis was made by an expert consensus panel. Participants were classified as having abnormal (≥1.4; PiB+) or normal PiB-PET (<1.4; PiB−) using a global cortical to cerebellar ratio. Multi-variable logistic regression analyses were performed to calculate odds ratios (OR) and 95% confidence intervals (95% CI) after adjusting for age and sex.
Results:
Of 1,038 CN participants (53.1% males), 379 were PiB+. Each one point symptom increase in the BDI (OR = 1.03; 1.00–1.06) and BAI (OR = 1.04; 1.01–1.08) was associated with increased odds of PiB-PET+. The number of participants with BDI > 13 (clinical depression) was greater in the PiB-PET+ than PiB-PET- group but the difference was not significant (OR = 1.42; 0.83–2.43). Similarly, the number of participants with BAI > 10 (clinical anxiety) was greater in the PiB-PET+ than PiB-PET− group but the difference was not significant (OR = 1.77; 0.97–3.22).
Conclusions:
As expected, depression and anxiety levels were low in this community-dwelling sample, which likely reduced our statistical power. However, we observed an informative albeit weak association between increased BDI and BAI scores and elevated cortical amyloid deposition. This observation needs to be tested in a longitudinal cohort study.
Investigations of drinking behavior across military deployment cycles are scarce, and few prospective studies have examined risk factors for post-deployment alcohol misuse.
Methods
Prevalence of alcohol misuse was estimated among 4645 US Army soldiers who participated in a longitudinal survey. Assessment occurred 1–2 months before soldiers deployed to Afghanistan in 2012 (T0), upon their return to the USA (T1), 3 months later (T2), and 9 months later (T3). Weights-adjusted logistic regression was used to evaluate associations of hypothesized risk factors with post-deployment incidence and persistence of heavy drinking (HD) (consuming 5 + alcoholic drinks at least 1–2×/week) and alcohol or substance use disorder (AUD/SUD).
Results
Prevalence of past-month HD at T0, T2, and T3 was 23.3% (s.e. = 0.7%), 26.1% (s.e. = 0.8%), and 22.3% (s.e. = 0.7%); corresponding estimates for any binge drinking (BD) were 52.5% (s.e. = 1.0%), 52.5% (s.e. = 1.0%), and 41.3% (s.e. = 0.9%). Greater personal life stress during deployment (e.g., relationship, family, or financial problems) – but not combat stress – was associated with new onset of HD at T2 [per standard score increase: adjusted odds ratio (AOR) = 1.20, 95% CI 1.06–1.35, p = 0.003]; incidence of AUD/SUD at T2 (AOR = 1.54, 95% CI 1.25–1.89, p < 0.0005); and persistence of AUD/SUD at T2 and T3 (AOR = 1.30, 95% CI 1.08–1.56, p = 0.005). Any BD pre-deployment was associated with post-deployment onset of HD (AOR = 3.21, 95% CI 2.57–4.02, p < 0.0005) and AUD/SUD (AOR = 1.85, 95% CI 1.27–2.70, p = 0.001).
Conclusions
Alcohol misuse is common during the months preceding and following deployment. Timely intervention aimed at alleviating/managing personal stressors or curbing risky drinking might reduce risk of alcohol-related problems post-deployment.
Anxiety disorders are common, and cognitive–behavioural therapy (CBT) is a first-line treatment. Candidate gene studies have suggested a genetic basis to treatment response, but findings have been inconsistent.
Aims
To perform the first genome-wide association study (GWAS) of psychological treatment response in children with anxiety disorders (n = 980).
Method
Presence and severity of anxiety was assessed using semi-structured interview at baseline, on completion of treatment (post-treatment), and 3 to 12 months after treatment completion (follow-up). DNA was genotyped using the Illumina Human Core Exome-12v1.0 array. Linear mixed models were used to test associations between genetic variants and response (change in symptom severity) immediately post-treatment and at 6-month follow-up.
Results
No variants passed a genome-wide significance threshold (P=5×10–8) in either analysis. Four variants met criteria for suggestive significance (P<5×10–6) in association with response post-treatment, and three variants in the 6-month follow-up analysis.
Conclusions
This is the first genome-wide therapygenetic study. It suggests no common variants of very high effect underlie response to CBT. Future investigations should maximise power to detect single-variant and polygenic effects by using larger, more homogeneous cohorts.
Obesity in pets is a frustrating, major health problem. Obesity in human children is similar. Prevailing theories accounting for the rising obesity rates – for example, poor nutrition and sedentary activity – are being challenged. Obesity interventions in both pets and children have produced modest short-term but poor long-term results. New strategies are needed. A novel theory posits that obesity in pets and children is due to ‘treats’ and excessive meal amounts given by the ‘pet–parent’ and child–parent to obtain affection from the pet/child, which enables ‘eating addiction’ in the pet/child and results in parental ‘co-dependence’. Pet–parents and child–parents may even become hostage to the treats/food to avoid the ire of the pet/child. Eating addiction in the pet/child also may be brought about by emotional factors such as stress, independent of parental co-dependence. An applicable treatment for child obesity has been trialled using classic addiction withdrawal/abstinence techniques, as well as behavioural addiction methods, with significant results. Both the child and the parent progress through withdrawal from specific ‘problem foods’, next from snacking (non-specific foods) and finally from excessive portions at meals (gradual reductions). This approach should adapt well for pets and pet–parents. Pet obesity is more ‘pure’ than child obesity, in that contributing factors and treatment points are essentially under the control of the pet–parent. Pet obesity might thus serve as an ideal test bed for the treatment and prevention of child obesity, with focus primarily on parental behaviours. Sharing information between the fields of pet and child obesity would be mutually beneficial.
The Amsterdam glacial basin was a major sedimentary sink from late Saalian until late Eemian (Picea zone, E6) times. The basin’s exemplary record makes it a potential reference area for the last interglacial stage. The cored Amsterdam-Terminal borehole was drilled in 1997 to provide a record throughout the Eemian interglacial. Integrated facies analysis has resulted in a detailed reconstruction of the sedimentary history.
After the Saalian ice mass had disappeared from the area, a large, deep lake had come into being, fed by the Rhine river. At the end of the glacial, the lake became smaller because it was cut off from the river-water supply, and eventually only a number of shallow pools remained in the Amsterdam basin. During the early Eemian (Betula zone, El), a seepage lake existed at the site. The lake deepened under the influence of a steadily rising sea level and finally evolved into a silled lagoon (late Quercus zone, E3). Initially, the lagoon water had fairly stable stratification, but as the sea level continued to rise the sill lost its significance, the lagoon becoming well mixed by the middle of the Corylus/Taxus zone (E4b). The phase of free exchange with the open sea ended in the early Carpinus zone (E5), when barriers developed in the sill area causing the lagoon to become stratified again. During the Late Eemian (late E5), a more dynamic system developed. The sandy barriers that had obstructed exchange with the open sea were no longer effective, and a tidally-influenced coastal lagoon formed.
The Eemian sedimentary history shown in the Amsterdam-Terminal borehole is intimately connected with the sea-level history. Because the site includes both a high-resolution pollen signal and a record of sea-level change, it has potential for correlation on various scales. Palaeomagnetic results show that the sediments predate the Blake Event, which confirms that this reversal excursion is relatively young. The U/Th age of the uppermost part of the Eemian sequence is 118.2±6.3 ka.
We previously reported an association between 5HTTLPR genotype and
outcome following cognitive–behavioural therapy (CBT) in child anxiety
(Cohort 1). Children homozygous for the low-expression short-allele
showed more positive outcomes. Other similar studies have produced mixed
results, with most reporting no association between genotype and CBT
outcome.
Aims
To replicate the association between 5HTTLPR and CBT outcome in child
anxiety from the Genes for Treatment study (GxT Cohort 2,
n = 829).
Method
Logistic and linear mixed effects models were used to examine the
relationship between 5HTTLPR and CBT outcomes. Mega-analyses using both
cohorts were performed.
Results
There was no significant effect of 5HTTLPR on CBT outcomes in Cohort 2.
Mega-analyses identified a significant association between 5HTTLPR and
remission from all anxiety disorders at follow-up (odds ratio 0.45,
P = 0.014), but not primary anxiety disorder
outcomes.
Conclusions
The association between 5HTTLPR genotype and CBT outcome did not
replicate. Short-allele homozygotes showed more positive treatment
outcomes, but with small, non-significant effects. Future studies would
benefit from utilising whole genome approaches and large, homogenous
samples.
HIV-associated cognitive impairments are prevalent, and are consistent with injury to both frontal cortical and subcortical regions of the brain. The current study aimed to assess the association of HIV infection with functional connections within the frontostriatal network, circuitry hypothesized to be highly vulnerable to HIV infection. Fifteen HIV-positive and 15 demographically matched control participants underwent 6 min of resting-state functional magnetic resonance imaging (RS-fMRI). Multivariate group comparisons of age-adjusted estimates of connectivity within the frontostriatal network were derived from BOLD data for dorsolateral prefrontal cortex (DLPFC), dorsal caudate and mediodorsal thalamic regions of interest. Whole-brain comparisons of group differences in frontostriatal connectivity were conducted, as were pairwise tests of connectivity associations with measures of global cognitive functioning and clinical and immunological characteristics (nadir and current CD4 count, duration of HIV infection, plasma HIV RNA). HIV – associated reductions in connectivity were observed between the DLPFC and the dorsal caudate, particularly in younger participants (<50 years, N=9). Seropositive participants also demonstrated reductions in dorsal caudate connectivity to frontal and parietal brain regions previously demonstrated to be functionally connected to the DLPFC. Cognitive impairment, but none of the assessed clinical/immunological variables, was also associated with reduced frontostriatal connectivity. In conclusion, our data indicate that HIV is associated with attenuated intrinsic frontostriatal connectivity. Intrinsic connectivity of this network may therefore serve as a marker of the deleterious effects of HIV infection on the brain, possibly via HIV-associated dopaminergic abnormalities. These findings warrant independent replication in larger studies. (JINS, 2015, 21, 1–11)
The redshifted 21cm line of neutral hydrogen (Hi), potentially observable at low radio frequencies (~50–200 MHz), should be a powerful probe of the physical conditions of the inter-galactic medium during Cosmic Dawn and the Epoch of Reionisation (EoR). The sky-averaged Hi signal is expected to be extremely weak (~100 mK) in comparison to the foreground of up to 104 K at the lowest frequencies of interest. The detection of such a weak signal requires an extremely stable, well characterised system and a good understanding of the foregrounds. Development of a nearly perfectly (~mK accuracy) calibrated total power radiometer system is essential for this type of experiment. We present the BIGHORNS (Broadband Instrument for Global HydrOgen ReioNisation Signal) experiment which was designed and built to detect the sky-averaged Hi signal from the EoR at low radio frequencies. The BIGHORNS system is a mobile total power radiometer, which can be deployed in any remote location in order to collect radio frequency interference (RFI) free data. The system was deployed in remote, radio quiet locations in Western Australia and low RFI sky data have been collected. We present a description of the system, its characteristics, details of data analysis, and calibration. We have identified multiple challenges to achieving the required measurement precision, which triggered two major improvements for the future system.
Efficacy of depression treatments, including adjunctive antipsychotic treatment, has not been explored for patients with worsening symptoms after antidepressant therapy (ADT).
Methods
This post-hoc analysis utilized pooled data from 3 similarly designed, randomized, double-blind, placebo-controlled trials that assessed the efficacy, safety, and tolerability of adjunctive aripiprazole in patients with major depressive disorder with inadequate response to ADT. The studies had 2 phases: an 8-week prospective ADT phase and 6-week adjunctive (aripiprazole or placebo) treatment phase. This analysis focused on patients whose symptoms worsened during the prospective 8-week ADT phase (worsening defined as >0% increase in Montgomery–Åsberg Depressive Rating Scale [MADRS] Total score). During the 6-week, double-blind, adjunctive phase, response was defined as ≥50% reduction in MADRS Total score and remission as ≥50% reduction in MADRS Total score and MADRS score ≤10.
Results
Of 1065 patients who failed to achieve a response during the prospective phase, 160 exhibited worsening of symptoms (ADT-Worseners), and 905 exhibited no change/reduction in MADRS scores (ADT-Non-worseners). Response rates for ADT-Worseners at endpoint were 36.6% (adjunctive aripiprazole) and 22.5% (placebo). Similarly, response rates at endpoint for ADT-Non-worseners were 37.5% (adjunctive aripiprazole) and 22.5% (placebo). Remission rates at endpoint for ADT-Worseners were 25.4% (adjunctive aripiprazole) and 12.4% (placebo). For ADT-Non-worseners, remission rates were 29.9% (adjunctive aripiprazole) and 17.4% (placebo).
Conclusion
These results suggest that adjunctive aripiprazole is an effective intervention for patients whose symptoms worsen during antidepressant monotherapy. The results challenge the view that benefits of adjunctive therapy with aripiprazole are limited to partial responders to ADT.