We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
While previous studies have reported high rates of documented suicide attempts (SAs) in the U.S. Army, the extent to which soldiers make SAs that are not identified in the healthcare system is unknown. Understanding undetected suicidal behavior is important in broadening prevention and intervention efforts.
Methods
Representative survey of U.S. Regular Army enlisted soldiers (n = 24 475). Reported SAs during service were compared with SAs documented in administrative medical records. Logistic regression analyses examined sociodemographic characteristics differentiating soldiers with an undetected SA v. documented SA. Among those with an undetected SA, chi-square tests examined characteristics associated with receiving a mental health diagnosis (MH-Dx) prior to SA. Discrete-time survival analysis estimated risk of undetected SA by time in service.
Results
Prevalence of undetected SA (unweighted n = 259) was 1.3%. Annual incidence was 255.6 per 100 000 soldiers, suggesting one in three SAs are undetected. In multivariable analysis, rank ⩾E5 (OR = 3.1[95%CI 1.6–5.7]) was associated with increased odds of undetected v. documented SA. Females were more likely to have a MH-Dx prior to their undetected SA (Rao-Scott χ21 = 6.1, p = .01). Over one-fifth of undetected SAs resulted in at least moderate injury. Risk of undetected SA was greater during the first four years of service.
Conclusions
Findings suggest that substantially more soldiers make SAs than indicated by estimates based on documented attempts. A sizable minority of undetected SAs result in significant injury. Soldiers reporting an undetected SA tend to be higher ranking than those with documented SAs. Undetected SAs require additional approaches to identifying individuals at risk.
Changing practice patterns caused by the pandemic have created an urgent need for guidance in prescribing stimulants using telepsychiatry for attention-deficit hyperactivity disorder (ADHD). A notable spike in the prescribing of stimulants accompanied the suspension of the Ryan Haight Act, allowing the prescribing of stimulants without a face-to-face meeting. Competing forces both for and against prescribing ADHD stimulants by telepsychiatry have emerged, requiring guidelines to balance these factors. On the one hand, factors weighing in favor of increasing the availability of treatment for ADHD via telepsychiatry include enhanced access to care, reduction in the large number of untreated cases, and prevention of the known adverse outcomes of untreated ADHD. On the other hand, factors in favor of limiting telepsychiatry for ADHD include mitigating the possibility of exploiting telepsychiatry for profit or for misuse, abuse, and diversion of stimulants. This Expert Consensus Group has developed numerous specific guidelines and advocates for some flexibility in allowing telepsychiatry evaluations and treatment without an in-person evaluation to continue. These guidelines also recognize the need to give greater scrutiny to certain subpopulations, such as young adults without a prior diagnosis or treatment of ADHD who request immediate-release stimulants, which should increase the suspicion of possible medication diversion, misuse, or abuse. In such cases, nonstimulants, controlled-release stimulants, or psychosocial interventions should be prioritized. We encourage the use of outside informants to support the history, the use of rating scales, and having access to a hybrid model of both in-person and remote treatment.
Blood-based biomarkers represent a scalable and accessible approach for the detection and monitoring of Alzheimer’s disease (AD). Plasma phosphorylated tau (p-tau) and neurofilament light (NfL) are validated biomarkers for the detection of tau and neurodegenerative brain changes in AD, respectively. There is now emphasis to expand beyond these markers to detect and provide insight into the pathophysiological processes of AD. To this end, a reactive astrocytic marker, namely plasma glial fibrillary acidic protein (GFAP), has been of interest. Yet, little is known about the relationship between plasma GFAP and AD. Here, we examined the association between plasma GFAP, diagnostic status, and neuropsychological test performance. Diagnostic accuracy of plasma GFAP was compared with plasma measures of p-tau181 and NfL.
Participants and Methods:
This sample included 567 participants from the Boston University (BU) Alzheimer’s Disease Research Center (ADRC) Longitudinal Clinical Core Registry, including individuals with normal cognition (n=234), mild cognitive impairment (MCI) (n=180), and AD dementia (n=153). The sample included all participants who had a blood draw. Participants completed a comprehensive neuropsychological battery (sample sizes across tests varied due to missingness). Diagnoses were adjudicated during multidisciplinary diagnostic consensus conferences. Plasma samples were analyzed using the Simoa platform. Binary logistic regression analyses tested the association between GFAP levels and diagnostic status (i.e., cognitively impaired due to AD versus unimpaired), controlling for age, sex, race, education, and APOE e4 status. Area under the curve (AUC) statistics from receiver operating characteristics (ROC) using predicted probabilities from binary logistic regression examined the ability of plasma GFAP to discriminate diagnostic groups compared with plasma p-tau181 and NfL. Linear regression models tested the association between plasma GFAP and neuropsychological test performance, accounting for the above covariates.
Results:
The mean (SD) age of the sample was 74.34 (7.54), 319 (56.3%) were female, 75 (13.2%) were Black, and 223 (39.3%) were APOE e4 carriers. Higher GFAP concentrations were associated with increased odds for having cognitive impairment (GFAP z-score transformed: OR=2.233, 95% CI [1.609, 3.099], p<0.001; non-z-transformed: OR=1.004, 95% CI [1.002, 1.006], p<0.001). ROC analyses, comprising of GFAP and the above covariates, showed plasma GFAP discriminated the cognitively impaired from unimpaired (AUC=0.75) and was similar, but slightly superior, to plasma p-tau181 (AUC=0.74) and plasma NfL (AUC=0.74). A joint panel of the plasma markers had greatest discrimination accuracy (AUC=0.76). Linear regression analyses showed that higher GFAP levels were associated with worse performance on neuropsychological tests assessing global cognition, attention, executive functioning, episodic memory, and language abilities (ps<0.001) as well as higher CDR Sum of Boxes (p<0.001).
Conclusions:
Higher plasma GFAP levels differentiated participants with cognitive impairment from those with normal cognition and were associated with worse performance on all neuropsychological tests assessed. GFAP had similar accuracy in detecting those with cognitive impairment compared with p-tau181 and NfL, however, a panel of all three biomarkers was optimal. These results support the utility of plasma GFAP in AD detection and suggest the pathological processes it represents might play an integral role in the pathogenesis of AD.
Blood-based biomarkers offer a more feasible alternative to Alzheimer’s disease (AD) detection, management, and study of disease mechanisms than current in vivo measures. Given their novelty, these plasma biomarkers must be assessed against postmortem neuropathological outcomes for validation. Research has shown utility in plasma markers of the proposed AT(N) framework, however recent studies have stressed the importance of expanding this framework to include other pathways. There is promising data supporting the usefulness of plasma glial fibrillary acidic protein (GFAP) in AD, but GFAP-to-autopsy studies are limited. Here, we tested the association between plasma GFAP and AD-related neuropathological outcomes in participants from the Boston University (BU) Alzheimer’s Disease Research Center (ADRC).
Participants and Methods:
This sample included 45 participants from the BU ADRC who had a plasma sample within 5 years of death and donated their brain for neuropathological examination. Most recent plasma samples were analyzed using the Simoa platform. Neuropathological examinations followed the National Alzheimer’s Coordinating Center procedures and diagnostic criteria. The NIA-Reagan Institute criteria were used for the neuropathological diagnosis of AD. Measures of GFAP were log-transformed. Binary logistic regression analyses tested the association between GFAP and autopsy-confirmed AD status, as well as with semi-quantitative ratings of regional atrophy (none/mild versus moderate/severe) using binary logistic regression. Ordinal logistic regression analyses tested the association between plasma GFAP and Braak stage and CERAD neuritic plaque score. Area under the curve (AUC) statistics from receiver operating characteristics (ROC) using predicted probabilities from binary logistic regression examined the ability of plasma GFAP to discriminate autopsy-confirmed AD status. All analyses controlled for sex, age at death, years between last blood draw and death, and APOE e4 status.
Results:
Of the 45 brain donors, 29 (64.4%) had autopsy-confirmed AD. The mean (SD) age of the sample at the time of blood draw was 80.76 (8.58) and there were 2.80 (1.16) years between the last blood draw and death. The sample included 20 (44.4%) females, 41 (91.1%) were White, and 20 (44.4%) were APOE e4 carriers. Higher GFAP concentrations were associated with increased odds for having autopsy-confirmed AD (OR=14.12, 95% CI [2.00, 99.88], p=0.008). ROC analysis showed plasma GFAP accurately discriminated those with and without autopsy-confirmed AD on its own (AUC=0.75) and strengthened as the above covariates were added to the model (AUC=0.81). Increases in GFAP levels corresponded to increases in Braak stage (OR=2.39, 95% CI [0.71-4.07], p=0.005), but not CERAD ratings (OR=1.24, 95% CI [0.004, 2.49], p=0.051). Higher GFAP levels were associated with greater temporal lobe atrophy (OR=10.27, 95% CI [1.53,69.15], p=0.017), but this was not observed with any other regions.
Conclusions:
The current results show that antemortem plasma GFAP is associated with non-specific AD neuropathological changes at autopsy. Plasma GFAP could be a useful and practical biomarker for assisting in the detection of AD-related changes, as well as for study of disease mechanisms.
Background: Cerebrovascular reactivity is an important contributor to secondary injury following traumatic brain injury (TBI). The poor spatial resolution and invasive nature of “Gold-standard” intracranial pressure (ICP) based indies of cerebrovascular reactivity limit their use. Near infrared spectroscopy (NIRS) based indices of cerebrovascular reactivity are minimally invasive and have improved spatial resolution. The precise relationship between NIRS and ICP based indices is quantified utilizing times series analysis and advanced statistical techniques. Methods: High-resolution physiologic data was collected in a cohort of adult moderate to severe TBI patients at a single quaternary care site. From this data both ICP and NIRS based indices of cerebrovascular reactivity were derived. The times series structure of these indices was determined and used to correct for autocorrelation in a linear mixed effects model of ICP based indices from NIRS based indices of cerebrovascular reactivity. Results: A total of 83 patients were included in this study. Times series analysis coupled with mixed effects modeling was utilized to examine the relationship between ICP and NIRS based indices of cerebrovascular reactivity. Conclusions: Times series analysis coupled with mixed effects modeling allows for a more complete understanding of the relationship between ICP and NIRS based indices of cerebrovascular reactivity in the setting TBI.
Background: Cerebrovascular reactivity has been identified as an important contributor to secondary injury following moderate to severe traumatic brain injury (TBI). “Gold-standard” intracranial pressure (ICP) based indies of cerebrovascular reactivity are limited by their invasive nature poor spatial resolution. Near infrared spectroscopy (NIRS) based indices of cerebrovascular reactivity are minimally invasive and have improved spatial resolution. In this study, classical machine-learning algorithms are leveraged to better characterize the relationship between these indices. Methods: High-resolution physiologic data was collected in a cohort of adult moderate to severe TBI patients. From this data both ICP and NIRS based indices of cerebrovascular reactivity were derived. Utilizing Agglomerative Hierarchical Clustering (AHC) and Principal Component Analysis, the relationship between these indices in higher dimensional physiologic space was examined. Results: A total of 83 patients with 314,395 minutes of unique and complete physiologic data was obtained. Through AHC and PCA there was higher order clustering between NIRS and ICP based indices, separate from other physiologic parameters. Conclusions: NIRS and ICP based indices of cerebrovascular reactivity relate to one another in higher dimensional physiologic space. NIRS based indices of cerebrovascular reactivity may be a viable alternative to ICP based indices.
Background: Burst suppression (BS) is an EEG pattern in which there are isoelectric periods interspersed with bursts of cortical activity. Targeting BS through anesthetic administration is used as a tool in the neuro-ICU but its relationship with cerebral blood flow (CBF) and cerebral autoregulation (CA) is unclear. We performed a systematic review investigating the effect of BS on CBF and CA in animals and humans. Methods: We searched MEDLINE, BIOSIS, EMBASE, SCOPUS, and Cochrane library from inception to July 2022. The data that were collected included study population, methods to induce and measure BS, and the effect on CBF and CA. Results: In total 45 animal and 26 human studies were included in the final review. In almost all the studies, BS was induced using an anaesthetic. In most of the animal and human studies, BS was associated with a decrease in CBF and cerebral metabolism, even if the mean arterial pressure remained constant. The effect on CA during periods of stress (hypercapnia, hypothermia, etc.) was variable. Conclusions: BS is associated with a reduction in cerebral metabolic demand and CBF, which may explain its usefulness in patients with brain injury. More evidence is needed to elucidate the connection between BS and CA.
Obesity is highly prevalent and disabling, especially in individuals with severe mental illness including bipolar disorders (BD). The brain is a target organ for both obesity and BD. Yet, we do not understand how cortical brain alterations in BD and obesity interact.
Methods:
We obtained body mass index (BMI) and MRI-derived regional cortical thickness, surface area from 1231 BD and 1601 control individuals from 13 countries within the ENIGMA-BD Working Group. We jointly modeled the statistical effects of BD and BMI on brain structure using mixed effects and tested for interaction and mediation. We also investigated the impact of medications on the BMI-related associations.
Results:
BMI and BD additively impacted the structure of many of the same brain regions. Both BMI and BD were negatively associated with cortical thickness, but not surface area. In most regions the number of jointly used psychiatric medication classes remained associated with lower cortical thickness when controlling for BMI. In a single region, fusiform gyrus, about a third of the negative association between number of jointly used psychiatric medications and cortical thickness was mediated by association between the number of medications and higher BMI.
Conclusions:
We confirmed consistent associations between higher BMI and lower cortical thickness, but not surface area, across the cerebral mantle, in regions which were also associated with BD. Higher BMI in people with BD indicated more pronounced brain alterations. BMI is important for understanding the neuroanatomical changes in BD and the effects of psychiatric medications on the brain.
Opera was produced only rarely in the otherwise vibrant theatrical culture of seventeenth-century Spain and her American dominions, though Italian operas and occasional Spanish ones became a mainstay of public life in the Spanish-held territories in Italy, especially Naples and Milan. At the royal court in Madrid and the principal administrative centres of the overseas colonies (Lima and Mexico), opera was inextricably bound to dynastic politics and constrained by conventions about the gender of onstage singers. Several other kinds of plays with music were produced at theatres both public and private, however, and commercial theatres known as corrales were among the busiest sites of musical performance and cultural transmission. Some 10,000 plays were performed in Madrid in the course of the seventeenth century, although only about 2,000 such texts have been preserved. The principal theatrical genre was the comedia nueva, a three-act play in poly-metric verse in which the tragic and the comic were mingled to recreate the natural balance of human existence with varying degrees of verisimilitude.
Research of COVID-19-Pandemic mental health impact focus on three groups: the general population, (2) so called vulnerable groups (e.g. individuals with mental disorders) and (3) individuals suffering COVID-19 including Long-COVID syndromes.
Objectives
We investigate whether individuals with a history of depression in the past, react to the COVID-19 pandemic with increased depressive symptoms.
Methods
Longitudinal Data stem from the NAKO-Baseline-Assessment (2014-2019, 18 study centers in Germany, representative sampled individuals from 20 to 74 years) and the subsequent NAKO-COVID-Assessment (5-11/2020). The sample for analysis comprises 115.519 individuals. History of psychiatric disorder was operationalized as lifetime self-report for physician-diagnosed depression. Depressive symptoms were measured with the PHQ 9.
Results
Mean age of the sample at baseline was 49.95 (SD 12.53). It comprised 51.70 women; 14 % of the individuals had a history of physician-diagnosed depression. Considering a PHQ-Score with cut-off 10 as a clinical relevant depression, 3.65 % of the individuals without history of depression and 24.19 % of those with a history of depression were depressed at baseline. The NAKO-COVID-Assessment revealed 6.53 % depressed individuals without any history of depression and a similar rate of 23.29 % in those with history of depression.
Conclusions
In contrast to that what we expected, individuals with a history of a physician-diagnosed depression, did not react with increasing depressiveness during the first phase of the pandemic in Germany. Several reasons could be discussed. Whether there medium and long-term impact remains open.
Personality traits (e.g. neuroticism) and the social environment predict risk for internalizing disorders and suicidal behavior. Studying these characteristics together and prospectively within a population confronted with high stressor exposure (e.g. U.S. Army soldiers) has not been done, yet could uncover unique and interactive predictive effects that may inform prevention and early intervention efforts.
Methods
Five broad personality traits and social network size were assessed via self-administered questionnaires among experienced soldiers preparing for deployment (N = 4645) and new soldiers reporting for basic training (N = 6216). Predictive models examined associations of baseline personality and social network variables with recent distress disorders or suicidal behaviors assessed 3- and 9-months post-deployment and approximately 5 years following enlistment.
Results
Among the personality traits, elevated neuroticism was consistently associated with increased mental health risk following deployment. Small social networks were also associated with increased mental health risk following deployment, beyond the variance accounted for by personality. Limited support was found for social network size moderating the association between personality and mental health outcomes. Small social networks also predicted distress disorders and suicidal behavior 5 years following enlistment, whereas unique effects of personality traits on these more distal outcomes were rare.
Conclusions
Heightened neuroticism and small social networks predict a greater risk for negative mental health sequelae, especially following deployment. Social ties may mitigate adverse impacts of personality traits on psychopathology in some contexts. Early identification and targeted intervention for these distinct, modifiable factors may decrease the risk of distress disorders and suicidal behavior.
To what extent psychotic symptoms in first-episode psychosis (FEP) with a history of childhood interpersonal trauma (CIT) are less responsive to antipsychotic medication is not known. In this longitudinal study, we compare symptom trajectories and remission over the first 2 years of treatment in FEP with and without CIT and examine if differences are linked to the use of antipsychotics.
Methods
FEP (N = 191) were recruited from in- and outpatient services 1997–2000, and assessed at baseline, 3 months, 1 and 2 years. Inclusion criteria were 15–65 years, actively psychotic with a DSM-IV diagnosis of psychotic disorder and no previous adequate treatment for psychosis. Antipsychotic medication is reported as defined daily dosage (DDD). CIT (<18) was assessed with the Brief Betrayal Trauma Survey, and symptomatic remission based on scores from the Positive and Negative Syndrome Scale.
Results
CIT (n = 63, 33%) was not associated with symptomatic remission at 2 years follow-up (71% in remission, 14% in relapse), or time to first remission (CIT 12/ no-CIT 9 weeks, p = 0.51). Those with CIT had significantly more severe positive, depressive, and excited symptoms. FEP with physical (N = 39, 20%) or emotional abuse (N = 22, 14, 7%) had higher DDD at 1 year (p < 0.05). Mean DDD did not excerpt a significant between-group effect on symptom trajectories of positive symptoms.
Conclusion
Results indicate that antipsychotic medication is equally beneficial in the achievement of symptomatic remission in FEP after 2 years independent of CIT. Still, FEP patients with CIT had more severe positive, depressive, and excited symptoms throughout.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
The aim of the current study was to explore the effect of gender, age at onset, and duration on the long-term course of schizophrenia.
Methods
Twenty-nine centers from 25 countries representing all continents participated in the study that included 2358 patients aged 37.21 ± 11.87 years with a DSM-IV or DSM-5 diagnosis of schizophrenia; the Positive and Negative Syndrome Scale as well as relevant clinicodemographic data were gathered. Analysis of variance and analysis of covariance were used, and the methodology corrected for the presence of potentially confounding effects.
Results
There was a 3-year later age at onset for females (P < .001) and lower rates of negative symptoms (P < .01) and higher depression/anxiety measures (P < .05) at some stages. The age at onset manifested a distribution with a single peak for both genders with a tendency of patients with younger onset having slower advancement through illness stages (P = .001). No significant effects were found concerning duration of illness.
Discussion
Our results confirmed a later onset and a possibly more benign course and outcome in females. Age at onset manifested a single peak in both genders, and surprisingly, earlier onset was related to a slower progression of the illness. No effect of duration has been detected. These results are partially in accord with the literature, but they also differ as a consequence of the different starting point of our methodology (a novel staging model), which in our opinion precluded the impact of confounding effects. Future research should focus on the therapeutic policy and implications of these results in more representative samples.
Although non-suicidal self-injury (NSSI) is an issue of major concern to colleges worldwide, we lack detailed information about the epidemiology of NSSI among college students. The objectives of this study were to present the first cross-national data on the prevalence of NSSI and NSSI disorder among first-year college students and its association with mental disorders.
Methods
Data come from a survey of the entering class in 24 colleges across nine countries participating in the World Mental Health International College Student (WMH-ICS) initiative assessed in web-based self-report surveys (20 842 first-year students). Using retrospective age-of-onset reports, we investigated time-ordered associations between NSSI and Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-IV) mood (major depressive and bipolar disorder), anxiety (generalized anxiety and panic disorder), and substance use disorders (alcohol and drug use disorder).
Results
NSSI lifetime and 12-month prevalence were 17.7% and 8.4%. A positive screen of 12-month DSM-5 NSSI disorder was 2.3%. Of those with lifetime NSSI, 59.6% met the criteria for at least one mental disorder. Temporally primary lifetime mental disorders predicted subsequent onset of NSSI [median odds ratio (OR) 2.4], but these primary lifetime disorders did not consistently predict 12-month NSSI among respondents with lifetime NSSI. Conversely, even after controlling for pre-existing mental disorders, NSSI consistently predicted later onset of mental disorders (median OR 1.8) as well as 12-month persistence of mental disorders among students with a generalized anxiety disorder (OR 1.6) and bipolar disorder (OR 4.6).
Conclusions
NSSI is common among first-year college students and is a behavioral marker of various common mental disorders.
Despite the progress made in HIV treatment and prevention, HIV remains a major cause of adolescent morbidity and mortality in sub-Saharan Africa. As perinatally infected children increasingly survive into adulthood, the quality of life and mental health of this population has increased in importance. This review provides a synthesis of the prevalence of mental health problems in this population and explores associated factors. A systematic database search (Medline, PsycINFO, Scopus) with an additional hand search was conducted. Peer-reviewed studies on adolescents (aged 10–19), published between 2008 and 2019, assessing mental health symptoms or psychiatric disorders, either by standardized questionnaires or by diagnostic interviews, were included. The search identified 1461 articles, of which 301 were eligible for full-text analysis. Fourteen of these, concerning HIV-positive adolescents, met the inclusion criteria and were critically appraised. Mental health problems were highly prevalent among this group, with around 25% scoring positive for any psychiatric disorder and 30–50% showing emotional or behavioral difficulties or significant psychological distress. Associated factors found by regression analysis were older age, not being in school, impaired family functioning, HIV-related stigma and bullying, and poverty. Social support and parental competence were protective factors. Mental health problems among HIV-positive adolescents are highly prevalent and should be addressed as part of regular HIV care.
Epidemiological studies indicate that individuals with one type of mental disorder have an increased risk of subsequently developing other types of mental disorders. This study aimed to undertake a comprehensive analysis of pair-wise lifetime comorbidity across a range of common mental disorders based on a diverse range of population-based surveys.
Methods
The WHO World Mental Health (WMH) surveys assessed 145 990 adult respondents from 27 countries. Based on retrospectively-reported age-of-onset for 24 DSM-IV mental disorders, associations were examined between all 548 logically possible temporally-ordered disorder pairs. Overall and time-dependent hazard ratios (HRs) and 95% confidence intervals (CIs) were calculated using Cox proportional hazards models. Absolute risks were estimated using the product-limit method. Estimates were generated separately for men and women.
Results
Each prior lifetime mental disorder was associated with an increased risk of subsequent first onset of each other disorder. The median HR was 12.1 (mean = 14.4; range 5.2–110.8, interquartile range = 6.0–19.4). The HRs were most prominent between closely-related mental disorder types and in the first 1–2 years after the onset of the prior disorder. Although HRs declined with time since prior disorder, significantly elevated risk of subsequent comorbidity persisted for at least 15 years. Appreciable absolute risks of secondary disorders were found over time for many pairs.
Conclusions
Survey data from a range of sites confirms that comorbidity between mental disorders is common. Understanding the risks of temporally secondary disorders may help design practical programs for primary prevention of secondary disorders.
Intermittent explosive disorder (IED) is characterised by impulsive anger attacks that vary greatly across individuals in severity and consequence. Understanding IED subtypes has been limited by lack of large, general population datasets including assessment of IED. Using the 17-country World Mental Health surveys dataset, this study examined whether behavioural subtypes of IED are associated with differing patterns of comorbidity, suicidality and functional impairment.
Methods
IED was assessed using the Composite International Diagnostic Interview in the World Mental Health surveys (n = 45 266). Five behavioural subtypes were created based on type of anger attack. Logistic regression assessed association of these subtypes with lifetime comorbidity, lifetime suicidality and 12-month functional impairment.
Results
The lifetime prevalence of IED in all countries was 0.8% (s.e.: 0.0). The two subtypes involving anger attacks that harmed people (‘hurt people only’ and ‘destroy property and hurt people’), collectively comprising 73% of those with IED, were characterised by high rates of externalising comorbid disorders. The remaining three subtypes involving anger attacks that destroyed property only, destroyed property and threatened people, and threatened people only, were characterised by higher rates of internalising than externalising comorbid disorders. Suicidal behaviour did not vary across the five behavioural subtypes but was higher among those with (v. those without) comorbid disorders, and among those who perpetrated more violent assaults.
Conclusions
The most common IED behavioural subtypes in these general population samples are associated with high rates of externalising disorders. This contrasts with the findings from clinical studies of IED, which observe a preponderance of internalising disorder comorbidity. This disparity in findings across population and clinical studies, together with the marked heterogeneity that characterises the diagnostic entity of IED, suggests that it is a disorder that requires much greater research.
Humans are contributing to large carnivore declines around the globe, and conservation interventions should focus on increasing local stakeholder tolerance of carnivores and be informed by both biological and social considerations. In the Okavango Delta (Botswana), we tested new conservation strategies alongside a pre-existing government compensation programme. The new strategies included the construction of predator-proof livestock enclosures, the establishment of an early warning system linked to GPS satellite lion collars, depredation event investigations and educational programmes. We conducted pre- and post-assessments of villagers’ livestock management practices, attitudes towards carnivores and conservation, perceptions of human–carnivore coexistence and attitudes towards established conservation programmes. Livestock management levels were low and 50% of farmers lost livestock to carnivores, while 5–10% of owned stock was lost. Respondents had strong negative attitudes towards lions, which kill most depredated livestock. Following new management interventions, tolerance of carnivores significantly increased, although tolerance of lions near villages did not. The number of respondents who believed that coexistence with carnivores was possible significantly increased. Respondents had negative attitudes towards the government-run compensation programme, citing low and late payments, but were supportive of the new management interventions. These efforts show that targeted, intensive management can increase stakeholder tolerance of carnivores.