To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Our purpose was to describe the clinical, epidemiological and laboratory characteristics of patients hospitalised with acute Q fever in an endemic area of Israel. We conducted a historical cohort study of all patients hospitalised with a definite diagnosis of acute Q fever, and compared them to patients suspected to have acute Q fever, but diagnosis was ruled out. A total of 38 patients had a definitive diagnosis, 47% occurred during the autumn and winter seasons, only 18% lived in rural regions. Leucopaenia and thrombocytopaenia were uncommon (16% and 18%, respectively), but mild hepatitis was common (mean aspartate aminotransferase 76 U/l, mean alanine aminotransferase 81 U/l). We compared them with 74 patients in which acute Q fever was ruled out, and found that these parameters were not significantly different. Patients with acute Q fever had a shorter hospitalisation and they were treated more often with doxycycline than those without acute Q fever (6.4 vs. 14 days, P = 0.007, 71% vs. 38%, P = 0.001, respectively). In conclusion, acute Q fever can manifest as an unspecified febrile illness, with no seasonality. We suggest that in endemic areas, Q fever should be considered in the differential diagnosis in any febrile patient with risk factors for a persistent infection.
Many studies have identified changes in the brain associated with obsessive–compulsive disorder (OCD), but few have examined the relationship between genetic determinants of OCD and brain variation.
We present the first genome-wide investigation of overlapping genetic risk for OCD and genetic influences on subcortical brain structures.
Using single nucleotide polymorphism effect concordance analysis, we measured genetic overlap between the first genome-wide association study (GWAS) of OCD (1465 participants with OCD, 5557 controls) and recent GWASs of eight subcortical brain volumes (13 171 participants).
We found evidence of significant positive concordance between OCD risk variants and variants associated with greater nucleus accumbens and putamen volumes. When conditioning OCD risk variants on brain volume, variants influencing putamen, amygdala and thalamus volumes were associated with risk for OCD.
These results are consistent with current OCD neurocircuitry models. Further evidence will clarify the relationship between putamen volume and OCD risk, and the roles of the detected variants in this disorder.
Declaration of interest
The authors have declared that no competing interests exist.
Research on post-traumatic stress disorder (PTSD) course finds a substantial proportion of cases remit within 6 months, a majority within 2 years, and a substantial minority persists for many years. Results are inconsistent about pre-trauma predictors.
The WHO World Mental Health surveys assessed lifetime DSM-IV PTSD presence-course after one randomly-selected trauma, allowing retrospective estimates of PTSD duration. Prior traumas, childhood adversities (CAs), and other lifetime DSM-IV mental disorders were examined as predictors using discrete-time person-month survival analysis among the 1575 respondents with lifetime PTSD.
20%, 27%, and 50% of cases recovered within 3, 6, and 24 months and 77% within 10 years (the longest duration allowing stable estimates). Time-related recall bias was found largely for recoveries after 24 months. Recovery was weakly related to most trauma types other than very low [odds-ratio (OR) 0.2–0.3] early-recovery (within 24 months) associated with purposefully injuring/torturing/killing and witnessing atrocities and very low later-recovery (25+ months) associated with being kidnapped. The significant ORs for prior traumas, CAs, and mental disorders were generally inconsistent between early- and later-recovery models. Cross-validated versions of final models nonetheless discriminated significantly between the 50% of respondents with highest and lowest predicted probabilities of both early-recovery (66–55% v. 43%) and later-recovery (75–68% v. 39%).
We found PTSD recovery trajectories similar to those in previous studies. The weak associations of pre-trauma factors with recovery, also consistent with previous studies, presumably are due to stronger influences of post-trauma factors.
Sexual assault is a global concern with post-traumatic stress disorder (PTSD), one of the common sequelae. Early intervention can help prevent PTSD, making identification of those at high risk for the disorder a priority. Lack of representative sampling of both sexual assault survivors and sexual assaults in prior studies might have reduced the ability to develop accurate prediction models for early identification of high-risk sexual assault survivors.
Data come from 12 face-to-face, cross-sectional surveys of community-dwelling adults conducted in 11 countries. Analysis was based on the data from the 411 women from these surveys for whom sexual assault was the randomly selected lifetime traumatic event (TE). Seven classes of predictors were assessed: socio-demographics, characteristics of the assault, the respondent's retrospective perception that she could have prevented the assault, other prior lifetime TEs, exposure to childhood family adversities and prior mental disorders.
Prevalence of Diagnostic and Statistical Manual of Mental Disorders IV (DSM-IV) PTSD associated with randomly selected sexual assaults was 20.2%. PTSD was more common for repeated than single-occurrence victimization and positively associated with prior TEs and childhood adversities. Respondent's perception that she could have prevented the assault interacted with history of mental disorder such that it reduced odds of PTSD, but only among women without prior disorders (odds ratio 0.2, 95% confidence interval 0.1–0.9). The final model estimated that 40.3% of women with PTSD would be found among the 10% with the highest predicted risk.
Whether counterfactual preventability cognitions are adaptive may depend on mental health history. Predictive modelling may be useful in targeting high-risk women for preventive interventions.
Background: The surgical risk factors and neuro-imaging characteristics associated with cerebellar mutism (CM) remain unclear and require further investigation. We aimed to examine surgical and MRI findings associated with CM in children following posterior fossa tumor resection. Methods: Using our data registry, we retrospectively collected data from pediatric patients who acquired CM and were matched based on age and pathology type with patients not acquiring CM after posterior fossa surgery. The strength of association between surgical and MRI variables and CM were examined using odds ratios (ORs) and corresponding 95% confidence intervals (CIs). Results: A total of 22 patients were included. Medulloblastoma was the most common pathology among CM patients (91%). Tumor attachment to the floor of the fourth ventricle (OR, 6; 95% CI, 0.7-276), calcification/hemosiderin deposition (OR 7; 95% CI 0.9-315.5), and post-operative peri-ventricular ischemia on MRI (OR, 5; 95% CI, 0.5-236.5) were found to have the highest association with CM. Conclusions: Our results may suggest that tumor attachment to the floor of the fourth ventricle, pathological calcification, and post-operative ischemia are relatively more prevalent in patients with CM. Collectively, our work calls for a larger multi-institutional study of CM patients to further investigate the determinants and management of CM to potentially minimize its development and predict onset.
The suicide rate has increased significantly among US Army soldiers over the past decade. Here we report the first results from a large psychological autopsy study using two control groups designed to reveal risk factors for suicide death among soldiers beyond known sociodemographic factors and the presence of suicide ideation.
Informants were next-of-kin and Army supervisors for: 135 suicide cases, 137 control soldiers propensity-score-matched on known sociodemographic risk factors for suicide and Army history variables, and 118 control soldiers who reported suicide ideation in the past year.
Results revealed that most (79.3%) soldiers who died by suicide have a prior mental disorder; mental disorders in the prior 30-days were especially strong risk factors for suicide death. Approximately half of suicide decedents tell someone that they are considering suicide. Virtually all of the risk factors identified in this study differed between suicide cases and propensity-score-matched controls, but did not significantly differ between suicide cases and suicide ideators. The most striking difference between suicides and ideators was the presence in the former of an internalizing disorder (especially depression) and multi-morbidity (i.e. 3+ disorders) in the past 30 days.
Most soldiers who die by suicide have identifiable mental disorders shortly before their death and tell others about their suicidal thinking, suggesting that there are opportunities for prevention and intervention. However, few risk factors distinguish between suicide ideators and decedents, pointing to an important direction for future research.
Introduction: The Canadian Triage and Acuity Scale (CTAS) is the standard used in all Canadian (and many international) emergency departments (EDs) for establishing the priority by which patients should be assessed. In addition to its clinical utility, CTAS has become an important administrative metric used by governments to estimate patient care requirements, ED funding and workload models. Despite its importance, the process by which CTAS scores are derived is highly variable. Emphasis on ED wait times has also drawn attention to the length of time the triage process takes. The primary objective of this study was to determine the interrater agreement of CTAS in current clinical practice. The secondary objective was to determine the time it takes to triage in a variety of ED settings. Methods: This was a prospective, observational study conducted in 7 hospital EDs, selected to represent a mix of triage processes (electronic vs. manual), documentation practices (electronic vs. paper), hospital types (rural, community and teaching) and patient volumes (annual ED census ranged from 38,000 to 136,000). An expert CTAS auditor observed on-duty triage nurses in the ED and assigned independent CTAS in real time. Research assistants not involved in the triage process independently recorded the triage time. Interrater agreement was estimated using unweighted and quadratic-weighted kappa statistics with 95% confidence intervals (CIs). Results: 738 consecutive patient CTAS assessments were audited over 21 seven-hour triage shifts. Exact modal agreement was achieved for 554 (75.0%) patients. Using the auditor’s CTAS score as the reference standard, on-duty triage nurses over-triaged 89 (12.1%) and under-triaged 95 (12.9%) patients. Interrater agreement was “good” with an unweighted kappa of 0.63 (95% CI: 0.58, 0.67) and quadratic-weighted kappa of 0.79 (95% CI: 0.67, 0.90). Research assistants captured triage time for 3808 patients over 69 shifts at 7 different EDs. Median (IQR) triage time was 5.2 (3.8, 7.3) minutes and ranged from 3.9 (3.1, 4.8) minutes to 7.5 (5.8, 10.8) minutes. Conclusion: Variability in the accuracy, and length of time taken to perform CTAS assessments suggest that a standardized approach to performing CTAS assessments would improve both clinical decision making, and administrative data accuracy.
The stress sensitization theory hypothesizes that individuals exposed to childhood adversity will be more vulnerable to mental disorders from proximal stressors. We aimed to test this theory with respect to risk of 30-day major depressive episode (MDE) and generalized anxiety disorder (GAD) among new US Army soldiers.
The sample consisted of 30 436 new soldier recruits in the Army Study to Assess Risk and Resilience (Army STARRS). Generalized linear models were constructed, and additive interactions between childhood maltreatment profiles and level of 12-month stressful experiences on the risk of 30-day MDE and GAD were analyzed.
Stress sensitization was observed in models of past 30-day MDE (χ28 = 17.6, p = 0.025) and GAD (χ28 = 26.8, p = 0.001). This sensitization only occurred at high (3+) levels of reported 12-month stressful experiences. In pairwise comparisons for the risk of 30-day MDE, the risk difference between 3+ stressful experiences and no stressful experiences was significantly greater for all maltreatment profiles relative to No Maltreatment. Similar results were found with the risk for 30-day GAD with the exception of the risk difference for Episodic Emotional and Sexual Abuse, which did not differ statistically from No Maltreatment.
New soldiers are at an increased risk of 30-day MDE or GAD following recent stressful experiences if they were exposed to childhood maltreatment. Particularly in the military with an abundance of unique stressors, attempts to identify this population and improve stress management may be useful in the effort to reduce the risk of mental disorders.
Traumatic events are common globally; however, comprehensive population-based cross-national data on the epidemiology of posttraumatic stress disorder (PTSD), the paradigmatic trauma-related mental disorder, are lacking.
Data were analyzed from 26 population surveys in the World Health Organization World Mental Health Surveys. A total of 71 083 respondents ages 18+ participated. The Composite International Diagnostic Interview assessed exposure to traumatic events as well as 30-day, 12-month, and lifetime PTSD. Respondents were also assessed for treatment in the 12 months preceding the survey. Age of onset distributions were examined by country income level. Associations of PTSD were examined with country income, world region, and respondent demographics.
The cross-national lifetime prevalence of PTSD was 3.9% in the total sample and 5.6% among the trauma exposed. Half of respondents with PTSD reported persistent symptoms. Treatment seeking in high-income countries (53.5%) was roughly double that in low-lower middle income (22.8%) and upper-middle income (28.7%) countries. Social disadvantage, including younger age, female sex, being unmarried, being less educated, having lower household income, and being unemployed, was associated with increased risk of lifetime PTSD among the trauma exposed.
PTSD is prevalent cross-nationally, with half of all global cases being persistent. Only half of those with severe PTSD report receiving any treatment and only a minority receive specialty mental health care. Striking disparities in PTSD treatment exist by country income level. Increasing access to effective treatment, especially in low- and middle-income countries, remains critical for reducing the population burden of PTSD.
This is the first cross-national study of intermittent explosive disorder (IED).
A total of 17 face-to-face cross-sectional household surveys of adults were conducted in 16 countries (n = 88 063) as part of the World Mental Health Surveys initiative. The World Health Organization Composite International Diagnostic Interview (CIDI 3.0) assessed DSM-IV IED, using a conservative definition.
Lifetime prevalence of IED ranged across countries from 0.1 to 2.7% with a weighted average of 0.8%; 0.4 and 0.3% met criteria for 12-month and 30-day prevalence, respectively. Sociodemographic correlates of lifetime risk of IED were being male, young, unemployed, divorced or separated, and having less education. The median age of onset of IED was 17 years with an interquartile range across countries of 13–23 years. The vast majority (81.7%) of those with lifetime IED met criteria for at least one other lifetime disorder; co-morbidity was highest with alcohol abuse and depression. Of those with 12-month IED, 39% reported severe impairment in at least one domain, most commonly social or relationship functioning. Prior traumatic experiences involving physical (non-combat) or sexual violence were associated with increased risk of IED onset.
Conservatively defined, IED is a low prevalence disorder but this belies the true societal costs of IED in terms of the effects of explosive anger attacks on families and relationships. IED is more common among males, the young, the socially disadvantaged and among those with prior exposure to violence, especially in childhood.
Most systematic tables of data associated to ranks of elliptic curves order the curves by conductor. Recent developments, led by work of Bhargava and Shankar studying the average sizes of
-Selmer groups, have given new upper bounds on the average algebraic rank in families of elliptic curves over
, ordered by height. We describe databases of elliptic curves over
, ordered by height, in which we compute ranks and
-Selmer group sizes, the distributions of which may also be compared to these theoretical results. A striking new phenomenon that we observe in our database is that the average rank eventually decreases as height increases.
Childhood emotional maltreatment (CEM) increases the likelihood of developing an anxiety disorder in adulthood, but the neural processes underlying conferment of this risk have not been established. Here, we test the potential for neuroimaging the adult brain to inform understanding of the mechanism linking CEM to adult anxiety symptoms.
One hundred eighty-two adults (148 females, 34 males) with a normal-to-clinical range of anxiety symptoms underwent structural and functional magnetic resonance imaging while completing an emotion-processing paradigm with facial expressions of fear, anger, and happiness. Participants completed self-report measures of CEM and current anxiety symptoms. Voxelwise mediation analyses on gray-matter volumes and activation to each emotion condition were used to identify candidate brain mechanisms relating CEM to anxiety in adulthood.
During processing of fear and anger faces, greater amygdala and less right dorsolateral prefrontal (dlPFC) activation partially mediated the positive relationship between CEM and anxiety symptoms. Greater right posterior insula activation to fear also partially mediated this relationship, as did greater ventral anterior cingulate (ACC) and less dorsal ACC activation to anger. Responses to happy faces in these regions did not mediate the CEM-anxiety relationship. Smaller right dlPFC gray-matter volumes also partially mediated the CEM-anxiety relationship.
Activation patterns of the adult brain demonstrate the potential to inform mechanistic accounts of the CEM conferment of anxiety symptoms. Results support the hypothesis that exaggerated limbic activation to negative valence facial emotions links CEM to anxiety symptoms, which may be consequent to a breakdown of cortical regulatory processes.
We assessed prevalence of and risk factors for candidaemia following Clostridium difficile infection (CDI) using longitudinal population-based surveillance. Of 13 615 adults with CDI, 113 (0·8%) developed candidaemia in the 120 days following CDI. In a matched case-control analysis, severe CDI and CDI treatment with vancomycin + metronidazole were associated with development of candidaemia following CDI.
To determine the association between household food security and infant complementary feeding practices in rural Bangladesh.
Prospective, cohort study using structured home interviews during pregnancy and 3 and 9 months after delivery. We used two indicators of household food security at 3-months’ follow-up: maternal Food Composition Score (FCS), calculated via the World Food Programme method, and an HHFS index created from an eleven-item food security questionnaire. Infant feeding practices were characterized using WHO definitions.
Two rural sub-districts of Kishoreganj, Bangladesh.
Mother–child dyads (n 2073) who completed the 9-months’ follow-up.
Complementary feeding was initiated at age ≤4 months for 7 %, at 5–6 months for 49 % and at ≥7 months for 44 % of infants. Based on 24 h dietary recall, 98 % of infants were still breast-feeding at age 9 months, and 16 % received ≥4 food groups and ≥4 meals (minimally acceptable diet) in addition to breast milk. Mothers’ diet was more diverse than infants’. The odds of receiving a minimally acceptable diet for infants living in most food-secure households were three times those for infants living in least food-secure households (adjusted OR=3·0; 95 % CI 2·1, 4·3). Socio-economic status, maternal age, literacy, parity and infant sex were not associated with infant diet.
HHFS and maternal FCS were significant predictors of subsequent infant feeding practices. Nevertheless, even the more food-secure households had poor infant diet. Interventions aimed at improving infant nutritional status need to focus on both complementary food provision and education.
Considerable research has documented that exposure to traumatic events has negative effects on physical and mental health. Much less research has examined the predictors of traumatic event exposure. Increased understanding of risk factors for exposure to traumatic events could be of considerable value in targeting preventive interventions and anticipating service needs.
General population surveys in 24 countries with a combined sample of 68 894 adult respondents across six continents assessed exposure to 29 traumatic event types. Differences in prevalence were examined with cross-tabulations. Exploratory factor analysis was conducted to determine whether traumatic event types clustered into interpretable factors. Survival analysis was carried out to examine associations of sociodemographic characteristics and prior traumatic events with subsequent exposure.
Over 70% of respondents reported a traumatic event; 30.5% were exposed to four or more. Five types – witnessing death or serious injury, the unexpected death of a loved one, being mugged, being in a life-threatening automobile accident, and experiencing a life-threatening illness or injury – accounted for over half of all exposures. Exposure varied by country, sociodemographics and history of prior traumatic events. Being married was the most consistent protective factor. Exposure to interpersonal violence had the strongest associations with subsequent traumatic events.
Given the near ubiquity of exposure, limited resources may best be dedicated to those that are more likely to be further exposed such as victims of interpersonal violence. Identifying mechanisms that account for the associations of prior interpersonal violence with subsequent trauma is critical to develop interventions to prevent revictimization.
Although interventions exist to reduce violent crime, optimal implementation requires accurate targeting. We report the results of an attempt to develop an actuarial model using machine learning methods to predict future violent crimes among US Army soldiers.
A consolidated administrative database for all 975 057 soldiers in the US Army in 2004–2009 was created in the Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS). Of these soldiers, 5771 committed a first founded major physical violent crime (murder-manslaughter, kidnapping, aggravated arson, aggravated assault, robbery) over that time period. Temporally prior administrative records measuring socio-demographic, Army career, criminal justice, medical/pharmacy, and contextual variables were used to build an actuarial model for these crimes separately among men and women using machine learning methods (cross-validated stepwise regression, random forests, penalized regressions). The model was then validated in an independent 2011–2013 sample.
Key predictors were indicators of disadvantaged social/socioeconomic status, early career stage, prior crime, and mental disorder treatment. Area under the receiver-operating characteristic curve was 0.80–0.82 in 2004–2009 and 0.77 in the 2011–2013 validation sample. Of all administratively recorded crimes, 36.2–33.1% (male-female) were committed by the 5% of soldiers having the highest predicted risk in 2004–2009 and an even higher proportion (50.5%) in the 2011–2013 validation sample.
Although these results suggest that the models could be used to target soldiers at high risk of violent crime perpetration for preventive interventions, final implementation decisions would require further validation and weighing of predicted effectiveness against intervention costs and competing risks.
The contribution of subsidized food commodities to total food consumption is unknown. We estimated the proportion of individual energy intake from food commodities receiving the largest subsidies from 1995 to 2010 (corn, soyabeans, wheat, rice, sorghum, dairy and livestock).
Integrating information from three federal databases (MyPyramid Equivalents, Food Intakes Converted to Retail Commodities, and What We Eat in America) with data from the 2001–2006 National Health and Nutrition Examination Surveys, we computed a Subsidy Score representing the percentage of total energy intake from subsidized commodities. We examined the score’s distribution and the probability of having a ‘high’ (≥70th percentile) v. ‘low’ (≤30th percentile) score, across the population and subgroups, using multivariate logistic regression.
Community-dwelling adults in the USA.
Participants (n 11 811) aged 18–64 years.
Median Subsidy Score was 56·7 % (interquartile range 47·2–65·4 %). Younger, less educated, poorer, and Mexican Americans had higher scores. After controlling for covariates, age, education and income remained independently associated with the score: compared with individuals aged 55–64 years, individuals aged 18–24 years had a 50 % higher probability of having a high score (P<0·0001). Individuals reporting less than high-school education had 21 % higher probability of having a high score than individuals reporting college completion or higher (P=0·003); individuals in the lowest tertile of income had an 11 % higher probability of having a high score compared with individuals in the highest tertile (P=0·02).
Over 50 % of energy in US diets is derived from federally subsidized commodities.
Civilian suicide rates vary by occupation in ways related to occupational stress exposure. Comparable military research finds suicide rates elevated in combat arms occupations. However, no research has evaluated variation in this pattern by deployment history, the indicator of occupation stress widely considered responsible for the recent rise in the military suicide rate.
The joint associations of Army occupation and deployment history in predicting suicides were analysed in an administrative dataset for the 729 337 male enlisted Regular Army soldiers in the US Army between 2004 and 2009.
There were 496 suicides over the study period (22.4/100 000 person-years). Only two occupational categories, both in combat arms, had significantly elevated suicide rates: infantrymen (37.2/100 000 person-years) and combat engineers (38.2/100 000 person-years). However, the suicide rates in these two categories were significantly lower when currently deployed (30.6/100 000 person-years) than never deployed or previously deployed (41.2–39.1/100 000 person-years), whereas the suicide rate of other soldiers was significantly higher when currently deployed and previously deployed (20.2–22.4/100 000 person-years) than never deployed (14.5/100 000 person-years), resulting in the adjusted suicide rate of infantrymen and combat engineers being most elevated when never deployed [odds ratio (OR) 2.9, 95% confidence interval (CI) 2.1–4.1], less so when previously deployed (OR 1.6, 95% CI 1.1–2.1), and not at all when currently deployed (OR 1.2, 95% CI 0.8–1.8). Adjustment for a differential ‘healthy warrior effect’ cannot explain this variation in the relative suicide rates of never-deployed infantrymen and combat engineers by deployment status.
Efforts are needed to elucidate the causal mechanisms underlying this interaction to guide preventive interventions for soldiers at high suicide risk.
There is limited evidence on the acceptability, feasibility and cost-effectiveness of task-sharing interventions to narrow the treatment gap for mental disorders in sub-Saharan Africa. The purpose of this article is to describe the rationale, aims and methods of the Africa Focus on Intervention Research for Mental health (AFFIRM) collaborative research hub. AFFIRM is investigating strategies for narrowing the treatment gap for mental disorders in sub-Saharan Africa in four areas. First, it is assessing the feasibility, acceptability and cost-effectiveness of task-sharing interventions by conducting randomised controlled trials in Ethiopia and South Africa. The AFFIRM Task-sharing for the Care of Severe mental disorders (TaSCS) trial in Ethiopia aims to determine the acceptability, affordability, effectiveness and sustainability of mental health care for people with severe mental disorder delivered by trained and supervised non-specialist, primary health care workers compared with an existing psychiatric nurse-led service. The AFFIRM trial in South Africa aims to determine the cost-effectiveness of a task-sharing counselling intervention for maternal depression, delivered by non-specialist community health workers, and to examine factors influencing the implementation of the intervention and future scale up. Second, AFFIRM is building individual and institutional capacity for intervention research in sub-Saharan Africa by providing fellowship and mentorship programmes for candidates in Ethiopia, Ghana, Malawi, Uganda and Zimbabwe. Each year five Fellowships are awarded (one to each country) to attend the MPhil in Public Mental Health, a joint postgraduate programme at the University of Cape Town and Stellenbosch University. AFFIRM also offers short courses in intervention research, and supports PhD students attached to the trials in Ethiopia and South Africa. Third, AFFIRM is collaborating with other regional National Institute of Mental Health funded hubs in Latin America, sub-Saharan Africa and south Asia, by designing and executing shared research projects related to task-sharing and narrowing the treatment gap. Finally, it is establishing a network of collaboration between researchers, non-governmental organisations and government agencies that facilitates the translation of research knowledge into policy and practice. This article describes the developmental process of this multi-site approach, and provides a narrative of challenges and opportunities that have arisen during the early phases. Crucial to the long-term sustainability of this work is the nurturing and sustaining of partnerships between African mental health researchers, policy makers, practitioners and international collaborators.