Book chapters will be unavailable on Saturday 24th August between 8am-12pm BST. This is for essential maintenance which will provide improved performance going forwards. Please accept our apologies for any inconvenience caused.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We describe the design and deployment of GREENBURST, a commensal Fast Radio Burst (FRB) search system at the Green Bank Telescope. GREENBURST uses the dedicated L-band receiver tap to search over the 960–1 920 MHz frequency range for pulses with dispersion measures out to
. Due to its unique design, GREENBURST is capable of conducting searches for FRBs when the L-band receiver is not being used for scheduled observing. This makes it a sensitive single pixel detector capable of reaching deeper in the radio sky. While single pulses from Galactic pulsars and rotating radio transients will be detectable in our observations, and will form part of the database we archive, the primary goal is to detect and study FRBs. Based on recent determinations of the all-sky rate, we predict that the system will detect approximately one FRB for every 2–3 months of continuous operation. The high sensitivity of GREENBURST means that it will also be able to probe the slope of the FRB fluence distribution, which is currently uncertain in this observing band.
Grains rich in starch constitute the primary source of energy for both pigs and humans, but there is incomplete understanding of physiological mechanisms that determine the extent of digestion of grain starch in monogastric animals including pigs and humans. Slow digestion of starch to produce glucose in the small intestine (SI) leads to undigested starch escaping to the large intestine where it is fermented to produce short-chain fatty acids. Glucose generated from starch provides more energy than short-chain fatty acids for normal metabolism and growth in monogastrics. While incomplete digestion of starch leads to underutilised feed in pigs and economic losses, it is desirable in human nutrition to maintain consistent body weight in adults. Undigested nutrients reaching the ileum may trigger the ileal brake, and fermentation of undigested nutrients or fibre in the large intestine triggers the colonic brake. These intestinal brakes reduce the passage rate in an attempt to maximise nutrient utilisation, and lead to increased satiety that may reduce feed intake. The three physiological mechanisms that control grain digestion and feed intake are: (1) gastric emptying rate; (2) interplay of grain digestion and passage rate in the SI controlling the activation of the ileal brake; and (3) fermentation of undigested nutrients or fibre in the large intestine activating the colonic brake. Fibre plays an important role in influencing these mechanisms and the extent of their effects. In this review, an account of the physiological mechanisms controlling the passage rate, feed intake and enzymatic digestion of grains is presented: (1) to evaluate the merits of recently developed methods of grain/starch digestion for application purposes; and (2) to identify opportunities for future research to advance our understanding of how the combination of controlled grain digestion and fibre content can be manipulated to physiologically influence satiety and food intake.
Surgery for CHD has been slow to develop in parts of the former Soviet Union. The impact of an 8-year surgical assistance programme between an emerging centre and a multi-disciplinary international team that comprised healthcare professionals from developed cardiac programmes is analysed and presented.
Material and methods
The international paediatric assistance programme included five main components – intermittent clinical visits to the site annually, medical education, biomedical engineering support, nurse empowerment, and team-based practice development. Data were analysed from visiting teams and local databases before and since commencement of assistance in 2007 (era A: 2000–2007; era B: 2008–2015). The following variables were compared between periods: annual case volume, operative mortality, case complexity based on Risk Adjustment for Congenital Heart Surgery (RACHS-1), and RACHS-adjusted standardised mortality ratio.
A total of 154 RACHS-classifiable operations were performed during era A, with a mean annual case volume by local surgeons of 19.3 at 95% confidence interval 14.3–24.2, with an operative mortality of 4.6% and a standardised mortality ratio of 2.1. In era B, surgical volume increased to a mean of 103.1 annual cases (95% confidence interval 69.1–137.2, p<0.0001). There was a non-significant (p=0.84) increase in operative mortality (5.7%), but a decrease in standardised mortality ratio (1.2) owing to an increase in case complexity. In era B, the proportion of local surgeon-led surgeries during visits from the international team increased from 0% (0/27) in 2008 to 98% (58/59) in the final year of analysis.
The model of assistance described in this report led to improved adjusted mortality, increased case volume, complexity, and independent operating skills.
Vitamin B12 is synthesised in the rumen from cobalt (Co) and has a major role in metabolism in the peri-paturient period, although few studies have evaluated the effect of the dietary inclusion of Co, vitamin B12 or injecting vitamin B12 on the metabolism, health and performance of high yielding dairy cows. A total of 56 Holstein-Friesian dairy cows received one of four treatments from 8 weeks before calving to 8 weeks post-calving: C, no added Co; DC, additional 0.2 mg Co/kg dry matter (DM); DB, additional 0.68 mg vitamin B12/kg DM; IB, intra-muscular injection of vitamin B12 to supply 0.71 mg/cow per day prepartum and 1.42 mg/cow per day post-partum. The basal and lactation rations both contained 0.21 mg Co/kg DM. Cows were weighed and condition scored at drying off, 4 weeks before calving, within 24 h of calving and at 2, 4 and 8 weeks post-calving, with blood samples collected at drying off, 2 weeks pre-calving, calving and 2, 4 and 8 weeks post-calving. Liver biopsy samples were collected from all animals at drying off and 4 weeks post-calving. Live weight changed with time, but there was no effect of treatment (P>0.05), whereas cows receiving IB had the lowest mean body condition score and DB the highest (P<0.05). There was no effect of treatment on post-partum DM intake, milk yield or milk fat concentration (P>0.05) with mean values of 21.6 kg/day, 39.6 kg/day and 40.4 g/kg, respectively. Cows receiving IB had a higher plasma vitamin B12 concentration than those receiving any of the other treatments (P<0.001), but there was no effect (P>0.05) of treatment on homocysteine or succinate concentrations, although mean plasma methylmalonic acid concentrations were lower (P=0.019) for cows receiving IB than for Control cows. Plasma β-hydroxybutyrate concentrations increased sharply at calving followed by a decline, but there was no effect of treatment. Similarly, there was no effect (P>0.05) of treatment on plasma non-esterified fatty acids or glucose. Whole tract digestibility of DM and fibre measured at week 7 of lactation were similar between treatments, and there was little effect of treatment on the milk fatty acid profile except for C15:0, which was lower in cows receiving DC than IB (P<0.05). It is concluded that a basal dietary concentration of 0.21 mg Co/kg DM is sufficient to meet the requirements of high yielding dairy cows during the transition period, and there is little benefit from additional Co or vitamin B12.
Knowledge of the effects of burial depth and burial duration on seed viability and, consequently, seedbank persistence of Palmer amaranth (Amaranthus palmeri S. Watson) and waterhemp [Amaranthus tuberculatus (Moq.) J. D. Sauer] ecotypes can be used for the development of efficient weed management programs. This is of particular interest, given the great fecundity of both species and, consequently, their high seedbank replenishment potential. Seeds of both species collected from five different locations across the United States were investigated in seven states (sites) with different soil and climatic conditions. Seeds were placed at two depths (0 and 15 cm) for 3 yr. Each year, seeds were retrieved, and seed damage (shrunken, malformed, or broken) plus losses (deteriorated and futile germination) and viability were evaluated. Greater seed damage plus loss averaged across seed origin, burial depth, and year was recorded for lots tested at Illinois (51.3% and 51.8%) followed by Tennessee (40.5% and 45.1%) and Missouri (39.2% and 42%) for A. palmeri and A. tuberculatus, respectively. The site differences for seed persistence were probably due to higher volumetric water content at these sites. Rates of seed demise were directly proportional to burial depth (α=0.001), whereas the percentage of viable seeds recovered after 36 mo on the soil surface ranged from 4.1% to 4.3% compared with 5% to 5.3% at the 15-cm depth for A. palmeri and A. tuberculatus, respectively. Seed viability loss was greater in the seeds placed on the soil surface compared with the buried seeds. The greatest influences on seed viability were burial conditions and time and site-specific soil conditions, more so than geographical location. Thus, management of these weed species should focus on reducing seed shattering, enhancing seed removal from the soil surface, or adjusting tillage systems.
Cardiomyopathy develops in >90% of Duchenne muscular dystrophy (DMD) patients by the second decade of life. We assessed the associations between DMD gene mutations, as well as Latent transforming growth factor-beta-binding protein 4 (LTBP4) haplotypes, and age at onset of myocardial dysfunction in DMD. DMD patients with baseline normal left ventricular systolic function and genotyping between 2004 and 2013 were included. Patients were grouped in multiple ways: specific DMD mutation domains, true loss-of-function mutations (group A) versus possible residual gene expression (group B), and LTBP4 haplotype. Age at onset of myocardial dysfunction was the first echocardiogram with an ejection fraction <55% and/or shortening fraction <28%. Of 101 DMD patients, 40 developed cardiomyopathy. There was no difference in age at onset of myocardial dysfunction among DMD genotype mutation domains (13.7±4.8 versus 14.3±1.0 versus 14.3±2.9 versus 13.8±2.5, p=0.97), groups A and B (14.4±2.8 versus 12.1±4.4, p=0.09), or LTBP4 haplotypes (14.5±3.2 versus 13.1±3.2 versus 11.0±2.8, p=0.18). DMD gene mutations involving the hinge 3 region, actin-binding domain, and exons 45–49, as well as the LTBP4 IAAM haplotype, were not associated with age of left ventricular dysfunction onset in DMD.
The History, Electrocardiogram (ECG), Age, Risk Factors, and Troponin (HEART) score is a decision aid designed to risk stratify emergency department (ED) patients with acute chest pain. It has been validated for ED use, but it has yet to be evaluated in a prehospital setting.
A prehospital modified HEART score can predict major adverse cardiac events (MACE) among undifferentiated chest pain patients transported to the ED.
A retrospective cohort study of patients with chest pain transported by two county-based Emergency Medical Service (EMS) agencies to a tertiary care center was conducted. Adults without ST-elevation myocardial infarction (STEMI) were included. Inter-facility transfers and those without a prehospital 12-lead ECG or an ED troponin measurement were excluded. Modified HEART scores were calculated by study investigators using a standardized data collection tool for each patient. All MACE (death, myocardial infarction [MI], or coronary revascularization) were determined by record review at 30 days. The sensitivity and negative predictive values (NPVs) for MACE at 30 days were calculated.
Over the study period, 794 patients met inclusion criteria. A MACE at 30 days was present in 10.7% (85/794) of patients with 12 deaths (1.5%), 66 MIs (8.3%), and 12 coronary revascularizations without MI (1.5%). The modified HEART score identified 33.2% (264/794) of patients as low risk. Among low-risk patients, 1.9% (5/264) had MACE (two MIs and three revascularizations without MI). The sensitivity and NPV for 30-day MACE was 94.1% (95% CI, 86.8-98.1) and 98.1% (95% CI, 95.6-99.4), respectively.
Prehospital modified HEART scores have a high NPV for MACE at 30 days. A study in which prehospital providers prospectively apply this decision aid is warranted.
A substantial proportion of persons with mental disorders seek treatment from complementary and alternative medicine (CAM) professionals. However, data on how CAM contacts vary across countries, mental disorders and their severity, and health care settings is largely lacking. The aim was therefore to investigate the prevalence of contacts with CAM providers in a large cross-national sample of persons with 12-month mental disorders.
In the World Mental Health Surveys, the Composite International Diagnostic Interview was administered to determine the presence of past 12 month mental disorders in 138 801 participants aged 18–100 derived from representative general population samples. Participants were recruited between 2001 and 2012. Rates of self-reported CAM contacts for each of the 28 surveys across 25 countries and 12 mental disorder groups were calculated for all persons with past 12-month mental disorders. Mental disorders were grouped into mood disorders, anxiety disorders or behavioural disorders, and further divided by severity levels. Satisfaction with conventional care was also compared with CAM contact satisfaction.
An estimated 3.6% (standard error 0.2%) of persons with a past 12-month mental disorder reported a CAM contact, which was two times higher in high-income countries (4.6%; standard error 0.3%) than in low- and middle-income countries (2.3%; standard error 0.2%). CAM contacts were largely comparable for different disorder types, but particularly high in persons receiving conventional care (8.6–17.8%). CAM contacts increased with increasing mental disorder severity. Among persons receiving specialist mental health care, CAM contacts were reported by 14.0% for severe mood disorders, 16.2% for severe anxiety disorders and 22.5% for severe behavioural disorders. Satisfaction with care was comparable with respect to CAM contacts (78.3%) and conventional care (75.6%) in persons that received both.
CAM contacts are common in persons with severe mental disorders, in high-income countries, and in persons receiving conventional care. Our findings support the notion of CAM as largely complementary but are in contrast to suggestions that this concerns person with only mild, transient complaints. There was no indication that persons were less satisfied by CAM visits than by receiving conventional care. We encourage health care professionals in conventional settings to openly discuss the care patients are receiving, whether conventional or not, and their reasons for doing so.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
The treatment gap between the number of people with mental disorders and the number treated represents a major public health challenge. We examine this gap by socio-economic status (SES; indicated by family income and respondent education) and service sector in a cross-national analysis of community epidemiological survey data.
Data come from 16 753 respondents with 12-month DSM-IV disorders from community surveys in 25 countries in the WHO World Mental Health Survey Initiative. DSM-IV anxiety, mood, or substance disorders and treatment of these disorders were assessed with the WHO Composite International Diagnostic Interview (CIDI).
Only 13.7% of 12-month DSM-IV/CIDI cases in lower-middle-income countries, 22.0% in upper-middle-income countries, and 36.8% in high-income countries received treatment. Highest-SES respondents were somewhat more likely to receive treatment, but this was true mostly for specialty mental health treatment, where the association was positive with education (highest treatment among respondents with the highest education and a weak association of education with treatment among other respondents) but non-monotonic with income (somewhat lower treatment rates among middle-income respondents and equivalent among those with high and low incomes).
The modest, but nonetheless stronger, an association of education than income with treatment raises questions about a financial barriers interpretation of the inverse association of SES with treatment, although future within-country analyses that consider contextual factors might document other important specifications. While beyond the scope of this report, such an expanded analysis could have important implications for designing interventions aimed at increasing mental disorder treatment among socio-economically disadvantaged people.
The 9th meeting of the African Society of Human Genetics, in partnership with the Senegalese Cancer Research and Study Group and the Human Heredity and Health in Africa (H3Africa) Consortium, was held in Dakar, Senegal. The theme was Strengthening Human Genetics Research in Africa. The 210 delegates came from 21 African countries and from France, Switzerland, UK, UAE, Canada and the USA. The goal was to highlight genetic and genomic science across the African continent with the ultimate goal of improving the health of Africans and those across the globe, and to promote the careers of young African scientists in the field. A session on the sustainability of genomic research in Africa brought to light innovative and practical approaches to supporting research in resource-limited settings and the importance of promoting genetics in academic, research funding, governmental and private sectors. This meeting led to the formation of the Senegalese Society for Human Genetics.
The Endangered snow leopard Panthera uncia is a flagship species of mountainous Asia and a conservation priority. China is the most important country for the species’ conservation because it has the most potential habitat and the largest population of snow leopards. North-west Yunnan province in south-west China is at the edge of the snow leopard's range, and a biodiversity hotspot, where three major Asian rivers, the Yangtze, Mekong and Salween, flow off the Tibetan plateau and cut deep valleys through the Hengduan Mountains. The snow leopard's status in north-west Yunnan is uncertain. We conducted interviews and camera-trapping surveys to assess the species’ status at multiple sites: two east of the Yangtze River and two between the Yangtze and Mekong Rivers. Thirty-eight herders/nature reserve officials interviewed claimed that snow leopards were present, but in 6,300 camera-trap days we did not obtain any photographs of snow leopards, so if the species is present, it is rare. However, we obtained many photographs of potential prey, such as blue sheep Pseudois nayaur, as well as photographs of common leopards Panthera pardus, at high elevations (3,000–4,500 m). More study is necessary in Yunnan and other areas of south-west China to investigate the status and resource overlap of snow leopards and common leopards, especially as climate change is resulting in increases in common leopard habitat and decreases in snow leopard habitat.
Planktonic foraminifera first evolved in the middle Jurassic but did not experience a major radiation until the mid-Cretaceous. The mid-Barremian to late Aptian was characterized by a steady increase in species richness and by the appearance of new morphological forms including planispiral coiling, clavate and radially elongate chambers, and culminating in the first appearance of taxa with complex apertural structures and the keeled morphotype in late Aptian time. This broad interval of radiation was abruptly ended by evolutionary turnover and low diversification rates in the latest Aptian and early Albian prior to a second explosive episode of radiation in the middle and late Albian. The evolutionary history of mid-Cretaceous calcareous nannofossils generally parallels the trends observed in planktonic foraminifera, although the latest Aptian-early Albian turnover event is not as pronounced. Reef communities in the Caribbean/Gulf of Mexico and Mediterranean provinces show a change in dominance from coral-algal-rudist reefs in the Barremian-early Albian to rudist domiance by the late Albian time. These changes in calcareous plankton and reef communities are related to complex oceanographic changes of the mid-Cretaceous including structure of the upper water column, productivity, sea level, atmospheric and oceanographic circulation, and changes in the chemistry of the ocean.
Changes in eustatic sea level influenced many of these factors including nutrient delivery to the oceans, climate, sites and rates of deep water formation, and ocean chemistry. What is the relationship between changes in sea level, as expressed by major seismic sequence boundaries, and the changes observed in marine biota? We have compared major changes of eustatic sea level within this interval of generally rising global sea level (Scott et al., 1988), with equivalent sequence boundaries (Haq et al., 1988) and the records of calcareous plankton (Roth, 1987; Leckie, 1989) and reef communities (Scott, 1988). What is most striking about these relationships is the apparent lack of direct correlation between sequence boundaries and turnover events in the marine biota. The calcareous plankton alternate in phase between relatively high rates of diversification and low rates of diversification, with the major sequence boundaries falling within intervals of change rather than at intervals of change. However, we acknowledge the potential of missing or condensed intervals in deep sea settings which may influence the record of evolutionary rates (e.g., Loutit, et al., 1988). Only the basal Albian sequence boundary appears to correlate with a major turnover event in the planktonic foraminifera, and the rapid change in Gulf Coast reef communites between the middle and upper Albian may correlate with a eustatic sea level change and a major sequence boundary. Based on high-resolution calcareous nannofossil, planktonic foraminiferal, sedimentologic, and geochemical data of Bralower et al. (submitted), the lower Aptian, basal Albian, and lower upper Albian sequence boundaries appear to correlate more closely with widespread oceanic dysoxic/anoxic events OAE1a, OAE1b, and OAE1c, respectively. The correlations between evolutionary events, anoxic events, and sequence boundaries must be considered tentative at this time because major disparities exist between the correlation of calcareous plankton zones and mid-Cretaceous chronostratigraphic units used by Haq et al. (1988) and Bralower et al. (submitted).
The house mouse (Mus musculus) and the black rat (Rattus rattus) are reservoir hosts for zoonotic pathogens, several of which cause neglected tropical diseases (NTDs). Studies of the prevalence of these NTD-causing zoonotic pathogens, in house mice and black rats from tropical residential areas are scarce. Three hundred and two house mice and 161 black rats were trapped in 2013 from two urban neighbourhoods and a rural village in Yucatan, Mexico, and subsequently tested for Trypanosoma cruzi, Hymenolepis diminuta and Leptospira interrogans. Using the polymerase chain reaction we detected T. cruzi DNA in the hearts of 4·9% (8/165) and 6·2% (7/113) of house mice and black rats, respectively. We applied the sedimentation technique to detect eggs of H. diminuta in 0·5% (1/182) and 14·2% (15/106) of house mice and black rats, respectively. Through the immunofluorescent imprint method, L. interrogans was identified in 0·9% (1/106) of rat kidney impressions. Our results suggest that the black rat could be an important reservoir for T. cruzi and H. diminuta in the studied sites. Further studies examining seasonal and geographical patterns could increase our knowledge on the epidemiology of these pathogens in Mexico and the risk to public health posed by rodents.
Background: The degree of overlap between schizophrenia (SCZ) and affective psychosis (AFF) has been a recurring question since Kraepelin’s subdivision of the major psychoses. Studying nonpsychotic relatives allows a comparison of disorder-associated phenotypes, without potential confounds that can obscure distinctive features of the disorder. Because attention and working memory have been proposed as potential endophenotypes for SCZ and AFF, we compared these cognitive features in individuals at familial high-risk (FHR) for the disorders. Methods: Young, unmedicated, first-degree relatives (ages, 13–25 years) at FHR-SCZ (n=41) and FHR-AFF (n=24) and community controls (CCs, n=54) were tested using attention and working memory versions of the Auditory Continuous Performance Test. To determine if schizotypal traits or current psychopathology accounted for cognitive deficits, we evaluated psychosis proneness using three Chapman Scales, Revised Physical Anhedonia, Perceptual Aberration, and Magical Ideation, and assessed psychopathology using the Hopkins Symptom Checklist -90 Revised. Results: Compared to controls, the FHR-AFF sample was significantly impaired in auditory vigilance, while the FHR-SCZ sample was significantly worse in working memory. Both FHR groups showed significantly higher levels of physical anhedonia and some psychopathological dimensions than controls. Adjusting for physical anhedonia, phobic anxiety, depression, psychoticism, and obsessive-compulsive symptoms eliminated the FHR-AFF vigilance effects but not the working memory deficits in FHR-SCZ. Conclusions: The working memory deficit in FHR-SZ was the more robust of the cognitive impairments after accounting for psychopathological confounds and is supported as an endophenotype. Examination of larger samples of people at familial risk for different psychoses remains necessary to confirm these findings and to clarify the role of vigilance in FHR-AFF. (JINS, 2016, 22, 1026–1037)
Age and sex-related patterns of association between medical conditions and major depressive episodes (MDE) are important for understanding disease burden, anticipating clinical needs and for formulating etiological hypotheses. General population estimates are especially valuable because they are not distorted by help-seeking behaviours. However, even large population surveys often deliver inadequate precision to adequately describe such patterns. In this study, data from a set of national surveys were pooled to increase precision, supporting more precise characterisation of these associations.
The data were from a series of Canadian national surveys. These surveys used comparable sampling strategies and assessment methods for MDE. Chronic medical conditions were assessed using items asking about professionally diagnosed medical conditions. Individual-level meta-analysis methods were used to generate unadjusted, stratified and adjusted prevalence odds ratios for 11 chronic medical conditions. Random effects models were used in the meta-analysis. A procedure incorporating rescaled replicate bootstrap weights was used to produce 95% confidence intervals.
Overall, conditions characterised by pain and inflammation tended to show stronger associations with MDE. The meta-analysis uncovered two previously undescribed patterns of association. Effect modification by age was observed in varying degrees for most conditions. This effect was most prominent for high blood pressure and cancer. Stronger associations were found in younger age categories. Migraine was an exception: the strength of association increased with age, especially in men. Second, especially for conditions predominantly affecting older age groups (arthritis, diabetes, back pain, cataracts, effects of stroke and heart disease) confounding by age was evident. For each condition, age adjustment resulted in strengthening of the associations. In addition to migraine, two conditions displayed distinctive patterns of association. Age adjusted odds ratios for thyroid disease reflected a weak association that was only significant in women. In epilepsy, a similar strength of association was found irrespective of age or sex.
The prevalence of MDE is elevated in association with most chronic conditions, but especially those characterised by inflammation and pain. Effect modification by age may reflect greater challenges or difficulties encountered by young people attempting to cope with these conditions. This pattern, however, does not apply to migraine or epilepsy. Neurobiological changes associated with these conditions may offset coping-related effects, such that the association does not weaken with age. Prominent confounding by age for several conditions suggests that age adjustments are necessary in order to avoid underestimating the strength of these associations.
There is consensus about the importance of ‘recovery’ in mental health services, but the link between recovery orientation of mental health teams and personal recovery of individuals has been underresearched.
To investigate differences in team leader, clinician and service user perspectives of recovery orientation of community adult mental health teams in England.
In six English mental health National Health Service (NHS) trusts, randomly chosen community adult mental health teams were surveyed. A random sample of ten patients, one team leader and a convenience sample of five clinicians were surveyed from each team. All respondents rated the recovery orientation of their team using parallel versions of the Recovery Self Assessment (RSA). In addition, service users also rated their own personal recovery using the Questionnaire about Processes of Recovery (QPR).
Team leaders (n = 22) rated recovery orientation higher than clinicians (n = 109) or patients (n = 120) (Wald(2) = 7.0, P = 0.03), and both NHS trust and team type influenced RSA ratings. Patient-rated recovery orientation was a predictor of personal recovery (b = 0.58, 95% CI 0.31–0.85, P<0.001). Team leaders and clinicians with experience of mental illness (39%) or supporting a family member or friend with mental illness (76%) did not differ in their RSA ratings from other team leaders or clinicians.
Compared with team leaders, frontline clinicians and service users have less positive views on recovery orientation. Increasing recovery orientation may support personal recovery.