To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Maintenance antipsychotic medication has a key role in the long term management of schizophrenia but in clinical practice its effectiveness is often reduced by poor adherence. Antipsychotic long acting injections (LAIs) can improve clinical outcomes in those who have adhered poorly with oral medication.
Aims and objectives
To compare patients’ attitudes, satisfaction and tolerability to their currently prescribed LAI, either a FGA-LAI or risperidone long-acting injection (RLAI), which was the only SGA-LAI at the time of this research.
Cross-sectional survey of a representative sample of patients prescribed a FGA-LAI (n = 39) or RLAI (n = 28) for a minimum of 6 months. Assessments comprised drug attitude inventory (DAI-30), tolerability measured by Liverpool university neuroleptic side effect rating scale (LUNSERS) and satisfaction with antipsychotic medication by the SWAM scale.
The DAI-30 score for patients on FGA depots was 16.18 and RLAI was 14.43, which indicated positive attitudes in both the groups. This difference did not reach statistical significance (p = 0.491). Further analysis, based on both the LUNSERS and SWAM scales, did not find any significant difference in tolerability and patient satisfaction.
There was no evidence of differences between FGA-LAIs and RLAI in terms of patient rated tolerability, attitudes and satisfaction. Both groups of patients had positive attitudes to their LAI and overall tolerability was good. This data is observational, and not from a randomised design, which may reflect selection bias. Randomised studies are needed to further investigate differences in tolerability and attitudes between specific LAIs.
A cross-sectional study was conducted from 2014 to 2017 in 13 organised pig farms located in eight states of India (Northern, North-Eastern and Southern regions) to identify the risk factors, pathotype and antimicrobial resistance of Escherichia coli associated with pre- and post-weaning piglet diarrhoea. The data collected through questionnaire survey were used to identify the risk factors by univariable analysis, in which weaning status, season, altitude, ventilation in the shed, use of heater/cooler for temperature control in the sheds, feed type, water source, and use of disinfectant, were the potential risk factors. In logistic regression model, weaning and source of water were the significant risk factors. The piglet diarrhoea prevalence was almost similar across the regions. Of the 909 faecal samples collected (North – 310, North-East – 194 and South – 405) for isolation of E. coli, pathotyping and antibiotic screening, 531 E. coli were isolated in MacConkey agar added with cefotaxime, where 345 isolates were extended spectrum β-lactamase (ESBL) producers and were positive for blaCTX-M-1 (n = 147), bla TEM (n = 151), qnrA (n = 98), qnrB (n = 116), qnrS (n = 53), tetA (n = 46), tetB (n = 48) and sul1 (n = 54) genes. Multiple antibiotic resistance (MAR) index revealed that 14 (2.64%) isolates had MAR index of 1. On the virulence screening of E. coli, 174 isolates harboured alone or combination of Stx1, Stx2, eaeA, hlyA genes. The isolates from diarrhoeic and post-weaning samples harboured higher number of virulence genes than non-diarrhoeic and pre-weaning. Alleviating the risk factors might reduce the piglet diarrhoea cases. The presence of multidrug-resistant and ESBL-producing pathogenic E. coli in piglets appears a public health concern.
Rice has the lowest grain protein content (GPC) among cereals. Efforts have been made to improve GPC through the modified bulk-pedigree method of selection. A total of 1780 F8 recombinant lines were derived in the year 2013 from five different cross combinations involving two high-GPC landraces, namely ARC10075 and ARC10063, three high-yielding parents, namely Swarna, Naveen and IR64, and one parent, namely Sharbati, known for superior grain quality with high micronutrient content. Near-infrared spectroscopy was used to facilitate high-throughput selection for GPC. Significant selection differential, response to selection and non-significant differences between the predicted and observed response to selection for GPC and protein yield indicated the effectiveness of this selection process. This resulted in lines with high GPC, protein yield and desirable levels of amylose content. Further, based on high mean and stability for GPC and protein yield over the environments in the wet seasons of 2013, 2014 and the dry season of 2014, 12 elite lines were identified. Higher accumulation of glutelin fraction and non-significant change in prolamin/glutelin ratio in the grain suggested safe guarding of the nutritional value of rice grain protein of most of these identified lines. Since rice is the staple food of millions, the output of breeding for high GPC could have a significant role in alleviating protein malnutrition, especially in the developing world.
Bovine calf scours reported to be caused by multiple aetiologies resulting in heavy mortality in unweaned calves and huge economic loss to the dairy farmers. Among these, cryptosporidiosis is an emerging waterborne zoonoses and one of the important causes of neonatal calf diarrhoea. Poor immune response coupled with primary cryptosporidial infections predispose neonatal calves to multiple secondary infections resulting in their deaths. In the present study, faecal samples from 100 diarrhoeic calves randomly picked up out of 17 outbreaks of bovine calf diarrhoea in periurban Ludhiana, Punjab in Northern India were subjected to conventional (microscopy, modified Zeihl–Neelsen (mZN) staining) and immunological and molecular techniques (faecal antigen capture ELISA and PCR) for detection of primary Cryptosporidium parvum infection as well as other frequently reported concurrent pathogens, viz. rotavirus and coronavirus, Salmonella spp., Escherichia coli, Clostridium perfringens and Eimeria spp. The faecal antigen capture ELISA and PCR revealed 35% prevalence of C. parvum in contrast to 25% by mZN staining with a relatively higher prevalence (66·7%) in younger (8–14-day-old) calves. The detection rate of the other enteropathogens associated with C. parvum was 45·71% for C. perfringens followed by Salmonella spp (40·0%), rotavirus (36·0%), coronavirus (16·0%), E. coli (12·0%) and Eimeria spp (4·0%) The sensitivity for detection of C. parvum by ELISA and mZN staining in comparison to PCR was 97·14% and 72·72%, respectively. An important finding of the study was that C. parvum alone was found in only 10% of the diarrhoeic faecal samples, whereas, majority of the samples (90%) showed mixed infections ranging from a combination of two to five agents. This is the first documentary proof of C. parvum and associated pathogens responsible for severe periurban outbreaks of bovine calf diarrhoea culminating in heavy mortality from Northern India.
Background: The surgical risk factors and neuro-imaging characteristics associated with cerebellar mutism (CM) remain unclear and require further investigation. We aimed to examine surgical and MRI findings associated with CM in children following posterior fossa tumor resection. Methods: Using our data registry, we retrospectively collected data from pediatric patients who acquired CM and were matched based on age and pathology type with patients not acquiring CM after posterior fossa surgery. The strength of association between surgical and MRI variables and CM were examined using odds ratios (ORs) and corresponding 95% confidence intervals (CIs). Results: A total of 22 patients were included. Medulloblastoma was the most common pathology among CM patients (91%). Tumor attachment to the floor of the fourth ventricle (OR, 6; 95% CI, 0.7-276), calcification/hemosiderin deposition (OR 7; 95% CI 0.9-315.5), and post-operative peri-ventricular ischemia on MRI (OR, 5; 95% CI, 0.5-236.5) were found to have the highest association with CM. Conclusions: Our results may suggest that tumor attachment to the floor of the fourth ventricle, pathological calcification, and post-operative ischemia are relatively more prevalent in patients with CM. Collectively, our work calls for a larger multi-institutional study of CM patients to further investigate the determinants and management of CM to potentially minimize its development and predict onset.
Adverse psychosocial working environments characterized by job strain (the combination of high demands and low control at work) are associated with an increased risk of depressive symptoms among employees, but evidence on clinically diagnosed depression is scarce. We examined job strain as a risk factor for clinical depression.
We identified published cohort studies from a systematic literature search in PubMed and PsycNET and obtained 14 cohort studies with unpublished individual-level data from the Individual-Participant-Data Meta-analysis in Working Populations (IPD-Work) Consortium. Summary estimates of the association were obtained using random-effects models. Individual-level data analyses were based on a pre-published study protocol.
We included six published studies with a total of 27 461 individuals and 914 incident cases of clinical depression. From unpublished datasets we included 120 221 individuals and 982 first episodes of hospital-treated clinical depression. Job strain was associated with an increased risk of clinical depression in both published [relative risk (RR) = 1.77, 95% confidence interval (CI) 1.47–2.13] and unpublished datasets (RR = 1.27, 95% CI 1.04–1.55). Further individual participant analyses showed a similar association across sociodemographic subgroups and after excluding individuals with baseline somatic disease. The association was unchanged when excluding individuals with baseline depressive symptoms (RR = 1.25, 95% CI 0.94–1.65), but attenuated on adjustment for a continuous depressive symptoms score (RR = 1.03, 95% CI 0.81–1.32).
Job strain may precipitate clinical depression among employees. Future intervention studies should test whether job strain is a modifiable risk factor for depression.
Medulloblastoma (MB) is the most common malignant pediatric brain tumour, and is categorized into four molecular subgroups, with Group 3 MB having the worst prognosis due to the highest rate of metastatic dissemination and relapse. In this work, we describe the epigenetic regulator Bmi1 as a novel therapeutic target for treatment of recurrent Group 3 MB. Through comparative profiling of primary and recurrent MB, we show that Bmi1 defines a treatment-refractory cell population that is uniquely targetable by a novel class of small molecule inhibitors. We have optimized an in vivo mouse-adapted therapy model that has the advantage of generating recurrent, human, treatment-refractory MBs. Our preliminary studies showed that although chemoradiotherapy administered to mice engrafted with human MB showed reduction in tumour size, Bmi1 expression was enriched in the post-treatment residual tumour. Furthermore, we found that knockdown of Bmi1 in human recurrent MB cells decreases proliferation and self-renewing capacities of MB cells in vitro as well as both tumour size and extent of spinal leptomeningeal metastases in vivo. Oral administration of a potent Bmi1 inhibitor, PTC 028, resulted in a marked reduction in tumour burden and an increased survival in treatment cohort. Bmi1 inhibitors showed high specificity for MB cells and spared normal human neural stem cells, when treated with doses relevant for MB cells. As Group 3 medulloblastoma is often metastatic and uniformly fatal at recurrence, with no current or planned trials of targeted therapy, an efficacious agent such as Bmi1 inhibitor could be rapidly transitioned to clinical trials.
Medulloblastoma (MB), the most common malignant pediatric brain tumor, is categorized into four molecular subgroups. Given the high rate of metastatic dissemination at diagnosis and recurrence in Group 3 MBs, these patients have the worst clinical outcome with a 5-year survivorship of approximately 50%. By adapting the existing COG (Children’s Oncology Group) Protocol for children with newly diagnosed high-risk MB, for treatment of immuno-deficient mice intracranially engrafted with human MB brain tumour initiating cells we aim to identify and characterize the treatment-refractory cell population in Group 3 MBs. Mice were sacrificed at multiple time points during the course of tumor development and therapy: (i) at engraftment; (ii) post-radiation; (iii) post-radiation and chemotherapy; and (iv) at MB recurrence. MB cell populations recovered separately from brains and spines were comprehensively profiled for gene expression analysis, stem cell and molecular features to generate a global, comparative profile of MB cells through therapy. We report a higher expression of CD133, Sox2 and Bmi1 in addition to increased self-renewal capacity following chemoradiotherapy treatment. The enrichment map constructed from global gene expression analysis showed an increase in pathways regulating self-renewal, DNA repair and chemoresistance post-therapy despite the apparent decrease in tumour size and vascularity. Additionally, from gene expression at MB recurrence, we identified a list of genes that negatively correlate with survival in patients diagnosed with Group 3 MB. A differential genomic profile of the “treatment-responsive” tumors against those that fail therapy may contribute to discovery of novel therapeutic approaches for the most aggressive subgroup of MB.
Brain Metastases (BM) represent a leading cause of cancer mortality. While metastatic lesions contain subclones derived from their primary lesion, their functional characterization has been limited by a paucity of preclinical models accurately recapitulating the stages of metastasis. This work describes the isolation of a unique subset of metastatic stem-like cells from primary human patient samples of BM, termed brain metastasis initiating cells (BMICs). Utilizing these BMICs we have established a novel patient-derived xenograft (PDX) model of BM that recapitulates the entire metastatic cascade, from primary tumor initiation to micro-metastasis and macro-metastasis formation in the brain. We then comprehensively interrogated human BM to identify genetic regulators of BMICs using in vitro and in vivo RNA interference screens, and validated hits using both our novel PDX model as well as primary clinical BM specimens. We identified SPOCK1 and TWIST2 as novel BMIC regulators, where in our model SPOCK1 regulated BMIC self-renewal and tumor initiation, and TWIST2 specifically regulated cell migration from lung to brain. A prospective cohort of primary lung cancer specimens was used to establish that SPOCK1 and TWIST2 were only expressed in patients who ultimately developed BM, thus establishing both clinical and functional utility for these gene products. This work offers the first comprehensive preclinical model of human brain metastasis for further characterization of therapeutic targets, identification of predictive biomarkers, and subsequent prophylactic treatment of patients most likely to develop BM. By blocking this process, metastatic lung cancer would effectively become a localized, more manageable disease.
The ITRACK study explored the process and predictors of transition between Child and Adolescent Mental Health Services (CAMHS) and Adult Mental Health Services (AMHS) in the Republic of Ireland.
Following ethical approval, clinicians in each of Ireland’s four Health Service Executive (HSE) areas were contacted, informed about the study and were invited to participate. Clinicians identified all cases who had reached the transition boundary (i.e. upper age limit for that CAMHS team) between January and December 2010. Data were collected on clinical and socio-demographic details and factors that informed the decision to refer or not refer to the AMHS, and case notes were scrutinised to ascertain the extent of information exchanged between services during transition.
A total of 62 service users were identified as having crossed the transition boundary from nine CAMHS [HSE Dublin Mid-Leinster (n=40, 66%), HSE South (n=18, 30%), HSE West (n=2, 3%), HSE Dublin North (n=1, 2%)]. The most common diagnoses were attention deficit hyperactivity disorder (ADHD; n=19, 32%), mood disorders (n=16, 27%), psychosis (n=6, 10%) and eating disorders (n=5, 8%). Forty-seven (76%) of those identified were perceived by the CAMHS clinician to have an ‘on-going mental health service need’, and of these 15 (32%) were referred, 11 (23%) young people refused and 21 (45%) were not referred, with the majority (12, 57%) continuing with the CAMHS for more than a year beyond the transition boundary. Young people with psychosis were more likely to be referred [χ2 (2, 46)=8.96, p=0.02], and those with ADHD were less likely to be referred [χ2 (2, 45)=8.89, p=0.01]. Being prescribed medication was not associated with referral [χ2 (2, 45)=4.515, p=0.11]. In referred cases (n=15), there was documented evidence of consent in two cases (13.3%), inferred in another four (26.7%) and documented preparation for transition in eight (53.3%). Excellent written communication (100%) was not supported by face-to-face planning meetings (n=2, 13.3%), joint appointments (n=1, 6.7%) or telephone conversations (n=1, 6.7%) between corresponding clinicians.
Despite perceived on-going mental health (MH) service need, many young people are not being referred or are refusing referral to the AMHS, with those with ADHD being the most affected. CAMHS continue to offer on-going care past the transition boundary, which has resource implications. Further qualitative research is warranted to understand, in spite of perceived MH service need, the reason for non-referral by the CAMHS clinicians and refusal by the young person.
The choice of an appropriate cropping system is critical to maintaining or enhancing agricultural sustainability. Yield, profitability and water use efficiency are important factors for determining suitability of cropping systems in hot arid region. In a two-year field experiment (2009/10–2010/11) on loam sandy soils of Bikaner, India, the production potential, profitability and water use efficiency (WUE) of five cropping systems (groundnut–wheat, groundnut–isabgol, groundnut–chickpea, cluster bean–wheat and mung bean–wheat) each at six nutrient application rate (NAR) i.e. 0, 25, 50, 75, 100% recommended dose of N and P (NP) and 100% NP + S were evaluated. The cropping systems varied significantly in terms of productivity, profitability and WUEs. Averaged across nutrient application regimes, groundnut–wheat rotation gave 300–1620 kg ha−1 and 957–3365 kg ha−1 higher grain and biomass yields, respectively, than other cropping systems. The mean annual net returns were highest for the mung bean–wheat system, which returned 32–57% higher net return than other cropping systems. The mung bean–wheat and cluster bean–wheat systems had higher WUE in terms of yields than other cropping systems. The mung bean–wheat system recorded 35–63% higher WUE in monetary terms compared with other systems. Nutrients application improved yields, profit and WUEs of cropping systems. Averaged across years and cropping systems, the application of 100% NP improved grain yields, returns and WUE by 1.7, 3.9 and 1.6 times than no application of nutrients. The results suggest that the profitability and WUEs of crop production in this hot arid environment can be improved, compared with groundnut–wheat cropping, by substituting groundnut by mung bean and nutrients application.
Dengue is regarded as the most important arboviral disease. Although sporadic cases have been reported, serotypes responsible for outbreaks have not been identified from central India over the last 20 years. We investigated two outbreaks of febrile illness, in August and November 2012, from Korea district (Chhattisgarh) and Narsinghpur district (Madhya Pradesh), respectively. Fever and entomological surveys were conducted in the affected regions. Molecular and serological tests were conducted on collected serum samples. Dengue-specific amplicons were sequenced and phylogenetic analyses were performed. In Korea and Narsinghpur districts 37·3% and 59% of cases were positive, respectively, for dengue infection, with adults being the worst affected. RT–PCR confirmed dengue virus serotype 1 genotype III as the aetiology. Ninety-six percent of infections were primary. This is the first time that dengue virus 1 outbreaks have been documented from central India. Introduction of the virus into the population and a conducive mosquitogenic environment favouring increased vector density caused the outbreak. Timely diagnosis and strengthening vector control measures are essential to avoid future outbreaks.
Identification of climate-smart nutrient management practices will overcome the ill effects of extreme climate variability on agricultural production under projected climate change scenarios. The rice–wheat cropping system is the major system used in India: using long-term yield data from Integrated Nutrient Management experiments on this system, the present study analysed trends in weather parameters and grain yield under different nutrient management practices. Twelve treatments with different combinations of inorganic (chemical fertilizer) and organic (farmyard manure (FYM), green manure (GM) and crop residue) sources of nutrients were compared with farmers’ conventional practices. A significant increasing trend was noticed for rainfall during the rice season at Kalyani and Navsari, of the order of 137·7 and 154·2 mm/decade, respectively. The highest increase in maximum temperature was seen at Palampur (1·62 °C/decade) followed by Ludhiana (1·14 °C/decade). At all the sites except Ludhiana and Kanpur, the yield of the rice–wheat system showed an increasing trend ranging from 0·08 t/ha/year in Jabalpur to 0·011 t/ha/year in Navsari, under the recommended dose of inorganic fertilizer application. A significant decreasing trend of 0·055 t/ha was found in Ludhiana. For most of the sites, a combination of half the recommended dose of inorganic fertilizer and either FYM or GM to provide the remainder of the N required was sufficient to maintain productivity. The top three climate-resilient integrated nutrient management practices were identified for all the study sites. Thus, the present study highlights the adaptive capacity of different integrated nutrient management practices to rainfall and temperature extremes under rice–wheat cropping system in distinctive agro-ecological zones of India.
Smallholder dairy production represents a promising income generating activity for poor farmers in the developing world. Because of the perishable nature of milk, marketing arrangements for collection, distribution and sale are important for enhanced livelihoods in the smallholder dairy sector. In this study we examined the relationship between market quality and basic feeding and breeding practices at farm level. We define market quality as the attractiveness and reliability of procurement channels and associated input supply arrangements. We took as our study countries, India with its well-developed smallholder dairy sector, and Ethiopia where the smallholder dairy industry has remained relatively undeveloped despite decades of development effort. We conducted village surveys among producer groups in 90 villages across three States in India and two Regions in Ethiopia. Producer groups were stratified according to three levels of market quality – high, medium and low. Data showed that diet composition was relatively similar in India and Ethiopia with crop residues forming the major share of the diet. Concentrate feeding tended to be more prominent in high market quality sites. Herd composition changed with market quality with more dairy (exotic) cross-bred animals in high market quality sites in both India and Ethiopia. Cross-bred animals were generally more prominent in India than Ethiopia. Herd performance within breed did not change a great deal along the market quality gradient. Parameters such as calving interval and milk yield were relatively insensitive to market quality. Insemination of cross-bred cows was predominantly by artificial insemination (AI) in India and accounted for around half of cross-bred cow inseminations in Ethiopia. Data on perceptions of change over the last decade indicated that per herd and per cow productivity are both increasing in high market quality sites with a more mixed picture in medium and low-quality sites. Similarly dairy-derived income is on the increase in high market quality sites. This is accompanied by a strong increase in stall feeding at the expense of grazing. The study indicates that the first constraint to intensification of dairy production in Ethiopia is the genetic quality of the herd. There is less scope for improved AI provision in India since the cross-bred herd is mainly serviced by AI already. However, as for Ethiopia, there is considerable scope for closing yield gaps in India through improved feed use and supply. Results strongly show that well-developed markets with good procurement arrangements are key for sustainable dairy intensification.
In India, rotavirus infections cause the death of 98621 children each year. In urban neighbourhoods in Delhi, children were followed up for 1 year to estimate the incidence of rotavirus gastroenteritis and common genotypes. Infants aged ⩽1 week were enrolled in cohort 1 and infants aged 12 months (up to +14 days) in cohort 2. Fourteen percent (30/210) gastroenteritis episodes were positive for rotavirus. Incidence rates of rotavirus gastroenteritis episodes in the first and second year were 0·18 [95% confidence interval (CI) 0·10–0·27] and 0·14 (95% CI 0·07–0·21) episodes/child-year, respectively. The incidence rate of severe rotavirus gastroenteritis in the first year of life was 0·05 (95% CI 0·01–0·10) episodes/child-year. There were no cases in the second year. The common genotypes detected were G1P (27%) and G9P (23%). That severe rotavirus gastroenteritis is common in the first year of life is relevant for planning efficacy trials.
Prior to 2009 dengue fever had not been reported in the Andaman and Nicobar archipelago. In 2009, a few patients with dengue fever-like illness were reported, some of whom tested positive for dengue antibodies. In 2010, 516 suspected cases were reported, including some with dengue haemorrhagic fever (DHF) and dengue shock syndrome (DSS); 80 (15·5%) were positive for dengue antibodies. DENV RNA was detected in five patients and PCR-based typing showed that three of these belonged to serotype 1 and two to serotype 2. This was confirmed by sequence typing. Two clones of dengue virus, one belonging to serotype 1 and the other to serotype 2 appeared to be circulating in Andaman. Emergence of severe diseases such as DHF and DSS might be due to recent introduction of a more virulent strain or because of the enhancing effect of sub-neutralizing levels of antibodies developed due to prior infections. There is a need to revise the vector-borne disease surveillance system in the islands.
The present cross-sectional study was conducted to determine the vitamin D status of pregnant Indian women and their breast-fed infants. Subjects were recruited from the Department of Obstetrics, Armed Forces Clinic and Army Hospital (Research and Referral), Delhi. A total of 541 apparently healthy women with uncomplicated, single, intra-uterine gestation reporting in any trimester were consecutively recruited. Of these 541 women, 299 (first trimester, ninety-seven; second trimester, 125; third trimester, seventy-seven) were recruited in summer (April–October) and 242 (first trimester, fifty-nine, second trimester, ninety-three; third trimester, ninety) were recruited in winter (November–March) to study seasonal variations in vitamin D status. Clinical, dietary, biochemical and hormonal evaluations for the Ca–vitamin D–parathormone axis were performed. A subset of 342 mother–infant pairs was re-evaluated 6 weeks postpartum. Mean serum 25-hydroxyvitamin D (25(OH)D) of pregnant women was 23·2 (sd 12·2) nmol/l. Hypovitaminosis D (25(OH)D < 50 nmol/l) was observed in 96·3 % of the subjects. Serum 25(OH)D levels were significantly lower in winter in the second and third trimesters, while serum intact parathormone (iPTH) and alkaline phosphatase levels were significantly higher in winter in all three trimesters. A significant negative correlation was found between serum 25(OH)D and iPTH in mothers (r − 0·367, P = 0·0001) and infants (r − 0·56, P = 0·0001). A strong positive correlation was observed between 25(OH)D levels of mother–infant pairs (r 0·779, P = 0·0001). A high prevalence of hypovitaminosis D was observed in pregnancy, lactation and infancy with no significant inter-trimester differences in serum 25(OH)D levels.
Associations between parental depression and offspring affective and disruptive disorders are well documented. Few genetically informed studies have explored the processes underlying intergenerational associations.
A semi-structured interview assessing DSM-III-R psychiatric disorders was administered to twins (n=1296) from the Australian Twin Register (ATR), their spouses (n=1046) and offspring (n=2555). We used the Children of Twins (CoT) design to delineate the extent to which intergenerational associations were consistent with a causal influence or due to genetic confounds.
In between-family analyses, parental depression was associated significantly with offspring depression [hazard ratio (HR) 1.52, 95% confidence interval (CI) 1.20–1.93] and conduct disorder (CD; HR 2.27, CI 1.31–3.93). Survival analysis indicated that the intergenerational transmission of depression is consistent with a causal (environmental) inference, with a significant intergenerational association in offspring of discordant monozygotic (MZ) twin pairs (HR 1.39, CI 1.00–1.94). Logistic regression analysis suggested that the parental depression–offspring CD association was due to shared genetic liability in the parents and offspring. No intergenerational association was found when comparing the offspring of discordant MZ twins [odds ratio (OR) 1.41, CI 0.63–3.14], but offspring of discordant dizygotic (DZ) twins differed in their rates of CD (OR 2.53, CI 0.95–6.76). All findings remained after controlling for several measured covariates, including history of depression and CD in the twins' spouses.
The mechanisms underlying associations between parental depression and offspring psychopathology seem to differ depending on the outcome. The results are consistent with a causal environmental role of parental depression in offspring depression whereas common genetic factors account for the association of parental depression and offspring CD.