To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Overweight and obesity may increase risk of disease progression in men with prostate cancer, but there have been few studies of weight loss interventions in this patient group. Based on existing literature and patient preferences we designed a self-help diet and physical activity intervention with telephone-based dietitian support. Men treated for prostate cancer who were overweight or obese were randomised to intervention or wait-list mini-intervention groups. The intervention group had an initial group meeting, a supporting letter from their urological consultant, three telephone dietitian consultations at 4-week intervals, a pedometer and access to web-based diet and physical activity resources. At 12 weeks, men in both groups were given digital scales for providing follow-up weight measurements, and the wait-list group received a mini-intervention of the supporting letter, a pedometer and access to the web-based resources. Sixty-two men were randomised; fifty-four completed baseline and 12-week measurements, and fifty-one and twenty-seven provided measurements at 6 and 12 months, respectively. In a repeated measures model, mean (95 % CI) difference in weight change between groups (wait-list mini-intervention minus intervention) at 12 weeks was −2·13 (−3·44, −0·82) kg (P = 0·002). At 12 months the corresponding value was −2·43 (−4·50, −0·37) kg (P = 0·022). Mean (95 % CI) difference in global QoL score change between groups at 12 weeks was 12·3 (4·93, 19·7) (P = 0·002); at 12 months there were no significant differences between groups. Results suggest the potential of self-help diet and physical activity intervention with trained support for modest but sustained weight loss in this patient group.
Maternal mental health during pregnancy and postpartum predicts later emotional and behavioural problems in children. Even though most perinatal mental health problems begin before pregnancy, the consequences of preconception maternal mental health for children's early emotional development have not been prospectively studied.
We used data from two prospective Australian intergenerational cohorts, with 756 women assessed repeatedly for mental health problems before pregnancy between age 13 and 29 years, and during pregnancy and at 1 year postpartum for 1231 subsequent pregnancies. Offspring infant emotional reactivity, an early indicator of differential sensitivity denoting increased risk of emotional problems under adversity, was assessed at 1 year postpartum.
Thirty-seven percent of infants born to mothers with persistent preconception mental health problems were categorised as high in emotional reactivity, compared to 23% born to mothers without preconception history (adjusted OR 2.1, 95% CI 1.4–3.1). Ante- and postnatal maternal depressive symptoms were similarly associated with infant emotional reactivity, but these perinatal associations reduced somewhat after adjustment for prior exposure. Causal mediation analysis further showed that 88% of the preconception risk was a direct effect, not mediated by perinatal exposure.
Maternal preconception mental health problems predict infant emotional reactivity, independently of maternal perinatal mental health; while associations between perinatal depressive symptoms and infant reactivity are partially explained by prior exposure. Findings suggest that processes shaping early vulnerability for later mental disorders arise well before conception. There is an emerging case for expanding developmental theories and trialling preventive interventions in the years before pregnancy.
Shunt-related adverse events are frequent in infants after modified Blalock–Taussig despite use of acetylsalicylic acid prophylaxis. A higher incidence of acetylsalicylic acid-resistance and sub-therapeutic acetylsalicylic acid levels has been reported in infants. We evaluated whether using high-dose acetylsalicylic acid can decrease shunt-related adverse events in infants after modified Blalock–Taussig.
In this single-centre retrospective cohort study, we included infants ⩽1-year-old who underwent modified Blalock–Taussig placement and received acetylsalicylic acid in the ICU. We defined acetylsalicylic acid treatment groups as standard dose (⩽7 mg/kg/day) and high dose (⩾8 mg/kg/day) based on the initiating dose.
There were 34 infants in each group. Both groups were similar in age, gender, cardiac defect type, ICU length of stay, and time interval to second stage or definitive repair. Shunt interventions (18 versus 32%, p=0.16), shunt thrombosis (14 versus 17%, p=0.74), and mortality (9 versus 12%, p=0.65) were not significantly different between groups. On multiple logistic regression analysis, single-ventricle morphology (odds ratio 5.2, 95% confidence interval of 1.2–23, p=0.03) and post-operative red blood cells transfusion ⩾24 hours [odds ratio 15, confidence interval of (3–71), p<0.01] were associated with shunt-related adverse events. High-dose acetylsalicylic acid treatment [odds ratio 2.6, confidence interval of (0.7–10), p=0.16] was not associated with decrease in these events.
High-dose acetylsalicylic acid may not be sufficient in reducing shunt-related adverse events in infants after modified Blalock–Taussig. Post-operative red blood cells transfusion may be a modifiable risk factor for these events. A randomised trial is needed to determine appropriate acetylsalicylic acid dosing in infants with modified Blalock–Taussig.
Self-harm in young people is associated with later problems in social and emotional development. However, it is unknown whether self-harm in young women continues to be a marker of vulnerability on becoming a parent. This study prospectively describes the associations between pre-conception self-harm, maternal depressive symptoms and mother–infant bonding problems.
The Victorian Intergenerational Health Cohort Study (VIHCS) is a follow-up to the Victorian Adolescent Health Cohort Study (VAHCS) in Australia. Socio-demographic and health variables were assessed at 10 time-points (waves) from ages 14 to 35, including self-reported self-harm at waves 3–9. VIHCS enrolment began in 2006 (when participants were aged 28–29 years), by contacting VAHCS women every 6 months to identify pregnancies over a 7-year period. Perinatal depressive symptoms were assessed with the Edinburgh Postnatal Depression Scale during the third trimester, and 2 and 12 months postpartum. Mother–infant bonding problems were assessed with the Postpartum Bonding Questionnaire at 2 and 12 months postpartum.
Five hundred sixty-four pregnancies from 384 women were included. One in 10 women (9.7%) reported pre-conception self-harm. Women who reported self-harming in young adulthood (ages 20–29) reported higher levels of perinatal depressive symptoms and mother–infant bonding problems at all perinatal time points [perinatal depressive symptoms adjusted β = 5.40, 95% confidence interval (CI) 3.42–7.39; mother–infant bonding problems adjusted β = 7.51, 95% CI 3.09–11.92]. There was no evidence that self-harm in adolescence (ages 15–17) was associated with either perinatal outcome.
Self-harm during young adulthood may be an indicator of future vulnerability to perinatal mental health and mother–infant bonding problems.
The “Stop the Bleed” campaign advocates for non-medical personnel to be trained in basic hemorrhage control. However, it is not clear what type of education or the duration of instruction needed to meet that requirement. The objective of this study was to determine the impact of a brief hemorrhage control educational curriculum on the willingness of laypersons to respond during a traumatic emergency.
This “Stop the Bleed” education initiative was conducted by the University of Texas Health San Antonio Office of the Medical Director (San Antonio, Texas USA) between September 2016 and March 2017. Individuals with formal medical certification were excluded from this analysis. Trainers used a pre-event questionnaire to assess participants knowledge and attitudes about tourniquets and responding to traumatic emergencies. Each training course included an individual evaluation of tourniquet placement, 20 minutes of didactic instruction on hemorrhage control techniques, and hands-on instruction with tourniquet application on both adult and child mannequins. The primary outcome in this study was the willingness to use a tourniquet in response to a traumatic medical emergency.
Of 236 participants, 218 met the eligibility criteria. When initially asked if they would use a tourniquet in real life, 64.2% (140/218) responded “Yes.” Following training, 95.6% (194/203) of participants responded that they would use a tourniquet in real life. When participants were asked about their comfort level with using a tourniquet in real life, there was a statistically significant improvement between their initial response and their response post training (2.5 versus 4.0, based on 5-point Likert scale; P<.001).
In this hemorrhage control education study, it was found that a short educational intervention can improve laypersons’ self-efficacy and reported willingness to use a tourniquet in an emergency. Identified barriers to act should be addressed when designing future hemorrhage control public health education campaigns. Community education should continue to be a priority of the “Stop the Bleed” campaign.
RossEM, RedmanTT, MappJG, BrownDJ, TanakaK, CooleyCW, KharodCU, WamplerDA. Stop the Bleed: The Effect of Hemorrhage Control Education on Laypersons’ Willingness to Respond During a Traumatic Medical Emergency. Prehosp Disaster Med. 2018;33(2):127–132.
The initiation and propagation of the 1993–95 surge of Bering Glacier, Alaska, U.S.A., was observed using ERS-1 synthetic aperture radar(SAR) imagery. Images were acquired before and during the surge, between November 1992 and October 1993. Terrain-corrected and co-registered imagery was used to measure the propagation of the surge front. Surface undulations interpreted to be evidence of accelerated flow, indicating surge initiation in late winter, were observed in the 26 March 1993 image. From 19 May to 25 August 1993, the mean propagation velocity of the surge front was 90 m d−1. The surge reached the terminus shortly after 25 August 1993. The central area of the calving terminus then advanced into proglacial Vitus Lake at a mean rate of 19 md−1 between 9 August and 18 October 1993. Feature matching was used to measure discrete velocity vectors between 9 August and 13 September; the vectors were kriged onto a uniform grid and used to compute the principal strain rates. Shattering of the calving front and dramatically increased iceberg calving were accompanied by high compressive strain rates immediately up-glacier from the calving front.
As chemical management options for weeds become increasingly limited due to selection for herbicide resistance, investigation of additional nonchemical tools becomes necessary. Harvest weed seed control (HWSC) is a methodology of weed management that targets and destroys weed seeds that are otherwise dispersed by harvesters following threshing. It is not known whether problem weeds in western Canada retain their seeds in sufficient quantities until harvest at a height suitable for collection. A study was conducted at three sites over 2 yr to determine whether retention and height criteria were met by wild oat, false cleavers, and volunteer canola. Wild oat consistently shed seeds early, but seed retention was variable, averaging 56% at the time of wheat swathing, with continued losses until direct harvest of wheat and fababean. The majority of retained seeds were >45 cm above ground level, suitable for collection. Cleavers seed retention was highly variable by site-year, but generally greater than wild oat. The majority of seed was retained >15 cm above ground level and would be considered collectable. Canola seed typically had >95% retention, with the majority of seed retained >15 cm above ground level. The suitability ranking of the species for management with HWSC was canola>cleavers>wild oat. Efficacy of HWSC systems in western Canada will depend on the target species and site- and year-specific environmental conditions.
Developmental origins of health and disease (DOHaD) is the study of how the early life environment can impact the risk of chronic diseases from childhood to adulthood and the mechanisms involved. Epigenetic modifications such as DNA methylation, histone modifications and non-coding RNAs are involved in mediating how early life environment impacts later health. This review is a summary of the Epigenetics and DOHaD workshop held at the 2016 DOHaD Society of Australia and New Zealand Conference. Our extensive knowledge of how the early life environment impacts later risk for chronic disease would not have been possible without animal models. In this review we highlight some animal model examples that demonstrate how an adverse early life exposure results in epigenetic and gene expression changes that may contribute to increased risk of chronic disease later in life. Type 2 diabetes and cardiovascular disease are chronic diseases with an increasing incidence due to the increased number of children and adults that are obese. Epigenetic changes such as DNA methylation have been shown to be associated with metabolic health measures and potentially predict future metabolic health status. Although more difficult to elucidate in humans, recent studies suggest that DNA methylation may be one of the epigenetic mechanisms that mediates the effects of early life exposures on later life risk of obesity and obesity related diseases. Finally, we discuss the role of the microbiome and how it is a new player in developmental programming and mediating early life exposures on later risk of chronic disease.
Accurate weed emergence models are valuable tools for scheduling planting, cultivation, and herbicide applications. Multiple models predicting giant ragweed emergence have been developed, but none have been validated in diverse crop rotation and tillage systems, which have the potential to influence weed emergence patterns. This study evaluated the performance of published giant ragweed emergence models across various crop rotations and spring tillage dates in southern Minnesota. Across experiments, the most robust model was a mixed-effects Weibull (flexible sigmoidal function) model predicting emergence in relation to hydrothermal time accumulation with a base temperature of 4.4 C, a base soil matric potential of −2.5 MPa, and two random effects determined by overwinter growing degree days (GDD) (10 C) and precipitation accumulated during seedling recruitment. The deviations in emergence between individual plots and the fixed-effects model were distinguished by the positive association between the lower horizontal asymptote (Drop) and maximum daily soil temperature during seedling recruitment. This finding indicates that crops and management practices that increase soil temperature will have a shorter lag phase at the start of giant ragweed emergence compared with practices promoting cool soil temperatures. Thus, crops with early-season crop canopies such as perennial crops and crops planted in early spring and in narrow rows will likely have a slower progression of giant ragweed emergence. This research provides a valuable assessment of published giant ragweed emergence models and illustrates that accurate emergence models can be used to time field operations and improve giant ragweed control across diverse cropping systems.
The APM QSO survey is a quantitative survey aimed at finding a large sample (∼ 1000) of QSOs using broadly-based selection criteria applied to machine-scanned UK Schmidt Telescope direct and objective-prism plates. The survey is currently entering its third year and, as of August 1988, the sample consists of ∼ 700 QSOs with mJ ≥ 18.75 in the range 0.2 ≤ z ≤ 3.3. Preliminary analysis suggests that the sample is relatively free of the selection effects endemic to most QSO surveys based on slitless spectroscopy.
Depression and obesity are highly prevalent, and major impacts on public health frequently co-occur. Recently, we reported that having depression moderates the effect of the FTO gene, suggesting its implication in the association between depression and obesity.
To confirm these findings by investigating the FTO polymorphism rs9939609 in new cohorts, and subsequently in a meta-analysis.
The sample consists of 6902 individuals with depression and 6799 controls from three replication cohorts and two original discovery cohorts. Linear regression models were performed to test for association between rs9939609 and body mass index (BMI), and for the interaction between rs9939609 and depression status for an effect on BMI. Fixed and random effects meta-analyses were performed using METASOFT.
In the replication cohorts, we observed a significant interaction between FTO, BMI and depression with fixed effects meta-analysis (β=0.12, P = 2.7 × 10−4) and with the Han/Eskin random effects method (P = 1.4 × 10−7) but not with traditional random effects (β = 0.1, P = 0.35). When combined with the discovery cohorts, random effects meta-analysis also supports the interaction (β = 0.12, P = 0.027) being highly significant based on the Han/Eskin model (P = 6.9 × 10−8). On average, carriers of the risk allele who have depression have a 2.2% higher BMI for each risk allele, over and above the main effect of FTO.
This meta-analysis provides additional support for a significant interaction between FTO, depression and BMI, indicating that depression increases the effect of FTO on BMI. The findings provide a useful starting point in understanding the biological mechanism involved in the association between obesity and depression.
Over the last two decades, there has been a rapid increase of studies testing the efficacy and acceptability of virtual reality in the assessment and treatment of mental health problems. This systematic review was carried out to investigate the use of virtual reality in the assessment and the treatment of psychosis. Web of Science, PsychInfo, EMBASE, Scopus, ProQuest and PubMed databases were searched, resulting in the identification of 638 articles potentially eligible for inclusion; of these, 50 studies were included in the review. The main fields of research in virtual reality and psychosis are: safety and acceptability of the technology; neurocognitive evaluation; functional capacity and performance evaluation; assessment of paranoid ideation and auditory hallucinations; and interventions. The studies reviewed indicate that virtual reality offers a valuable method of assessing the presence of symptoms in ecologically valid environments, with the potential to facilitate learning new emotional and behavioural responses. Virtual reality is a promising method to be used in the assessment of neurocognitive deficits and the study of relevant clinical symptoms. Furthermore, preliminary findings suggest that it can be applied to the delivery of cognitive rehabilitation, social skills training interventions and virtual reality-assisted therapies for psychosis. The potential benefits for enhancing treatment are highlighted. Recommendations for future research include demonstrating generalisability to real-life settings, examining potential negative effects, larger sample sizes and long-term follow-up studies. The present review has been registered in the PROSPERO register: CDR 4201507776.
The fertility and soil health of organic agroecosystems are determined in part by the size and turnover rate of soil carbon (C) and nitrogen (N) pools. Our research contrasts the effects of best management practices (BMP) (reduction in soil disturbance, addition of organic amendments) on C and N cycling in soils from two field sites representing five organic agroecosystems. Total soil organic C (SOC), a standard measure of soil health, contains equal amounts of biologically and non-biologically active C that is not associated with release of mineral N. A three-pool first-order model can be used to estimate the size and turnover rates of C pools but requires data from a long-term incubation. Our research highlights the use of two rapid C fractions, hydrolysable and permanganate (0.02 M) oxidizable C, to assess shifts in biologically active C. Adoption of BMP in organic management systems reduced the partitioning of C to the active pool while augmenting the slow pool C. These pools are associated with potentially mineralizable N supplied by residues, amendments and soil organic matter affecting the concentration and release of mineral N to crops. Our data show that minimizing disturbance (no tillage, pasture) and mixed compost additions have the potential to reduce carbon dioxide emissions while enhancing slow pool C and or its turnover, a reservoir of nutrients available to the soil biota. Use of these rapid, sensitive indicators of biological C activity will aid growers in determining whether a BMP fosters nutrient loss or retention prior to shifts in total SOC.
In the midwestern United States, biotypes of giant ragweed resistant to multiple herbicide biochemical sites of action have been identified. Weeds with resistance to multiple herbicides reduce the utility of existing herbicides and necessitate the development of alternative weed control strategies. In two experiments in southeastern Minnesota, we determined the effect of six 3 yr crop-rotation systems containing corn, soybean, wheat, and alfalfa on giant ragweed seedbank depletion and emergence patterns. The six crop-rotation systems included continuous corn, soybean–corn–corn, corn–soybean–corn, soybean–wheat–corn, soybean–alfalfa–corn, and alfalfa–alfalfa–corn. The crop-rotation system had no effect on the amount of seedbank depletion when a zero-weed threshold was maintained, with an average of 96% of the giant ragweed seedbank being depleted within 2 yr. Seedbank depletion occurred primarily through seedling emergence in all crop-rotation systems. However, seedling emergence tended to account for more of the seedbank depletion in rotations containing only corn or soybean compared with rotations with wheat or alfalfa. Giant ragweed emerged early across all treatments, with on average 90% emergence occurring by June 4. Duration of emergence was slightly longer in established alfalfa compared with other cropping systems. These results indicate that corn and soybean rotations are more conducive to giant ragweed emergence than rotations including wheat and alfalfa, and that adopting a zero-weed threshold is a viable approach to depleting the weed seedbank in all crop-rotation systems.
There is now expert consensus that directly observing the work of trainee therapists vs. relying upon self-report of sessions, is critical to providing the accurate feedback required to attain a range of competencies. In spite of this expert consensus however, and the broadly positive attitudes towards video review among supervisees, video feedback methods remain under-utilized in clinical supervision. This paper outlines some of the weaknesses that affect feedback based solely on self-report methods, before introducing some of the specific benefits that video feedback methods can offer the training and supervision context. It is argued that video feedback methods fit seamlessly into CBT supervision providing direct, accessible, effective, efficient and accurate observation of the learning situation, and optimizing the chances for accurate self-reflections and planning further improvements in performance. To demonstrate the utility of video feedback techniques to CBT supervision, two specific video feedback techniques are introduced and described: the Give-me-5 technique and the I-spy technique. Case examples of CBT supervision using the two techniques are provided and explored, and guidance as to the supervision contexts in which each of the two techniques are suitable, individually, and in tandem, are outlined. Finally, best practice guidelines for the use of video feedback techniques in supervision are outlined.