To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
OBJECTIVES/GOALS: The overall goal of this study was to determine the effect of early life stress (ELS) on the intestinal CD4+ T cell immune compartment, at homeostasis and after induction of experimental Inflammatory Bowel Disease (IBD). METHODS/STUDY POPULATION: We used a mouse model of ELS, maternal separation with early weaning (MSEW). We used IL-10 reporter mice to enable analysis of IL-10-producing cells. Mice were examined on postnatal day 28 to determine the impact of ELS on gut regulatory T cells. Plasma levels of corticosterone (rodent stress response hormone) was determined by ELISA. Colitis was induced in MSEW and normal rear (NR) mice via intraperitoneal injection of α-IL-10R every 5 days until day 15. Mice were euthanized on days 20 and 30. Colonic tissue sections were stained for histological analysis. Remaining tissue was further processed for flow cytometric analysis of CD4+ T cells and innate lymphoid cells. RESULTS/ANTICIPATED RESULTS: Plasma corticosterone was elevated in MSEW mice compared to their NR counterparts at 4 weeks of age. We observed that the MSEW stress protocol does not affect the baseline colonic CD4+ T cell or innate lymphoid cell populations. There was a reduction in the intestinal CD4+ T cells and regulatory T cells on day 20 in α-IL-10R MSEW mice compared to NR counterparts. This difference disappeared by day 30. Histological scoring showed no difference in disease severity between α-IL-10R treated MSEW and NR mice on day 20. However, on day 30, when α-IL-10R NR mice are recovering from colitis, MSEW mice showed persistent histological inflammation, mainly attributable to sustained epithelial damage. DISCUSSION/SIGNIFICANCE OF IMPACT: Our results suggest that ELS prolongs intestinal inflammation and impairs epithelial repair. Future studies will focus on elucidating the mechanisms responsible for ELS-dependent impairment of mucosal repair in experimental colitis.
Organismal metabolic rates reflect the interaction of environmental and physiological factors. Thus, calcifying organisms that record growth history can provide insight into both the ancient environments in which they lived and their own physiology and life history. However, interpreting them requires understanding which environmental factors have the greatest influence on growth rate and the extent to which evolutionary history constrains growth rates across lineages. We integrated satellite measurements of sea-surface temperature and chlorophyll-a concentration with a database of growth coefficients, body sizes, and life spans for 692 populations of living marine bivalves in 195 species, set within the context of a new maximum-likelihood phylogeny of bivalves. We find that environmental predictors overall explain only a small proportion of variation in growth coefficient across all species; temperature is a better predictor of growth coefficient than food supply, and growth coefficient is somewhat more variable at higher summer temperatures. Growth coefficients exhibit moderate phylogenetic signal, and taxonomic membership is a stronger predictor of growth coefficient than any environmental predictor, but phylogenetic inertia cannot fully explain the disjunction between our findings and the extensive body of work demonstrating strong environmental control on growth rates within taxa. Accounting for evolutionary history is critical when considering shells as historical archives. The weak relationship between variation in food supply and variation in growth coefficient in our data set is inconsistent with the hypothesis that the increase in mean body size through the Phanerozoic was driven by increasing productivity enabling faster growth rates.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Overweight and obesity may increase risk of disease progression in men with prostate cancer, but there have been few studies of weight loss interventions in this patient group. In this study overweight or obese men treated for prostate cancer were randomised to a self-help diet and activity intervention with telephone-based dietitian support or a wait-list mini-intervention group. The intervention group had an initial group meeting, a supporting letter from their urological consultant, three telephone dietitian consultations at 4-week intervals, a pedometer and access to web-based diet and physical activity resources. At 12 weeks, men in both groups were given digital scales for providing follow-up weight measurements, and the wait-list group received a mini-intervention of the supporting letter, a pedometer and access to the web-based resources. Sixty-two men were randomised; fifty-four completed baseline and 12-week measurements, and fifty-one and twenty-seven provided measurements at 6 and 12 months, respectively. In a repeated-measures model, mean difference in weight change between groups (wait-list mini-intervention minus intervention) at 12 weeks was −2·13 (95 % CI −3·44, −0·82) kg (P = 0·002). At 12 months the corresponding value was −2·43 (95 % CI −4·50, −0·37) kg (P = 0·022). Mean difference in global quality of life score change between groups at 12 weeks was 12·3 (95 % CI 4·93, 19·7) (P = 0·002); at 12 months there were no significant differences between groups. Results suggest the potential of self-help diet and physical activity intervention with trained support for modest but sustained weight loss in this patient group.
We evaluated provider adherence to practice guidelines for inpatients diagnosed with Clostridoides difficile infection (CDI) before and after implementation of a best practice alert (BPA) linking a positive test result to guideline-based orders. After implementation of the BPA, guideline-based prescribing increased from 39.4% in 2013 to 67.7% in 2016 (P = .014).
Due to concerns over increasing fluoroquinolone (FQ) resistance among gram-negative organisms, our stewardship program implemented a preauthorization use policy. The goal of this study was to assess the relationship between hospital FQ use and antibiotic resistance.
Large academic medical center.
We performed a retrospective analysis of FQ susceptibility of hospital isolates for 5 common gram-negative bacteria: Acinetobacter spp., Enterobacter cloacae, Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa. Primary endpoint was the change of FQ susceptibility. A Poisson regression model was used to calculate the rate of change between the preintervention period (1998–2005) and the postimplementation period (2006–2016).
Large rates of decline of FQ susceptibility began in 1998, particularly among P. aeruginosa, Acinetobacter spp., and E. cloacae. Our FQ restriction policy improved FQ use from 173 days of therapy (DOT) per 1,000 patient days to <60 DOT per 1,000 patient days. Fluoroquinolone susceptibility increased for Acinetobacter spp. (rate ratio [RR], 1.038; 95% confidence interval [CI], 1.005–1.072), E. cloacae (RR, 1.028; 95% CI, 1.013–1.044), and P. aeruginosa (RR, 1.013; 95% CI, 1.006–1.020). No significant change in susceptibility was detected for K. pneumoniae (RR, 1.002; 95% CI, 0.996–1.008), and the susceptibility for E. coli continued to decline, although the decline was not as steep (RR, 0.981; 95% CI, 0.975–0.987).
A stewardship-driven FQ restriction program stopped overall declining FQ susceptibility rates for all species except E. coli. For 3 species (ie, Acinetobacter spp, E. cloacae, and P. aeruginosa), susceptibility rates improved after implementation, and this improvement has been sustained over a 10-year period.
The value of the nosological distinction between non-affective and affective psychosis has frequently been challenged. We aimed to investigate the transdiagnostic dimensional structure and associated characteristics of psychopathology at First Episode Psychosis (FEP). Regardless of diagnostic categories, we expected that positive symptoms occurred more frequently in ethnic minority groups and in more densely populated environments, and that negative symptoms were associated with indices of neurodevelopmental impairment.
This study included 2182 FEP individuals recruited across six countries, as part of the EUropean network of national schizophrenia networks studying Gene–Environment Interactions (EU-GEI) study. Symptom ratings were analysed using multidimensional item response modelling in Mplus to estimate five theory-based models of psychosis. We used multiple regression models to examine demographic and context factors associated with symptom dimensions.
A bifactor model, composed of one general factor and five specific dimensions of positive, negative, disorganization, manic and depressive symptoms, best-represented associations among ratings of psychotic symptoms. Positive symptoms were more common in ethnic minority groups. Urbanicity was associated with a higher score on the general factor. Men presented with more negative and less depressive symptoms than women. Early age-at-first-contact with psychiatric services was associated with higher scores on negative, disorganized, and manic symptom dimensions.
Our results suggest that the bifactor model of psychopathology holds across diagnostic categories of non-affective and affective psychosis at FEP, and demographic and context determinants map onto general and specific symptom dimensions. These findings have implications for tailoring symptom-specific treatments and inform research into the mood-psychosis spectrum.
Intermediate wheatgrass (Thinopyrum intermedium; IWG) is a perennial cereal crop undergoing development for grain production; however, grain yield declines of >75% are often observed after year 2 of the perennial stand and may be linked to soil nutrient depletion. Intercropping IWG with a perennial legume such as alfalfa (Medicago sativa) could benefit nutrient cycling while increasing agroecological diversity. Intermediate wheatgrass was established at five environmentally diverse sites in Minnesota, USA in (1) bi-culture with alfalfa, (2) non-fertilized monoculture and (3) monoculture fertilized annually in the spring with 80 kg N/ha. At northern sites where alfalfa growth was favoured, IWG grain yields were reduced in year 2 by growing IWG in bi-culture with alfalfa, relative to the monoculture systems. Across all sites IWG grain yield decreased by 90% in the non-fertilized monoculture, 80% in the fertilized monoculture and 65% in the bi-culture from year 2 to 4 and plant macronutrient concentrations decreased by 25–70%. In year 4, IWG grain yield was similar or greater in the bi-culture than the fertilized monoculture at three of the five sites and alfalfa biomass was correlated positively with grain yield, harvest index and nutrient uptake in the year 4 bi-culture. Chemical-nitrogen fertilization increased grain yields in year 2 but did not mitigate the decline in yields as stands aged. Intermediate wheatgrass in the bi-culture had similar yields and nutrient uptake and lower yield declines than the chemically fertilized stand at sites where alfalfa growth was maintained throughout the life of the stand.
Accurate weed emergence models are valuable tools for scheduling planting, cultivation, and herbicide applications. Multiple models predicting giant ragweed emergence have been developed, but none have been validated in diverse crop rotation and tillage systems, which have the potential to influence weed emergence patterns. This study evaluated the performance of published giant ragweed emergence models across various crop rotations and spring tillage dates in southern Minnesota. Across experiments, the most robust model was a mixed-effects Weibull (flexible sigmoidal function) model predicting emergence in relation to hydrothermal time accumulation with a base temperature of 4.4 C, a base soil matric potential of −2.5 MPa, and two random effects determined by overwinter growing degree days (GDD) (10 C) and precipitation accumulated during seedling recruitment. The deviations in emergence between individual plots and the fixed-effects model were distinguished by the positive association between the lower horizontal asymptote (Drop) and maximum daily soil temperature during seedling recruitment. This finding indicates that crops and management practices that increase soil temperature will have a shorter lag phase at the start of giant ragweed emergence compared with practices promoting cool soil temperatures. Thus, crops with early-season crop canopies such as perennial crops and crops planted in early spring and in narrow rows will likely have a slower progression of giant ragweed emergence. This research provides a valuable assessment of published giant ragweed emergence models and illustrates that accurate emergence models can be used to time field operations and improve giant ragweed control across diverse cropping systems.
Alfalfa is recommended as a rotational crop in corn production, due to its ability to contribute to soil nitrogen (N) and carbon (C) stocks through atmospheric N2 fixation and above- and belowground biomass production. However, there is little information on how alfalfa management practices affect contributions to soil and subsequent corn crop yields, and research has not been targeted to organic systems. A study was conducted to determine the effects of alfalfa stand age, cutting frequency and biomass removal on soil C and N status and corn yields at three organically managed Minnesota locations. In one experiment, five cutting treatments were applied in nine environments: two, three and four cuts with biomass removal; three cuts with biomass remaining in place; and a no-cut control. In the other experiment, corn was planted following 1-, 2-, 3- or 4-year-old alfalfa stands and a no-alfalfa control. Yield was measured in the subsequent corn crop. In the cutting experiment, the two- and three-cut treatments with biomass removal reduced soil mineral N by 12.6 and 11.5%, respectively, compared with the control. Potentially mineralizable N (PMN) was not generally affected by cutting treatments. The three-cut no-removal increased potentially mineralizable C by 17% compared with the other treatments, but lowered soil total C in two environments, suggesting a priming effect in which addition of alfalfa biomass stimulated microbial mineralization of native soil C. Although both yields and soil mineral N tended to be higher in treatments where biomass remained in place, this advantage was small and inconsistent, indicating that farmers need not forgo hay harvest to obtain the rotational benefits of an alfalfa stand. The lack of overall correlation between corn grain yields and mineral and potentially mineralizable N suggests that alfalfa N contribution was not the driver of the yield increase in the no-removal treatments. Alfalfa stand age had inconsistent effects on fall-incorporated N and soil N and C parameters. Beyond the first year, increased alfalfa stand age did not increase soil mineral N or PMN. However, corn yield increased following older stands. Yields were 29, 77 and 90% higher following first-, second- and third-year alfalfa stands than the no-alfalfa control, respectively. This indicates that alfalfa may benefit succeeding corn through mechanisms other than N contribution, potentially including P solubilization and weed suppression. These effects have been less studied than N credits, but are of high value in organic cropping systems.
Efforts to address health disparities and achieve health equity are critically dependent on the development of a diverse research workforce. However, many researchers from underrepresented backgrounds face challenges in advancing their careers, securing independent funding, and finding the mentorship needed to expand their research.
Faculty from the University of Maryland at College Park and the University of Wisconsin-Madison developed and evaluated an intensive week-long research and career-development institute—the Health Equity Leadership Institute (HELI)—with the goal of increasing the number of underrepresented scholars who can sustain their ongoing commitment to health equity research.
In 2010-2016, HELI brought 145 diverse scholars (78% from an underrepresented background; 81% female) together to engage with each other and learn from supportive faculty. Overall, scholar feedback was highly positive on all survey items, with average agreement ratings of 4.45-4.84 based on a 5-point Likert scale. Eighty-five percent of scholars remain in academic positions. In the first three cohorts, 73% of HELI participants have been promoted and 23% have secured independent federal funding.
HELI includes an evidence-based curriculum to develop a diverse workforce for health equity research. For those institutions interested in implementing such an institute to develop and support underrepresented early stage investigators, a resource toolbox is provided.
In this case study, we evaluated a point-mapping method for simultaneously collecting data while controlling three invasive woody plant species: black locust, Chinese privet, and hardy orange. The study in Arkansas Post National Memorial included seven project areas ranging in size from 2.7 to 27.3 ha and spanned six field seasons (2010 to 2015). The control techniques varied depending on plant size and always included the application of herbicide, which also varied over the course of the study to include glyphosate, imazapyr, and triclopyr. Each person responsible for controlling plants simultaneously collected global positioning system point data to estimate the foliar cover of the plants treated. The resulting data demonstrated evidence of decreases in all three plant species in most project areas during the 6-yr period. Initial increases in area treated for some species–area combinations reflected differences in the preliminary efforts required to control invasive plants in entire project areas, but by 2012 six of seven project areas were treated in their entirety. Despite a high level of reduction, in some cases, the plants persisted at low levels even during the sixth year of the project. Our findings support the ability of this method to granularly detect changes in plant abundance while simultaneously controlling invasive plants. With several acknowledged limitations, this streamlined project-based monitoring approach provides data that allow managers to assess the effectiveness of weed control treatments.
In the midwestern United States, biotypes of giant ragweed resistant to multiple herbicide biochemical sites of action have been identified. Weeds with resistance to multiple herbicides reduce the utility of existing herbicides and necessitate the development of alternative weed control strategies. In two experiments in southeastern Minnesota, we determined the effect of six 3 yr crop-rotation systems containing corn, soybean, wheat, and alfalfa on giant ragweed seedbank depletion and emergence patterns. The six crop-rotation systems included continuous corn, soybean–corn–corn, corn–soybean–corn, soybean–wheat–corn, soybean–alfalfa–corn, and alfalfa–alfalfa–corn. The crop-rotation system had no effect on the amount of seedbank depletion when a zero-weed threshold was maintained, with an average of 96% of the giant ragweed seedbank being depleted within 2 yr. Seedbank depletion occurred primarily through seedling emergence in all crop-rotation systems. However, seedling emergence tended to account for more of the seedbank depletion in rotations containing only corn or soybean compared with rotations with wheat or alfalfa. Giant ragweed emerged early across all treatments, with on average 90% emergence occurring by June 4. Duration of emergence was slightly longer in established alfalfa compared with other cropping systems. These results indicate that corn and soybean rotations are more conducive to giant ragweed emergence than rotations including wheat and alfalfa, and that adopting a zero-weed threshold is a viable approach to depleting the weed seedbank in all crop-rotation systems.
Evidence has accumulated that implicates childhood trauma in the aetiology of psychosis, but our understanding of the putative psychological processes and mechanisms through which childhood trauma impacts on individuals and contributes to the development of psychosis remains limited. We aimed to investigate whether stress sensitivity and threat anticipation underlie the association between childhood abuse and psychosis.
We used the Experience Sampling Method to measure stress, threat anticipation, negative affect, and psychotic experiences in 50 first-episode psychosis (FEP) patients, 44 At-Risk Mental State (ARMS) participants, and 52 controls. Childhood abuse was assessed using the Childhood Trauma Questionnaire.
Associations of minor socio-environmental stress in daily life with negative affect and psychotic experiences were modified by sexual abuse and group (all pFWE < 0.05). While there was strong evidence that these associations were greater in FEP exposed to high levels of sexual abuse, and some evidence of greater associations in ARMS exposed to high levels of sexual abuse, controls exposed to high levels of sexual abuse were more resilient and reported less intense negative emotional reactions to socio-environmental stress. A similar pattern was evident for threat anticipation.
Elevated sensitivity and lack of resilience to socio-environmental stress and enhanced threat anticipation in daily life may be important psychological processes underlying the association between childhood sexual abuse and psychosis.
Twins can help researchers disentangle the roles of genes from those of the environment on human traits, health, and diseases. To realize this potential, the Australian Twin Registry (ATR), University of Melbourne, and the Charles Perkins Centre (CPC), University of Sydney, established a collaboration to form the Twins Research Node, a highly interconnected research facility dedicated specifically to research involving twins. This collaboration aims to foster the adoption of twin designs as important tools for research in a range of health-related domains. The CPC hosted their Twins Research Node's launch seminar entitled ‘Double the power of your research with twin studies’, in which experienced twin researchers described how twin studies are supporting scientific discoveries and careers. The launch also featured twin pairs who have actively participated in research through the ATR. Researchers at the CPC were surveyed before the event to gauge their level of understanding and interest in utilizing twin research. This article describes the new Twins Research Node, discusses the survey's main results and reports on the launch seminar.
The evidence underpinning the developmental origins of health and disease (DOHaD) is overwhelming. As the emphasis shifts more towards interventions and the translational strategies for disease prevention, it is important to capitalize on collaboration and knowledge sharing to maximize opportunities for discovery and replication. DOHaD meetings are facilitating this interaction. However, strategies to perpetuate focussed discussions and collaborations around and between conferences are more likely to facilitate the development of DOHaD research. For this reason, the DOHaD Society of Australia and New Zealand (DOHaD ANZ) has initiated themed Working Groups, which convened at the 2014–2015 conferences. This report introduces the DOHaD ANZ Working Groups and summarizes their plans and activities. One of the first Working Groups to form was the ActEarly birth cohort group, which is moving towards more translational goals. Reflecting growing emphasis on the impact of early life biodiversity – even before birth – we also have a Working Group titled Infection, inflammation and the microbiome. We have several Working Groups exploring other major non-cancerous disease outcomes over the lifespan, including Brain, behaviour and development and Obesity, cardiovascular and metabolic health. The Epigenetics and Animal Models Working Groups cut across all these areas and seeks to ensure interaction between researchers. Finally, we have a group focussed on ‘Translation, policy and communication’ which focusses on how we can best take the evidence we produce into the community to effect change. By coordinating and perpetuating DOHaD discussions in this way we aim to enhance DOHaD research in our region.
We analyzed birth order differences in means and variances of height and body mass index (BMI) in monozygotic (MZ) and dizygotic (DZ) twins from infancy to old age. The data were derived from the international CODATwins database. The total number of height and BMI measures from 0.5 to 79.5 years of age was 397,466. As expected, first-born twins had greater birth weight than second-born twins. With respect to height, first-born twins were slightly taller than second-born twins in childhood. After adjusting the results for birth weight, the birth order differences decreased and were no longer statistically significant. First-born twins had greater BMI than the second-born twins over childhood and adolescence. After adjusting the results for birth weight, birth order was still associated with BMI until 12 years of age. No interaction effect between birth order and zygosity was found. Only limited evidence was found that birth order influenced variances of height or BMI. The results were similar among boys and girls and also in MZ and DZ twins. Overall, the differences in height and BMI between first- and second-born twins were modest even in early childhood, while adjustment for birth weight reduced the birth order differences but did not remove them for BMI.
As herbicide-resistant weed populations become increasingly problematic in crop production, alternative strategies of weed control are necessary. Giant ragweed, one of the most competitive agricultural weeds in row crops, has evolved resistance to multiple herbicide biochemical sites of action within the plant, necessitating the development of new and integrated methods of weed control. This study assessed the quantity and duration of seed retention of giant ragweed grown in soybean fields and adjacent field margins. Seed retention of giant ragweed was monitored weekly during the 2012 to 2014 harvest seasons using seed collection traps. Giant ragweed plants produced an average of 1,818 seeds per plant, with 66% being potentially viable. Giant ragweed on average began shattering hard (potentially viable) and soft (nonviable) seeds September 12 and continued through October at an average rate of 0.75 and 0.44% of total seeds per day during September and October, respectively. Giant ragweed seeds remained on the plants well into the Minnesota soybean harvest season, with an average of 80% of the total seeds being retained on October 11, when Minnesota soybean harvest was approximately 75% completed in the years of the study. These results suggest that there is a sufficient amount of time to remove escaped giant ragweed from production fields and field margins before the seeds shatter by managing weed seed dispersal before or at crop harvest. Controlling weed seed dispersal has potential to manage herbicide-resistant giant ragweed by limiting replenishment of the weed seed bank.