To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
ABSTRACT IMPACT: Understanding dietary patterns and nutrient intakes of the aging population may help address concerns and dietary guidelines regarding their nutritional needs. OBJECTIVES/GOALS: The objective of this study is to test the hypothesis that a healthy dietary pattern in the oldest old (aged 80 years and older) is related to greater compliance with dietary recommendations and better nutrient intake profiles. METHODS/STUDY POPULATION: We conducted a cross-sectional study of 122 participants aged 82 to 97 years old from the Geisinger Rural Aging Study (GRAS) cohort in rural Pennsylvania (n = 56 men and 66 women). The main outcome measures of the investigation were the daily nutrient intakes and food group intakes evaluated from the average of three 24-hour dietary recalls. The dietary patterns were determined by cluster analysis from 28 food groups. Diet quality and adherence to the Dietary Guidelines for Americans was assessed by the Healthy Eating Index (HEI)-2015 and the Dietary Screening Tool (DST). Recommended intakes were determined by the Recommended Dietary Allowances (RDAs) or Adequate Intakes (AIs). RESULTS/ANTICIPATED RESULTS: Less than 50% of participants met the dietary recommended intakes for vitamins D, E, K, B6, dietary fiber, zinc, potassium, and calcium. The more-nutrient-dense cluster was characterized by higher intakes of fruits and vegetables. The less-nutrient-dense cluster was characterized by higher intakes of foods including desserts and sweets. After adjusting for age, sex, and energy intake, participants in the more-nutrient-dense dietary pattern had a higher intake of vitamins A, D, K, C, fiber, and potassium (p < 0.05 for all). After adjusting for age and sex, participants in the more-nutrient-dense pattern had better diet quality assessed by the (HEI)-2015 (p < 0.001) and DST (p = 0.006). DISCUSSION/SIGNIFICANCE OF FINDINGS: Among the oldest old, many participants were found to have nutrient intakes lower than the recommended levels for fundamental nutrients suggesting that dietary guidance in addition to a dietary pattern more aligned with dietary guidelines may be beneficial for supporting healthy aging.
Lacustrine sedimentary records and the proxies contained within them are valuable archives of past climate. However, the resolution of these records is frequently coarse or contains a high degree of uncertainty, making it difficult to resolve how climatic variability impacts the ecosystems on which humans depend. The goal of this study is to couple recent sediment cores sampled at centimeter-scale resolution with paleo- and historical information about lake levels to document how changes in the paleoenvironment impact the paleoecology of a rift basin lake. We present multiproxy data from three short cores collected from Ferguson's Gulf (FG), a shallow embayment connected to the western shore of Lake Turkana, Kenya. Five distinct biozones were interpreted on the basis of ostracods and geochemistry (δ18O, δ13C, and major elements), spanning the Little Ice Age (LIA) to the modern. Overall, ostracod total abundance and assemblage diversity decreased up-core, with the largest total abundance and genera diversity occurring during the LIA. This fits with regional datasets that indicate the Eastern Branch of the East African Rift System was wetter during the LIA than it is today. This also suggests that human impact in and around Lake Turkana has weakened the resiliency of the ecosystems in FG.
Mass asymptomatic SARS-CoV-2 nucleic acid amplified testing of healthcare personnel (HCP) was performed at a large tertiary health system. A low period-prevalence of positive HCP was observed. Of those who tested positive, half had mild symptoms in retrospect. HCP with even mild symptoms should be isolated and tested.
Animal-derived dietary protein ingestion and physical activity stimulate myofibrillar protein synthesis rates in older adults. We determined whether a non-animal-derived diet can support daily myofibrillar protein synthesis rates to the same extent as an omnivorous diet. Nineteen healthy older adults (aged 66 (sem 1) years; BMI 24 (sem 1) kg/m2; twelve males, seven females) participated in a randomised, parallel-group, controlled trial during which they consumed a 3-d isoenergetic high-protein (1·8 g/kg body mass per d) diet, where the protein was provided from predominantly (71 %) animal (OMNI; n 9; six males, three females) or exclusively vegan (VEG; n 10; six males, four females; mycoprotein providing 57 % of daily protein intake) sources. During the dietary control period, participants conducted a daily bout of unilateral resistance-type leg extension exercise. Before the dietary control period, participants ingested 400 ml of deuterated water, with 50-ml doses consumed daily thereafter. Saliva samples were collected throughout to determine body water 2H enrichments, and muscle samples were collected from rested and exercised muscle to determine daily myofibrillar protein synthesis rates. Deuterated water dosing resulted in body water 2H enrichments of approximately 0·78 (sem 0·03) %. Daily myofibrillar protein synthesis rates were 13 (sem 8) (P = 0·169) and 12 (sem 4) % (P = 0·016) greater in the exercised compared with rested leg (1·59 (sem 0·12) v. 1·77 (sem 0·12) and 1·76 (sem 0·14) v. 1·93 (sem 0·12) %/d) in OMNI and VEG groups, respectively. Daily myofibrillar protein synthesis rates did not differ between OMNI and VEG in either rested or exercised muscle (P > 0·05). Over the course of a 3-d intervention, omnivorous- or vegan-derived dietary protein sources can support equivalent rested and exercised daily myofibrillar protein synthesis rates in healthy older adults consuming a high-protein diet.
This study examined the differential impact of Hurricane Harvey on adolescent standardized Body Mass Index (zBMI), physical activity, diet, and perceived stress.
Prior to Hurricane Harvey, 175 ethnic minority adolescents were recruited from an independent school district in Houston. Height and weight were directly measured. The School Physical Activity and Nutrition Questionnaire assessed diet and physical activity. Stress was assessed with the Perceived Stress Scale. High hurricane impact was classified as at least 1 affirmative response to house damage, rescue, displacement, or going without food, water, or medicine. Repeated measures such as ANCOVA models were developed to assess differences in zBMI, physical activity, diet, and stress between the hurricane impact groups. Regression models were used to assess stress as a mediator of the hurricane impact and zBMI change relationship.
Students who were highly impacted by the hurricane had a greater decrease in zBMI than those less impacted from pre-hurricane to 15 weeks post-hurricane (95% CI 0.02 to 0.25, p<0.05). Physical activity and diet did not differ by impact. Perceived stress at 3 weeks post-hurricane mediated the impact and zBMI change relationship (β=-0.04 95% CI -0.12 to -0.002).
The decrease in zBMI among highly impacted students warrants further monitoring. Perceived stress, immediately following the hurricane, impacted student growth months later.
OBJECTIVES/GOALS: The overall goal of this study was to determine the effect of early life stress (ELS) on the intestinal CD4+ T cell immune compartment, at homeostasis and after induction of experimental Inflammatory Bowel Disease (IBD). METHODS/STUDY POPULATION: We used a mouse model of ELS, maternal separation with early weaning (MSEW). We used IL-10 reporter mice to enable analysis of IL-10-producing cells. Mice were examined on postnatal day 28 to determine the impact of ELS on gut regulatory T cells. Plasma levels of corticosterone (rodent stress response hormone) was determined by ELISA. Colitis was induced in MSEW and normal rear (NR) mice via intraperitoneal injection of α-IL-10R every 5 days until day 15. Mice were euthanized on days 20 and 30. Colonic tissue sections were stained for histological analysis. Remaining tissue was further processed for flow cytometric analysis of CD4+ T cells and innate lymphoid cells. RESULTS/ANTICIPATED RESULTS: Plasma corticosterone was elevated in MSEW mice compared to their NR counterparts at 4 weeks of age. We observed that the MSEW stress protocol does not affect the baseline colonic CD4+ T cell or innate lymphoid cell populations. There was a reduction in the intestinal CD4+ T cells and regulatory T cells on day 20 in α-IL-10R MSEW mice compared to NR counterparts. This difference disappeared by day 30. Histological scoring showed no difference in disease severity between α-IL-10R treated MSEW and NR mice on day 20. However, on day 30, when α-IL-10R NR mice are recovering from colitis, MSEW mice showed persistent histological inflammation, mainly attributable to sustained epithelial damage. DISCUSSION/SIGNIFICANCE OF IMPACT: Our results suggest that ELS prolongs intestinal inflammation and impairs epithelial repair. Future studies will focus on elucidating the mechanisms responsible for ELS-dependent impairment of mucosal repair in experimental colitis.
Organismal metabolic rates reflect the interaction of environmental and physiological factors. Thus, calcifying organisms that record growth history can provide insight into both the ancient environments in which they lived and their own physiology and life history. However, interpreting them requires understanding which environmental factors have the greatest influence on growth rate and the extent to which evolutionary history constrains growth rates across lineages. We integrated satellite measurements of sea-surface temperature and chlorophyll-a concentration with a database of growth coefficients, body sizes, and life spans for 692 populations of living marine bivalves in 195 species, set within the context of a new maximum-likelihood phylogeny of bivalves. We find that environmental predictors overall explain only a small proportion of variation in growth coefficient across all species; temperature is a better predictor of growth coefficient than food supply, and growth coefficient is somewhat more variable at higher summer temperatures. Growth coefficients exhibit moderate phylogenetic signal, and taxonomic membership is a stronger predictor of growth coefficient than any environmental predictor, but phylogenetic inertia cannot fully explain the disjunction between our findings and the extensive body of work demonstrating strong environmental control on growth rates within taxa. Accounting for evolutionary history is critical when considering shells as historical archives. The weak relationship between variation in food supply and variation in growth coefficient in our data set is inconsistent with the hypothesis that the increase in mean body size through the Phanerozoic was driven by increasing productivity enabling faster growth rates.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Overweight and obesity may increase risk of disease progression in men with prostate cancer, but there have been few studies of weight loss interventions in this patient group. In this study overweight or obese men treated for prostate cancer were randomised to a self-help diet and activity intervention with telephone-based dietitian support or a wait-list mini-intervention group. The intervention group had an initial group meeting, a supporting letter from their urological consultant, three telephone dietitian consultations at 4-week intervals, a pedometer and access to web-based diet and physical activity resources. At 12 weeks, men in both groups were given digital scales for providing follow-up weight measurements, and the wait-list group received a mini-intervention of the supporting letter, a pedometer and access to the web-based resources. Sixty-two men were randomised; fifty-four completed baseline and 12-week measurements, and fifty-one and twenty-seven provided measurements at 6 and 12 months, respectively. In a repeated-measures model, mean difference in weight change between groups (wait-list mini-intervention minus intervention) at 12 weeks was −2·13 (95 % CI −3·44, −0·82) kg (P = 0·002). At 12 months the corresponding value was −2·43 (95 % CI −4·50, −0·37) kg (P = 0·022). Mean difference in global quality of life score change between groups at 12 weeks was 12·3 (95 % CI 4·93, 19·7) (P = 0·002); at 12 months there were no significant differences between groups. Results suggest the potential of self-help diet and physical activity intervention with trained support for modest but sustained weight loss in this patient group.
We evaluated provider adherence to practice guidelines for inpatients diagnosed with Clostridoides difficile infection (CDI) before and after implementation of a best practice alert (BPA) linking a positive test result to guideline-based orders. After implementation of the BPA, guideline-based prescribing increased from 39.4% in 2013 to 67.7% in 2016 (P = .014).
Due to concerns over increasing fluoroquinolone (FQ) resistance among gram-negative organisms, our stewardship program implemented a preauthorization use policy. The goal of this study was to assess the relationship between hospital FQ use and antibiotic resistance.
Large academic medical center.
We performed a retrospective analysis of FQ susceptibility of hospital isolates for 5 common gram-negative bacteria: Acinetobacter spp., Enterobacter cloacae, Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa. Primary endpoint was the change of FQ susceptibility. A Poisson regression model was used to calculate the rate of change between the preintervention period (1998–2005) and the postimplementation period (2006–2016).
Large rates of decline of FQ susceptibility began in 1998, particularly among P. aeruginosa, Acinetobacter spp., and E. cloacae. Our FQ restriction policy improved FQ use from 173 days of therapy (DOT) per 1,000 patient days to <60 DOT per 1,000 patient days. Fluoroquinolone susceptibility increased for Acinetobacter spp. (rate ratio [RR], 1.038; 95% confidence interval [CI], 1.005–1.072), E. cloacae (RR, 1.028; 95% CI, 1.013–1.044), and P. aeruginosa (RR, 1.013; 95% CI, 1.006–1.020). No significant change in susceptibility was detected for K. pneumoniae (RR, 1.002; 95% CI, 0.996–1.008), and the susceptibility for E. coli continued to decline, although the decline was not as steep (RR, 0.981; 95% CI, 0.975–0.987).
A stewardship-driven FQ restriction program stopped overall declining FQ susceptibility rates for all species except E. coli. For 3 species (ie, Acinetobacter spp, E. cloacae, and P. aeruginosa), susceptibility rates improved after implementation, and this improvement has been sustained over a 10-year period.
The value of the nosological distinction between non-affective and affective psychosis has frequently been challenged. We aimed to investigate the transdiagnostic dimensional structure and associated characteristics of psychopathology at First Episode Psychosis (FEP). Regardless of diagnostic categories, we expected that positive symptoms occurred more frequently in ethnic minority groups and in more densely populated environments, and that negative symptoms were associated with indices of neurodevelopmental impairment.
This study included 2182 FEP individuals recruited across six countries, as part of the EUropean network of national schizophrenia networks studying Gene–Environment Interactions (EU-GEI) study. Symptom ratings were analysed using multidimensional item response modelling in Mplus to estimate five theory-based models of psychosis. We used multiple regression models to examine demographic and context factors associated with symptom dimensions.
A bifactor model, composed of one general factor and five specific dimensions of positive, negative, disorganization, manic and depressive symptoms, best-represented associations among ratings of psychotic symptoms. Positive symptoms were more common in ethnic minority groups. Urbanicity was associated with a higher score on the general factor. Men presented with more negative and less depressive symptoms than women. Early age-at-first-contact with psychiatric services was associated with higher scores on negative, disorganized, and manic symptom dimensions.
Our results suggest that the bifactor model of psychopathology holds across diagnostic categories of non-affective and affective psychosis at FEP, and demographic and context determinants map onto general and specific symptom dimensions. These findings have implications for tailoring symptom-specific treatments and inform research into the mood-psychosis spectrum.
Intermediate wheatgrass (Thinopyrum intermedium; IWG) is a perennial cereal crop undergoing development for grain production; however, grain yield declines of >75% are often observed after year 2 of the perennial stand and may be linked to soil nutrient depletion. Intercropping IWG with a perennial legume such as alfalfa (Medicago sativa) could benefit nutrient cycling while increasing agroecological diversity. Intermediate wheatgrass was established at five environmentally diverse sites in Minnesota, USA in (1) bi-culture with alfalfa, (2) non-fertilized monoculture and (3) monoculture fertilized annually in the spring with 80 kg N/ha. At northern sites where alfalfa growth was favoured, IWG grain yields were reduced in year 2 by growing IWG in bi-culture with alfalfa, relative to the monoculture systems. Across all sites IWG grain yield decreased by 90% in the non-fertilized monoculture, 80% in the fertilized monoculture and 65% in the bi-culture from year 2 to 4 and plant macronutrient concentrations decreased by 25–70%. In year 4, IWG grain yield was similar or greater in the bi-culture than the fertilized monoculture at three of the five sites and alfalfa biomass was correlated positively with grain yield, harvest index and nutrient uptake in the year 4 bi-culture. Chemical-nitrogen fertilization increased grain yields in year 2 but did not mitigate the decline in yields as stands aged. Intermediate wheatgrass in the bi-culture had similar yields and nutrient uptake and lower yield declines than the chemically fertilized stand at sites where alfalfa growth was maintained throughout the life of the stand.
Accurate weed emergence models are valuable tools for scheduling planting, cultivation, and herbicide applications. Multiple models predicting giant ragweed emergence have been developed, but none have been validated in diverse crop rotation and tillage systems, which have the potential to influence weed emergence patterns. This study evaluated the performance of published giant ragweed emergence models across various crop rotations and spring tillage dates in southern Minnesota. Across experiments, the most robust model was a mixed-effects Weibull (flexible sigmoidal function) model predicting emergence in relation to hydrothermal time accumulation with a base temperature of 4.4 C, a base soil matric potential of −2.5 MPa, and two random effects determined by overwinter growing degree days (GDD) (10 C) and precipitation accumulated during seedling recruitment. The deviations in emergence between individual plots and the fixed-effects model were distinguished by the positive association between the lower horizontal asymptote (Drop) and maximum daily soil temperature during seedling recruitment. This finding indicates that crops and management practices that increase soil temperature will have a shorter lag phase at the start of giant ragweed emergence compared with practices promoting cool soil temperatures. Thus, crops with early-season crop canopies such as perennial crops and crops planted in early spring and in narrow rows will likely have a slower progression of giant ragweed emergence. This research provides a valuable assessment of published giant ragweed emergence models and illustrates that accurate emergence models can be used to time field operations and improve giant ragweed control across diverse cropping systems.
Alfalfa is recommended as a rotational crop in corn production, due to its ability to contribute to soil nitrogen (N) and carbon (C) stocks through atmospheric N2 fixation and above- and belowground biomass production. However, there is little information on how alfalfa management practices affect contributions to soil and subsequent corn crop yields, and research has not been targeted to organic systems. A study was conducted to determine the effects of alfalfa stand age, cutting frequency and biomass removal on soil C and N status and corn yields at three organically managed Minnesota locations. In one experiment, five cutting treatments were applied in nine environments: two, three and four cuts with biomass removal; three cuts with biomass remaining in place; and a no-cut control. In the other experiment, corn was planted following 1-, 2-, 3- or 4-year-old alfalfa stands and a no-alfalfa control. Yield was measured in the subsequent corn crop. In the cutting experiment, the two- and three-cut treatments with biomass removal reduced soil mineral N by 12.6 and 11.5%, respectively, compared with the control. Potentially mineralizable N (PMN) was not generally affected by cutting treatments. The three-cut no-removal increased potentially mineralizable C by 17% compared with the other treatments, but lowered soil total C in two environments, suggesting a priming effect in which addition of alfalfa biomass stimulated microbial mineralization of native soil C. Although both yields and soil mineral N tended to be higher in treatments where biomass remained in place, this advantage was small and inconsistent, indicating that farmers need not forgo hay harvest to obtain the rotational benefits of an alfalfa stand. The lack of overall correlation between corn grain yields and mineral and potentially mineralizable N suggests that alfalfa N contribution was not the driver of the yield increase in the no-removal treatments. Alfalfa stand age had inconsistent effects on fall-incorporated N and soil N and C parameters. Beyond the first year, increased alfalfa stand age did not increase soil mineral N or PMN. However, corn yield increased following older stands. Yields were 29, 77 and 90% higher following first-, second- and third-year alfalfa stands than the no-alfalfa control, respectively. This indicates that alfalfa may benefit succeeding corn through mechanisms other than N contribution, potentially including P solubilization and weed suppression. These effects have been less studied than N credits, but are of high value in organic cropping systems.
Efforts to address health disparities and achieve health equity are critically dependent on the development of a diverse research workforce. However, many researchers from underrepresented backgrounds face challenges in advancing their careers, securing independent funding, and finding the mentorship needed to expand their research.
Faculty from the University of Maryland at College Park and the University of Wisconsin-Madison developed and evaluated an intensive week-long research and career-development institute—the Health Equity Leadership Institute (HELI)—with the goal of increasing the number of underrepresented scholars who can sustain their ongoing commitment to health equity research.
In 2010-2016, HELI brought 145 diverse scholars (78% from an underrepresented background; 81% female) together to engage with each other and learn from supportive faculty. Overall, scholar feedback was highly positive on all survey items, with average agreement ratings of 4.45-4.84 based on a 5-point Likert scale. Eighty-five percent of scholars remain in academic positions. In the first three cohorts, 73% of HELI participants have been promoted and 23% have secured independent federal funding.
HELI includes an evidence-based curriculum to develop a diverse workforce for health equity research. For those institutions interested in implementing such an institute to develop and support underrepresented early stage investigators, a resource toolbox is provided.
In this case study, we evaluated a point-mapping method for simultaneously collecting data while controlling three invasive woody plant species: black locust, Chinese privet, and hardy orange. The study in Arkansas Post National Memorial included seven project areas ranging in size from 2.7 to 27.3 ha and spanned six field seasons (2010 to 2015). The control techniques varied depending on plant size and always included the application of herbicide, which also varied over the course of the study to include glyphosate, imazapyr, and triclopyr. Each person responsible for controlling plants simultaneously collected global positioning system point data to estimate the foliar cover of the plants treated. The resulting data demonstrated evidence of decreases in all three plant species in most project areas during the 6-yr period. Initial increases in area treated for some species–area combinations reflected differences in the preliminary efforts required to control invasive plants in entire project areas, but by 2012 six of seven project areas were treated in their entirety. Despite a high level of reduction, in some cases, the plants persisted at low levels even during the sixth year of the project. Our findings support the ability of this method to granularly detect changes in plant abundance while simultaneously controlling invasive plants. With several acknowledged limitations, this streamlined project-based monitoring approach provides data that allow managers to assess the effectiveness of weed control treatments.