To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
High-resolution digital elevation models of Finland and Sweden based on LiDAR (Light Detection and Ranging) reveal subglacial landforms in great detail. We describe the ice-sheet scale distribution and morphometric characteristics of a glacial landform that is distinctive in morphology and occurs commonly in the central parts of the former Scandinavian Ice Sheet, especially up-ice of the Younger Dryas end moraine zone. We refer to these triangular or V-shaped landforms as murtoos (singular, ‘murtoo’). Murtoos are typically 30–200 m in length and 30–200 m in width with a relief of commonly <5 m. Murtoos have straight and steep edges, a triangular tip oriented parallel to ice-flow direction, and an asymmetric longitudinal profile with a shorter, but steeper down-ice slope. The spatial distribution of murtoos and their geomorphic relation to other landforms indicate that they formed subglacially during times of climate warming and rapid retreat of the Scandinavian Ice Sheet when large amounts of meltwater were delivered to the bed. Murtoos are formed under warm-based ice and may be associated with a non-channelized subglacial hydraulic system that evacuated large discharges of subglacial water.
Pigweed is difficult to manage in grain sorghum because of widespread herbicide resistance, a limited number of registered effective herbicides, and the synchronous emergence of pigweed with grain sorghum in Kansas. The combination of cultural and mechanical control tactics with an herbicide program are commonly recognized as best management strategies; however, limited information is available to adapt these strategies to dryland systems. Our objective for this research was to assess the influence of four components, including a winter wheat cover crop (CC), row-crop cultivation, three row widths, with and without a herbicide program, on pigweed control in a dryland system. Field trials were implemented during 2017 and 2018 at three locations for a total of 6 site-years. The herbicide program component resulted in excellent control (>97%) in all treatments at 3 and 8 weeks after planting (WAP). CC provided approximately 50% reductions in pigweed density and biomass for both timings in half of the site-years; however, mixed results were observed in the remaining site-years, ranging from no attributable difference to a 170% increase in weed density at 8 WAP in 1 site-year. Treatments including row-crop cultivation reduced pigweed biomass and density in most site-years 3 and 8 WAP. An herbicide program is required to achieve pigweed control and should be integrated with row-crop cultivation or narrow row widths to reduce the risk of herbicide resistance. Additional research is required to optimize the use of CC as an integrated pigweed management strategy in dryland grain sorghum.
Successful pigweed management requires an integrated strategy to delay the development of resistance to any single control tactic. Field trials were implemented during 2017 and 2018 in three counties in Kansas on dryland (limited rainfall, nonirrigated), glufosinate-resistant soybean. The objective was to assess pigweed control with combinations of a winter wheat cover crop (CC), three soybean row widths (76, 38, and 19 cm), row-crop cultivation 2.5 weeks after planting (WAP), and an herbicide program to develop integrated pigweed management recommendations. All combinations of the four components were assessed by 16 treatments. All treatments with the herbicide program resulted in excellent (>97%) pigweed control and were analyzed separately from the other components. Treatments containing row-crop cultivation reduced pigweed density and biomass 3 and 8 WAP in all locations compared with the 76-cm row width plus no CC treatment. CC impacts were mixed. In Riley County, Palmer amaranth density and biomass were reduced; in Reno County, no additional Palmer amaranth control was observed; in Franklin County, the CC had greater waterhemp density and biomass compared with the treatments containing no CC. Narrow row widths achieved the most consistent results of all cultural components when data were pooled across locations: Decreasing row widths from 76 to 38 cm resulted in a 23% reduction in pigweed biomass 8 WAP and decreasing row width from 38 to 19 cm achieved a 15% reduction. Row-crop cultivation should be incorporated where possible as a mechanical option to manage pigweed, and narrow row widths should be used to suppress late-season pigweed growth when feasible. Inconsistent pigweed control from CC was achieved and should be given special consideration before implementation. The integral use of these components with an herbicide program as a system should be recommended to achieve the best pigweed control and reduce the risk of developing herbicide resistance.
The Psychiatric Genomics Consortium (PGC) has made major advances in the molecular etiology of MDD, confirming that MDD is highly polygenic. Pathway enrichment results from PGC meta-analyses can also be used to help inform molecular drug targets. Prior to any knowledge of molecular biomarkers for MDD, drugs targeting molecular pathways (MPs) proved successful in treating MDD. It is possible that examining polygenicity within specific MPs implicated in MDD can further refine molecular drug targets.
Using a large case–control GWAS based on low-coverage whole genome sequencing (N = 10 640) in Han Chinese women, we derived polygenic risk scores (PRS) for MDD and for MDD specific to each of over 300 MPs previously shown to be relevant to psychiatric diagnoses. We then identified sets of PRSs, accounting for critical covariates, significantly predictive of case status.
Over and above global MDD polygenic risk, polygenic risk within the GO: 0017144 drug metabolism pathway significantly predicted recurrent depression after multiple testing correction. Secondary transcriptomic analysis suggests that among genes in this pathway, CYP2C19 (family of Cytochrome P450) and CBR1 (Carbonyl Reductase 1) might be most relevant to MDD. Within the cases, pathway-based risk was additionally associated with age at onset of MDD.
Results indicate that pathway-based risk might inform etiology of recurrent major depression. Future research should examine whether polygenicity of the drug metabolism gene pathway has any association with clinical presentation or treatment response. We discuss limitations to the generalizability of these preliminary findings, and urge replication in future research.
Despite established clinical associations among major depression (MD), alcohol dependence (AD), and alcohol consumption (AC), the nature of the causal relationship between them is not completely understood. We leveraged genome-wide data from the Psychiatric Genomics Consortium (PGC) and UK Biobank to test for the presence of shared genetic mechanisms and causal relationships among MD, AD, and AC.
Linkage disequilibrium score regression and Mendelian randomization (MR) were performed using genome-wide data from the PGC (MD: 135 458 cases and 344 901 controls; AD: 10 206 cases and 28 480 controls) and UK Biobank (AC-frequency: 438 308 individuals; AC-quantity: 307 098 individuals).
Positive genetic correlation was observed between MD and AD (rgMD−AD = + 0.47, P = 6.6 × 10−10). AC-quantity showed positive genetic correlation with both AD (rgAD−AC quantity = + 0.75, P = 1.8 × 10−14) and MD (rgMD−AC quantity = + 0.14, P = 2.9 × 10−7), while there was negative correlation of AC-frequency with MD (rgMD−AC frequency = −0.17, P = 1.5 × 10−10) and a non-significant result with AD. MR analyses confirmed the presence of pleiotropy among these four traits. However, the MD-AD results reflect a mediated-pleiotropy mechanism (i.e. causal relationship) with an effect of MD on AD (beta = 0.28, P = 1.29 × 10−6). There was no evidence for reverse causation.
This study supports a causal role for genetic liability of MD on AD based on genetic datasets including thousands of individuals. Understanding mechanisms underlying MD-AD comorbidity addresses important public health concerns and has the potential to facilitate prevention and intervention efforts.
To test the effect of a behavioural economics intervention in two food pantries on the nutritional quality of foods available at the pantries and the foods selected by adults visiting food pantries.
An intervention (SuperShelf) was implemented in two food pantries (Sites A and B), with two other pantries (Sites C and D) serving as a control for pantry outcomes. The intervention aimed to increase the amount and variety of healthy foods (supply), as well as the appeal of healthy foods (demand) using behavioural economics strategies. Assessments included baseline and 4-month follow-up client surveys, client cart inventories, pantry inventories and environmental assessments. A fidelity score (range 0–100) was assigned to each intervention pantry to measure the degree of implementation. A Healthy Eating Index-2010 (HEI-2010) score (range 0–100) was generated for each client cart and pantry.
Four Minnesota food pantries, USA.
Clients visiting intervention pantries before (n 71) and after (n 70) the intervention.
Fidelity scores differed by intervention site (Site A=82, Site B=51). At Site A, in adjusted models, client cart HEI-2010 scores increased on average by 11·8 points (P<0·0001), whereas there was no change at Site B. HEI-2010 pantry environment scores increased in intervention pantries (Site A=8 points, Site B=19 points) and decreased slightly in control pantries (Site C=−4 points, Site D=−3 points).
When implemented as intended, SuperShelf has the potential to improve the nutritional quality of foods available to and selected by pantry clients.
Double-crop grain sorghum after winter wheat harvest is a common cropping system in the southern plains region. Palmer amaranth is a troublesome weed in double-crop grain sorghum in Kansas. Populations resistant to various herbicides (e.g., atrazine, glyphosate, metsulfuron, pyrasulfotole) have made Palmer amaranth management even more difficult for producers. To evaluate control of atrazine-resistant and atrazine-susceptible Palmer amaranth in double-crop grain sorghum, we assessed 14 herbicide programs, of which 8 were PRE only and 6 were PRE followed by (fb) POST applications. Visible ratings of Palmer amaranth control were taken at 3 and 8 wk after planting (WAP) grain sorghum. PRE treatments containing very-long-chain fatty acid (VLCFA)–inhibiting herbicides provided 91% control of atrazine-resistant Palmer amaranth 3 WAP, and reduced weed density 8 WAP compared to atrazine-only PRE treatments. PRE fb POST treatments, especially those that included VLCFA-inhibiting herbicides, provided greater control (71% to 93%) of both atrazine-resistant and atrazine-susceptible Palmer amaranth, respectively, at 8 WAP compared to PRE treatments alone (59% to 79%). These results demonstrated the utility of VLCFA-inhibiting herbicides applied PRE and in a layered PRE fb POST approach in controlling atrazine-resistant Palmer amaranth, as well as the importance of an effective POST application following residual PRE herbicides for controlling both atrazine-resistant and atrazine-susceptible Palmer amaranth in double-crop grain sorghum.
Double-crop soybean after winter wheat is a component of many cropping systems across eastern and central Kansas. Until recently, control of Palmer amaranth and common waterhemp has been both easy and economical with the use of sequential applications of glyphosate in glyphosate-resistant soybean. Many populations of Palmer amaranth and common waterhemp have become resistant to glyphosate. During 2015 and 2016, a total of five field experiments were conducted near Manhattan, Hutchinson, and Ottawa, KS, to assess various non-glyphosate herbicide programs at three different application timings for the control of Palmer amaranth and waterhemp in double-crop soybean after winter wheat. Spring-POST treatments of pyroxasulfone (119 g ai ha–1) and pendimethalin (1065 g ai ha–1) were applied to winter wheat to evaluate residual control of Palmer amaranth and waterhemp. Less than 40% control of Palmer amaranth and waterhemp was observed in both treatments 2 wk after planting (WAP) double-crop soybean. Preharvest treatments of 2,4-D (561 g ae ha–1) and flumioxazin (107 g ai ha–1) were also applied to the winter wheat to assess control of emerged Palmer amaranth and waterhemp. 2,4-D resulted in highly variable Palmer amaranth and waterhemp control, whereas flumioxazin resulted in control similar to PRE treatments that contained paraquat (841 g ai ha–1) plus residual herbicide(s). Excellent control of both species was observed 2 WAP with a PRE paraquat application; however, reduced control of Palmer amaranth and waterhemp was noted 8 WAP due to subsequent emergence. Results indicate that Palmer amaranth and waterhemp control was 85% or greater 8 WAP for PRE treatments that included a combination of paraquat plus residual herbicide(s). PRE treatments that did not include both paraquat and residual herbicide(s) did not provide acceptable control.
Legionnaires’ disease (LD) incidence in the USA has quadrupled since 2000. Health departments must detect LD outbreaks quickly to identify and remediate sources. We tested the performance of a system to prospectively detect simulated LD outbreaks in Allegheny County, Pennsylvania, USA. We generated three simulated LD outbreaks based on published outbreaks. After verifying no significant clusters existed in surveillance data during 2014–2016, we embedded simulated outbreak-associated cases into 2016, assigning simulated residences and report dates. We mimicked daily analyses in 2016 using the prospective space-time permutation scan statistic to detect clusters of ⩽30 and ⩽180 days using 365-day and 730-day baseline periods, respectively. We used recurrence interval (RI) thresholds of ⩾20, ⩾100 and ⩾365 days to define significant signals. We calculated sensitivity, specificity and positive and negative predictive values for daily analyses, separately for each embedded outbreak. Two large, simulated cooling tower-associated outbreaks were detected. As the RI threshold was increased, sensitivity and negative predictive value decreased, while positive predictive value and specificity increased. A small, simulated potable water-associated outbreak was not detected. Use of a RI threshold of ⩾100 days minimised time-to-detection while maximizing positive predictive value. Health departments should consider using this system to detect community-acquired LD outbreaks.
Synchrotron X-ray diffraction was used to monitor the hydrothermal precipitation of akaganeite (β-FeOOH) and its transformation to hematite (Fe2O3) in situ. Akaganeite was the first phase to form and hematite was the final phase in our experiments with temperatures between 150 and 200 °C. Akaganeite was the only phase that formed at 100 °C. Rietveld analyses revealed that the akaganeite unit-cell volume contracted until the onset of dissolution, and subsequently expanded. This reversal at the onset of dissolution was associated with a substantial and rapid increase in occupancy of the Cl site, perhaps by OH− or Fe3+. Rietveld analyses supported the incipient formation of an OH-rich, Fe-deficient hematite phase in experiments between 150 and 200 °C. The inferred H concentrations of the first crystals were consistent with “hydrohematite.” With continued crystal growth, the Fe occupancies increased. Contraction in both a- and c-axes signaled the loss of hydroxyl groups and formation of a nearly stoichiometric hematite.
Plant growth stage and temperature influence the activity of glyphosate on common lambsquarters. A biotype of common lambsquarters in Dickinson County, KS (DK) was not controlled upon treatment with glyphosate in the field. In a greenhouse dose–response study, the DK biotype expressed 1.5-fold less sensitivity to glyphosate compared to a known susceptible biotype from Riley County, KS (RL). Common lambsquarters plants were treated at different growth stages (5 to 7, 10 to 12, 15 to 17, or 19 to 21 cm tall) with glyphosate at a field rate (840 g ae ha–1), and, regardless of the biotype, plants were more susceptible to glyphosate when they were 5 to 7 cm tall. Common lambsquarters plants were treated with glyphosate (840 g ae ha–1) after growing at different temperatures (25/15, 32.5/22.5, or 40/30 C day/night), and regardless of the biotype, plants were more susceptible to glyphosate when grown at 25/15 C. The results suggest that the DK biotype exhibits reduced sensitivity to glyphosate compared to the RL biotype, and glyphosate applied at field rate would be more effective on smaller common lambsquarters plants and at cooler temperatures. Common lambsquarters seedlings tend to emerge when temperatures are cooler, early in the spring relative to other summer annual weeds. Therefore, plants should be identified and treated earlier in the growing season for best efficacy with glyphosate.
Animal and cross-sectional epidemiological studies suggest that prenatal lead exposure is related to delayed menarche, but this has not been confirmed in longitudinal studies. We analyzed this association among 200 girls from Mexico City who were followed since the first trimester of gestation. Maternal blood lead levels were analyzed once during each trimester of pregnancy, and daughters were asked about their first menstrual cycle at a visit between the ages of 9.8 and 18.1 years. We estimated hazard ratios (HRs) and 95% confidence intervals (CI) for probability of menarche over the follow-up period using interval-censored Cox models, comparing those with prenatal blood lead level ⩾5 µg/dl to those with prenatal blood lead <5 µg/dl. We also estimated HRs and 95% CI with conventional Cox regression models, which utilized the self-reported age at menarche. In adjusted analyses, we accounted for maternal age, maternal parity, maternal education, and prenatal calcium treatment status. Across trimesters, 36−47% of mothers had blood lead levels ⩾5 µg/dl. Using interval-censored models, we found that during the second trimester only, girls with ⩾5 µg/dl prenatal blood lead had a later age at menarche compared with girls with prenatal blood lead levels <5 µg/dl (confounder-adjusted HR=0.59, 95% CI 0.28–0.90; P=0.05). Associations were in a similar direction, although not statistically significant, in the conventional Cox regression models, potentially indicating measurement error in the self-recalled age at menarche. In summary, higher prenatal lead exposure during the second trimester could be related to later onset of sexual maturation.
This study investigates relations of maternal N-3 and N-6 polyunsaturated fatty acids (PUFA) intake during pregnancy with offspring body mass index (BMI), height z-score and metabolic risk (fasting glucose, C-peptide, leptin, lipid profile) during peripuberty (8–14 years) among 236 mother–child pairs in Mexico. We used food frequency questionnaire data to quantify trimester-specific intake of N-3 alpha-linolenic acid, eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA); N-6 linoleic acid and arachidonic acid (AA); and N-6:N-3 (AA:EPA+DHA), which accounts for the fact that the two PUFA families have opposing effects on physiology. Next, we used multivariable linear regression models that accounted for maternal education and parity, and child’s age, sex and pubertal status, to examine associations of PUFA intake with the offspring outcomes. In models where BMI z-score was the outcome, we also adjusted for height z-score. We found that higher second trimester intake of EPA, DHA and AA were associated with lower offspring BMI and height z-score. For example, each 1-s.d. increment in second trimester EPA intake corresponded with 0.25 (95% CI: 0.03, 0.47) z-scores lower BMI and 0.20 (0.05, 0.36) z-scores lower height. Accounting for height z-score in models where BMI z-score was the outcome attenuated estimates [e.g., EPA: −0.16 (−0.37, 0.05)], suggesting that this relationship was driven by slower linear growth rather than excess adiposity. Maternal PUFA intake was not associated with the offspring metabolic biomarkers. Our findings suggest that higher PUFA intake during mid-pregnancy is associated with lower attained height in offspring during peripuberty. Additional research is needed to elucidate mechanisms and to confirm findings in other populations.
Exotic annual grasses such as medusahead [Taeniatherum caput-medusae (L.) Nevski] and downy brome (Bromus tectorum L.) dominate millions of hectares of grasslands in the western United States. Applying picloram, aminopyralid, and other growth regulator herbicides at late growth stages reduces seed production of most exotic annual grasses. In this study, we applied aminopyralid to T. caput-medusae to determine how reducing seed production in the current growing season influenced cover in the subsequent growing season. At eight annual grassland sites, we applied aminopyralid at 55, 123, and 245 g ae ha−1 in spring just before T. caput-medusae heading. The two higher rates were also applied pre-emergence (PRE) in fall to allow comparisons with this previously tested timing. When applied in spring during the roughly 10-d period between the flag leaf and inflorescence first becoming visible, just 55 g ae ha−1 of aminopyralid greatly limited seed production and subsequently reduced T. caput-medusae cover to nearly zero. Fall aminopyralid applications were less effective against T. caput-medusae, even at a rate of 245 g ae ha−1. The growing season of application, fall treatments, but not spring treatments, sometimes reduced cover of desirable winter annual forage grasses. The growing season after application, both spring and fall treatments tended to increase forage grasses, though spring treatments generally caused larger increases. Compared with other herbicide treatment options, preheading aminopyralid treatments are a relatively inexpensive, effective approach for controlling T. caput-medusae and increasing forage production.
Habits are behavioral routines that are automatic and frequent, relatively independent of any desired outcome, and have potent antecedent cues. Among individuals with anorexia nervosa (AN), behaviors that promote the starved state appear habitual, and this is the foundation of a recent neurobiological model of AN. In this proof-of-concept study, we tested the habit model of AN by examining the impact of an intervention focused on antecedent cues for eating disorder routines.
The primary intervention target was habit strength; we also measured clinical impact via eating disorder psychopathology and actual eating. Twenty-two hospitalized patients with AN were randomly assigned to 12 sessions of either Supportive Psychotherapy or a behavioral intervention aimed at cues for maladaptive behavioral routines, Regulating Emotions and Changing Habits (REaCH).
Covarying for baseline, REaCH was associated with a significantly lower Self-Report Habit Index (SRHI) score and significantly lower Eating Disorder Examination-Questionnaire (EDE-Q) global score at the end-of-treatment. The end-of-treatment effect size for SRHI was d = 1.28, for EDE-Q was d = 0.81, and for caloric intake was d = 1.16.
REaCH changed habit strength of maladaptive routines more than an active control therapy, and targeting habit strength yielded improvement in clinically meaningful measures. These findings support a habit-based model of AN, and suggest habit strength as a mechanism-based target for intervention.
Post-patent ductus arteriosus ligation syndrome is common, but rarely has hypertension been described following ductal ligation with an unclear mechanism. We report a case of an infant who exhibited features of post-patent ductus arteriosus ligation syndrome and hypertension, but was found to have bilateral renal artery stenosis. Increased systemic vascular resistance can be masked by the parallel circuit physiology of a patent ductus arteriosus.
Identifying genetic relationships between complex traits in emerging adulthood can provide useful etiological insights into risk for psychopathology. College-age individuals are under-represented in genomic analyses thus far, and the majority of work has focused on the clinical disorder or cognitive abilities rather than normal-range behavioral outcomes.
This study examined a sample of emerging adults 18–22 years of age (N = 5947) to construct an atlas of polygenic risk for 33 traits predicting relevant phenotypic outcomes. Twenty-eight hypotheses were tested based on the previous literature on samples of European ancestry, and the availability of rich assessment data allowed for polygenic predictions across 55 psychological and medical phenotypes.
Polygenic risk for schizophrenia (SZ) in emerging adults predicted anxiety, depression, nicotine use, trauma, and family history of psychological disorders. Polygenic risk for neuroticism predicted anxiety, depression, phobia, panic, neuroticism, and was correlated with polygenic risk for cardiovascular disease.
These results demonstrate the extensive impact of genetic risk for SZ, neuroticism, and major depression on a range of health outcomes in early adulthood. Minimal cross-ancestry replication of these phenomic patterns of polygenic influence underscores the need for more genome-wide association studies of non-European populations.