To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Weed management during spring crop production in eastern Washington presents many challenges. Many spring crops are weak competitors with weeds. In May of 2010 and 2011, two spring crop trials were initiated near Pullman, WA, to compare the relative competitiveness of barley (Hordeum vulgare L.), wheat (Triticum aestivum L.), lentil (Lens culinaris Medik.), and pea (Pisum sativum L.) using cultivated oat (Avena sativa L.) as a surrogate for wild oat (Avena fatua L.) competition. The experiment was arranged as a split-block split-plot design with four replications. One set of main plots included three oat density treatments (0, 63, and 127 plants m−2), while a second set included each crop species. Crop species main plots were then split into subplots of two different seeding rates (recommended and doubled). Crop populations decreased as oat density increased and increased as crop seeding rate increased. As oat density increased, preharvest crop biomass decreased for all crops, while oat biomass and yield increased. Oat biomass and yield were greater in legume plots compared with cereal plots. Increasing oat density decreased yields for all crops, whereas doubling crop seeding rate increased yields for barley and wheat in 2010 and barley in 2011. Compared with legumes, cereals were taller, produced more biomass, and were more competitive with oat.
Quantifying the rate of wave attenuation in sea ice is key to understanding trends in the Antarctic marginal ice zone extent. However, a paucity of observations of waves in sea ice limits progress on this front. We deployed 14 waves-in-ice observation systems (WIIOS) on Antarctic sea ice during the Polynyas, Ice Production, and seasonal Evolution in the Ross Sea expedition (PIPERS) in 2017. The WIIOS provide in situ measurement of surface wave characteristics. Two experiments were conducted, one while the ship was inbound and one outbound. The sea ice throughout the experiments generally consisted of pancake and young ice <0.5 m thick. The WIIOS survived a minimum of 4 d and a maximum of 6 weeks. Several large-wave events were captured, with the largest recorded significant wave height over 9 m. We find that the total wave energy measured by the WIIOS generally decays exponentially in the ice and the rate of decay depends on ice concentration.
Australian conservation cropping systems are practiced on very large farms (approximately 3,000 ha) where herbicides are relied on for effective and timely weed control. In many fields, though, there are low weed densities (e.g., <1.0 plant 10 m−2) and whole-field herbicide treatments are wasteful. For fallow weed control, commercially available weed detection systems provide the opportunity for site-specific herbicide treatments, removing the need for whole-field treatment of fallow fields with low weed densities. Concern about the sustainability of herbicide-reliant weed management systems remain and there has not been interest in the use of weed detection systems for alternative weed control technologies, such as targeted tillage. In this paper, we discuss the use of a targeted tillage technique for site-specific weed control in large-scale crop production systems. Three small-scale prototypes were used for engineering and weed control efficacy testing across a range of species and growth stages. With confidence established in the design approach and a demonstrated 100% weed-control potential, a 6-m wide pre-commercial prototype, the “Weed Chipper,” was built incorporating commercially available weed-detection cameras for practical field-scale evaluation. This testing confirmed very high (90%) weed control efficacies and associated low levels (1.8%) of soil disturbance where the weed density was fewer than 1.0 plant 10 m−2 in a commercial fallow. These data established the suitability of this mechanical approach to weed control for conservation cropping systems. The development of targeted tillage for fallow weed control represents the introduction of site-specific, nonchemical weed control for conservation cropping systems.
Preliminary evidence has suggested that high-fat diets (HFD) enriched with SFA, but not MUFA, promote hyperinsulinaemia and pancreatic hypertrophy with insulin resistance. The objective of this study was to determine whether the substitution of dietary MUFA within a HFD could attenuate the progression of pancreatic islet dysfunction seen with prolonged SFA-HFD. For 32 weeks, C57BL/6J mice were fed either: (1) low-fat diet, (2) SFA-HFD or (3) SFA-HFD for 16 weeks, then switched to MUFA-HFD for 16 weeks (SFA-to-MUFA-HFD). Fasting insulin was assessed throughout the study; islets were isolated following the intervention. Substituting SFA with MUFA-HFD prevented the progression of hyperinsulinaemia observed in SFA-HFD mice (P < 0·001). Glucose-stimulated insulin secretion from isolated islets was reduced by SFA-HFD, yet not fully affected by SFA-to-MUFA-HFD. Markers of β-cell identity (Ins2, Nkx6.1, Ngn3, Rfx6, Pdx1 and Pax6) were reduced, and islet inflammation was increased (IL-1β, 3·0-fold, P = 0·007; CD68, 2·9-fold, P = 0·001; Il-6, 1·1-fold, P = 0·437) in SFA-HFD – effects not seen with SFA-to-MUFA-HFD. Switching to MUFA-HFD can partly attenuate the progression of SFA-HFD-induced hyperinsulinaemia, pancreatic inflammation and impairments in β-cell function. While further work is required from a mechanistic perspective, dietary fat may mediate its effect in an IL-1β–AMP-activated protein kinase α1-dependent fashion. Future work should assess the potential translation of the modulation of metabolic inflammation in man.
The production and use of masks at multiple scales and in diverse contexts is a millennia-long tradition in Mesoamerica. In this paper, we explore some implications of Mesoamerican masking practices in light of materiality studies and the archaeology of the senses. We also discuss a collection of 22 masks, miniature masks and representations of masks from the lower Río Verde valley of coastal Oaxaca, Mexico. The iconography of these artefacts as well as their recovery from well-documented archaeological contexts inform our interpretations of masking practices during an approximately 2000-year span of the Formative period (2000 bc–ad 250). Specifically, we argue that these masking-related artefacts index sociocultural changes in the region, from the first villages and the advent of ceramic technology during the Early Formative period (2000–1000 bc) to a time of increasing consolidation of iconographic influence in the hands of the elite in the final centuries before the Classic period. As indicated by their continued use today, masks have long been intimates of communal activities in Oaxaca.
There is increasing evidence that smoking is a risk factor for severe mental illness, including bipolar disorder. Conversely, patients with bipolar disorder might smoke more (often) as a result of the psychiatric disorder.
We conducted a bidirectional Mendelian randomisation (MR) study to investigate the direction and evidence for a causal nature of the relationship between smoking and bipolar disorder.
We used publicly available summary statistics from genome-wide association studies on bipolar disorder, smoking initiation, smoking heaviness, smoking cessation and lifetime smoking (i.e. a compound measure of heaviness, duration and cessation). We applied analytical methods with different, orthogonal assumptions to triangulate results, including inverse-variance weighted (IVW), MR-Egger, MR-Egger SIMEX, weighted-median, weighted-mode and Steiger-filtered analyses.
Across different methods of MR, consistent evidence was found for a positive effect of smoking on the odds of bipolar disorder (smoking initiation ORIVW = 1.46, 95% CI 1.28–1.66, P = 1.44 × 10−8, lifetime smoking ORIVW = 1.72, 95% CI 1.29–2.28, P = 1.8 × 10−4). The MR analyses of the effect of liability to bipolar disorder on smoking provided no clear evidence of a strong causal effect (smoking heaviness betaIVW = 0.028, 95% CI 0.003–0.053, P = 2.9 × 10−2).
These findings suggest that smoking initiation and lifetime smoking are likely to be a causal risk factor for developing bipolar disorder. We found some evidence that liability to bipolar disorder increased smoking heaviness. Given that smoking is a modifiable risk factor, these findings further support investment into smoking prevention and treatment in order to reduce mental health problems in future generations.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
Numerous studies have demonstrated that genetic and environmental factors interact to influence alcohol problems. Yet prior research has primarily focused on samples of European descent and little is known about gene–environment interactions in relation to alcohol problems in non-European populations. In this study, we examined whether and how genetic risk for alcohol problems and peer deviance and interpersonal traumatic events independently and interactively influence trajectories of alcohol use disorder symptoms in a sample of African American students across the college years (N = 1,119; Mage = 18.44 years). Data were drawn from the Spit for Science study where participants completed multiple online surveys throughout college and provided a saliva sample for genotyping. Multilevel growth curve analyses indicated that alcohol dependence genome-wide polygenic risk scores did not predict trajectory of alcohol use disorder symptoms, while family history of alcohol problems was associated with alcohol use disorder symptoms at the start of college but not with the rate of change in symptoms over time. Peer deviance and interpersonal traumatic events were associated with more alcohol use disorder symptoms across college years. Neither alcohol dependence genome-wide polygenic risk scores nor family history of alcohol problems moderated the effects of these environmental risk factors on alcohol use disorder symptoms. Our findings indicated that peer deviance and experience of interpersonal traumatic events are salient risk factors that elevate risk for alcohol problems among African American college students. Family history of alcohol problems could be a useful indicator of genetic risk for alcohol problems. Gene identification efforts with much larger samples of African descent are needed to better characterize genetic risk for alcohol use disorders, in order to better understand gene–environment interaction processes in this understudied population.
Plant height and lodging resistance can affect rice yield significantly, but these traits have always conflicted in crop cultivation and breeding. The current study aimed to establish a rapid and accurate plant type evaluation mechanism to provide a basis for breeding tall but lodging-resistant super rice varieties. A comprehensive approach integrating plant anatomy and histochemistry was used to investigate variations in flexural strength (a material property, defined as the stress in a material just before it yields in a flexure test) of the rice stem and the lodging index of 15 rice accessions at different growth stages to understand trends in these parameters and the potential factors influencing them. Rice stem anatomical structure was observed and the lignin content the cell wall was determined at different developmental stages. Three rice lodging evaluation models were established using correlation analysis, multivariate regression and artificial radial basis function (RBF) neural network analysis, and the results were compared to identify the most suitable model for predicting optimal rice plant types. Among the three evaluation methods, the mean residual and relative prediction errors were lowest using the RBF network, indicating that it was highly accurate and robust and could be used to establish a mathematical model of the morphological characteristics and lodging resistance of rice to identify optimal varieties.
FFQ, food diaries and 24 h recall methods represent the most commonly used dietary assessment tools in human studies on nutrition and health, but food intake biomarkers are assumed to provide a more objective reflection of intake. Unfortunately, very few of these biomarkers are sufficiently validated. This review provides an overview of food intake biomarker research and highlights present research efforts of the Joint Programming Initiative ‘A Healthy Diet for a Healthy Life’ (JPI-HDHL) Food Biomarkers Alliance (FoodBAll). In order to identify novel food intake biomarkers, the focus is on new food metabolomics techniques that allow the quantification of up to thousands of metabolites simultaneously, which may be applied in intervention and observational studies. As biomarkers are often influenced by various other factors than the food under investigation, FoodBAll developed a food intake biomarker quality and validity score aiming to assist the systematic evaluation of novel biomarkers. Moreover, to evaluate the applicability of nutritional biomarkers, studies are presently also focusing on associations between food intake biomarkers and diet-related disease risk. In order to be successful in these metabolomics studies, knowledge about available electronic metabolomics resources is necessary and further developments of these resources are essential. Ultimately, present efforts in this research area aim to advance quality control of traditional dietary assessment methods, advance compliance evaluation in nutritional intervention studies, and increase the significance of observational studies by investigating associations between nutrition and health.
Objectives: The cardinal motor deficits seen in ideomotor limb apraxia are thought to arise from damage to internal representations for actions developed through learning and experience. However, whether apraxic patients learn to develop new representations with training is not well understood. We studied the capacity of apraxic patients for motor adaptation, a process associated with the development of a new internal representation of the relationship between movements and their sensory effects. Methods: Thirteen healthy adults and 23 patients with left hemisphere stroke (12 apraxic, 11 nonapraxic) adapted to a 30-degree visuomotor rotation. Results: While healthy and nonapraxic participants successfully adapted, apraxics did not. Rather, they showed a rapid decrease in error early but no further improvement thereafter, suggesting a deficit in the slow, but not the fast component of a dual-process model of adaptation. The magnitude of this late learning deficit was predicted by the degree of apraxia, and was correlated with the volume of damage in parietal cortex. Apraxics also demonstrated an initial after-effect similar to the other groups likely reflecting the early learning, but this after-effect was not sustained and performance returned to baseline levels more rapidly, consistent with a disrupted slow learning process. Conclusions: These findings suggest that the early phase of learning may be intact in apraxia, but this leads to the development of a fragile representation that is rapidly forgotten. The association between this deficit and left parietal damage points to a key role for this region in learning to form stable internal representations. (JINS, 2017, 23, 139–149)
Many variables crucial to the social sciences are not directly observed but instead are latent and measured indirectly. When an external variable of interest affects this measurement, estimates of its relationship with the latent variable will then be biased. Such violations of “measurement invariance” may, for example, confound true differences across countries in postmaterialism with measurement differences. To deal with this problem, researchers commonly aim at “partial measurement invariance” that is, to account for those differences that may be present and important. To evaluate this importance directly through sensitivity analysis, the “EPC-interest” was recently introduced for continuous data. However, latent variable models in the social sciences often use categorical data. The current paper therefore extends the EPC-interest to latent variable models for categorical data and demonstrates its use in example analyses of U.S. Senate votes as well as respondent rankings of postmaterialism values in the World Values Study.
The origin of red supergiant mass loss still remains to be unveiled. Characterising the formation loci and the dust distribution in the first stellar radii above the surface is key to understand the initiation of the mass loss phenomenon. Polarimetric interferometry observations in the near-infrared allowed us to detect an inner dust atmosphere located only 0.5 stellar radius above the photosphere of Betelgeuse. We modelled these observations and compare them with visible polarimetric measurements to discuss the dust distribution properties.
The anticipated release of EnlistTM cotton, corn, and soybean cultivars likely will increase the use of 2,4-D, raising concerns over potential injury to susceptible cotton. An experiment was conducted at 12 locations over 2013 and 2014 to determine the impact of 2,4-D at rates simulating drift (2 g ae ha−1) and tank contamination (40 g ae ha−1) on cotton during six different growth stages. Growth stages at application included four leaf (4-lf), nine leaf (9-lf), first bloom (FB), FB + 2 wk, FB + 4 wk, and FB + 6 wk. Locations were grouped according to percent yield loss compared to the nontreated check (NTC), with group I having the least yield loss and group III having the most. Epinasty from 2,4-D was more pronounced with applications during vegetative growth stages. Importantly, yield loss did not correlate with visual symptomology, but more closely followed effects on boll number. The contamination rate at 9-lf, FB, or FB + 2 wk had the greatest effect across locations, reducing the number of bolls per plant when compared to the NTC, with no effect when applied at FB + 4 wk or later. A reduction of boll number was not detectable with the drift rate except in group III when applied at the FB stage. Yield was influenced by 2,4-D rate and stage of cotton growth. Over all locations, loss in yield of greater than 20% occurred at 5 of 12 locations when the drift rate was applied between 4-lf and FB + 2 wk (highest impact at FB). For the contamination rate, yield loss was observed at all 12 locations; averaged over these locations yield loss ranged from 7 to 66% across all growth stages. Results suggest the greatest yield impact from 2,4-D occurs between 9-lf and FB + 2 wk, and the level of impact is influenced by 2,4-D rate, crop growth stage, and environmental conditions.
To clarify the pathways between household livestock and child growth by assessing the relationships between consumption of animal-source foods (ASF) and child growth and evaluating the household livestock correlates of child consumption of ASF.
We conducted a longitudinal cohort study of anthropometry and 3 d feeding recalls among children <5 years old between June 2014 and May 2015. In addition, we collected data on wealth, livestock ownership and livestock diseases in the same households. We used linear and negative binomial mixed models to evaluate the relationships between household livestock characteristics, reported consumption of ASF and child growth.
An 1800-household surveillance catchment area in Western Kenya within the structure of human and animal health surveillance systems.
Children (n 874) <5 years old.
Among children >6 months old, reported frequency of egg and milk consumption was associated with increased monthly height gain (for each additional report of consumption over 3 d: adjusted β (95 % CI)=0·010 (0·002, 0·019) cm/month and 0·008 (0·004, 0·013) cm/month, respectively). Poultry ownership was associated with higher reported frequency of egg, milk and chicken consumption (adjusted incidence rate ratio (95 % CI)=1·3 (1·2, 1·4), 1·4 (1·1, 1·6) and 1·3 (1·1, 1·4), respectively). Some livestock diseases were associated with lower reported frequency of ASF intake (livestock digestive diseases-adjusted incidence rate ratio (95 % CI)=0·89 (0·78, 1·00)).
Child height gain was associated with milk and egg consumption in this cohort. ASF consumption was related to both household livestock ownership and animal health.
We have previously demonstrated that a sharp rise in feed intake (hyperphagia) and spontaneous liver steatosis could be experimentally induced in domestic Greylag geese by combining a short photoperiod and a sequence of feed restriction followed by ad libitum corn feeding during the fall and the winter. In this previous work, however, individual feed intake could not be recorded. The present study aimed at evaluating the relationship between level and pattern of hyperphagia and liver weight with an individual control of feed intake in individually housed (IH) geese, while comparing the performances with group housed (GH) geese. A total of 300 male geese of 19 weeks of age, were provided with corn ad libitum after an initial feed restriction period. From 21 to 23 weeks of age, the daylight duration was progressively reduced from 10 to 7 h and kept as such until the end of the experiment (week 36). In all, 30 GH and 30 IH geese were slaughtered at 19, 27, 30, 32 and 36 weeks of age. Feed intake was measured per group in GH geese and individually in IH geese. During the 1st week of corn feeding, the average feed intake rose up to 600 g/goose per day in GH geese but not in IH geese where the feed intake rose gradually from 300 to 400 g/day. The liver weight increased from 93 g (week 19) to 497 g (week 32; P<0.05) in GH birds. In IH birds, liver weights were, on average, much lower (ranging from 220 to 268 g) than in GH birds (P<0.05). In GH and IH bird, the variability in the individual response to corn feeding was very high (liver weight cv ranging from 63% to 83% depending on slaughter age). A close correlation between corn consumption and liver weight was evidenced in IH birds at each slaughter age (R2 ranging from 0.62 to 0.79), except at 36 weeks of age where this correlation was weak (R2=0.14). The variability in the extent of liver steatosis is very high and our results in IH birds clearly point out the major role of hyperphagia, mainly at the beginning of the ad libitum corn feeding period, on the development of spontaneous liver steatosis.
Field studies were conducted in Alabama, Arkansas, Georgia, Louisiana, Mississippi, North Carolina, and Tennessee during 2010 and 2011 to determine the effect of glufosinate application rate on LibertyLink and WideStrike cotton. Glufosinate was applied in a single application (three-leaf cotton) or sequential application (three-leaf followed by eight-leaf cotton) at 0.6, 1.2, 1.8, and 2.4 kg ai ha−1. Glufosinate application rate did not affect visual injury or growth parameters measured in LibertyLink cotton. No differences in LibertyLink cotton yield were observed because of glufosinate application rate; however, LibertyLink cotton treated with glufosinate yielded slightly more cotton than the nontreated check. Visual estimates of injury to WideStrike cotton increased with each increase in glufosinate application rate. However, the injury was transient, and by 28 d after the eight-leaf application, no differences in injury were observed. WideStrike cotton growth was adversely affected during the growing season following glufosinate application at rates of 1.2 kg ha−1 and greater; however, cotton height and total nodes were unaffected by glufosinate application rate at the end of the season. WideStrike cotton maturity was delayed, and yields were reduced following glufosinate application at rates of 1.2 kg ha−1 and above. Fiber quality of LibertyLink and WideStrike cotton was unaffected by glufosinate application rate. These data indicate that glufosinate may be applied to WideStrike cotton at rates of 0.6 kg ha−1 without inhibiting cotton growth, development, or yield. Given the lack of injury or yield reduction following glufosinate application to LibertyLink cotton, these cultivars possess robust resistance to glufosinate. Growers are urged to be cautious when increasing glufosinate application rates to increase control of glyphosate-resistant Palmer amaranth in WideStrike cotton. However, glufosinate application rates may be increased to maximum labeled rates when making applications to LibertyLink cotton without fear of reducing cotton growth, development, or yield.
An abattoir-based study was undertaken between January and May 2013 to estimate the prevalence of Salmonella spp. and Yersinia spp. carriage and seroprevalence of antibodies to Toxoplasma gondii and porcine reproductive and respiratory syndrome virus (PRRSv) in UK pigs at slaughter. In total, 626 pigs were sampled at 14 abattoirs that together process 80% of the annual UK pig slaughter throughput. Sampling was weighted by abattoir throughput and sampling dates and pig carcasses were randomly selected. Rectal swabs, blood samples, carcass swabs and the whole caecum, tonsils, heart and tongue were collected. Salmonella spp. was isolated from 30·5% [95% confidence interval (CI) 26·5–34·6] of caecal content samples but only 9·6% (95% CI 7·3–11·9) of carcass swabs, which was significantly lower than in a UK survey in 2006–2007. S. Typhimurium and S. 4,,12:i:- were the most commonly isolated serovars, followed by S. Derby and S. Bovismorbificans. The prevalence of Yersinia enterocolitica carriage in tonsils was 28·7% (95% CI 24·8–32·7) whereas carcass contamination was much lower at 1·8% (95% CI 0·7–2·8). The seroprevalence of antibodies to Toxoplasma gondii and PRRSv was 7·4% (95% CI 5·3–9·5) and 58·3% (95% CI 53·1–63·4), respectively. This study provides a comparison to previous abattoir-based prevalence surveys for Salmonella and Yersinia, and the first UK-wide seroprevalence estimates for antibodies to Toxoplasma and PRRSv in pigs at slaughter.