To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Ready-to-eat (RTE) cereal is an important source of nutrients in the American diet. Recent regulatory changes to labelling requirements may impact the fortification of RTE cereal. We used an evidence-based approach to optimize the fortification of RTE cereal considering current dietary patterns and nutrition policy.
A US modelling study of cross-sectional data from the National Health and Nutrition Examination Survey (NHANES) 2013–2014. The percentage of the population below the Estimated Average Requirement (EAR) and above the Upper Tolerable Intake Level (UL) was modelled under three scenarios: baseline, zero fortification and optimized fortification.
Toddlers aged 1–3 years, n 559; children aged 4–12 years, n 1540; adolescents aged 13–18 years, n 992; and adults aged ≥19 years, n 576.
Comparing current with optimized fortification, nutrient/100 g RTE cereal decreased for vitamin A, thiamin, riboflavin, niacin, vitamin B6, folic acid, vitamin B12, Ca and Fe (by 2–82 %). The amount of vitamins C and D increased (by 13 and 50 %, respectively). Among RTE cereal eaters, these changes resulted in modest increases in the percentage of the population aged ≥1 year below the EAR (+0·5 to +11·5 percentage points). Decreases were observed in the percentage of the population above the UL.
Fortification of RTE cereal can be optimized to provide key nutrients and minimize the percentage of the population below the EAR and above the UL. Dietary intake modelling is useful to ensure that RTE cereal continues to help the population meet their nutrient needs.
Cognitive health, and prevention of its decline to dementia, has risen in prominence with a corresponding exploration of modifiable risk factors to prevent a decline in cognitive health with age. This commentary discusses a new Cochrane review that examines the effect of vitamin and mineral supplementation in maintaining cognitive health in cognitively healthy adults in mid- and late-life. From a heterogeneous body of evidence, the quality of which ranged from very low to moderate, the review draws the conclusions of little or no benefit of supplements.
Attention deficit hyperactivity disorder (ADHD) is highly heritable and is associated with lower educational attainment. ADHD is linked to family adversity, including hostile parenting. Questions remain regarding the role of genetic and environmental factors underlying processes through which ADHD symptoms develop and influence academic attainment.
This study employed a parent-offspring adoption design (N = 345) to examine the interplay between genetic susceptibility to child attention problems (birth mother ADHD symptoms) and adoptive parent (mother and father) hostility on child lower academic outcomes, via child ADHD symptoms. Questionnaires assessed birth mother ADHD symptoms, adoptive parent (mother and father) hostility to child, early child impulsivity/activation, and child ADHD symptoms. The Woodcock–Johnson test was used to examine child reading and math aptitude.
Building on a previous study (Harold et al., 2013, Journal of Child Psychology and Psychiatry, 54(10), 1038–1046), heritable influences were found: birth mother ADHD symptoms predicted child impulsivity/activation. In turn, child impulsivity/activation (4.5 years) evoked maternal and paternal hostility, which was associated with children's ADHD continuity (6 years). Both maternal and paternal hostility (4.5 years) contributed to impairments in math but not reading (7 years), via impacts on ADHD symptoms (6 years).
Findings highlight the importance of early child behavior dysregulation evoking parent hostility in both mothers and fathers, with maternal and paternal hostility contributing to the continuation of ADHD symptoms and lower levels of later math ability. Early interventions may be important for the promotion of child math skills in those with ADHD symptoms, especially where children have high levels of early behavior dysregulation.
This study compares the frequency and severity of influenza A/H1N1pdm09 (A/H1), influenza A/H3N2 (A/H3) and other respiratory virus infections in hospitalised patients. Data from 17 332 adult hospitalised patients admitted to Sir Charles Gairdner Hospital, Perth, Western Australia, with a respiratory illness between 2012 and 2015 were linked with data containing reverse transcription polymerase chain reaction results for respiratory viruses including A/H1, A/H3, influenza B, human metapneumovirus, respiratory syncytial virus and parainfluenza. Of these, 1753 (10.1%) had test results. Multivariable regression analyses were conducted to compare the viruses for clinical outcomes including ICU admission, ventilation, pneumonia, length of stay and death. Patients with A/H1 were more likely to experience severe outcomes such as ICU admission (OR 2.5, 95% CI 1.2–5.5, P = 0.016), pneumonia (OR 3.0, 95% CI 1.6–5.7, P < 0.001) and lower risk of discharge from hospital (indicating longer lengths of hospitalisation; HR 0.64 95% CI 0.47–0.88, P = 0.005), than patients with A/H3. Patients with a non-influenza respiratory virus were less likely to experience severe clinical outcomes than patients with A/H1, however, had similar likelihood when compared to patients with A/H3. Patients hospitalised with A/H1 had higher odds of severe outcomes than patients with A/H3 or other respiratory viruses. Knowledge of circulating influenza strains is important for healthcare preparedness.
We have detected 27 new supernova remnants (SNRs) using a new data release of the GLEAM survey from the Murchison Widefield Array telescope, including the lowest surface brightness SNR ever detected, G 0.1 – 9.7. Our method uses spectral fitting to the radio continuum to derive spectral indices for 26/27 candidates, and our low-frequency observations probe a steeper spectrum population than previously discovered. None of the candidates have coincident WISE mid-IR emission, further showing that the emission is non-thermal. Using pulsar associations we derive physical properties for six candidate SNRs, finding G 0.1 – 9.7 may be younger than 10 kyr. Sixty per cent of the candidates subtend areas larger than 0.2 deg2 on the sky, compared to < 25% of previously detected SNRs. We also make the first detection of two SNRs in the Galactic longitude range 220°–240°.
This work makes available a further
of the GaLactic and Extragalactic All-sky Murchison Widefield Array (GLEAM) survey, covering half of the accessible galactic plane, across 20 frequency bands sampling 72–231 MHz, with resolution
. Unlike previous GLEAM data releases, we used multi-scale CLEAN to better deconvolve large-scale galactic structure. For the galactic longitude ranges
$345^\circ < l < 67^\circ$
$180^\circ < l < 240^\circ$
, we provide a compact source catalogue of 22 037 components selected from a 60-MHz bandwidth image centred at 200 MHz, with RMS noise
and position accuracy better than 2 arcsec. The catalogue has a completeness of 50% at
, and a reliability of 99.86%. It covers galactic latitudes
towards the galactic centre and
for other regions, and is available from Vizier; images covering
for all longitudes are made available on the GLEAM Virtual Observatory (VO).server and SkyView.
We examined the latest data release from the GaLactic and Extragalactic All-sky Murchison Widefield Array (GLEAM) survey covering 345° < l < 60° and 180° < l < 240°, using these data and that of the Widefield Infrared Survey Explorer to follow up proposed candidate Supernova Remnant (SNR) from other sources. Of the 101 candidates proposed in the region, we are able to definitively confirm ten as SNRs, tentatively confirm two as SNRs, and reclassify five as H ii regions. A further two are detectable in our images but difficult to classify; the remaining 82 are undetectable in these data. We also investigated the 18 unclassified Multi-Array Galactic Plane Imaging Survey (MAGPIS) candidate SNRs, newly confirming three as SNRs, reclassifying two as H ii regions, and exploring the unusual spectra and morphology of two others.
Increasing weed control costs and limited herbicide options threaten vegetable crop profitability. Traditional interrow mechanical cultivation is very effective at removing weeds between crop rows. However, weed control within the crop rows is necessary to establish the crop and prevent yield loss. Currently, many vegetable crops require hand weeding to remove weeds within the row that remain after traditional cultivation and herbicide use. Intelligent cultivators have come into commercial use to remove intrarow weeds and reduce cost of hand weeding. Intelligent cultivators currently on the market such as the Robovator, use pattern recognition to detect the crop row. These cultivators do not differentiate crops and weeds and do not work well among high weed populations. One approach to differentiate weeds is to place a machine-detectable mark or signal on the crop (i.e., the crop has the mark and the weed does not), thereby facilitating weed/crop differentiation. Lettuce and tomato plants were marked with labels and topical markers, then cultivated with an intelligent cultivator programmed to identify the markers. Results from field trials in marked tomato and lettuce found that the intelligent cultivator removed 90% more weeds from tomato and 66% more weeds from lettuce than standard cultivators without reducing yields. Accurate crop and weed differentiation described here resulted in a 45% to 48% reduction in hand-weeding time per hectare.
Data preservation, reuse, and synthesis are important goals in contemporary archaeological research that have been addressed by the recent collaboration of the Eastern Archaic Faunal Working Group (EAFWG). We used the Digital Archaeological Record (tDAR) to preserve 60 significant legacy faunal databases from 23 Archaic period archaeological sites located in several contiguous subregions of the interior North American Eastern Woodlands. In order to resolve the problem of synthesizing non-standardized databases, we used the ontology and integration tools available in tDAR to explore comparability and combine datasets so that our research questions about aquatic resource use during the Archaic could be addressed at multiple scales. The challenges of making digital databases accessible for reuse, including the addition of metadata, and of linking disparate data in queryable datasets are significant but worth the effort. Our experience provides one example of how collaborative research may productively resolve problems in making legacy data accessible and usable for synthetic archaeological research.
Meal timing may influence food choices, neurobiology and psychological states. Our exploratory study examined if time-of-day eating patterns were associated with mood disorders among adults.
During 2004–2006 (age 26–36 years) and 2009–2011 (follow-up, age 31–41 years), N = 1304 participants reported 24-h food and beverage intake. Time-of-day eating patterns were derived by principal components analysis. At follow-up, the Composite International Diagnostic Interview measured lifetime mood disorder. Log binomial and adjacent categories log-link regression were used to examine bidirectional associations between eating patterns and mood disorder. Covariates included sex, age, marital status, social support, education, work schedule, body mass index and smoking.
Three patterns were derived at each time-point: Grazing (intake spread across the day), Traditional (highest intakes reflected breakfast, lunch and dinner), and Late (skipped/delayed breakfast with higher evening intakes). Compared to those in the lowest third of the respective pattern at baseline and follow-up, during the 5-year follow-up, those in the highest third of the Late pattern at both time-points had a higher prevalence of mood disorder [prevalence ratio (PR) = 2.04; 95% confidence interval (CI) 1.20–3.48], and those in the highest third of the Traditional pattern at both time-points had a lower prevalence of first onset mood disorder (PR = 0.31; 95% CI 0.11–0.87). Participants who experienced a mood disorder during follow-up had a 1.07 higher relative risk of being in a higher Late pattern score category at follow-up than those without mood disorder (95% CI 1.00–1.14).
Non-traditional eating patterns, particularly skipped or delayed breakfast, may be associated with mood disorders.
The study of bureaucratic behavior—focusing on control, decision-making, and institutional arrangements—has historically leaned heavily on theories of rational choice and bounded rationality. Notably absent from this research, however, is attention to the growing literature on biological and especially evolutionary human behavior. This article addresses this gap by closely examining the extant economic and psychological frameworks—which we refer to as “Adam Smith’s bureaucrat” and “Herbert Simon’s bureaucrat”—for their shortcomings in terms of explanatory and predictive theory, and by positing a different framework, which we call “Charles Darwin’s bureaucrat.” This model incorporates new insights from an expanding multidisciplinary research framework and has the potential to address some of the long-noted weaknesses of classic theories of bureaucratic behavior.
In patients with β-lactam allergies, administration of non–β-lactam surgical prophylaxis is associated with increased risk of infection. Although many patients self-report β-lactam allergies, most are unconfirmed or mislabeled. A quality improvement process, utilizing a structured β-lactam allergy tool, was implemented to improve the utilization of preferred β-lactam surgical prophylaxis.
Introduction and regular application of multiplex polymerase chain reaction analysis of bronchoalveolar specimens for community-acquired respiratory viruses in January 2017 led to the identification of adenovirus in multiple patients in a surgical intensive unit in July 2017, which was attributed to a pseudo-outbreak.
Weed management is a major challenge in organic crop production, and organic farms generally harbor larger weed populations and more diverse communities compared with conventional farms. However, little research has been conducted on the effects of different organic management practices on weed communities and crop yields. In 2014 and 2015, we measured weed community structure and soybean [Glycine max (L.) Merr.] yield in a long-term experiment that compared four organic cropping systems that differed in nutrient inputs, tillage, and weed management intensity: (1) high fertility (HF), (2) low fertility (LF), (3) enhanced weed management (EWM), and (4) reduced tillage (RT). In addition, we created weed-free subplots within each system to assess the impact of weeds on soybean yield. Weed density was greater in the LF and RT systems compared with the EWM system, but weed biomass did not differ among systems. Weed species richness was greater in the RT system compared with the EWM system, and weed community composition differed between RT and other systems. Our results show that differences in weed community structure were primarily related to differences in tillage intensity, rather than nutrient inputs. Soybean yield was lower in the EWM system compared with the HF and RT systems. When averaged across all four cropping systems and both years, soybean yield in weed-free subplots was 10% greater than soybean yield in the ambient weed subplots that received standard management practices for the systems in which they were located. Although weed competition limited soybean yield across all systems, the EWM system, which had the lowest weed density, also had the lowest soybean yield. Future research should aim to overcome such trade-offs between weed control and yield potential, while conserving weed species richness and the ecosystem services associated with increased weed diversity.