To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Research on complex behavior change interventions has largely focused on intervention development and testing their effects in feasibility trials, pilot studies, and randomized controlled trials. However, a significant gap exists in translating behavior interventions informed by theory into real-world practice. This chapter describes how engaging stakeholders can improve the likelihood that effective behavior change interventions are put into practice. The chapter begins with an overview of implementation science and normalization process theory – which outlines how effective interventions are routinely implemented. The roles of stakeholders as research partners and research participants are differentiated using research in health contexts. For example, the process of stakeholder involvement is illustrated using digital health interventions for people with long-term physical health conditions with reference to UK Medical Research Council guidelines on complex interventions. The examples illustrate (1) how stakeholder support in the co-design of complex interventions can improve their utility, usability, accessibility, and acceptability and (2) how stakeholder perspectives elicited using mixed methods during the feasibility and pilot phases of intervention development can help inform subsequent stages of intervention development. Finally, the evaluation and implementation phase is explored, using a case study to illustrate the need to engage with additional stakeholders to translate effective interventions into routine practice.
A diet rich in fruits and vegetables may reduce the risk of chronic diseases. However, in many countries, the majority of children do not eat the recommended quantities of fruits and vegetables. The present study aimed to understand associations between feeding practices in infancy (breast-feeding and first complementary food) and fruit and vegetable consumption in childhood (frequency and variety). Data were from the national, observational, cross-sectional Mothers and their Children’s Health study conducted in 2016/2017, a sub-study of the national Australian Longitudinal Study on Women’s Health. Mothers completed a written survey on feeding practices in infancy (breast-feeding duration, use of formula, first complementary food) and children’s fruit and vegetable frequency (number of times eaten) and variety (number of different types eaten) in the past 24 h, using the Children’s Dietary Questionnaire. Children (n 4981, mean 7·36 (sd 2·90) years) ate vegetables 2·10 (sd 1·11) times and fruits 2·35 (sd 1·14) times and ate 3·21 (sd 1·35) different vegetables and 2·40 (sd 1·18) different fruits, on average. Compared with breast-feeding for <6 months, breast-feeding for ≥6 months was associated with higher vegetable variety. Compared with cereal as the first complementary food, fruits or vegetables were associated with higher vegetable frequency and variety, and higher fruit frequency. Overall, infancy is a window of opportunity for dietary intervention. Guidance to parents should encourage the use of fruits and vegetables at the beginning of complementary feeding.
We describe an ultra-wide-bandwidth, low-frequency receiver recently installed on the Parkes radio telescope. The receiver system provides continuous frequency coverage from 704 to 4032 MHz. For much of the band (
), the system temperature is approximately 22 K and the receiver system remains in a linear regime even in the presence of strong mobile phone transmissions. We discuss the scientific and technical aspects of the new receiver, including its astronomical objectives, as well as the feed, receiver, digitiser, and signal processor design. We describe the pipeline routines that form the archive-ready data products and how those data files can be accessed from the archives. The system performance is quantified, including the system noise and linearity, beam shape, antenna efficiency, polarisation calibration, and timing stability.
There is a continual need for invasive plant science to develop approaches for cost-effectively benefiting native over nonnative species in dynamic management and biophysical contexts, including within predominantly nonnative plant landscapes containing only small patches of native plants. Our objective was to test the effectiveness of a minimal-input strategy for enlarging native species patches within a nonnative plant matrix. In Pecos National Historical Park, New Mexico, USA, we identified 40 native perennial grass patches within a matrix of the nonnative annual forb kochia [Bassia scoparia (L.) A.J. Scott]. We mechanically cut B. scoparia in a 2-m-wide ring surrounding the perimeters of half the native grass patches (with the other half as uncut controls) and measured change in native grass patch size (relative to pretreatment) for 3 yr. Native grass patches around which B. scoparia was cut grew quickly the first posttreatment year and by the third year had increased in size four times more than control patches. Treated native grass patches expanded by an average of 25 m2, from 4 m2 in October 2015 before treatment to 29 m2 in October 2018. The experiment occurred during a dry period, conditions that should favor B. scoparia and contraction of the native grasses, suggesting that the observed increase in native grasses occurred despite suboptimal climatic conditions. Strategically treating around native patches to enlarge them over time showed promise as a minimal-input technique for increasing the proportion of the landscape dominated by native plants.
Limited data exist for management of hyperuricemia in non-oncologic patients, particularly in paediatric cardiac patients. Hyperuricemia is a risk factor for acute kidney injury and may prompt treatment in critically ill patients. The primary objective was to determine if rasburicase use was associated with greater probability normalisation of serum uric acid compared to allopurinol. Secondary outcomes included percent reduction in uric acid, changes in serum creatinine, and cost of therapy.
A single-centre retrospective chart review.
A 20-bed quaternary cardiovascular ICU in a university-based paediatric hospital in California.
Patients admitted to cardiovascular ICU who received rasburicase or intravenous allopurinol between 2015 and 2016.
Measurements and main results:
Data from a cohort of 14 patients receiving rasburicase were compared to 7 patients receiving IV allopurinol. Patients who were administered rasburicase for hyperuricemia were more likely to have a post-treatment uric acid level less than 8 mg/dl as compared to IV allopurinol (100 versus 43%; p = 0.0058). Patients who received rasburicase had a greater absolute reduction in post-treatment day 1 uric acid (−9 mg/dl versus −1.9 mg/dl; p = 0.002). There were no differences in post-treatment day 3 or day 7 serum creatinine or time to normalisation of serum creatinine. The cost of therapy normalised to a 20 kg patient was greater in the allopurinol group ($18,720 versus $1928; p = 0.001).
In a limited paediatric cardiac cohort, the use of rasburicase was associated with a greater reduction in uric acid levels and associated with a lower cost compared to IV allopurinol.
Organismal metabolic rates reflect the interaction of environmental and physiological factors. Thus, calcifying organisms that record growth history can provide insight into both the ancient environments in which they lived and their own physiology and life history. However, interpreting them requires understanding which environmental factors have the greatest influence on growth rate and the extent to which evolutionary history constrains growth rates across lineages. We integrated satellite measurements of sea-surface temperature and chlorophyll-a concentration with a database of growth coefficients, body sizes, and life spans for 692 populations of living marine bivalves in 195 species, set within the context of a new maximum-likelihood phylogeny of bivalves. We find that environmental predictors overall explain only a small proportion of variation in growth coefficient across all species; temperature is a better predictor of growth coefficient than food supply, and growth coefficient is somewhat more variable at higher summer temperatures. Growth coefficients exhibit moderate phylogenetic signal, and taxonomic membership is a stronger predictor of growth coefficient than any environmental predictor, but phylogenetic inertia cannot fully explain the disjunction between our findings and the extensive body of work demonstrating strong environmental control on growth rates within taxa. Accounting for evolutionary history is critical when considering shells as historical archives. The weak relationship between variation in food supply and variation in growth coefficient in our data set is inconsistent with the hypothesis that the increase in mean body size through the Phanerozoic was driven by increasing productivity enabling faster growth rates.
An experiment was carried out to examine the effects of offering beef cattle five silage diets. These were perennial ryegrass silage (PRGS) as the sole forage, tall fescue/perennial ryegrass silage (FGS) as the sole forage, PRGS in a 50:50 ratio on a dry matter (DM) basis with lupin/triticale silage (LTS), lupin/wheat silage (LWS) and pea/oat silage (POS). Each of the five silage diets was supplemented with 4 and 7 kg of concentrates/head/day in a five silages × two concentrate intakes factorial design. A total of 90 cattle were used in the 121-day experiment. The grass silages were of medium digestibility and were well preserved. The legume/cereal silages had high ammonia N, high acetic acid, low lactic acid, low butyric acid and low digestible organic matter concentrations (542, 562 and 502 g/kg DM for LTS, LWS and POS, respectively). Silage treatment did not significantly affect liveweight gain, carcass gain, carcass characteristics, the instrumental assessment of meat quality or fatty acid composition of the M. longissimus dorsi muscle. In view of the low yields of the legume/cereal crops, it is concluded that the inclusion of spring-sown legume/cereal silages in the diets of beef cattle is unlikely to be advantageous.
An experiment was carried out to examine the effects of offering beef steers grass silage (GS) as the sole forage, lupins/triticale silage (LTS) as the sole forage, a mixture of LTS and GS at a ratio of 70:30 on a dry matter (DM) basis, vetch/barley silage (VBS) as the sole forage, a mixture of VBS and GS at a ratio of 70:30 on a DM basis, giving a total of five silage diets. Each of the five silage diets was supplemented with 2 and 5 kg of concentrates/head/day in a 5 × 2 factorial design to evaluate the five silages at two levels of concentrate intake and to examine possible interactions between silage type and concentrate intake. A total of 80 beef steers were used in the 122-day experiment. The GS was well preserved while the whole crop cereal/legume silages had high ammonia-nitrogen (N) concentrations, low lactic acid concentrations and low butyric acid concentrations For GS, LTS, LTS/GS, VBS and VBS/GS, respectively, silage DM intakes were 6.5, 7.0, 7.2, 6.1 and 6.6 (s.e.d. 0.55) kg/day and live weight gains were 0.94, 0.72, 0.63, 0.65 and 0.73 (s.e.d. 0.076) kg/day. Silage type did not affect carcass fatness, the colour or tenderness of meat or the fatty acid composition of the intramuscular fat in the longissimus dorsi muscle.
Prenatal adversity shapes child neurodevelopment and risk for later mental health problems. The quality of the early care environment can buffer some of the negative effects of prenatal adversity on child development. Retrospective studies, in adult samples, highlight epigenetic modifications as sentinel markers of the quality of the early care environment; however, comparable data from pediatric cohorts are lacking. Participants were drawn from the Maternal Adversity Vulnerability and Neurodevelopment (MAVAN) study, a longitudinal cohort with measures of infant attachment, infant development, and child mental health. Children provided buccal epithelial samples (mean age = 6.99, SD = 1.33 years, n = 226), which were used for analyses of genome-wide DNA methylation and genetic variation. We used a series of linear models to describe the association between infant attachment and (a) measures of child outcome and (b) DNA methylation across the genome. Paired genetic data was used to determine the genetic contribution to DNA methylation at attachment-associated sites. Infant attachment style was associated with infant cognitive development (Mental Development Index) and behavior (Behavior Rating Scale) assessed with the Bayley Scales of Infant Development at 36 months. Infant attachment style moderated the effects of prenatal adversity on Behavior Rating Scale scores at 36 months. Infant attachment was also significantly associated with a principal component that accounted for 11.9% of the variation in genome-wide DNA methylation. These effects were most apparent when comparing children with a secure versus a disorganized attachment style and most pronounced in females. The availability of paired genetic data revealed that DNA methylation at approximately half of all infant attachment-associated sites was best explained by considering both infant attachment and child genetic variation. This study provides further evidence that infant attachment can buffer some of the negative effects of early adversity on measures of infant behavior. We also highlight the interplay between infant attachment and child genotype in shaping variation in DNA methylation. Such findings provide preliminary evidence for a molecular signature of infant attachment and may help inform attachment-focused early intervention programs.
The most important factors known to influence the eating quality of beef are well established and include both pre- and post-slaughter events with many of the determinants interacting with each other. A substantial programme of work has been conducted by the Agri-Food and Biosciences Institute in Northern Ireland aimed at quantifying those factors of most importance to the local beef industry. Post-slaughter effects such as carcase chilling and electrical stimulation, ageing, carcase hanging and cooking method have been shown to have a significant impact on eating quality when compared with pre-slaughter activities such as animal handling and lairage time in the Northern Ireland studies. However, the effect of animal breed, particularly the use of dairy breed animals, was shown to significantly improve eating quality. Many of these factors were found to interact with each other.
Epidemiology formed the basis of ‘the Barker hypothesis’, the concept of ‘developmental programming’ and today’s discipline of the Developmental Origins of Health and Disease (DOHaD). Animal experimentation provided proof of the underlying concepts, and continues to generate knowledge of underlying mechanisms. Interventions in humans, based on DOHaD principles, will be informed by experiments in animals. As knowledge in this discipline has accumulated, from studies of humans and other animals, the complexity of interactions between genome, environment and epigenetics, has been revealed. The vast nature of programming stimuli and breadth of effects is becoming known. As a result of our accumulating knowledge we now appreciate the impact of many variables that contribute to programmed outcomes. To guide further animal research in this field, the Australia and New Zealand DOHaD society (ANZ DOHaD) Animals Models of DOHaD Research Working Group convened at the 2nd Annual ANZ DOHaD Congress in Melbourne, Australia in April 2015. This review summarizes the contributions of animal research to the understanding of DOHaD, and makes recommendations for the design and conduct of animal experiments to maximize relevance, reproducibility and translation of knowledge into improving health and well-being.
Toxoplasma gondii is a globally distributed parasitic protozoan that infects most warm-blooded animals. We incorporated a bead coupled with recombinant SAG2A protein into our Neglected Tropical Disease (NTD) multiplex bead assay (MBA) panel and used it to determine Toxoplasma infection rates in two studies in Haiti. In a longitudinal cohort study of children aged 0–11 years, the infection rate varied with age reaching a maximum of 0·131 infections/year in children aged 3 years [95% confidence interval (CI) 0·065–0·204]. The median time to seroconversion was estimated to be 9·7 years (95% CI 7·6–∞). In a cross-sectional, community-wide survey of residents of all ages, we determined an overall seroprevalence of 28·2%. The seroprevalence age curve from the cross-sectional study also suggested that the force of infection varied with age and peaked at 0·057 infections/year (95% CI 0·033–0·080) at age 2·6 years. Integration of the Toxoplasma MBA into NTD surveys may allow for better estimates of the potential burden of congenital toxoplasmosis in underserved regions.
Studies report the variable prevalence of attention deficit hyperactivity disorder (ADHD) in incarcerated populations. The aim of this meta-analysis was to determine the prevalence of ADHD in these populations.
Primary research studies reporting the prevalence (lifetime/current) of ADHD in incarcerated populations were identified. The meta-analysis used a mixed log-binomial model, including fixed effects for each covariate and a random study effect, to estimate the significance of various risk factors.
Forty-two studies were included in the analysis. ADHD prevalence was higher with screening diagnoses versus diagnostic interview (and with retrospective youth diagnoses versus current diagnoses). Using diagnostic interview data, the estimated prevalence was 25.5% and there were no significant differences for gender and age. Significant country differences were noted.
Compared with published general population prevalence, there is a fivefold increase in prevalence of ADHD in youth prison populations (30.1%) and a 10-fold increase in adult prison populations (26.2%).
Antenatal corticosteroids are used to augment fetal lung maturity in human pregnancy. Dexamethasone (DEX) is also used to treat congenital adrenal hyperplasia of the fetus in early pregnancy. We previously reported effects of synthetic corticosteroids given to sheep in early or late gestation on pregnancy length and fetal cortisol levels and glucocorticoids alter plasma insulin-like growth factor (IGF) and insulin-like growth factor binding protein (IGFBP) concentrations in late pregnancy and reduce fetal weight. The effects of administering DEX in early pregnancy on fetal organ weights and betamethasone (BET) given in late gestation on weights of fetal brain regions or organ development have not been reported. We hypothesized that BET or DEX administration at either stage of pregnancy would have deleterious effects on fetal development and associated hormones. In early pregnancy, DEX was administered as four injections at 12-hourly intervals over 48 h commencing at 40–42 days of gestation (dG). There was no consistent effect on fetal weight, or individual fetal organ weights, except in females at 7 months postnatal age. When BET was administered at 104, 111 and 118 dG, the previously reported reduction in total fetal weight was associated with significant reductions in weights of fetal brain, cerebellum, heart, kidney and liver. Fetal plasma insulin, leptin and triiodothyronine were also reduced at different times in fetal and postnatal life. We conclude that at the amounts given, the sheep fetus is sensitive to maternal administration of synthetic glucocorticoid in late gestation, with effects on growth and metabolic hormones that may persist into postnatal life.
In this study, we determined the gene and/or protein expression of hypothalamic–pituitary–adrenal (HPA) axis regulatory molecules following synthetic glucocorticoid exposures. Pregnant sheep received intramuscular saline or betamethasone (BET) injections at 104 (BET-1), 104 and 111(BET-2) or 104, 111 and 118 (BET-3) days of gestation (dG). Samples were collected at numerous time-points between 75 dG and 12 weeks postnatal age. In the BET-3 treatment group, fetal plasma cortisol levels were lower at 145 dG than controls and gestational length was lengthened significantly. The cortisol:adrenocorticotropic hormone (ACTH) ratio in fetal plasma of control and BET-3 fetuses rose significantly between132 and 145 dG, and remained elevated in lambs at 6 and 12 weeks of age; this rise was truncated at day 145 in fetuses of BET-3 treated mothers. After BET treatment, fetal and postnatal pituitary proopiomelanocortin mRNA levels were reduced from 109 dG to 12 weeks postnatal age; pituitary prohormone convertase 1 and 2 mRNA levels were reduced at 145 dG and postnatally; hypothalamic arginine vasopressin mRNA levels were lowered at all time-points, but corticotrophin-releasing hormone mRNA levels were reduced only in postnatal lambs. Maternal BET increased late fetal and/or postnatal adrenal mRNA levels of ACTH receptor and 3β hydroxysteroid dehydrogenase but decreased steroidogenic acute regulatory protein and P450 17-α hydroxylase. The altered mRNA levels of key HPA axis regulatory proteins after maternal BET injections suggests processes that may subserve long-term changes in HPA activity in later life after prenatal exposure to synthetic glucocorticoids.
We investigate to what extent the current helicity distribution observed in solar active regions is compatible with solar dynamo models. We use an advanced 2D mean-field dynamo model with dynamo action largely concentrated near the bottom of the convective zone, and dynamo saturation based on the evolution of the magnetic helicity and algebraic quenching. For comparison, we also studied a more basic 2D mean-field dynamo model with simple algebraic alpha quenching only. Using these numerical models we obtain butterfly diagrams for both the small-scale current helicity and the large-scale magnetic helicity, and compare them with the butterfly diagram for the current helicity in active regions obtained from observations. This comparison shows that the current helicity of active regions, as estimated by −A·B evaluated at the depth from which the active region arises, resembles the observational data much better than the small-scale current helicity calculated directly from the helicity evolution equation. Here B and A are respectively the dynamo generated mean magnetic field and its vector potential.