To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Animal-derived dietary protein ingestion and physical activity stimulate myofibrillar protein synthesis rates in older adults. We determined whether a non-animal-derived diet can support daily myofibrillar protein synthesis rates to the same extent as an omnivorous diet. Nineteen healthy older adults (aged 66 (sem 1) years; BMI 24 (sem 1) kg/m2; twelve males, seven females) participated in a randomised, parallel-group, controlled trial during which they consumed a 3-d isoenergetic high-protein (1·8 g/kg body mass per d) diet, where the protein was provided from predominantly (71 %) animal (OMNI; n 9; six males, three females) or exclusively vegan (VEG; n 10; six males, four females; mycoprotein providing 57 % of daily protein intake) sources. During the dietary control period, participants conducted a daily bout of unilateral resistance-type leg extension exercise. Before the dietary control period, participants ingested 400 ml of deuterated water, with 50-ml doses consumed daily thereafter. Saliva samples were collected throughout to determine body water 2H enrichments, and muscle samples were collected from rested and exercised muscle to determine daily myofibrillar protein synthesis rates. Deuterated water dosing resulted in body water 2H enrichments of approximately 0·78 (sem 0·03) %. Daily myofibrillar protein synthesis rates were 13 (sem 8) (P = 0·169) and 12 (sem 4) % (P = 0·016) greater in the exercised compared with rested leg (1·59 (sem 0·12) v. 1·77 (sem 0·12) and 1·76 (sem 0·14) v. 1·93 (sem 0·12) %/d) in OMNI and VEG groups, respectively. Daily myofibrillar protein synthesis rates did not differ between OMNI and VEG in either rested or exercised muscle (P > 0·05). Over the course of a 3-d intervention, omnivorous- or vegan-derived dietary protein sources can support equivalent rested and exercised daily myofibrillar protein synthesis rates in healthy older adults consuming a high-protein diet.
Maintaining nutritional adequacy contributes to successful ageing. B vitamins involved in one-carbon metabolism regulation (folate, riboflavin, vitamins B6 and B12) are critical nutrients contributing to homocysteine and epigenetic regulation. Although cross-sectional B vitamin intake in ageing populations is characterised, longitudinal changes are infrequently reported. This systematic review explores age-related changes in dietary adequacy of folate, riboflavin, vitamins B6 and B12 in community-dwelling older adults (≥65 years at follow-up). Following Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, databases (MEDLINE, Embase, BIOSIS, CINAHL) were systematically screened, yielding 1579 records; eight studies were included (n 3119 participants, 2–25 years of follow-up). Quality assessment (modified Newcastle–Ottawa quality scale) rated all of moderate–high quality. The estimated average requirement cut-point method estimated the baseline and follow-up population prevalence of dietary inadequacy. Riboflavin (seven studies, n 1953) inadequacy progressively increased with age; the prevalence of inadequacy increased from baseline by up to 22·6 and 9·3 % in males and females, respectively. Dietary folate adequacy (three studies, n 2321) improved in two studies (by up to 22·4 %), but the third showed increasing (8·1 %) inadequacy. Evidence was similarly limited (two studies, respectively) and inconsistent for vitamins B6 (n 559; −9·9 to 47·9 %) and B12 (n 1410; −4·6 to 7·2 %). This review emphasises the scarcity of evidence regarding micronutrient intake changes with age, highlighting the demand for improved reporting of longitudinal changes in nutrient intake that can better direct micronutrient recommendations for older adults. This review was registered with PROSPERO (CRD42018104364).
Lymphatic vessel dysplasia is associated with Fontan-associated protein-losing enteropathy. Extra nodal non-Hodgkin lymphomas including mucosa-associated lymphoid tissue (MALT lymphoma) are associated with lymphatic vessel dysplasia. Here, we describe the case of a 7-year-old with Fontan-associated protein-losing enteropathy who developed MALT lymphoma with a clinical course indicative of interaction between these pathologies and improvement in protein-losing enteropathy after MALT lymphoma treatment. This case suggests a pathophysiologic overlap which has implications for the management of Fontan-associated protein-losing enteropathy.
The Eating Assessment in Toddlers FFQ (EAT FFQ) has been shown to have good reliability and comparative validity for ranking nutrient intakes in young children. With the addition of food items (n 4), we aimed to re-assess the validity of the EAT FFQ and estimate calibration factors in a sub-sample of children (n 97) participating in the Growing Up Milk – Lite (GUMLi) randomised control trial (2015–2017). Participants completed the ninety-nine-item GUMLi EAT FFQ and record-assisted 24-h recalls (24HR) on two occasions. Energy and nutrient intakes were assessed at months 9 and 12 post-randomisation and calibration factors calculated to determine predicted estimates from the GUMLi EAT FFQ. Validity was assessed using Pearson correlation coefficients, weighted kappa (κ) and exact quartile categorisation. Calibration was calculated using linear regression models on 24HR, adjusted for sex and treatment group. Nutrient intakes were significantly correlated between the GUMLi EAT FFQ and 24HR at both time points. Energy-adjusted, de-attenuated Pearson correlations ranged from 0·3 (fibre) to 0·8 (Fe) at 9 months and from 0·3 (Ca) to 0·7 (Fe) at 12 months. Weighted κ for the quartiles ranged from 0·2 (Zn) to 0·6 (Fe) at 9 months and from 0·1 (total fat) to 0·5 (Fe) at 12 months. Exact agreement ranged from 30 to 74 %. Calibration factors predicted up to 56 % of the variation in the 24HR at 9 months and 44 % at 12 months. The GUMLi EAT FFQ remained a useful tool for ranking nutrient intakes with similar estimated validity compared with other FFQ used in children under 2 years.
Prevention of Clostridioides difficile infection (CDI) is a national priority and may be facilitated by deployment of the Targeted Assessment for Prevention (TAP) Strategy, a quality improvement framework providing a focused approach to infection prevention. This article describes the process and outcomes of TAP Strategy implementation for CDI prevention in a healthcare system.
Hospital A was identified based on CDI surveillance data indicating an excess burden of infections above the national goal; hospitals B and C participated as part of systemwide deployment. TAP facility assessments were administered to staff to identify infection control gaps and inform CDI prevention interventions. Retrospective analysis was performed using negative-binomial, interrupted time series (ITS) regression to assess overall effect of targeted CDI prevention efforts. Analysis included hospital-onset, laboratory-identified C. difficile event data for 18 months before and after implementation of the TAP facility assessments.
The systemwide monthly CDI rate significantly decreased at the intervention (β2, −44%; P = .017), and the postintervention CDI rate trend showed a sustained decrease (β1 + β3; −12% per month; P = .008). At an individual hospital level, the CDI rate trend significantly decreased in the postintervention period at hospital A only (β1 + β3, −26% per month; P = .003).
This project demonstrates TAP Strategy implementation in a healthcare system, yielding significant decrease in the laboratory-identified C. difficile rate trend in the postintervention period at the system level and in hospital A. This project highlights the potential benefit of directing prevention efforts to facilities with the highest burden of excess infections to more efficiently reduce CDI rates.
Sulfur-bearing monazite-(Ce) occurs in silicified carbonatite at Eureka, Namibia, forming rims up to ~0.5 mm thick on earlier-formed monazite-(Ce) megacrysts. We present X-ray photoelectron spectroscopy data demonstrating that sulfur is accommodated predominantly in monazite-(Ce) as sulfate, via a clino-anhydrite-type coupled substitution mechanism. Minor sulfide and sulfite peaks in the X-ray photoelectron spectra, however, also indicate that more complex substitution mechanisms incorporating S2– and S4+ are possible. Incorporation of S6+ through clino-anhydrite-type substitution results in an excess of M2+ cations, which previous workers have suggested is accommodated by auxiliary substitution of OH– for O2–. However, Raman data show no indication of OH–, and instead we suggest charge imbalance is accommodated through F– substituting for O2–. The accommodation of S in the monazite-(Ce) results in considerable structural distortion that may account for relatively high contents of ions with radii beyond those normally found in monazite-(Ce), such as the heavy rare earth elements, Mo, Zr and V. In contrast to S-bearing monazite-(Ce) in other carbonatites, S-bearing monazite-(Ce) at Eureka formed via a dissolution–precipitation mechanism during prolonged weathering, with S derived from an aeolian source. While large S-bearing monazite-(Ce) grains are likely to be rare in the geological record, formation of secondary S-bearing monazite-(Ce) in these conditions may be a feasible mineral for dating palaeo-weathering horizons.
Early detection and intervention strategies in patients at clinical high-risk (CHR) for syndromal psychosis have the potential to contain the morbidity of schizophrenia and similar conditions. However, research criteria that have relied on severity and number of positive symptoms are limited in their specificity and risk high false-positive rates. Our objective was to examine the degree to which measures of recency of onset or intensification of positive symptoms [a.k.a., new or worsening (NOW) symptoms] contribute to predictive capacity.
We recruited 109 help-seeking individuals whose symptoms met criteria for the Progression Subtype of the Attenuated Positive Symptom Psychosis-Risk Syndrome defined by the Structured Interview for Psychosis-Risk Syndromes and followed every three months for two years or onset of syndromal psychosis.
Forty-one (40.6%) of 101 participants meeting CHR criteria developed a syndromal psychotic disorder [mostly (80.5%) schizophrenia] with half converting within 142 days (interquartile range: 69–410 days). Patients with more NOW symptoms were more likely to convert (converters: 3.63 ± 0.89; non-converters: 2.90 ± 1.27; p = 0.001). Patients with stable attenuated positive symptoms were less likely to convert than those with NOW symptoms. New, but not worsening, symptoms, in isolation, also predicted conversion.
Results suggest that the severity and number of attenuated positive symptoms are less predictive of conversion to syndromal psychosis than the timing of their emergence and intensification. These findings also suggest that the earliest phase of psychotic illness involves a rapid, dynamic process, beginning before the syndromal first episode, with potentially substantial implications for CHR research and understanding the neurobiology of psychosis.
Dung-colonizing beetles provide a range of ecosystem services in farmland pasture systems. However, such beetles are declining in Northern temperate regions. This may, in part, be due to the widespread use of macrocyclic lactones (MLs) and synthetic pyrethroids (SPs) in livestock farming. These chemicals are used to control pests and parasites of cattle; the residues of which are excreted in dung at concentrations toxic to insects. While the lethal effects of such residues are well known, sublethal effects are less understood. Any effects, however, may have important consequences for beetle populations, particularly if they affect reproduction. To investigate, the impact of ML and SP exposure on the reproductive output of Onthophagus similis (Scriba), a Northern temperate dung beetle species, was examined. In laboratory trials, field-collected adult O. similis exposed to the ML ivermectin at 1 ppm (wet weight) over a period of 3 weeks had smaller oocytes (p = 0.016), smaller fat bodies and reduced motility compared to the control. In a farm-level investigation, cattle dung-baited pitfall trapping was undertaken on 23 beef cattle farms in SW England, which either used MLs (n = 9), SPs (n = 7) or neither chemical (n = 7). On farms that used no MLs or SPs, 24.2% of females caught were gravid. However, on farms that used MLs no gravid females were caught, and only 1% of the beetles caught on farms using SPs were gravid (p < 0.001). The association between ML and SP use and impaired reproductive output suggests that the use of such chemicals is likely to be ecologically damaging.
Folic acid (FA) supplementation is recommended in the periconceptional period, for the prevention of neural tube defects. Limited data are available on the folate status of New Zealand (NZ) pregnant women and its association with FA supplementation intake. Objectives were to examine the relationship between plasma folate (PF) and reported FA supplement use at 15 weeks’ gestation and to explore socio-demographic and lifestyle factors associated with PF. We used data and blood samples from NZ participants of the Screening for Pregnancy Endpoints cohort study. Healthy nulliparous women with singleton pregnancy (n 1921) were interviewed and blood samples collected. PF was analysed via microbiological assay. Of the participants, 73 % reported taking an FA supplement at 15 weeks’ gestation – of these, 79 % were taking FA as part of/alongside a multivitamin supplement. Of FA supplement users, 56 % reported consuming a daily dose of ≥800 μg; 39 % reported taking less than 400 µg/d. Mean PF was significantly higher in women reporting FA supplementation (54·6 (se 1·5) nmol/l) v. no FA supplementation (35·1 (se 1·6) nmol/l) (P<0·0001). Reported daily FA supplement dose and PF were significantly positively correlated (r 0·41; P<0·05). Younger maternal age, Pacific and Maori ethnicity and obesity were negatively associated with PF levels; vegetarianism was positively associated with PF. Reported FA supplement dose was significantly associated with PF after adjustment for socio-demographic, lifestyle confounders and multivitamin intake. The relationship observed between FA supplementation and PF demonstrates that self-reported intake is a reliable proxy for FA supplement use in this study population.
Sheep blowfly strike (ovine cutaneous myiasis) is a widespread economic and welfare problem in sheep husbandry in many parts of the world. Strike incidence is determined by a complex interaction of fly abundance, host susceptibility and climate, combined with farmer husbandry and intervention strategies. Sheep farmers adopt a range of approaches to the type and timing of the management used for the control of blowfly strike, the rational basis for which is often not robust. Here a deterministic model, based on existing data relating to fly abundance, seasonal risk and strike incidence, is used to compare the variable costs associated with different strike management strategies. The model shows that not employing prophylactic treatment is the lowest cost strategy only where strike risk is low. In all other circumstances, prophylactic treatment incurs lower costs than not doing so, because the deaths associated with strike outweigh the costs of prophylactic treatment. Lamb treatment, in particular, has a substantial effect on strike and cost reduction, since lambs are the most abundant age-class of animals and are at the highest risk over the period when fly abundance is the greatest. Early-season treatment of ewes before shearing is also an important component of the lowest cost strategies, particularly when the blowfly season is extended. While the rational choice of the most appropriate strike management strategy is essential in the context of farm economics, welfare considerations lend added importance to treatment decisions that reduce strike incidence.
The second year of life is a period of nutritional vulnerability. We aimed to investigate the dietary patterns and nutrient intakes from 1 to 2 years of age during the 12-month follow-up period of the Growing Up Milk – Lite (GUMLi) trial. The GUMLi trial was a multi-centre, double-blinded, randomised controlled trial of 160 healthy 1-year-old children in Auckland, New Zealand and Brisbane, Australia. Dietary intakes were collected at baseline, 3, 6, 9 and 12 months post-randomisation, using a validated FFQ. Dietary patterns were identified using principal component analysis of the frequency of food item consumption per d. The effect of the intervention on dietary patterns and intake of eleven nutrients over the duration of the trial were investigated using random effects mixed models. A total of three dietary patterns were identified at baseline: ‘junk/snack foods’, ‘healthy/guideline foods’ and ‘breast milk/formula’. A significant group difference was observed in ‘breast milk/formula’ dietary pattern z scores at 12 months post-randomisation, where those in the GUMLi group loaded more positively on this pattern, suggesting more frequent consumption of breast milk. No difference was seen in the other two dietary patterns. Significant intervention effects were seen on nutrient intake between the GUMLi (intervention) and cows’ milk (control) groups, with lower protein and vitamin B12, and higher Fe, vitamin D, vitamin C and Zn intake in the GUMLi (intervention) group. The consumption of GUMLi did not affect dietary patterns, however, GUMLi participants had lower protein intake and higher Fe, vitamins D and C and Zn intake at 2 years of age.
To simulate effects of different scenarios of folic acid fortification of food on dietary folate equivalents (DFE) intake in an ethnically diverse sample of pregnant women.
A forty-four-item FFQ was used to evaluate dietary intake of the population. DFE intakes were estimated for different scenarios of food fortification with folic acid: (i) voluntary fortification; (ii) increased voluntary fortification; (iii) simulated bread mandatory fortification; and (iv) simulated grains-and-rice mandatory fortification.
Ethnically and socio-economically diverse cohort of pregnant women in New Zealand.
Pregnant women (n 5664) whose children were born in 2009–2010.
Participants identified their ethnicity as European (56·0 %), Asian (14·2 %), Māori (13·2 %), Pacific (12·8 %) or Others (3·8 %). Bread, breakfast cereals and yeast spread were main food sources of DFE in the two voluntary fortification scenarios. However, for Asian women, green leafy vegetables, bread and breakfast cereals were main contributors of DFE in these scenarios. In descending order, proportions of different ethnic groups in the lowest tertile of DFE intake for the four fortification scenarios were: Asian (39–60 %), Others (41–44 %), European (31–37 %), Pacific (23–26 %) and Māori (23–27 %). In comparisons within each ethnic group across scenarios of food fortification with folic acid, differences were observed only with DFE intake higher in the simulated grains-and-rice mandatory fortification v. other scenarios.
If grain and rice fortification with folic acid was mandatory in New Zealand, DFE intakes would be more evenly distributed among pregnant women of different ethnicities, potentially reducing ethnic group differences in risk of lower folate intakes.
The authors developed a practical and clinically useful model to predict the risk of psychosis that utilizes clinical characteristics empirically demonstrated to be strong predictors of conversion to psychosis in clinical high-risk (CHR) individuals. The model is based upon the Structured Interview for Psychosis Risk Syndromes (SIPS) and accompanying clinical interview, and yields scores indicating one's risk of conversion.
Baseline data, including demographic and clinical characteristics measured by the SIPS, were obtained on 199 CHR individuals seeking evaluation in the early detection and intervention for mental disorders program at the New York State Psychiatric Institute at Columbia University Medical Center. Each patient was followed for up to 2 years or until they developed a syndromal DSM-4 disorder. A LASSO logistic fitting procedure was used to construct a model for conversion specifically to a psychotic disorder.
At 2 years, 64 patients (32.2%) converted to a psychotic disorder. The top five variables with relatively large standardized effect sizes included SIPS subscales of visual perceptual abnormalities, dysphoric mood, unusual thought content, disorganized communication, and violent ideation. The concordance index (c-index) was 0.73, indicating a moderately strong ability to discriminate between converters and non-converters.
The prediction model performed well in classifying converters and non-converters and revealed SIPS measures that are relatively strong predictors of conversion, comparable with the risk calculator published by NAPLS (c-index = 0.71), but requiring only a structured clinical interview. Future work will seek to externally validate the model and enhance its performance with the incorporation of relevant biomarkers.
Since 2012–2016 an increased number of listeriosis cases, especially from one region of the Czech Republic, were observed. Most of them were caused by strains of serotype 1/2a, clonal complex 8, indistinguishable by pulsed-field gel electrophoresis. Twenty-six human cases were reported, including two neonatal cases in twins. Three cases were fatal. The typing of Listeria monocytogenes isolates from food enabled to confirm a turkey meat delicatessen as the vehicle of infection for this local outbreak in the Moravian-Silesian Region. The food strains belonging to identical pulsotype were isolated from ready-to-eat turkey meat products packaged by the same producer between 2012 and 2016. This fact confirms that the described L. monocytogenes outbreak strain probably persisted in the environment of the aforementioned food-processing plant over several years. Whole-genome sequencing confirmed a very close relationship (zero to seven different alleles) between isolates from humans, foods and swabs from the environment of the food-processing plant under investigation.
Evidence in support of the Developmental Origins of Health and Disease (DOHaD) hypothesis has reached the level where it can appropriately be used to inform practice. DOHaD informed interventions supporting primary noncommunicable disease risk reduction should target the pre- and periconceptional periods, pregnancy, lactation, childhood and adolescence. Such interventions are dependent on a health workforce (including dietitians, nurses, midwives, doctors, and nutrition teachers), that has a deep understanding of DOHaD concepts. This study assessed development of awareness of DOHaD concepts during undergraduate health professional training programs. Using a cross-sectional design, a standardized questionnaire was completed by Year 1–4 undergraduate students studying nutrition in Japan (n=309) and Year 1–3 nursing students in New Zealand (n=151). On entry to undergraduate study, most students had no awareness of the terms ‘DOHaD’ or ‘First 1000 Days’. While awareness reached 60% by Year 3 in courses that included DOHaD-related teaching, this remains inadequate. More than 95% of Year 1 undergraduates in both countries demonstrated an appreciation of associations between maternal nutrition and fetal health. However, awareness of associations between parental health status and/or nutritional environment and later-life health was low. While levels of awareness increased across program years, overall awareness was less than optimal. These results indicate evidence of some focus on DOHaD-related content in curricula. We argue that DOHaD principles should be one pillar around which health training curricula are built. This study indicates a need for the DOHaD community to engage with faculties in curriculum development.
To evaluate the sociodemographic and lifestyle factors associated with insufficient and excessive use of folic acid supplements (FAS) among pregnant women.
A pregnancy cohort to which multinomial logistic regression models were applied to identify factors associated with duration and dose of FAS use.
The Growing Up in New Zealand child study, which enrolled pregnant women whose children were born in 2009–2010.
Pregnant women (n 6822) enrolled into a nationally generalizable cohort.
Ninety-two per cent of pregnant women were not taking FAS according to the national recommendation (4 weeks before until 12 weeks after conception), with 69 % taking insufficient FAS and 57 % extending FAS use past 13 weeks’ gestation. The factors associated with extended use differed from those associated with insufficient use. Consistent with published literature, the relative risks of insufficient use were increased for younger women, those with less education, of non-European ethnicities, unemployed, who smoked cigarettes, whose pregnancy was unplanned or who had older children, or were living in more deprived households. In contrast, the relative risks of extended use were increased for women of higher socio-economic status or for whom this was their first pregnancy and decreased for women of Pacific v. European ethnicity.
In New Zealand, current use of FAS during pregnancy potentially exposes pregnant women and their unborn children to too little or too much folic acid. Further policy development is necessary to reduce current socio-economic inequities in the use of FAS.
Early detection of karyotype abnormalities, including aneuploidy, could aid producers in identifying animals which, for example, would not be suitable candidate parents. Genome-wide genetic marker data in the form of single nucleotide polymorphisms (SNPs) are now being routinely generated on animals. The objective of the present study was to describe the statistics that could be generated from the allele intensity values from such SNP data to diagnose karyotype abnormalities; of particular interest was whether detection of aneuploidy was possible with both commonly used genotyping platforms in agricultural species, namely the Applied BiosystemsTM AxiomTM and the Illumina platform. The hypothesis was tested using a case study of a set of dizygotic X-chromosome monosomy 53,X sheep twins. Genome-wide SNP data were available from the Illumina platform (11 082 autosomal and 191 X-chromosome SNPs) on 1848 male and 8954 female sheep and available from the AxiomTM platform (11 128 autosomal and 68 X-chromosome SNPs) on 383 female sheep. Genotype allele intensity values, either as their original raw values or transformed to logarithm intensity ratio (LRR), were used to accurately diagnose two dizygotic (i.e. fraternal) twin 53,X sheep, both of which received their single X chromosome from their sire. This is the first reported case of 53,X dizygotic twins in any species. Relative to the X-chromosome SNP genotype mean allele intensity values of normal females, the mean allele intensity value of SNP genotypes on the X chromosome of the two females monosomic for the X chromosome was 7.45 to 12.4 standard deviations less, and were easily detectable using either the AxiomTM or Illumina genotype platform; the next lowest mean allele intensity value of a female was 4.71 or 3.3 standard deviations less than the population mean depending on the platform used. Both 53,X females could also be detected based on the genotype LRR although this was more easily detectable when comparing the mean LRR of the X chromosome of each female to the mean LRR of their respective autosomes. On autopsy, the ovaries of the two sheep were small for their age and evidence of prior ovulation was not appreciated. In both sheep, the density of primordial follicles in the ovarian cortex was lower than normally found in ovine ovaries and primary follicle development was not observed. Mammary gland development was very limited. Results substantiate previous studies in other species that aneuploidy can be readily detected using SNP genotype allele intensity values generally already available, and the approach proposed in the present study was agnostic to genotype platform.
Pre-school nutrition-related behaviours influence diet and development of lifelong eating habits. We examined the prevalence and congruence of recommended nutrition-related behaviours (RNB) in home and early childhood education (ECE) services, exploring differences by child and ECE characteristics.
Telephone interviews with mothers. Online survey of ECE managers/head teachers.
Children (n 1181) aged 45 months in the Growing Up in New Zealand longitudinal study.
A mean 5·3 of 8 RNB were followed at home, with statistical differences by gender and ethnic group, but not socio-economic position. ECE services followed a mean 4·8 of 8 RNB, with differences by type of service and health-promotion programme participation. No congruence between adherence at home and in ECE services was found; half of children with high adherence at home attended a service with low adherence. A greater proportion of children in deprived communities attended a service with high adherence, compared with children living in the least deprived communities (20 and 12 %, respectively).
Children, across all socio-economic positions, may not experience RNB at home. ECE settings provide an opportunity to improve or support behaviours learned at home. Targeting of health-promotion programmes in high-deprivation areas has resulted in higher adherence to RNB at these ECE services. The lack of congruence between home and ECE behaviours suggests health-promotion messages may not be effectively communicated to parents/family. Greater support is required across the ECE sector to adhere to RNB and promote wider change that can reach into homes.
A cadmium chloride activation treatment is essential for the production of high efficiency cadmium telluride (CdTe) solar cells. However, the effects of the treatment on the distributions of chlorine and sulphur within the device are not fully understood. Here, the detailed locations of chlorine and sulphur in a treated CdTe cell are determined in three dimensions by high resolution dynamic SIMS measurements. Chlorine is found to be present in grain boundaries, grain interiors, extended defects within the grain interiors, at the front interface, and in the cadmium sulphide layer. In each of these regions, the chlorine is likely to have significant effects on local electronic properties of the material, and hence overall device performance. Sulphur is found to have a U-shaped diffusion profile within CdTe grains, indicating a mixed grain boundary and lattice diffusion regime.