To save this undefined to your undefined account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you used this feature, you will be asked to authorise Cambridge Core to connect with your undefined account.
Find out more about saving content to .
To save this article to your Kindle, first ensure firstname.lastname@example.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The association between dietary Cu intake and mortality risk remains uncertain. We aimed to investigate the relationship of dietary Cu intake with all-cause mortality among Chinese adults. A total of 17 310 participants from the China Health and Nutrition Survey, a national ongoing open cohort of Chinese participants, were included in the analysis. Dietary intake was measured by three consecutive 24-h dietary recalls in combination with a weighing inventory over the same 3 d. The average intakes of the 3-d dietary macronutrients and micronutrients were calculated. The study outcome was all-cause mortality. During a median follow-up of 9·0 years, 1324 (7·6 %) participants died. After adjusting for sex, age, BMI, ever alcohol drinking, ever smoking, education levels, occupations, urban or rural residents, systolic blood pressure, diastolic blood pressure and the intakes of fat, protein and carbohydrate, the association between dietary Cu intake and all-cause mortality followed a J-shape (Pfor nonlinearity = 0·047). When dietary Cu intake was assessed as quartiles, compared with those in the first quartile (<1·60 mg/d), the adjusted hazard ratios for all-cause mortality were 0·87 (95 % CI (0·71, 1·07)), 0·98 (95 % CI (0·79, 1·21)) and 1·49 (95 % CI (1·19, 1·86)), respectively, in participants in the second (1·60–<1·83 mg/d), third (1·83–<2·09 mg/d) and fourth (≥2·09 mg/d) quartiles. A series of subgroup analyses and sensitivity analyses showed similar results. Overall, our findings emphasised the importance of maintaining optimal dietary Cu intake levels for prevention of premature death.
Determination of indispensable amino acid (IAA) requirements necessitates a range of intakes of the test IAA and monitoring of the physiological response. Short-term methods are the most feasible for studying multiple intake levels in the same individual. Carbon oxidation methods measure the excretion of 13CO2 in breath from a labelled amino acid (AA) in response to varying intakes of the test AA following a period of adaptation. However, the length of adaptation to each AA intake level has been a source of debate and disagreement among researchers. The assertion of the minimally invasive indicator amino acid oxidation (IAAO) technique is that IAA requirements can be estimated after only a few hours (8 h) of adaptation to each test AA intake, suggesting that adaptation occurs rapidly in response to dietary adjustments. On the contrary, the assertion of most other techniques is that 6–7 d of adaptation is required when determining IAA needs. It has even been argued that a minimum of two weeks is needed to achieve complete adaptation. This review explores evidence regarding AA oxidation methods and whether long periods of adaptation to test IAA levels are necessary when estimating IAA requirements. It was found that the consumption of experimental diets containing lower test IAA intake for greater than 7 d violates the terms of a successful adaptive response. While there is some evidence that short-term 8 h IAAO is not different among different test amino acid intakes up to 7 d, it is unclear whether it impacts assessment of IAA requirements.
In growing pigs, reduced growth during heat stress (HS) is mainly related to decreased feed intake. The study aimed to determine whether the reported positive effects of live yeast (LY) supplementation in HS pigs were due to a modified feeding behaviour or energy metabolism, and if these can be replicated by imposing an increased meal frequency. The effect of LY supplementation (0 (NS) v. 100 (LY) g/ton of feed), and of feeding window (FW) (unlimited or Unli, 2FW of 1 h each and 8FW of 15 min each) were measured in entire male finishing pigs (n 36). Ambient temperature was at 22°C during the thermoneutral (TN) period (5 d) and at 28°C during the HS period (5 d). Heat exposure decreased DM intake (DMI) and retained energy (RE) (–627 and −460 kJ·kg BW–0·60 · d–1, respectively; P < 0·01). During HS, LY supplementation in Unli pigs decreased inter-meal intervals (P = 0·02) attenuating HS effect on DMI which tended to improve RE (P = 0·09). NS – 8FW had higher DMI and RE than NS – 2FW (P < 0·05) but protein deposition (PD) were similar. Supplemented pigs had higher PD during HS regardless of FW (+18 g · d–1; P = 0·03). Comparing the 2FW groups, improved heat tolerance of LY-supplemented pigs were due to improved insulin sensitivity (P < 0·05) and latent heat loss capacity after a meal (P < 0·05) allowing them to increase their DMI (via an increased number of meals) and thus their energy efficiency. Imposing an increased meal frequency improved DMI in HS pigs but did not replicate positive effects of LY on PD.
Type 2 diabetes mellitus (T2DM) is characterised by chronic hyperglycaemia. Despite the efficacy of conventional pharmacotherapy, some individuals do not reach glycaemic goals and require adjuvant therapies. Taurine, a semi-essential amino acid, decreases blood glucose and cholesterol levels in rodents and humans. However, glycated hemoglobin (HbA1c) has not been evaluated in randomised controlled trials after taurine treatment for more than 12 weeks. This study aims to evaluate the effect of taurine administration on glycaemic, lipid, inflammatory, anthropometric and dietary parameters in individuals with T2DM. A randomised, double-blind, placebo-controlled clinical trial will be conducted at the Clinical Research Center of a tertiary public hospital. Participants with T2DM (n 94) will be recruited and randomised to receive 3 g of taurine or placebo, twice/day, orally, for 12 weeks. Blood samples will be collected before and after 12 weeks of treatment, when HbA1c, fasting glucose, insulin, albuminuria, creatinine, total cholesterol and fractions, triglycerides, C-reactive protein, TNF-α, IL 1, 4, 5, 6, 10 and 13 will be evaluated. Anthropometric parameters and 24-hour food recall will also be evaluated. The study will evaluate the effect of taurine treatment on biochemical and anthropometric parameters in individuals with T2DM. These results will guide the decision-making to indicate taurine treatment as an adjunct in individuals with T2DM who have not reached their glycaemic goal.
The current trial investigates the effect of renal diet therapy and nutritional education on the estimated glomerular filtration rate (eGFR), blood pressure (BP) and depression among patients with chronic kidney disease (CKD). A total of 120 CKD patients (stages 3–4) (15<eGFR < 60) were randomised into an intensive nutrition intervention group (individualised renal diet therapy plus nutrition counselling: 0·75 g protein/kg/d and 30–35 kcal/kg/d with Na restriction) and a control group (routine and standard care) for 24 weeks. The primary outcome was the change in the eGFR. Secondary outcomes included changes in anthropometric measures, biochemistry (serum creatinine (Cr), uric acid, albumin, electrolytes, Ca, vitamin D, ferritin, blood urea nitrogen (BUN), and Hb), BP, nutritional status, depression and quality of life. The eGFR increased significantly in the intervention group compared with the control group (P < 0·001). Moreover, serum levels of Cr and the systolic and diastolic BP decreased significantly in the intervention group relative to the control group (P < 0·001, P < 0·001 and P = 0·020, respectively). The nutrition intervention also hindered the increase in the BUN level and the depression score (P = 0·045 and P = 0·028, respectively). Furthermore, the reduction in protein and Na intake was greater in the intervention group (P = 0·003 and P < 0·001, respectively). Nutritional treatment along with supportive education and counselling contributed to improvements in renal function, BP control and adherence to protein intake recommendations. A significant difference in the mean eGFR between the groups was also confirmed at the end of the study using ANCOVA (β = -5·06; 95 % CI (−8·203, −2·999)).
Sarcopenia is more common in the elderly and causes adverse outcomes with increased morbidity and mortality. This prospective cohort study assessed the association of sarcopenia risk with the severity of COVID-19 at the time of admission and during hospitalisation and the length of hospital stay. Two hundred patients (aged ≥ 60 years) who were hospitalised for COVID-19 were enrolled using consecutive sampling between 29 December 2020 and 20 May 2021. The sarcopenia score of the patients was assessed using the Strength, Assistance in walking, Rising from a chair, Climbing stairs, and Falls questionnaire. The severity of COVID-19 was determined using the modified National Early Warning Score (m-NEWS) system for 2019 n-CoV-infected patients at admission (T1), day three (T2) and at discharge (T3). Data were analysed using SPSS, version 22 and STATA, version 14. Of the 165 patients included, thirty four (20·6 %) were at risk of sarcopenia. The length of hospital stay was slightly longer in patients with sarcopenia risk, but the difference was not significant (P = 0·600). The adjusted OR of respiratory rate (RR) > 20 /min at T1 for the sarcopenia risk group was 6·7-times higher than that for the non-sarcopenic group (P = 0·002). According to generalised estimating equations, after adjusting for confounding factors, the m-NEWS score was 5·6 units higher in patients at risk of sarcopenia (P < 0·001). Sarcopenia risk could exacerbate COVID-19 severity and increase RR at admission, as well as the need for oxygen therapy at discharge.
Major depressive disorder (MDD) is regarded as an inflammatory disorder. Gut microbiota dysbiosis, observed in both MDD and obesity, leads to endotoxemia and inflammatory status, eventually exacerbating depressive symptoms. Manipulation of gut microbiota by prebiotics might help alleviate depression. The present study aimed to investigate the effects of inulin supplementation on psychological outcomes and biomarkers of gut permeability, endotoxemia, inflammation, and brain-derived neurotrophic factor (BDNF) in women with obesity and depression on a calorie-restricted diet. In a double-blind randomised clinical trial, forty-five women with obesity and MDD were allocated to receive 10 g/d of either inulin or maltodextrin for 8 weeks; all the patients followed a healthy calorie restricted diet as well. Anthropometric measures, dietary intakes, depression, and serum levels of zonulin, lipopolysaccharide (LPS), inflammatory biomarkers (TNF-α, IL-10, monocyte chemoattractant protein-1, toll-like receptor-4 and high-sensitivity C-reactive protein), and BDNF were assessed at baseline and end of the study. Weight and Hamilton Depression Rating Scale (HDRS) scores decreased in both groups; between-group differences were non-significant by the end of study (P = 0·333 for body weight and P = 0·500 for HDRS). No between-group differences were observed for the other psychological outcomes and serum biomarkers (P > 0·05). In this short-term study, prebiotic supplementation had no significant beneficial effects on depressive symptoms, gut permeability, or inflammatory biomarkers in women with obesity and depression.
Vitamin D seasonality has been reported in adults and children, suggesting that sunlight exposure has effects on 25(OH)D production. While vitamin D deficiency among infants has received significant attention, little is known about the extent to which vitamin D status during early infancy is affected by sunlight exposure. Here, we retrospectively analysed serum 25(OH)D levels of 692 samples obtained from healthy infants aged 1–2 months born at Saitama City Hospital, Japan (latitude 35·9° North) between August 2017 and September 2021. Data regarding the frequency of outdoor activities, formula intake and BMI were also collected and analysed. Month-to-month comparisons of vitamin D levels revealed significant variation in 25(OH)D levels in breastfed infants starting at 2 months, with maximal and minimal levels in September and January, respectively. An outdoor activity score of 0 was most common at 1 month (83·9 %) and a score of 3 was most common at 2 months (81·2 %), suggesting an increased amount of sunlight exposure at 2 months. Multiple linear regression analysis revealed the amount of formula intake to be significantly associated with vitamin D status at both 1 (t = 17·96) and 2 months (t = 16·30). Our results comprise the first evidence that seasonal variation of vitamin D begins at 2 months among breastfed infants from East Asia, though dietary intake appears to be the major determinant of vitamin D status. These findings provide new insights into the influence of dietary and non-dietary factors on vitamin D status during early infancy.
The COVID-19 pandemic has impacted college students’ lifestyles and placed them at a greater risk of obesity and food insecurity. The purpose of the systematic review was to consolidate evidence for the effect of Covid-19 on students’ dietary quality, dietary habits, body weight and food security status. A comprehensive literature search was conducted utilising various databases including Google Scholar, MEDLINE, ScienceDirect, Embase and Scopus to identify relevant studies. To be incorporated in this review, studies had to include higher education students, measure the prevalence of food insecurity and assess the dietary and body weight changes during the COVID-19 pandemic. The studies showed that the diet quality of college students was compromised during the pandemic in many nations due to the decrease in the intake of whole grains, dairy products, legumes, nuts, fruits and vegetables and the increase in consumption of alcohol, confectionery products and refined grains. There was an increase in the frequency of cooking, binge eating, breakfast skipping and unhealthy snacking. These modifications, in return, were associated with body weight changes, with no less than 20 to 30 % of students gaining weight during the pandemic. The pandemic also impacted food security status of students, with over 30% being food insecure worldwide. The COVID-19 outbreak has exacerbated the students’ diet quality and dietary habits and placed them under high risk of weight gain and food insecurity. Higher education institutions and governments should improve students’ access to nutritious foods and incorporate nutrition education interventions in the curricula.
Although previous studies suggested the protective effect of Zn for type 2 diabetes (T2D), the unitary causal effect remains inconclusive. We investigated the causal effect of Zn as a single intervention on glycaemic control for T2D, using a systematic review of randomised controlled trials and two-sample Mendelian randomisation (MR). Four primary outcomes were identified: fasting blood glucose/fasting glucose, HbA1c, homeostatic model assessment for insulin resistance (HOMA-IR) and serum insulin/fasting insulin level. In the systematic review, four databases were searched until June 2021. Studies, in which participants had T2D and intervention did not comprise another co-supplement, were included. Results were synthesised through the random-effects meta-analysis. In the two-sample MR, we used single-nucleotide polymorphisms (SNP) from MR-base, strongly related to Zn supplements, to infer the relationship causally, but not specified T2D. In the systematic review and meta-analysis, fourteen trials were included with overall 897 participants initially. The Zn supplement led to a significant reduction in the post-trial mean of fasting blood glucose (mean difference (MD): −26·52 mg/dl, 95 % CI (−35·13, −17·91)), HbA1c (MD: −0·52 %, 95 % CI: (−0·90, −0·13)) and HOMA-IR (MD: −1·65, 95 % CI (−2·62, −0·68)), compared to the control group. In the two-sample MR, Zn supplement with two SNP reduced the fasting glucose (inverse-variance weighted coefficient: −2·04 mmol/l, 95 % CI (−3·26, −0·83)). From the two methods, Zn supplementation alone may causally improve glycaemic control among T2D patients. The findings are limited by power from the small number of studies and SNP included in the systematic review and two-sample MR analysis, respectively.
There is increasing interest in modelling longitudinal dietary data and classifying individuals into subgroups (latent classes) who follow similar trajectories over time. These trajectories could identify population groups and time points amenable to dietary interventions. This paper aimed to provide a comparison and overview of two latent class methods: group-based trajectory modelling (GBTM) and growth mixture modelling (GMM). Data from 2963 mother–child dyads from the longitudinal Southampton Women’s Survey were analysed. Continuous diet quality indices (DQI) were derived using principal component analysis from interviewer-administered FFQ collected in mothers pre-pregnancy, at 11- and 34-week gestation, and in offspring at 6 and 12 months and 3, 6–7 and 8–9 years. A forward modelling approach from 1 to 6 classes was used to identify the optimal number of DQI latent classes. Models were assessed using the Akaike and Bayesian information criteria, probability of class assignment, ratio of the odds of correct classification, group membership and entropy. Both methods suggested that five classes were optimal, with a strong correlation (Spearman’s = 0·98) between class assignment for the two methods. The dietary trajectories were categorised as stable with horizontal lines and were defined as poor (GMM = 4 % and GBTM = 5 %), poor-medium (23 %, 23 %), medium (39 %, 39 %), medium-better (27 %, 28 %) and best (7 %, 6 %). Both GBTM and GMM are suitable for identifying dietary trajectories. GBTM is recommended as it is computationally less intensive, but results could be confirmed using GMM. The stability of the diet quality trajectories from pre-pregnancy underlines the importance of promotion of dietary improvements from preconception onwards.
Reducing Na intake is an urgent global challenge, especially in East Asia and high-income Asia-Pacific regions. However, the sources of Na and their effects on urinary Na excretion have not been fully studied. We sought to clarify these sources and their association with urinary Na excretion. We examined four 3-d weighed food records and five 24-h urinary collections from each of 253 participants in Japan, aged 35–80 years, between 2012 and 2013. We compared the levels of Na according to four categories: foods contributing to discretionary or non-discretionary Na intake, the situation in which dishes were cooked and consumed, food groups and types of cuisine. We also conducted regression analysis in which 24-h urinary Na excretion was a dependent variable and the amounts of food intake in the four categories were independent variables. Levels of Na were the highest in discretionary intake (60·6 %) and in home-prepared dishes (84·0 %). Of the food groups, miso soup showed the highest percentage contribution to Na intake (13·3 %) after seasonings such as soya sauce. In the regression analysis, the standardised coefficient for foods of non-discretionary Na sources was larger than that for discretionary sources, whereas that for home-prepared dishes was consistent with the levels of Na in those foods. Pickled products, followed by fresh fish and shellfish, miso soup and rice, were associated with high urinary Na excretion. Thus, discretionary foods (such as miso soup) contribute the most to Na consumption, although non-discretionary intake (such as pickled vegetables) may influence urinary Na excretion.
Many dietary guidelines recommend restricting the consumption of processed red meat (PRM) in favour of healthier foods such as fish, to reduce the risk of chronic conditions such as hypertension and diabetes. The objective of this study was to estimate the potential effect of replacing PRM for fatty fish, lean fish, red meat, eggs, pulses, or vegetables, on the risk of incident hypertension and diabetes. This was a prospective study of women in the E3N cohort study. Cases of diabetes and hypertension were based on self-report, specific questionnaires, and drug reimbursements. In the main analysis, information on regular dietary intake was assessed with a single food history questionaire, and food substitutions were modelled using cox proportional hazard models. 95 % confidence intervals were generated via bootstrapping. 71 081 women free of diabetes and 45 771 women free of hypertension were followed for an average of 18·7 and 18·3 years, respectively. 2681 incident cases of diabetes and 12 327 incident cases of hypertension were identified. Relative to PRM, fatty fish was associated with a 15 % lower risk of diabetes (HR = 0·85, 95 CI (0·73, 0·97)) and hypertension (HR = 0 85 (0·79, 0·91)). Between 3 and 10 % lower risk of hypertension or diabetes was also observed when comparing PRM with vegetables, unprocessed red meat or pulses. Relative to PRM, alternative protein sources such as fatty fish, unprocessed red meat, vegetables or pulses was associated with a lower risk of hypertension and diabetes.
Avocado is a fruit rich in dietary fibre, potassium, Mg, mono and PUFA and bioactive phytochemicals, which are nutritional components that have been associated with cardiovascular health. Yet, despite the boom in avocado consumption, we lack evidence on its association with CVD risk in the general population. To estimate the prospective association between avocado consumption and incident hypertension in Mexican women, we estimated the association in participants from the Mexican Teachers’ Cohort who were ≥ 25 years, free of hypertension, CVD and cancer at baseline (n 67 383). We assessed baseline avocado consumption with a semi-quantitative FFQ (never to six or more times per week). Incident hypertension cases were identified if participants self-reported a diagnosis and receiving treatment. To assess the relation between categories of avocado consumption (lowest as reference) and incident hypertension, we estimated incidence rate ratios (IRR) and 95 % CI using Poisson regression models and adjusting for confounding. We identified 4002 incident cases of hypertension during a total of 158 706 person-years for a median follow-up of 2·2 years. The incidence rate of hypertension was 25·1 cases per 1000 person-years. Median avocado consumption was 1·0 (interquartile range: 0·23, 1·0) serving per week (half an avocado). After adjustment for confounding, consuming 5 + servings per week of avocado was associated with a 17 % decrease in the rate of hypertension, compared with non- or low consumers (IRR = 0·83; 95 % CI: 0·70, 0·99; Ptrend = 0·01). Frequent consumption of avocado was associated with a lower incidence of hypertension.
In the context of the global childhood obesity, it is essential to monitor the nutrition value of commercial foods. A cross-sectional study (November 2018 to April 2019) aimed to evaluate the nutritional adequacy of processed/ultra-processed food products targeted at 0–36-month-old children in Portugal and in Brazil. The nutrient profiling model developed by the Pan American Health Organization was used. A total of food 171 products were assessed (123 in Portugal and forty eight in Brazil). From the fifteen available meat- or fish-based meals in Brazil, 60 % exceeded the amount of Na and 100 % exceeded the target for total fat. Given the lack of specification of sugars within carbohydrates in the label of the foods in Brazil, it was not possible to calculate free sugars. In Portugal, from the seventeen fruit and vegetable purees and the six juice/smoothie/tea/drinks available, 82 % and 67 %, respectively, surpassed the level of free sugar, while total and saturated fat was excessive in all yogurt and yogurt-related products (n 21), 40 % of biscuit/wafer/crisps (two out of five) and 13 % meat- or fish-based meals (two out of sixteen). These findings demonstrate the relevance of improving the nutritional profile of some food products targeted to young children.
Various body indicators are used to predict health risks. However, controversies still exist regarding the best indicators to predict CVD. Using a large number of measurements, our aim was to assess their associations with blood pressure (BP) and to identify the most relevant parameters to be used in health surveillance studies. The population included 589 students (67·2 % women) aged 20–25 years from Constantine (Algeria). Sixteen parameters were considered, including crude body measurements, ratios and body fat indicators based on bioelectrical impedance analysis (BIA). We used multi-adjusted linear regression models to assess the associations between body measurements and BP. According to WHO definitions, underweight, overweight-without obesity, obesity and hypertension (HT) were identified in 6·1, 18·0, 2·4 and 5·1 % of the subjects, respectively. Prevalence of HT was higher in men than in women (11·9 % v. 1·8 %; P < 0·001). In the whole sample, almost all indicators were positively associated with systolic and diastolic BP. The suprailiac skinfold had the strongest associations with systolic (β = 3·498; P < 0·001) and diastolic (β = 2·436; P < 0·001) BP, and as a whole, arm circumferences and weight were also good candidates. The currently used BMI, waist-to-hip, waist-to-height ratio and BIA indictors also predicted BP, but they did not seem to be better determinants of BP than crude anthropometric measurements. This study showed that overweight and HT were already found in the present population of young Algerian adults. Most body indicators were highly associated with BP, but simple anthropometric measurements appeared to be particularly useful to predict BP.
The Thumbs food classification system was developed to assist remote Australian communities to identify food healthiness. This study aimed to assess: (1) the Thumbs system’s alignment to two other food classification systems, the Health Star Rating (HSR) and the Northern Territory School Canteens Guidelines (NTSCG); (2) its accuracy in classifying ‘unhealthy’ (contributing to discretionary energy and added sugars) and ‘healthy’ products against HSR and NTSCG; (3) areas for optimisation. Food and beverage products sold between 05/2018 and 05/2019 in fifty-one remote stores were classified in each system. System alignment was assessed by cross-tabulating percentages of products, discretionary energy and added sugars sold assigned to the same healthiness levels across the systems. The system/s capturing the highest percentage of discretionary energy and added sugars sold in ‘unhealthy’ products and the lowest levels in ‘healthy’ products were considered the best performing. Cohen’s κ was used to assess agreement between the Thumbs system and the NTSCG for classifying products as healthy. The Thumbs system classified product healthiness in line with the HSR and NTSCG, with Cohen’s κ showing moderate agreement between the Thumbs system and the NTSCG (κ = 0·60). The Thumbs system captured the most discretionary energy sold (92·2 %) and added sugar sold (90·6 %) in unhealthy products and the least discretionary energy sold (0 %) in healthy products. Modifications to optimise the Thumbs system include aligning several food categories to the NTSCG criteria and addressing core/discretionary classification discrepancies of fruit juice/drinks. The Thumbs system offers a classification algorithm that could strengthen the HSR system.
The childhood years represent a period of increased nutrient requirements during which a balanced diet is important to ensure optimal growth and development. The aim of this study was to examine food and nutrient intakes and compliance with recommendations in school-aged children in Ireland and to examine changes over time. Analyses were based on two National Children’s Food Surveys; NCFS (2003–2004) (n 594) and NCFS II (2017–2018) (n 600) which estimated food and nutrient intakes in nationally representative samples of children (5–12 years) using weighed food records (NCFS: 7-d; NCFS II: 4-d). This study found that nutrient intakes among school-aged children in Ireland are generally in compliance with recommendations; however, this population group have higher intakes of saturated fat, free sugars and salt, and lower intakes of dietary fibre than recommended. Furthermore, significant proportions have inadequate intakes of vitamin D, Ca, Fe and folate. Some of the key dietary changes that have occurred since the NCFS (2003–2004) include decreased intakes of sugar-sweetened beverages, fruit juice, milk and potatoes, and increased intakes of wholemeal/brown bread, high-fibre ready-to-eat breakfast cereals, porridge, pasta and whole fruit. Future strategies to address the nutrient gaps identified among this population group could include the continued promotion of healthy food choices (including education around ‘healthy’ lifestyles and food marketing restrictions), improvements of the food supply through reformulation (fat, sugar, salt, dietary fibre), food fortification for micronutrients of concern (voluntary or mandatory) and/or nutritional supplement recommendations (for nutrients unlikely to be sufficient from food intake alone).