To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Selenium is found at the active centre of twenty-five selenoproteins which have a variety of roles, including the well-characterised function of antioxidant defense, but it also is claimed to be involved in the immune system. However, due to limited and conflicting data for different parameters of immune function, intakes of selenium that have an influence on immune function are uncertain. This review covers the relationship between selenium and immune function in man, focusing on the highest level of evidence, namely that generated by randomised controlled trials (RCT), in which the effect of selective administration of selenium, in foods or a supplement, on immune function was assessed. A total of nine RCT were identified from a systematic search of the literature, and some of these trials reported effects on T and natural killer cells, which were dependent on the dose and form of selenium administered, but little effect of selenium on humoral immunity. There is clearly a need to undertake dose–response analysis of cellular immunity data in order to derive quantitative relationships between selenium intake and measures of immune function. Overall, limited effects on immunity emerged from experimental studies in human subjects, though additional investigation on the potential influence of selenium status on cellular immunity appears to be warranted.
This review aims to describe approaches used to estimate bioavailability when deriving dietary reference values (DRV) for iron and zinc using the factorial approach. Various values have been applied by different expert bodies to convert absorbed iron or zinc into dietary intakes, and these are summarised in this review. The European Food Safety Authority (EFSA) derived zinc requirements from a trivariate saturation response model describing the relationship between zinc absorption and dietary zinc and phytate. The average requirement for men and women was determined as the intercept of the total absorbed zinc needed to meet physiological requirements, calculated according to body weight, with phytate intake levels of 300, 600, 900 and 1200 mg/d, which are representative of mean/median intakes observed in European populations. For iron, the method employed by EFSA was to use whole body iron losses, determined from radioisotope dilution studies, to calculate the quantity of absorbed iron required to maintain null balance. Absorption from the diet was estimated from a probability model based on measures of iron intake and status and physiological requirements for absorbed iron. Average dietary requirements were derived for men and pre- and post-menopausal women. Taking into consideration the complexity of deriving DRV for iron and zinc, mainly due to the limited knowledge on dietary bioavailability, it appears that EFSA has made maximum use of the most relevant up-to-date data to develop novel and transparent DRV for these nutrients.
To examine the contribution of genetic factors to food choice, we determined dietary patterns from food frequency questionnaires in 3262 UK female twins aged 18 to 79 years. Five distinct dietary patterns were identified (fruit and vegetable, high alcohol, traditional English, dieting, low meat) that accounted for 22% of the total variance. These patterns are similar to those found in other singleton Western populations, and were related to body mass index, smoking status, physical activity and deprivation scores. Older subjects had higher scores on the fruit and vegetable and traditional English patterns, while lower social deprivation was associated with higher scores for fruit and vegetable, and lower scores for traditional English patterns. All 5 patterns were heritable, with estimates ranging from 41% to 48%. Among individual dietary components, a strongly heritable component was identified for garlic (46%), coffee (41%), fruit and vegetable sources (49%), and red meat (39%). Our results indicate that genetic factors have an important influence in determining food choice and dietary habits in Western populations. The relatively high heritability of specific dietary components implicates taste perception as a possible target for future genetic studies.
Dietary reference values for essential trace elements are designed to meet requirements with minimal risk of deficiency and toxicity. Risk–benefit analysis requires data on habitual dietary intakes, an estimate of variation and effects of deficiency and excess on health. For some nutrients, the range between the upper and lower limits may be extremely narrow and even overlap, which creates difficulties when setting safety margins. A new approach for estimating optimal intakes, taking into account several health biomarkers, has been developed and applied to selenium, but at present there are insufficient data to extend this technique to other micronutrients. The existing methods for deriving reference values for Cu and Fe are described. For Cu, there are no sensitive biomarkers of status or health relating to marginal deficiency or toxicity, despite the well-characterised genetic disorders of Menkes and Wilson's disease which, if untreated, lead to lethal deficiency and overload, respectively. For Fe, the wide variation in bioavailability confounds the relationship between intake and status and complicates risk–benefit analysis. As with Cu, health effects associated with deficiency or toxicity are not easy to quantify, therefore status is the most accessible variable for risk–benefit analysis. Serum ferritin reflects Fe stores but is affected by infection/inflammation, and therefore additional biomarkers are generally employed to measure and assess Fe status. Characterising the relationship between health and dietary intake is problematic for both these trace elements due to the confounding effects of bioavailability, inadequate biomarkers of status and a lack of sensitive and specific biomarkers for health outcomes.
The UK Food Standards Agency convened a workshop on 13 May 2009 to discuss recently completed research on diet and immune function. The objective of the workshop was to review this research and to establish priorities for future research. Several of the trials presented at the workshop showed some effect of nutritional interventions (e.g. vitamin D, Zn, Se) on immune parameters. One trial found that increased fruit and vegetable intake may improve the antibody response to pneumococcal vaccination in older people. The workshop highlighted the need to further clarify the potential public health relevance of observed nutrition-related changes in immune function, e.g. susceptibility to infections and infectious morbidity.
An international workshop of invited experts and partners in the EC-funded Network of Excellence, EURRECA (www.eurreca.org) was held in Norwich on 18–20 February 2008 to discuss biomarkers of micronutrient status.
Micronutrients are involved in specific biochemical pathways and have dedicated functions in the body, but they are also interconnected in complex metabolic networks, such as oxidative-reductive and inflammatory pathways and hormonal regulation, in which the overarching function is to optimise health. Post-genomic technologies, in particular metabolomics and proteomics, both of which are appropriate for plasma samples, provide a new opportunity to study the metabolic effects of micronutrients in relation to optimal health. The study of micronutrient-related health status requires a combination of data on markers of dietary exposure, markers of target function and biological response, health status metabolites, and disease parameters. When these nutrient-centred and physiology/health-centred parameters are combined and studied using a systems biology approach with bioinformatics and multivariate statistical tools, it should be possible to generate a micronutrient phenotype database. From this we can explore external factors that define the phenotype, such as lifestage and lifestyle, and the impact of genotype, and the results can also be used to define micronutrient requirements and provide dietary advice. New mechanistic insights have already been developed using biological network models, for example genes and protein-protein interactions in the aetiology of type 2 diabetes mellitus. It is hoped that the challenge of applying this approach to micronutrients will, in time, result in a change from micronutrient oriented to a health oriented views and provide a more holistic understanding of the role played by multiple micronutrients in the maintenance of homeostasis and prevention of chronic disease, for example through their involvement in oxidation and inflammation.
Fe homeostasis is considered in the context of the UK diet, using information on Fe intake and status from the National Diet and Nutrition Surveys. The importance of assessing Fe availability rather than total Fe intake is discussed. Dietary and host-related factors that determine Fe bioavailability (Fe utilised for Hb production) are reviewed using information from single-meal studies. When adaptive responses are taken into consideration, foods associated with higher Fe status include meat (haem-Fe and the ‘meat factor’) and fruits and fruit juice (vitamin C). Foods that may have a negative impact include dairy products (Ca), high-fibre foods (phytate) and tea and coffee (polyphenols), but the effects are more apparent in groups with marginal Fe deficiency, such as women of childbearing age. Analysis of dietary intake data on a meal-by-meal basis is needed to predict the influence of changing dietary patterns on Fe nutrition in the UK. Current information suggests that in the UK Fe deficiency is a greater problem than Fe overload.
Hepcidin plays a major role in iron homeostasis, but understanding its role has been hampered by the absence of analytical methods for quantification in blood. A commercial ELISA has been developed for serum prohepcidin, a hepcidin precursor, and there is interest in its potential use in the clinical and research arena. We investigated the association between serum prohepcidin concentration and iron absorption in healthy men, and its relationship with iron status in men carrying HFE mutations, hereditary haemochromatosis patients, and pregnant women. Iron absorption was determined in thirty healthy men (fifteen wild-type, fifteen C282Y heterozygote) using the stable isotope red cell incorporation technique. Iron status was measured in 138 healthy men (ninety-one wild-type, forty-seven C282Y heterozygote), six hereditary haemochromatosis patients, and thirteen pregnant women. Mean serum prohepcidin concentrations were 214 (sd 118) ng/ml [208 (sd 122) ng/ml in wild-type and 225 (sd 109) ng/ml in C282Y heterozygotes] in healthy men, 177 (sd 36) ng/ml in haemochromatosis patients, and 159 (sd 59) ng/ml in pregnant women. There was no relationship between serum prohepcidin concentration and serum ferritin in any subject groups, nor was it associated with efficiency of iron absorption. Serum prohepcidin is not a useful biomarker for clinical or research purposes.
The workshop was organised to discuss the validity and limitations of existing functional markers of Se status in human subjects and to identify future research priorities in this area. Studies presented as part of this workshop investigated: the bioavailability of Se from different dietary sources; potential functional markers of Se status; individual variation in response to Se; the effect of marginal Se status on immune function. The workshop highlighted the need to define the relationship between functional markers of Se status and health outcomes.
The UK Food Standards Agency convened a group of expert scientists to review current research investigating factors affecting iron status and the bioavailability of dietary iron. Results presented at the workshop show menstrual blood loss to be the major determinant of body iron stores in premenopausal women. In the presence of abundant and varied food supplies, the health consequences of lower iron bioavailability are unclear and require further investigation.
The UK Food Standards Agency convened a group of expert scientists to review current research investigating diet and carriers of genetic mutations associated with hereditary haemochromatosis. The workshop concluded that individuals who are heterozygous for the C282Y mutation of the HFE gene do not appear to respond abnormally to dietary Fe and therefore do not need to change their diet to prevent accumulation of body Fe.
The Fe solubility test is a commonly used, easy and relatively cheap in vitro tool for predicting Fe bioavailability in food matrices. However, the outcome of a recent field trial comparing the effect on Fe status of Tanzanian infants of processed V. unprocessed complementary foods (CF), with otherwise the same composition, challenged the validity of this test for predicting Fe bioavailability. In the solubility test, significant more soluble Fe was observed in processed compared with unprocessed foods (mean 18·8 (sem 0·21) V. 4·8 (sem 0·23) %; V<0·001). However, in the field trial, no significant difference in Fe status was seen between processed and unprocessed CF groups after 6 months' follow-up. Therefore, twenty-four samples of these CF (twelve processed and twelve unprocessed batches) were analysed in triplicate for Fe availability using an in vitro digestion–Caco-2 cell culture method and results were compared with solubility results. Significantly more soluble Fe was presented to Caco-2 cells in the processed compared with unprocessed samples (mean 11·5 (sem 1·16) V. 8·5 (sem 2·54) % p=0·028), but proportionally less Fe was taken up by the cells (mean 3·0 (sem 0·40)p. 11·7 (sem 2·22) %; p=0·007). As a net result, absolute Fe uptake was lower (not significantly) in processed compared with unprocessed CF (mean 1·3 (sem 0·16) V. 3·4 (sem 0·83) nmol/mg cell protein; p=0·052). These data clearly demonstrate that the Fe solubility test was not a good indicator of Fe bioavailability in these particular food matrices. In contrast, the results of an in vitro Caco-2 model supported the effects observed in vitro.
Nutrigenomics is the study of how constituents of the diet interact with genes, and their products, to alter phenotype and, conversely, how genes and their products metabolise these constituents into nutrients, antinutrients, and bioactive compounds. Results from molecular and genetic epidemiological studies indicate that dietary unbalance can alter gene–nutrient interactions in ways that increase the risk of developing chronic disease. The interplay of human genetic variation and environmental factors will make identifying causative genes and nutrients a formidable, but not intractable, challenge. We provide specific recommendations for how to best meet this challenge and discuss the need for new methodologies and the use of comprehensive analyses of nutrient–genotype interactions involving large and diverse populations. The objective of the present paper is to stimulate discourse and collaboration among nutrigenomic researchers and stakeholders, a process that will lead to an increase in global health and wellness by reducing health disparities in developed and developing countries.
Women of childbearing age are at risk of Fe deficiency if insufficient dietary Fe is available to replace menstrual and other Fe losses. Haem Fe represents 10–15 % of dietary Fe intake in meat-rich diets but may contribute 40 % of the total absorbed Fe. The aim of the present study was to determine the relative effects of type of diet and menstrual Fe loss on Fe status in women. Ninety healthy premenopausal women were recruited according to their habitual diet: red meat, poultry/fish or lacto-ovo-vegetarian. Intake of Fe was determined by analysing 7 d duplicate diets, and menstrual Fe loss was measured using the alkaline haematin method. A substantial proportion of women (60 % red meat, 40 % lacto-ovo-vegetarian, 20 % poultry/fish) had low Fe stores (serum ferritin <10 μg/l), but the median serum ferritin concentration was significantly lower in the red meat group (6·8 μg/l (interquartile range 3·3, 16·25)) than in the poultry/fish group (17·5 μg/l (interquartile range 11·3, 22·4) (P<0·01). The mean and standard deviation of dietary Fe intake were significantly different between the groups (P=0·025); the red meat group had a significantly lower intake (10·9 (sd 4·3) mg/d) than the lacto-ovo-vegetarians (14·5 (sd 5·5) mg/d), whereas that of the poultry/fish group (12·8 (sd 5·1) mg/d) was not significantly different from the other groups. There was no relationship between total Fe intake and Fe status, but menstrual Fe loss (P=0·001) and dietary group (P=0·040) were significant predictors of Fe status: poultry/fish diets were associated with higher Fe stores than lacto-ovo-vegetarian diets. Identifying individuals with high menstrual losses should be a key component of strategies to prevent Fe deficiency.
A computer-based dietary assessment tool, the meal-based intake assessment tool (MBIAT), is described. In the current study, dietary intakes of Fe and Zn fractions (total Fe, non-haem Fe, haem Fe, meat Fe, total Zn) and dietary components that influence Fe and Zn absorption (vitamin C, phytate, Ca, grams of meat/fish/poultry, black tea equivalents, phytate:Zn molar ratio) were assessed. The relative validity of the MBIAT was determined in forty-eight UK men aged 40 years and over by comparing its results with those from weighed diet records collected over 12 d. There was good agreement between the MBIAT and the weighed diet records for median intakes of total, non-haem, haem and meat Fe, Zn, vitamin C, phytate, grams of meat/fish/poultry and phytate:Zn molar ratio. Correlations between the two methods ranged from 0·32 (for Ca) to 0·80 (for haem Fe), with 0·76 for total Fe and 0·75 for Zn. The percentage of participants classified by the MBIAT into the same/opposite weighed diet record quartiles ranged from 56/0 for Fe and 60/0 for Zn to 33/10 for Ca. The questionnaire also showed an acceptable level of agreement between repeat administrations (e.g. a correlation for total Fe of 0·74). In conclusion, the MBIAT is appropriate for assessing group dietary intakes of total Fe and Zn and their absorption modifiers in UK men aged 40 years and over.