To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Selenium is found at the active centre of twenty-five selenoproteins which have a variety of roles, including the well-characterised function of antioxidant defense, but it also is claimed to be involved in the immune system. However, due to limited and conflicting data for different parameters of immune function, intakes of selenium that have an influence on immune function are uncertain. This review covers the relationship between selenium and immune function in man, focusing on the highest level of evidence, namely that generated by randomised controlled trials (RCT), in which the effect of selective administration of selenium, in foods or a supplement, on immune function was assessed. A total of nine RCT were identified from a systematic search of the literature, and some of these trials reported effects on T and natural killer cells, which were dependent on the dose and form of selenium administered, but little effect of selenium on humoral immunity. There is clearly a need to undertake dose–response analysis of cellular immunity data in order to derive quantitative relationships between selenium intake and measures of immune function. Overall, limited effects on immunity emerged from experimental studies in human subjects, though additional investigation on the potential influence of selenium status on cellular immunity appears to be warranted.
This review aims to describe approaches used to estimate bioavailability when deriving dietary reference values (DRV) for iron and zinc using the factorial approach. Various values have been applied by different expert bodies to convert absorbed iron or zinc into dietary intakes, and these are summarised in this review. The European Food Safety Authority (EFSA) derived zinc requirements from a trivariate saturation response model describing the relationship between zinc absorption and dietary zinc and phytate. The average requirement for men and women was determined as the intercept of the total absorbed zinc needed to meet physiological requirements, calculated according to body weight, with phytate intake levels of 300, 600, 900 and 1200 mg/d, which are representative of mean/median intakes observed in European populations. For iron, the method employed by EFSA was to use whole body iron losses, determined from radioisotope dilution studies, to calculate the quantity of absorbed iron required to maintain null balance. Absorption from the diet was estimated from a probability model based on measures of iron intake and status and physiological requirements for absorbed iron. Average dietary requirements were derived for men and pre- and post-menopausal women. Taking into consideration the complexity of deriving DRV for iron and zinc, mainly due to the limited knowledge on dietary bioavailability, it appears that EFSA has made maximum use of the most relevant up-to-date data to develop novel and transparent DRV for these nutrients.
To examine the contribution of genetic factors to food choice, we determined dietary patterns from food frequency questionnaires in 3262 UK female twins aged 18 to 79 years. Five distinct dietary patterns were identified (fruit and vegetable, high alcohol, traditional English, dieting, low meat) that accounted for 22% of the total variance. These patterns are similar to those found in other singleton Western populations, and were related to body mass index, smoking status, physical activity and deprivation scores. Older subjects had higher scores on the fruit and vegetable and traditional English patterns, while lower social deprivation was associated with higher scores for fruit and vegetable, and lower scores for traditional English patterns. All 5 patterns were heritable, with estimates ranging from 41% to 48%. Among individual dietary components, a strongly heritable component was identified for garlic (46%), coffee (41%), fruit and vegetable sources (49%), and red meat (39%). Our results indicate that genetic factors have an important influence in determining food choice and dietary habits in Western populations. The relatively high heritability of specific dietary components implicates taste perception as a possible target for future genetic studies.
Dietary reference values for essential trace elements are designed to meet requirements with minimal risk of deficiency and toxicity. Risk–benefit analysis requires data on habitual dietary intakes, an estimate of variation and effects of deficiency and excess on health. For some nutrients, the range between the upper and lower limits may be extremely narrow and even overlap, which creates difficulties when setting safety margins. A new approach for estimating optimal intakes, taking into account several health biomarkers, has been developed and applied to selenium, but at present there are insufficient data to extend this technique to other micronutrients. The existing methods for deriving reference values for Cu and Fe are described. For Cu, there are no sensitive biomarkers of status or health relating to marginal deficiency or toxicity, despite the well-characterised genetic disorders of Menkes and Wilson's disease which, if untreated, lead to lethal deficiency and overload, respectively. For Fe, the wide variation in bioavailability confounds the relationship between intake and status and complicates risk–benefit analysis. As with Cu, health effects associated with deficiency or toxicity are not easy to quantify, therefore status is the most accessible variable for risk–benefit analysis. Serum ferritin reflects Fe stores but is affected by infection/inflammation, and therefore additional biomarkers are generally employed to measure and assess Fe status. Characterising the relationship between health and dietary intake is problematic for both these trace elements due to the confounding effects of bioavailability, inadequate biomarkers of status and a lack of sensitive and specific biomarkers for health outcomes.
An international workshop of invited experts and partners in the EC-funded Network of Excellence, EURRECA (www.eurreca.org) was held in Norwich on 18–20 February 2008 to discuss biomarkers of micronutrient status.
Fe homeostasis is considered in the context of the UK diet, using information on Fe intake and status from the National Diet and Nutrition Surveys. The importance of assessing Fe availability rather than total Fe intake is discussed. Dietary and host-related factors that determine Fe bioavailability (Fe utilised for Hb production) are reviewed using information from single-meal studies. When adaptive responses are taken into consideration, foods associated with higher Fe status include meat (haem-Fe and the ‘meat factor’) and fruits and fruit juice (vitamin C). Foods that may have a negative impact include dairy products (Ca), high-fibre foods (phytate) and tea and coffee (polyphenols), but the effects are more apparent in groups with marginal Fe deficiency, such as women of childbearing age. Analysis of dietary intake data on a meal-by-meal basis is needed to predict the influence of changing dietary patterns on Fe nutrition in the UK. Current information suggests that in the UK Fe deficiency is a greater problem than Fe overload.
Hepcidin plays a major role in iron homeostasis, but understanding its role has been hampered by the absence of analytical methods for quantification in blood. A commercial ELISA has been developed for serum prohepcidin, a hepcidin precursor, and there is interest in its potential use in the clinical and research arena. We investigated the association between serum prohepcidin concentration and iron absorption in healthy men, and its relationship with iron status in men carrying HFE mutations, hereditary haemochromatosis patients, and pregnant women. Iron absorption was determined in thirty healthy men (fifteen wild-type, fifteen C282Y heterozygote) using the stable isotope red cell incorporation technique. Iron status was measured in 138 healthy men (ninety-one wild-type, forty-seven C282Y heterozygote), six hereditary haemochromatosis patients, and thirteen pregnant women. Mean serum prohepcidin concentrations were 214 (sd 118) ng/ml [208 (sd 122) ng/ml in wild-type and 225 (sd 109) ng/ml in C282Y heterozygotes] in healthy men, 177 (sd 36) ng/ml in haemochromatosis patients, and 159 (sd 59) ng/ml in pregnant women. There was no relationship between serum prohepcidin concentration and serum ferritin in any subject groups, nor was it associated with efficiency of iron absorption. Serum prohepcidin is not a useful biomarker for clinical or research purposes.
The workshop was organised to discuss the validity and limitations of existing functional markers of Se status in human subjects and to identify future research priorities in this area. Studies presented as part of this workshop investigated: the bioavailability of Se from different dietary sources; potential functional markers of Se status; individual variation in response to Se; the effect of marginal Se status on immune function. The workshop highlighted the need to define the relationship between functional markers of Se status and health outcomes.
The UK Food Standards Agency convened a group of expert scientists to review current research investigating factors affecting iron status and the bioavailability of dietary iron. Results presented at the workshop show menstrual blood loss to be the major determinant of body iron stores in premenopausal women. In the presence of abundant and varied food supplies, the health consequences of lower iron bioavailability are unclear and require further investigation.
The UK Food Standards Agency convened a group of expert scientists to review current research investigating diet and carriers of genetic mutations associated with hereditary haemochromatosis. The workshop concluded that individuals who are heterozygous for the C282Y mutation of the HFE gene do not appear to respond abnormally to dietary Fe and therefore do not need to change their diet to prevent accumulation of body Fe.
Nutrigenomics is the study of how constituents of the diet interact with genes, and their products, to alter phenotype and, conversely, how genes and their products metabolise these constituents into nutrients, antinutrients, and bioactive compounds. Results from molecular and genetic epidemiological studies indicate that dietary unbalance can alter gene–nutrient interactions in ways that increase the risk of developing chronic disease. The interplay of human genetic variation and environmental factors will make identifying causative genes and nutrients a formidable, but not intractable, challenge. We provide specific recommendations for how to best meet this challenge and discuss the need for new methodologies and the use of comprehensive analyses of nutrient–genotype interactions involving large and diverse populations. The objective of the present paper is to stimulate discourse and collaboration among nutrigenomic researchers and stakeholders, a process that will lead to an increase in global health and wellness by reducing health disparities in developed and developing countries.
Women of childbearing age are at risk of Fe deficiency if insufficient dietary Fe is available to replace menstrual and other Fe losses. Haem Fe represents 10–15 % of dietary Fe intake in meat-rich diets but may contribute 40 % of the total absorbed Fe. The aim of the present study was to determine the relative effects of type of diet and menstrual Fe loss on Fe status in women. Ninety healthy premenopausal women were recruited according to their habitual diet: red meat, poultry/fish or lacto-ovo-vegetarian. Intake of Fe was determined by analysing 7 d duplicate diets, and menstrual Fe loss was measured using the alkaline haematin method. A substantial proportion of women (60 % red meat, 40 % lacto-ovo-vegetarian, 20 % poultry/fish) had low Fe stores (serum ferritin <10 μg/l), but the median serum ferritin concentration was significantly lower in the red meat group (6·8 μg/l (interquartile range 3·3, 16·25)) than in the poultry/fish group (17·5 μg/l (interquartile range 11·3, 22·4) (P<0·01). The mean and standard deviation of dietary Fe intake were significantly different between the groups (P=0·025); the red meat group had a significantly lower intake (10·9 (sd 4·3) mg/d) than the lacto-ovo-vegetarians (14·5 (sd 5·5) mg/d), whereas that of the poultry/fish group (12·8 (sd 5·1) mg/d) was not significantly different from the other groups. There was no relationship between total Fe intake and Fe status, but menstrual Fe loss (P=0·001) and dietary group (P=0·040) were significant predictors of Fe status: poultry/fish diets were associated with higher Fe stores than lacto-ovo-vegetarian diets. Identifying individuals with high menstrual losses should be a key component of strategies to prevent Fe deficiency.
A computer-based dietary assessment tool, the meal-based intake assessment tool (MBIAT), is described. In the current study, dietary intakes of Fe and Zn fractions (total Fe, non-haem Fe, haem Fe, meat Fe, total Zn) and dietary components that influence Fe and Zn absorption (vitamin C, phytate, Ca, grams of meat/fish/poultry, black tea equivalents, phytate:Zn molar ratio) were assessed. The relative validity of the MBIAT was determined in forty-eight UK men aged 40 years and over by comparing its results with those from weighed diet records collected over 12 d. There was good agreement between the MBIAT and the weighed diet records for median intakes of total, non-haem, haem and meat Fe, Zn, vitamin C, phytate, grams of meat/fish/poultry and phytate:Zn molar ratio. Correlations between the two methods ranged from 0·32 (for Ca) to 0·80 (for haem Fe), with 0·76 for total Fe and 0·75 for Zn. The percentage of participants classified by the MBIAT into the same/opposite weighed diet record quartiles ranged from 56/0 for Fe and 60/0 for Zn to 33/10 for Ca. The questionnaire also showed an acceptable level of agreement between repeat administrations (e.g. a correlation for total Fe of 0·74). In conclusion, the MBIAT is appropriate for assessing group dietary intakes of total Fe and Zn and their absorption modifiers in UK men aged 40 years and over.
A method for measuring unlabelled Fe absorption has been investigated in a pilot study using a simple mathematical model. The metabolism of newly absorbed Fe can be approximated as a single-compartment model with the sampled compartment being the plasma pool. Five female volunteers (aged 30–55 years) were recruited to participate in the pilot study. After a 10 mg oral dose of unlabelled ferrous sulfate, the change in plasma Fe concentration over the following 6 h was used to estimate the quantity of absorbed Fe from the mathematical model. To assess the accuracy of the new technique, a 1 mg oral dose of 57Fe-labelled iron sulfate was given simultaneously with a 225 μg intravenous dose of 58Fe as iron citrate. The plasma appearance of the labelled Fe was used to estimate the absorption of the oral label from the traditional area under the curve method. There was no significant difference (P=0·61) between the geometric mean absorption of the unlabelled (19 (- 1 sd 12, +1 sd 28) %) and the labelled Fe (17 (- 1 sd 10, +1 sd 29) %). These initial results are encouraging, but further work needs to be undertaken with smaller doses, as typically found in meals. The effect of diurnal variation in serum Fe concentration on the estimation of unlabelled Fe absorption needs further assessment.
The study of Cu metabolism is hampered by a lack of sensitive and specific biomarkers of status and suitable isotopic labels, but limited information suggests that Cu homeostasis is maintained through changes in absorption and endogenous loss. The aim of the present study was to employ stable-isotope techniques to measure Cu absorption and endogenous losses in adult men adapted to low, moderate and high Cu-supplemented diets. Twelve healthy men, aged 20–59 years, were given diets containing 0·7, 1·6 and 6·0 mg Cu/d for 8 weeks, with at least 4 weeks intervening washout periods. After 6 weeks adaptation, apparent and true absorption of Cu were determined by measuring luminal loss and endogenous excretion of Cu following oral administration of 3 mg highly enriched 65Cu stable-isotope label. Apparent and true absorption (41 and 48% respectively) on the low-Cu diet were not significantly different from the high-Cu diet (45 and 48% respectively). Endogenous losses were significantly reduced on the low- (0·45mg/d; P<0·001) and medium- (0·81 mg/d; P=0·001) compared with the high-Cu diet (2·46mg/d). No biochemical changes resulting from the dietary intervention were observed. Cu homeostasis was maintained over a wide range of intake and more rapidly at the lower intake, mainly through changes in endogenous excretion.
It was nearly 120 years ago that the physiologist Claude Bernard proposed that for an organism to function optimally the component cells must be surrounded by a medium of closely regulated composition (‘La fixité du milieu interne est la condition de la vie libre’). Homeostasis is the term used today to describe this phenomenon, being first coined by Walter B. Cannon in 1929 (Hardy, 1976). This is the underlying physiological principle that explains the relative chemical constancy of the body, including inorganic nutrients such as Fe.
A double-blind controlled Ca supplementation trial was conducted for 6 months in thirty-four 7-year-old Chinese children from Hongkong and Jiangmen, China. The children were randomly allocated to the study group (n 17) or control group (n 17), and a CaCO3 tablet (300 mg Ca) or a placebo tablet was taken daily. True fractional Ca absorption (TFCA) was evaluated before and after the trial using stable isotopes: 8 mg44Ca mixed in 100 g chocolate milk was given after an intravenous injection of 0·75 mg 42Ca. There was no significant difference in baseline TFCA between the study group (60·6 (SD 11·4)%) and the controls (58·2 (SD 9.0)% P = 0·55). Serum 25-hydroxycholecalciferol levels were comparable between the two groups (P = 0·71). After 6 months, TFCA of the study group (55·6 (SD 12·7)%) was significantly lower than that of the controls (64·3 (SD 10·7)% P = 0·015). By comparing the individual changes in TFCA after the trial between the two groups there was a non-significant reduction in TFCA (5·03 (SD 12·4)% P = 0·11, Wilcoxon signed-rank test) in the study group (60·6–55·6%), whereas a significant increase in TFCA (6·17(SD 7·7)% P = 0·004, Wilcoxon signed-rank test) was observed in the controls (58·2–64·3%). The differential in TFCA between the two groups after 6 months was significantly different (P = 0·001), and remained significant after adjustment for baseline dietary intakes, weight and height by multiple-regression analysis (P = 0·003). If the mechanism of TFCA from chocolate milk in response to the treatment effects is similar to that from the total diet, then our results suggest that children with adequate vitamin D status can adapt to a change in Ca intake by adjusting the efficiency of TFCA. In corollary, children on habitually-low Ca diets have a higher TFCA than the counterparts with higher Ca diets.