We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send this article to your account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Hepatic impairment resulting from the use of conventional drugs is widely acknowledged, but there is less awareness of the potential hepatotoxicity of herbal preparations and other botanicals, many of which are believed to be harmless and are commonly used for self-medication without supervision. The aim of this paper is to examine the evidence for hepatotoxicity of botanicals and draw conclusions regarding their pathology, safety and applications.
Design
Current literature on the hepatotoxicity of herbal drugs and other botanicals is reviewed. The aetiology, clinical picture and treatment of mushroom (Amanita) poisoning are described.
Results
Hepatotoxic effects have been reported for some Chinese herbal medicines (such as Jin Bu Huan, Ma-Huang and Sho-saiko-to), pyrrolizidine alkaloid-containing plants, germander (Teucrium chamaedrys), chaparral (Larrea tridentata), Atractylis gummifera, Callilepsis laureola, and others. The frequency with which botanicals cause hepatic damage is unclear. There is a lack of controlled treatment trials and the few studies published to date do not clarify the incidence of adverse effects. Many plant products do not seem to lead to toxic effects in everyone taking them, and they commonly lack a strict dose-dependency. For some products, such as Sho-saiko-to, the picture is confused further by demonstrations of hepatoprotective properties for some components. Mushroom poisoning is mostly due to the accidental consumption of Amanita species. Treatment with silymarin, thioctic acid, penicillin and liver transplantation have been shown to be effective but require early diagnosis.
Conclusions
Severe liver injury, including acute and chronic abnormalities and even cirrhotic transformation and liver failure, has been described after the ingestion of a wide range of herbal products and other botanical ingredients, such as mushrooms. It is concluded that in certain situations herbal products may be just as harmful as conventional drugs.
While iron deficiency is regarded as the major cause of nutritional anaemia, changes in vitamins A, B12, C and E, folic acid and riboflavin status have also been linked to its development and control. This paper provides a systematic review of vitamin supplementation trials relating to the control of nutritional anaemia.
Methods
A MEDLINE search was used to find reports of vitamin supplementation trials that reported changes in anaemia or iron status.
Results
Vitamin A can improve haematological indicators and enhance the efficacy of iron supplementation. Both folate and vitamin B12 can cure and prevent megaloblastic anaemia. Riboflavin enhances the haematological response to iron, and its deficiency may account for a significant proportion of anaemia in many populations. Vitamin C enhances the absorption of dietary iron, although population-based data showing its efficacy in reducing anaemia or iron deficiency are lacking. Vitamin E supplementation given to preterm infants has not reduced the severity of the anaemia of prematurity. Vitamin B6 effectively treats sideroblastic anaemia. Multivitamin supplementation may raise haemoglobin (Hb) concentration, but few studies have isolated the effect of multivitamins from iron on haematological status.
Conclusions
In general, the public health impact of vitamin supplementation in controlling anaemia is not clear. Neither are the complex interactions involving multiple vitamins in haematopoiesis sufficiently understood to explain the observed variability in haematological responses to vitamins by age, population, vitamin mixture and dosages. Further research is needed to understand the roles of individual and combined vitamin deficiencies on anaemia to design appropriate micronutrient interventions to prevent anaemia.
Because the percentage of missing portion sizes was large in the Aerobics Center Longitudinal Study (ACLS), careful consideration of the accuracy of standard portion sizes was necessary. The purpose of the present study was to investigate the consequences of using standard portion sizes instead of reported portion sizes on subjects' nutrient intake.
Methods
In 2307 men and 411 women, nutrient intake calculated from a 3-day dietary record using reported portion sizes was compared with nutrient intake calculated from the same record in which standard portion sizes were substituted for reported portion sizes.
Results
The standard portion sizes provided significantly lower estimates (> 20%) of energy and nutrient intakes than the reported portion sizes. Spearman correlation coefficients obtained by the two methods were high, ranging from 0.67 to 0.93. Furthermore, the agreement between both methods was fairly good. Thus, in the ACLS the use of standard portion sizes rather than reported portion sizes did not appear to be suitable to assess the absolute intake at the group level, but appeared to lead to a good ranking of individuals according to nutrient intake. These results were confirmed by the Continuing Survey of Food Intake by Individuals (CSFII), in which the assessment of the portion size was optimal. When the standard portion sizes were adjusted using the correction factor, the ability of the standard portion sizes to assess the absolute nutrient intake at the group level was considerably improved.
Conclusions
This study suggests that the adjusted standard portion sizes may be able to replace missing portion sizes in the ACLS database.
To explore the utility of cluster analysis in defining complex dietary exposures, separately with two types of variables.
Design
A modified diet history method, combining a 7-day menu book and a 168-item questionnaire, assessed dietary habits. A standardized questionnaire collected information on sociodemographics, lifestyle and health history. Anthropometric information was obtained through direct measurements. The dietary information was collapsed into 43 generic food groups, and converted into variables indicating the per cent contribution of specific food groups to total energy intake. Food patterns were identified by the QUICK CLUSTER procedure in SPSS, in two separate analytical steps using unstandardized and standardized (Z-scores) clustering variables.
Setting
The Malmö Diet and Cancer (MDC) Study, a prospective study in the third largest city of Sweden, with baseline examinations from March 1991 to October 1996.
Subjects
A random sample of 2206 men and 3151 women from the MDC cohort (n=28098).
Results
Both variable types produced conceptually well separated clusters, confirmed with discriminant analysis. ‘Healthy’ and ‘less healthy’ food patterns were also identified with both types of variables. However, nutrient intake differences across clusters were greater, and the distribution of the number of individuals more even, with the unstandardized variables. Logistic regression indicated higher risks of past food habit change, underreporting of energy and higher body mass index (BMI) for individuals falling into ‘healthy’ food pattern clusters.
Conclusions
The utility in discriminating dietary exposures appears greater for unstandardized food group variables. Future studies on diet and cancer need to recognize the confounding factors associated with ‘healthy’ food patterns.
The purpose of this study was to measure the reported use of nutrition information on food labels by a population of university students and to determine if label users differed from non-users in terms of gender and specific beliefs related to label information and diet–disease relationships, specifically fat and heart disease and fibre and cancer.
Design
A single-stage cluster sampling technique was used. Data was obtained using a self-administered, validated questionnaire.
Setting
The present investigation took place at the University of Saskatchewan, Canada in the autumn of 1997.
Subjects
A total of 553 students in randomly selected classes in the College of Arts and Science took part in the survey (92% response rate). The sample consisted of roughly equal numbers of males and females, most between the ages of 18 and 24.
Results
There were approximately equal numbers of label users and non-users among males, while label users outnumbered non-users by almost four to one among females. The importance of nutrition information on food labels was the only belief that differed significantly between label users and non-users for both sexes. For females, no other beliefs distinguished label users from non-users. However, for males, significant differences were found between label users and non-users on the beliefs that nutrition information is truthful and that a relationship between fibre and cancer exists.
Conclusions
Females appear to use food labels more often than do males. The only consistently observed difference between label users and non-users (male and female) was that users believed in the importance of nutrition information on food labels while non-users did not.
To evaluate the errors incurred by young adults using single portion size colour food photographs to quantify foods and nutrients consumed at six meals on two non-consecutive days.
Design
Breakfast menus remained the same for the 2 days; but lunch and dinner menus varied. The amounts of food eaten by individuals were determined by weighing individual serving dishes pre- and post-consumption. The day after eating, all foods consumed were quantified in terms of fractions or multiples of the amounts shown in the food photographs.
Subjects
Thirty adult volunteers (15 male, 15 female), aged 18–36 years, completed the protocol for day one; 27 (90%) completed day two.
Results
Some foods were more difficult to quantify accurately than others. The largest error range was −38.9% to +284.6% (cheese), whereas the smallest errors were incurred for juice (−21.5% to +34.6%, day one). All subjects who consumed muesli (day one) overestimated (+3.7% to +113.7%). No other foods were consistently over-or underestimated. For foods consumed at breakfast by the same subjects on both days, individual estimation errors were inconsistent in magnitude and/or direction. At the group level, most nutrients were estimated to within ±10% of intake; exceptions were thiamin (+10.5%, day one) and vitamin E (−10.1%, day one; −15.3%, day two). Between 63% and 80% of subjects were correctly classified into tertiles on the basis of estimated intakes.
Conclusions
Despite some large food quantification errors, single portion size food photographs were effective when used to estimate nutrient intakes at the group level. It remains to be established whether, under the conditions used in this study, more photographs per food would improve estimates of nutrient intake at the individual level.
To evaluate changes over 1 year in weight and body mass index (BMI) among a population-based sample of non-pregnant women in Indonesia and to identify risk factors for developing under- and overnutrition.
Design
Cross-sectional studies in 1996 and 1997 in the same population.
Setting
Purworejo District, central Java, Indonesia.
Subjects
Non-pregnant women (n = 4132) aged 15–49 years of age who participated in both 1996 and 1997. Based on BMI, women were classified as having chronic energy deficiency (CED), and as being either of normal weight or obese.
Results
The mean height of the women was below the fifth percentile of international standards. In 1996, 16.2% had CED, 72.2% were normal and 11.6% were obese. In 1997, the corresponding figures were 14.4%, 71.2% and 14.3%, respectively, revealing a significant mean increase in weight and BMI. Among women classified as normal in 1996, 3.0% developed CED in 1997. Significant risk factors for developing CED were experiences of child deaths and non-use of contraceptives. Among women classified as normal in 1996, 5.3% developed obesity in 1997. Here, significant risk factors included most indicators of wealth as well as occupation.
Conclusions
The results should be important for future efforts to prevent CED and obesity in the general population; conditions which are both associated with health risks.
To examine the influences of nutritional information and consumer characteristics on meal quality expectations, food selection and subsequent macronutrient intakes of consumers offered a reduced-fat option in a restaurant.
Design
A target, full-fat (FF) main restaurant meal option was developed in a version substantially reduced in fat and energy (RF). Restaurant patrons were randomly placed into one of four treatment groups varying in provision of menu information about the target dish, and the actual version of that dish served (if ordered). A full-fat blind (FFB) control group was given no nutritional information in the menu and was served the FF version. The other three groups were all served the modified RF version: (i) reduced-fat blind (RFB), who were given no nutritional information; (ii) reduced-fat informed (RFI), who were given nutritional information; and (iii) reduced-fat informed with details (RFID), who were given the same nutritional information plus recipe modification details. Subjects rated their expected and actual liking, the pleasantness of taste, texture and appearance of the dish, how well the dish matched their expectations, and the likelihood of purchase again. Additional measures included the other dish selections, sociodemographic and attitudinal information.
Setting
A silver service (training) restaurant.
Subjects
Members of the public (n = 279) consuming meals in the restaurant.
Results
The presence of nutritional information on the menu did not significantly increase subsequent intakes of energy and fat from the rest of the meal, and did not significantly influence sensory expectations or post-meal acceptance measures (which also did not differ between the FF and RF versions). Consumer characteristics relating to fat reduction attitudes and behaviours were significantly related to the selection of different dishes.
Conclusions
Provision of RF alternatives in a restaurant can have significant positive dietary benefits. Menu nutritional information did not affect measures of meal acceptance. Further studies should identify which types of information formats might be most effective in enhancing the selection of ‘healthy’ options.
To document the type and volume of drinks given to infants and investigate whether giving supplementary drinks leads to reduced milk consumption.
Design
Carers were asked to record all drinks consumed by the infants in a 24-hour period at two ages, detailing the types and volume taken.
Setting
The Avon Longitudinal Study of Pregnancy and Childhood (ALSPAC).
Subjects
A randomly chosen population sample of over 1000 infants at 4 and 8 months of age.
Results
The different types of milk feed were used to group infants, compare volumes consumed and look at the use of non-milk drinks. The average volume of drinks consumed over 24 hours at 4 months was 861 ml and at 8 months was 662 ml. At 4 months 69.7% consumed infant formula and 43.0% breast milk. The mean volume of milk consumed by those having only formula was 802 ml and for those having only breast milk was estimated at 850 ml. The volumes of milks consumed were slightly lower in the groups who also had supplementary drinks. A quarter of infants were given fruit drinks and 14.6% herbal drinks. Supplementary drinks and solids were more likely to be given to formula-fed than breast-fed infants. At 8 months, formula milk was consumed by 71.4% and breast milk use had decreased (22.9%) but fruit drink use had increased (squash/cordial: 55.8%, fruit juice: 14.9%), with 13.9% of infants having no infant milk at all. More infants were fed formula milk and less were fed cows' milk compared with a nationally representative British study conducted 5 years earlier.
Conclusions
Many infants were given supplementary drinks by 4 months; there is some evidence that this led to a small reduction in milk intake. A minority were not being given infant milks at all by 8 months, contrary to British infant feeding recommendations.
To identify the most important motivations for food choice from the point of view of the consumer in the Irish population, and to characterize those subjects who do and do not regard nutrition as a significant consideration in food choice.
Design
As part of a pan-European Union (EU) survey on consumer attitudes to food, nutrition and health, a quota-controlled, nationally representative sample of Irish adults (n = 1009) aged 15 years upwards, completed an interview-assisted, close-ended questionnaire. Subjects selected three factors, from a list of 15, which they believed had the greatest influence on their food choice.
Setting
The interviews for the survey were conducted in subjects' homes.
Results
‘Quality/freshness of food’ was the most frequently selected food choice factor (51%) followed by ’taste‘ (43%) and ‘trying to eat a healthy diet’ (36%). Female gender, increasing age and higher levels of education were found to be independent sociodemographic factors affecting the selection of ‘trying to eat a healthy diet’ as an important factor in food choice.
Conclusions
Although included in the top five most frequently selected factors affecting food choice, nutrition/healthy eating does not appear to have top priority for the majority of Irish adults. There are differences between the various sociodemographic groups within the population; males and younger subjects appear to require specific nutrition promotion messages.
This study examined the relationship between breakfast cereal consumption and non-milk extrinsic sugars (NMES) intake and the possible implications of this for caries in preschool children.
Methods
Data from the 1995 UK National Diet and Nutrition Survey (NDNS) of children aged 1.5–4.5 years were reanalysed. Four-day weighed food records and dental examinations were available on 1450 children living in private households in Britain. Children were classified by tertiles (age-adjusted) according to the proportion of energy derived from breakfast cereals, and the amount of NME sugar from cereals. There were no significant differences in social class background between any of the groups.
Results
Children with diets high in breakfast cereals as a proportion of total energy (top third) had lower proportional intakes of NMES, compared with low consumers of cereals (lowest third). Consumption of sweetened cereals was positively associated with NMES intake. However, caries experience was unrelated to breakfast cereal consumption, whether presweetened or not.
Conclusions
Although presweetened cereals are relatively high in NMES, their cariogenic potential is probably minimal in the circumstances in which they are normally consumed.
Results of previous studies on diet and gallbladder disease (GBD), defined as having gallstones or having had surgery for gallstones, have been inconsistent. This research examined patterns of food intake in Mexican Americans and their associations with GBD.
Design
Cross-sectional.
Subjects
The study population included 4641 Mexican Americans aged 20–74 years who participated in the 1988–94 third National Health and Nutrition Examination Survey (NHANES III). GBD was diagnosed by ultrasound. Food intake patterns were identified by principal components analysis based on food frequency questionnaire responses. Component scores representing the level of intake of each pattern were categorized into quartiles, and prevalence odds ratios (POR) were estimated relative to the lowest quartile along with 95% confidence intervals (CI).
Results
There were four distinct patterns in women (vegetable, high calorie, traditional, fruit) and three in men (vegetable, high calorie, traditional). After age adjustment, none were associated with GBD in women. However, men in the third (POR = 0.42, 95%CI 0.21–0.85) and fourth (POR = 0.53, 95%CI 0.28–1.01) quartiles of the traditional intake pattern were half as likely to have GBD as those in the lowest quartile.
Conclusions
These findings add to a growing literature suggesting dietary intake patterns can provide potentially useful and relevant information on diet–disease associations. Nevertheless, methods to do so require further development and validation.
This first nationwide survey was undertaken to estimate the prevalence rates and severity of iodine deficiency disorders (IDD) and the proportion of households consuming iodized salt.
Design
The country was stratified into two ecological zones and 30 clusters (primary schools) from each zone, including the required numbers of pupils, were selected randomly. A subsample of pupils provided urine and salt samples for the determination of urinary iodine excretion (UIE) and presence of iodate, respectively.
Setting
Yemen.
Subjects
There were a total of 2984 pupils aged 6–12 years of whom 2003 were boys and 981 girls. The majority (1800) pupils were from the lowland/coastal areas (zone II) and the rest (1184) from the mountainous regions (zone I).
Results
The total goitre rates (TGR) in the whole country, zones II and I were 16.8%, 31.1% and 7.4%, respectively. The TGR in zone I for males was 32.8% and 27.3% for females, while in zone II the corresponding rates were 8.1% and 5.9%, respectively, and the differences were not statistically significant. Only three cases of visible goitres were encountered. The median UIE levels in zones I, II and the whole country were 13.6, 18.9 and 17.3 μg dl−1, respectively. Based on UIE cut-off points recommended by WHO, IDD was severe in 4.7% of pupils in zone I and 2.6% in zone II. Mild and moderate IDD were found in 18.5% and 8.7% of the pupils respectively. Nearly 70% of the surveyed pupils had UIE values of > 10 μg dl−1 (no deficiency). Girls had relatively better iodine nutrition as suggested by higher levels of median UIE. In addition, across all age groups median UIE values were above 10 μg dl−1. Over half of the households consumed iodized salt.
Conclusions
Since the introduction of universal salt iodization in 1996 both the prevalence and severity of IDD in Yemen were reduced markedly and Yemen can now be classified as a country with a mild IDD problem. However, the low level of households consuming iodized salt may hamper the goal of IDD elimination.