To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Excess energy intake is recognised as a strong contributing factor to the global rise of being overweight and obese. The aim of this paper was to investigate if oral sensitivity to complex carbohydrate relates to ad libitum consumption of complex carbohydrate foods in a sample group of female adults. Participants’ [(n = 51 females): age 23.0 ± 0.6 years (range 20.0 – 41.0 years); excluding restrained eaters] sensitivity towards maltodextrin (oral complex carbohydrate) and glucose (sweet taste) were assessed by measuring detection threshold (DT) and suprathreshold intensity perception (ST). A crossover design was used to assess consumption of two different iso-caloric preload milkshakes and ad libitum milkshakes – 1) glucose based milkshake, 2) maltodextrin based milkshake. Ad libitum intake (primary outcome) and eating rate, liking, hunger, fullness, and prospective consumption ratings were measured. Participants who were more sensitive towards complex carbohydrate (maltodextrin DT) consumed significantly more maltodextrin based milkshake in comparison to less sensitive participants (P=0.01) and this was independent of liking. Participants who had higher liking for glucose based milkshake consumed significantly more glucose based milkshake in comparison to participants with lower hedonic ratings (P=0.049). The results provide support regarding the role of the oral system sensitivity (potentially taste) to complex carbohydrate and the prospective to overconsume complex carbohydrate based milkshake in a single sitting. The trial was registered at the ANZCTR as ACTRN12617000551392.
Healthcare workers (HCWs) are at risk of acquiring and transmitting respiratory viruses while working in healthcare settings.
To investigate the incidence of and factors associated with HCWs working during an acute respiratory illness (ARI).
HCWs from 9 Canadian hospitals were prospectively enrolled in active surveillance for ARI during the 2010–2011 to 2013–2014 influenza seasons. Daily illness diaries during ARI episodes collected information on symptoms and work attendance.
At least 1 ARI episode was reported by 50.4% of participants each study season. Overall, 94.6% of ill individuals reported working at least 1 day while symptomatic, resulting in an estimated 1.9 days of working while symptomatic and 0.5 days of absence during an ARI per participant season. In multivariable analysis, the adjusted relative risk of working while symptomatic was higher for physicians and lower for nurses relative to other HCWs. Participants were more likely to work if symptoms were less severe and on the illness onset date compared to subsequent days. The most cited reason for working while symptomatic was that symptoms were mild and the HCW felt well enough to work (67%). Participants were more likely to state that they could not afford to stay home if they did not have paid sick leave and were younger.
HCWs worked during most episodes of ARI, most often because their symptoms were mild. Further data are needed to understand how best to balance the costs and risks of absenteeism versus those associated with working while ill.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
Objective: To investigate the effects of methylphenidate on long-term executive and neuropsychological functioning in children with attention problems following TBI, as well as the relationship between methylphenidate associated changes in lab-based neuropsychological measures of attentional control, processing speed, and executive functioning and parent- or self-report measures of everyday executive functioning. Method: 26 children aged 6–17 years, who were hospitalized for moderate-to-severe blunt head trauma 6 or more months previously, were recruited from a large children’s hospital medical center. Participants were randomized into a double-masked, placebo-controlled cross-over clinical trial. Participants completed a comprehensive neuropsychological battery and parent- and self-report ratings of everyday executive functioning at baseline, and at 4 weeks and 8 weeks following upward titration of medication to an optimal dose or while administered a placebo. Results: Methylphenidate was associated with significant improvements in processing speed, sustained attention, and both lab-based and everyday executive functioning. Significant treatment-by-period interactions were found on a task of sustained attention. Participants who were randomized to the methylphenidate condition for the first treatment period demonstrated random or erratic responding, with slower and more variable response times when given placebo during the second period. Conclusion: Results indicate that methylphenidate treatment is associated with positive outcomes in processing speed, sustained attention, and both lab-based and everyday measures of executive functioning compared to placebo group. Additionally, results suggest sustained attention worsens when discontinuing medication. (JINS, 2019, 25, 740–749)
Legume cover crops can supply a significant amount of nitrogen (N) for cash crops, which is particularly important for organic farmers. Because N mineralization from cover crop residue depends on the amount of biomass, cover crop quality, as well as environmental conditions such as soil moisture and temperature, predicting the amount of N mineralized and the timing of release has been difficult. We have developed a Cover Crop Nitrogen Calculator based on the N subroutine of the CERES crop model and evaluated the use of the predicted N credits on yields of fall broccoli [Brassica oleracea L. (Italica group)] at a research farm and two working farms. Research farm trials consisted of a cowpea (Vigna unguiculata L. Walp.) cover crop and no cover crop treatments, each at four N rates (0N, 0.5N, 1N and 1.5N, with 1N the target N rate of 112 kg N ha−1 in 2013 and 168 kg N ha−1 in 2014 and 2015) in a randomized complete block design. On-farm trials consisted of a cowpea or sunn hemp (Crotolaria juncea L.) cover crop at four N rates (0N, 0.5N, 1N and 1.5N) and no cover crop treatment at the 1N rate in a completely randomized design. Cover crop biomass and quality (N%, carbohydrates%, cellulose% and lignin%) were measured and used with a 5-yr average soil moisture and soil temperature from a local weather station to predict an N credit. In the cover crop treatments, the N rate was modified by the predicted credit, while the no cover crop treatment received the full N fertilizer rate either as feathermeal (certified organic fields) or as urea (conventional field). At the research farm, broccoli yield increased up to the 0.5N rate, and there was no difference in yield between the no cover 0.5N rate and the cover crop 0.5N rate in 2013, 2014 and 2105. On-farm, we saw an N response in two site-years. In these site-years, there was no difference between the no cover 1N rate and the cover crop 1N rate. At the third site-year, no N response was seen. Overall, our results showed using the cover crop credit predicted by the Calculator did not reduce yields. The use of a decision support tool such as the Calculator may help farmers better manage N fertilizer when cover crops are used, and increase cover crop adoption.
Irritability and anxiety are two common clinical phenotypes that involve high-arousal negative affect states (anger and fear), and that frequently co-occur. Elucidating how these two forms of emotion dysregulation relate to perturbed neurodevelopment may benefit from alternate phenotyping strategies. One such strategy applies a bifactor latent variable approach that can parse shared versus unique mechanisms of these two phenotypes. Here, we aim to replicate and extend this approach and examine associations with neural structure in a large transdiagnostic sample of youth (N = 331; M = 13.57, SD = 2.69 years old; 45.92% male). FreeSurfer was used to extract cortical thickness, cortical surface area, and subcortical volume. The current findings replicated the bifactor model and demonstrate measurement invariance as a function of youth age and sex. There were no associations of youth's factor scores with cortical thickness, surface area, or subcortical volume. However, we found strong convergent and divergent validity between parent-reported irritability and anxiety factors with clinician-rated symptoms and impairment. A general negative affectivity factor was robustly associated with overall functional impairment across symptom domains. Together, these results support the utility of the bifactor model as an alternative phenotyping strategy for irritability and anxiety, which may aid in the development of targeted treatments.
Investigations into the existence of life in other parts of the cosmos find strong parallels with studies of the origin and evolution of life on our own planet. In this way, astrobiology and paleobiology are married by their common interest in disentangling the interconnections between life and the surrounding environment. In this way, a cross-point of both sciences is paleometry, which involves a myriad of imaging and geochemical techniques, usually non-destructive, applied to the investigation of the fossil record. In the last decades, paleometry has benefited from an unprecedented technological improvement, thus solving old questions and raising new ones. This advance has been paralleled by conceptual approaches and discoveries fuelled by technological evolution in astrobiological research. In this context, we present some new data and review recent advances on the employment of paleometry to investigations on paleobiology and astrobiology in Brazil in areas such biosignatures in Ediacaran microbial mats, biogenicity tests on enigmatic Ediacaran structures, research on Ediacaran metazoan biomineralization, fossil preservation in Cretaceous insects and fish, and finally the experimental study on the decay of fish to test the effect of distinct types of sediment on soft-tissue preservation, as well as the effects of early diagenesis on fish bone preservation.
The purpose of the current study was to use a mixed-methods approach to assess the perspective of cancer survivors on the bidirectional impact between cancer and their social contexts.
A fixed concurrent triangulation mixed-methods survey design was used with open- and closed-ended questions that were predetermined and administered to participants. Quantitative items included demographic questions and the Life Impact Checklist. Qualitative questions were designed to explore the bidirectional impact between the patient and specific contexts including spirituality/faith, the spousal/partner relationship, and the family. A cross-sectional descriptive approach was used to evaluate the quantitative items and the constant comparative method guided the analysis of open-ended questions.
Among 116 participants (mean age 58.4 years), the majority were female (66.7%) with breast cancer (27.9%). Nearly one-half the respondents endorsed a positive impact of cancer on their spirituality/faith, but qualitative results suggested less of a bidirectional impact. The importance of the spouse/partner during the cancer experience was emphasized, including the subthemes of instrumental and emotional support; however, there was often a negative impact of cancer on the spouse/partner relationship, including sexual functioning. Survivors indicated family members provided instrumental and emotional support, but not as regularly or directly as a spouse/partner.
Significance of results
Social contexts are important among cancer survivors, with many cancer survivors relying more on their spouse/partner than other family members for support. The cancer experience is stressful not only for survivors, but also for individuals in their social contexts and relationships.
The physical and mechanical properties of many industrially important polymers are profoundly influenced by their degree of crystallinity; such properties include flex modulus, tensile strength, percent elongation, and impact strength. Commonly used polymers influenced by their crystallinity level include polyethlene, polypropylene, polyesters, and nylons. Many of these materials are above their glass transition temperature at room temperature and would be useless were it not for their crystalline phase which typically has a melting point far above room temperature. The crystalline ‘ regions (domains) in these materials are frequently very small, typically in the nanometer range in diameter. These crystalline domains act as reinforcing fillers (in somewhat the same manner as carbon black In rubber) and give strength to the polymer.
OBJECTIVES/SPECIFIC AIMS: This study attempts to evaluate the drinking patterns and traits of individuals who partake in high intensity drinking, defined as binge drinking at 2 or more times the minimum binge count (4 drinks for females, 5 drinks for males). METHODS/STUDY POPULATION: We analyzed data from non-treatment seeking volunteers enrolled in NIAAA screening protocols. The sample included 706 males and 474 females ranging in age from 18 to 91. Subjects were assigned to one of four groups (Non-Binge, Level 1, Level 2, Level 3) based on the highest binge session reported in their Timeline Followback questionnaire. The criteria for each group were different for males and females based on the current NIAAA definitions of binge drinking. The cutoffs for females were 0-3 drinks for Non-Binge, 4-7 drinks for Level 1, 8-11 drinks for Level 2, and 12+ drinks for Level 3. The male drink cutoffs were 0-4, 5-9, 10-14, and 15+ respectively. We looked at various drinking measures (Timeline Followback, Self-Reported Effects of Alcohol (SRE), Alcohol Use Disorders Identification Test (AUDIT)) and trait measures (UPPS-P Impulsivity Scale, Barratt’s Impulsiveness Scale, Buss Perry Aggression Questionnaire) to identify mean differences between groups. RESULTS/ANTICIPATED RESULTS: There were significant differences in drinking patterns between the groups for both males and females. Number of drinking days, average drinks per drinking day, and number of heavy drinking days all increased as binge level increased. There were also significant differences between groups in males for trait measures. Level 2 and Level 3 bingers scored significantly higher on impulsivity and aggression than the Level 1 and Non-Binge groups. Ongoing analyses are examining differences among binge groups on other measures including SRE and AUDIT. Future analyses will explore potential mechanisms underlying the relationships between trait measures and binge drinking using structural equation modeling. DISCUSSION/SIGNIFICANCE OF IMPACT: This study found significant differences between high-intensity drinkers, or “super bingers”, and lighter binge and non-binge drinkers. Super bingers showed an overall heavier drinking pattern across measures. The elevated aggression, impulsivity, and overall heavy drinking patterns of super bingers suggest a behavioral profile that makes this group in particular at higher risk for developing alcohol use disorder and related problems. These traits and behaviors may also help identify targets for treatment interventions for alcohol use disorder.
OBJECTIVES/SPECIFIC AIMS: Alcohol use disorder (AUD) has previously been studied using Timeline Followback (TLFB) interview measures and administration of alcohol within laboratory sessions. However, most of those studies supplied alcohol orally and analyzed drinking across a range of drinking intensity and frequency measures. High intensity binge drinking, i.e., drinking alcohol at multiple levels of the binge threshold (5+ drinks for males, 4+ drinks for females) has been identified as a significant risk factor for developing AUD. In the present study, we examined the relationship between high intensity binge drinking with the behavioral and subjective response to intravenous alcohol in a lab study. METHODS/STUDY POPULATION: Two hundred participants completed a 90-Day TLFB interview, wherein the maximum number of drinks in a day established the participant’s binge level status as a Non-Binger (N = 37), Binge Level 1 (N = 96), Binge Level 2 (N = 44), or Binge Level 3 (N = 22). Binge Level 1 corresponds with at least one binge (4-7 drinks for women, 5-9 drinks for men); Binge Level 2 requires at least twice the binge level (8-11 drinks for women, 10-14 drinks for men); and Level 3 necessitates a participant to drink at least three times the binge level (12+ drinks for women, 15+ drinks for men) on one day. Non-Bingers had no binge level drinking in the 90-day interview. Participants also underwent a 150-minute intravenous-alcohol self-infusion, where participants would press a button to receive an infusion of an ethanol solution. During this, participants also completed subjective questionnaires including the Alcohol Urge Questionnaire (AUQ), Biphasic Alcohol Effects Scale (BAES), and Drug Effects Questionnaire (DEQ). Kruskal-Wallis and chi-square tests were used to examine the effect of group on alcohol infusion and subjective response measures. RESULTS/ANTICIPATED RESULTS: A chi-square test for association showed significant statistical differences by groups in reaching binge level status (0.08% breath alcohol content) during the alcohol infusion session in the lab, X2 (3) = 23.321, p < 0.001. However, mean difference was not significantly different between Binge Level 2 and Binge Level 3 (0 < 1 < 2 = 3). Binge level groups showed significant differences in the number of button presses during the lab session (H(3) = 36.955, p < 0.001), peak breath alcohol concentration in the lab session (H(3) = 19.870, p < 0.001), and total binges in the TLFB (H(3) = 90.296, p < 0.001). Increased self-administration measures were proportional to the binge intensity level across groups, with no differences between Binge Level 2 and Binge Level 3 (0 < 1 < 2 = 3). For subjective measures, a Kruskal-Wallis H median test showed statistically significant differences between groups in the AUQ score following the priming infusion, H(3) = 11.489, p = 0.009, with bingers at all levels reporting higher scores compared to non-bingers (0 < 1 = 2 = 3). There was also a statistically significant difference between groups in the BAES Stimulation score following the priming infusion, H(3) = 9.023, p = 0.029, with differences seen between non-bingers and level 2 and level 3 bingers (0 = 1 < 2 = 3). DISCUSSION/SIGNIFICANCE OF IMPACT: This study demonstrated that high intensity binge drinkers were more likely to reach binge level and overall greater alcohol consumption during a human lab alcohol administration study. Binge intensity level was also associated with higher stimulation and urge for alcohol following priming exposures, which may in turn drive the consumption of greater amounts of alcohol, which we know to be associated with greater risk for AUD.
To describe the epidemiology of surgical site infections (SSIs) after pediatric ambulatory surgery.
Observational cohort study with 60 days follow-up after surgery.
The study took place in 3 ambulatory surgical facilities (ASFs) and 1 hospital-based facility in a single pediatric healthcare network.
Children <18 years undergoing ambulatory surgery were included in the study. Of 19,777 eligible surgical encounters, 8,502 patients were enrolled.
Data were collected through parental interviews and from chart reviews. We assessed 2 outcomes: (1) National Healthcare Safety Network (NHSN)–defined SSI and (2) evidence of possible infection using a definition developed for this study.
We identified 21 NSHN SSIs for a rate of 2.5 SSIs per 1,000 surgical encounters: 2.9 per 1,000 at the hospital-based facility and 1.6 per 1,000 at the ASFs. After restricting the search to procedures completed at both facilities and adjustment for patient demographics, there was no difference in the risk of NHSN SSI between the 2 types of facilities (odds ratio, 0.7; 95% confidence interval, 0.2–2.3). Within 60 days after surgery, 404 surgical patients had some or strong evidence of possible infection obtained from parental interview and/or chart review (rate, 48 SSIs per 1,000 surgical encounters). Of 306 cases identified through parental interviews, 176 cases (57%) did not have chart documentation. In our multivariable analysis, older age and black race were associated with a reduced risk of possible infection.
The rate of NHSN-defined SSI after pediatric ambulatory surgery was low, although a substantial additional burden of infectious morbidity related to surgery might not have been captured by standard surveillance strategies and definitions.
This analysis was conducted to evaluate the evidence of the efficacy of iron biofortification interventions on iron status and functional outcomes. Iron deficiency is a major public health problem worldwide, with a disproportionate impact on women and young children, particularly those living in resource-limited settings. Biofortification, or the enhancing of micronutrient content in staple crops, is a promising and sustainable agriculture-based approach to improve nutritional status. Previous randomised efficacy trials and meta-analyses have demonstrated that iron-biofortification interventions improved iron biomarkers; however, no systematic reviews to date have examined the efficacy of biofortification interventions on health outcomes. We conducted a systematic review of the efficacy of iron-biofortified staple crops on iron status and functional outcomes: cognitive function (e.g. attention, memory) and physical performance. Five studies from three randomised efficacy trials (i.e. rice, pearl millet, beans) conducted in the Philippines, India and Rwanda were identified for inclusion in this review. Iron status (Hb, serum ferritin, soluble transferrin receptor, total body iron, α-1-acid glycoprotein) was measured at baseline and endline in each trial; two studies reported cognitive outcomes, and no studies reported other functional outcomes. Meta-analyses were conducted using DerSimonian and Laird random-effects methods. Iron-biofortified crop interventions significantly improved cognitive performance in attention and memory domains, compared with conventional crops. There were no significant effects on categorical outcomes such as iron deficiency or anaemia. Further studies are needed to determine the efficacy of iron-biofortified staple crops on human health, including additional functional outcomes and other high-risk populations.
Weeds have acquired evolutionary adaptations to the diverse crop and weed management strategies used in cropping systems. Therefore, changes in crop production practices such as conventional to organic systems, tillage-based to no-till systems, and diversity in crop rotations can result in differences in weed community composition that have management implications. A study was carried out to understand the weed community dynamics in a long-term alternative cropping systems study at Scott, SK, Canada. Long-term (18-yr) weed community composition data in wheat (Triticum aestivum L.) in ORG (organic), RED (reduced-input, no-till), and HIGH (high-input, conventional tillage) systems with three levels of crop rotation diversity, LOW (low diversity), DAG (diversified annual grains), and DAP (diversified annuals and perennials), were used to study the effect of different cropping systems and the effect of environment (random temporal effects) on residual weed community composition using the principal response curve (PRC) technique. The interaction between cropping systems and year-to-year random environmental changes was found to be the predominant factor causing fluctuations in weed community composition. Furthermore, the single most predominant factor influencing the weed composition was year-to-year random changes. Organic systems clearly differed from the two conventional systems in most years and had more diverse weed communities compared with the two conventional systems. The two conventional systems exhibited similar weed composition in most years. In this study, the use of the PRC method allowed capture of the real temporal dynamics reflected in the cropping systems by time interaction. This study further concludes that moving from a tillage-based, high-input conventional system to a no-till, reduced-input system did not cause significant changes in the weed community composition throughout the time period, but diversity in organic systems was high, probably due to increased occurrence of some difficult to control species.
This essay seeks to reinterpret both the gendered rhetoric of the First Red Scare as well as the reasons why many feminists came under attack in the years following World War I. It underscores the ways in which women's activist concerns were de-legitimized through accusations of Bolshevism, but also highlights the very real attractions that the Soviet system held for American women seeking peace, economic independence, voting rights, professional opportunity, and sexual freedom. Although a number of historians have demonstrated the ways in which a focus upon gender and women offers important insights into the First Red Scare, they have given only minimal attention to the Soviet Union's appeal, presumably wishing to avoid giving credence to inflammatory and exaggerated right-wing rhetoric. However, this tendency has the effect of distorting the historical record and, in particular, of eliding revolutionary Russia's role in fostering the American feminist imagination. Attention to several prominent targets of the First Red Scare, including Louise Bryant, Emma Goldman, and Rose Pastor Stokes, helps to clarify these dynamics.
Gluten is only partially digested by intestinal enzymes and can generate peptides that can alter intestinal permeability, facilitating bacterial translocation, thus affecting the immune system. Few studies addressed the role of diet with gluten in the development of colitis. Therefore, we investigate the effects of wheat gluten-containing diet on the evolution of sodium dextran sulphate (DSS)-induced colitis. Mice were fed a standard diet without (colitis group) or with 4·5 % wheat gluten (colitis + gluten) for 15 d and received DSS solution (1·5 %, w/v) instead of water during the last 7 d. Compared with the colitis group, colitis + gluten mice presented a worse clinical score, a larger extension of colonic injury area, and increased mucosal inflammation. Both intestinal permeability and bacterial translocation were increased, propitiating bacteria migration for peripheral organs. The mechanism by which diet with gluten exacerbates colitis appears to be related to changes in protein production and organisation in adhesion junctions and desmosomes. The protein α-E-catenin was especially reduced in mice fed gluten, which compromised the localisation of E-cadherin and β-catenin proteins, weakening the structure of desmosomes. The epithelial damage caused by gluten included shortening of microvilli, a high number of digestive vacuoles, and changes in the endosome/lysosome system. In conclusion, our results show that wheat gluten-containing diet exacerbates the mucosal damage caused by colitis, reducing intestinal barrier function and increasing bacterial translocation. These effects are related to the induction of weakness and disorganisation of adhesion junctions and desmosomes as well as shortening of microvilli and modification of the endocytic vesicle route.
Adherence to dietary guidelines (DG) may result in higher intake of polyphenols via increased consumption of fruits, vegetables and whole grains. We compared polyphenol dietary intake and urinary excretion between two intervention groups in the Cardiovascular risk REduction Study: Supported by an Integrated Dietary Approach study: a 12-week parallel-arm, randomised controlled trial (n 161; sixty-four males, ninety-seven females; aged 40–70 years). One group adhered to UK DG, whereas the other group consumed a representative UK diet (control). We estimated polyphenol dietary intake, using a 4-d food diary (4-DFD) and FFQ, and analysed 24-h polyphenol urinary excretion by liquid chromatography-tandem MS on a subset of participants (n 46 control; n 45 DG). A polyphenol food composition database for 4-DFD analysis was generated using Phenol-Explorer and USDA databases. Total polyphenol intake by 4-DFD at endpoint (geometric means with 95 % CI, adjusted for baseline and sex) was significantly higher in the DG group (1279 mg/d per 10 MJ; 1158, 1412) compared with the control group (1084 mg/d per 10 MJ; 980, 1197). The greater total polyphenol intake in the DG group was attributed to higher intake of anthocyanins, proanthocyanidins and hydroxycinnamic acids, with the primary food sources being fruits, cereal products, nuts and seeds. FFQ estimates of flavonoid intake also detected greater intake in DG compared with the control group. 24-h urinary excretion showed consistency with 4-DFD in their ability to discriminate between dietary intervention groups for six out of ten selected, individual polyphenols. In conclusion, following UK DG increased total polyphenol intake by approximately 20 %, but not all polyphenol subclasses corresponded with this finding.