To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To explore associations of whole grain and cereal fibre intake to CVD risk factors in Australian adults.
Cross-sectional analysis. Intakes of whole grain and cereal fibre were examined in association to BMI, waist circumference (WC), blood pressure (BP), serum lipid concentrations, C-reactive protein, systolic BP, fasting glucose and HbA1c.
Australian Health Survey 2011–2013.
A population-representative sample of 7665 participants over 18 years old.
Highest whole grain consumers (T3) had lower BMI (T0 26·8 kg/m2, T3 26·0 kg/m2, P < 0·0001) and WC (T0 92·2 cm, T3 90·0 cm, P = 0·0005) compared with non-consumers (T0), although only WC remained significant after adjusting for dietary and lifestyle factors, including cereal fibre intake (P = 0·03). Whole grain intake was marginally inversely associated with fasting glucose (P = 0·048) and HbA1c (P = 0·03) after adjusting for dietary and lifestyle factors, including cereal fibre intake. Cereal fibre intake was inversely associated with BMI (P < 0·0001) and WC (P < 0·0008) and tended to be inversely associated with total cholesterol, LDL-cholesterol and apo-B concentrations, although associations were attenuated after further adjusting for BMI and lipid-lowering medication use.
The extent to which cereal fibre is responsible for the CVD-protective associations of whole grains may vary depending on the mediators involved. Longer-term intervention studies directly comparing whole grain and non-whole grain diets of similar cereal fibre contents (such as through the use of bran or added-fibre refined grain products) are needed to confirm independent effects.
To investigate how intakes of whole grains and cereal fibre were associated to risk factors for CVD in UK adults.
Cross-sectional analyses examined associations between whole grain and cereal fibre intakes and adiposity measurements, serum lipid concentrations, C-reactive protein, systolic blood pressure, fasting glucose, HbA1c, homocysteine and a combined CVD relative risk score.
The National Diet and Nutrition Survey (NDNS) Rolling Programme 2008–2014.
A nationally representative sample of 2689 adults.
Participants in the highest quartile (Q4) of whole grain intake had lower waist–hip ratio (Q1 0·872; Q4 0·857; P = 0·04), HbA1c (Q1 5·66 %; Q4 5·47 %; P = 0·01) and homocysteine (Q1 9·95 µmol/l; Q4 8·76 µmol/l; P = 0·01) compared with participants in the lowest quartile (Q1), after adjusting for dietary and lifestyle factors, including cereal fibre intake. Whole grain intake was inversely associated with C-reactive protein using multivariate analysis (P = 0·02), but this was not significant after final adjustment for cereal fibre. Cereal fibre intake was also inversely associated with waist–hip ratio (P = 0·03) and homocysteine (P = 0·002) in multivariate analysis.
Similar inverse associations between whole grain and cereal fibre intakes to CVD risk factors suggest the relevance of cereal fibre in the protective effects of whole grains. However, whole grain associations often remained significant after adjusting for cereal fibre intake, suggesting additional constituents may be relevant. Intervention studies are needed to compare cereal fibre intake from non-whole grain sources to whole grain intake.
National guidance cautions against low-intensity interventions for people with personality disorder, but evidence from trials is lacking.
To test the feasibility of conducting a randomised trial of a low-intensity intervention for people with personality disorder.
Single-blind, feasibility trial (trial registration: ISRCTN14994755). We recruited people aged 18 or over with a clinical diagnosis of personality disorder from mental health services, excluding those with a coexisting organic or psychotic mental disorder. We randomly allocated participants via a remote system on a 1:1 ratio to six to ten sessions of Structured Psychological Support (SPS) or to treatment as usual. We assessed social functioning, mental health, health-related quality of life, satisfaction with care and resource use and costs at baseline and 24 weeks after randomisation.
A total of 63 participants were randomly assigned to either SPS (n = 33) or treatment as usual (n = 30). Twenty-nine (88%) of those in the active arm of the trial received one or more session (median 7). Among 46 (73%) who were followed up at 24 weeks, social dysfunction was lower (−6.3, 95% CI −12.0 to −0.6, P = 0.03) and satisfaction with care was higher (6.5, 95% CI 2.5 to 10.4; P = 0.002) in those allocated to SPS. Statistically significant differences were not found in other outcomes. The cost of the intervention was low and total costs over 24 weeks were similar in both groups.
SPS may provide an effective low-intensity intervention for people with personality disorder and should be tested in fully powered clinical trials.
Lieder and Griffiths rightly urge that computational cognitive models be constrained by resource usage, but they should go further. The brain's primary function is to regulate resource usage. As a consequence, resource usage should not simply select among algorithmic models of “aspects of cognition.” Rather, “aspects of cognition” should be understood as existing in the service of resource management.
The regeneration niche defines the specific environmental requirements of the early phases of a plant's life cycle. It is critical for the long-term persistence of plant populations, particularly for obligate seeders that are highly vulnerable to stochastic events in fire-prone ecosystems. Here, we assessed germination characteristics and the relationship between population structure, soil seed bank density and fire response in Stachystemon vinosus (Euphorbiaceae), a rare endemic shrub from Western Australia, from burnt and long unburnt habitats. Many plants in long unburnt habitat were similar in size to those in recently burnt habitat. Soil seed bank density was related to plant abundance and fire history with density lower in burnt than unburnt sites. Thus, inter-fire recruitment may play a critical role in the requirements of the study species. To assess the dormancy status and germination requirements we used a ‘move-along’ experiment with temperatures from six seasonal phases of the year. Seeds were incubated under light and dark conditions, with and without smoked water, and with and without dry after-ripening. Germination was most effective when seeds were treated with smoked water and incubated in the dark at temperatures resembling autumn/winter conditions. After-ripening increased germination in light and dark incubated seeds in the absence of smoked water but was unnecessary for optimal germination in smoked water treated seeds. Irrespective of treatment, seeds showed a requirement for cooler temperatures for germination. These results suggest that rising temperatures and changes in fire regime associated with global warming may alter future germination responses of Stachystemon vinosus.
Consumption of certain berries appears to slow postprandial glucose absorption, attributable to polyphenols, which may benefit exercise and cognition, reduce appetite and/or oxidative stress. This randomised, crossover, placebo-controlled study determined whether polyphenol-rich fruits added to carbohydrate-based foods produce a dose-dependent moderation of postprandial glycaemic, glucoregulatory hormone, appetite and ex vivo oxidative stress responses. Twenty participants (eighteen males/two females; 24 (sd 5) years; BMI: 27 (sd 3) kg/m2) consumed one of five cereal bars (approximately 88 % carbohydrate) containing no fruit ingredients (reference), freeze-dried black raspberries (10 or 20 % total weight; LOW-Rasp and HIGH-Rasp, respectively) and cranberry extract (0·5 or 1 % total weight; LOW-Cran and HIGH-Cran), on trials separated by ≥5 d. Postprandial peak/nadir from baseline (Δmax) and incremental postprandial AUC over 60 and 180 min for glucose and other biochemistries were measured to examine the dose-dependent effects. Glucose AUC0–180 min trended towards being higher (43 %) after HIGH-Rasp v. LOW-Rasp (P=0·06), with no glucose differences between the raspberry and reference bars. Relative to reference, HIGH-Rasp resulted in a 17 % lower Δmax insulin, 3 % lower C-peptide (AUC0–60 min and 3 % lower glucose-dependent insulinotropic polypeptide (AUC0–180 min) P<0·05. No treatment effects were observed for the cranberry bars regarding glucose and glucoregulatory hormones, nor were there any treatment effects for either berry type regarding ex vivo oxidation, appetite-mediating hormones or appetite. Fortification with freeze-dried black raspberries (approximately 25 g, containing 1·2 g of polyphenols) seems to slightly improve the glucoregulatory hormone and glycaemic responses to a high-carbohydrate food item in young adults but did not affect appetite or oxidative stress responses at doses or with methods studied herein.
Several grass and broadleaf weed species around the world have evolved multiple-herbicide resistance at alarmingly increasing rates. Research on the biochemical and molecular resistance mechanisms of multiple-resistant weed populations indicate a prevalence of herbicide metabolism catalyzed by enzyme systems such as cytochrome P450 monooxygenases and glutathione S-transferases and, to a lesser extent, by glucosyl transferases. A symposium was conducted to gain an understanding of the current state of research on metabolic resistance mechanisms in weed species that pose major management problems around the world. These topics, as well as future directions of investigations that were identified in the symposium, are summarized herein. In addition, the latest information on selected topics such as the role of safeners in inducing crop tolerance to herbicides, selectivity to clomazone, glyphosate metabolism in crops and weeds, and bioactivation of natural molecules is reviewed.
Whole grain intake is associated with lower CVD risk in epidemiological studies. It is unclear to what extent cereal fibre, located primarily within the bran, is responsible. This review aimed to evaluate association between intake of whole grain, cereal fibre and bran and CVD risk. Academic databases were searched for human studies published before March 2018. Observational studies reporting whole grain and cereal fibre or bran intake in association with any CVD-related outcome were included. Studies were separated into those defining whole grain using a recognised definition (containing the bran, germ and endosperm in their natural proportions) (three studies, seven publications) and those using an alternative definition, such as including added bran as a whole grain source (eight additional studies, thirteen publications). Intake of whole grain, cereal fibre and bran were similarly associated with lower risk of CVD-related outcomes. Within the initial analysis, where studies used the recognised whole grain definition, results were less likely to show attenuation after adjustment for cereal fibre content. The fibre component of grain foods appears to play an important role in protective effects of whole grains. Adjusting for fibre content, associations remained, suggesting that additional components within the whole grain, and the bran component, may contribute to cardio-protective association. The limited studies and considerable discrepancy in defining and calculating whole grain intake limit conclusions. Future research should utilise a consistent definition and methodical approach of calculating whole grain intake to contribute to a greater body of consistent evidence surrounding whole grains.
Utilising routine surveillance data, this study presents a method for generating a baseline comparison that can be used in future foodborne outbreak investigations following a case–case methodology. Salmonella and Campylobacter cases (2012–2015) from Maricopa County, AZ were compared to determine differences in risk factors, symptoms and demographics. For foods and other risk factors, adjusted odds ratios were developed using Campylobacter as the reference. Comparisons were also made for three major Salmonella subtypes, Typhimurium, Enteritidis and Poona as compared with Campylobacter. Salmonella cases were younger, while Campylobacter cases were more Hispanic and female. Campylobacter cases reported consuming peppers, sprouts, poultry, queso fresco, eggs and raw nuts more and reported contact with animal products, birds, visiting a farm or dairy, owning a pet, a sick pet, swimming in a river, lake or pond, or handling multiple raw meats more. Salmonella cases reported visiting a petting zoo and contact with a reptile more. There were significant variations by Salmonella subtype in both foods and exposures. We recommend departments conduct this analysis to generate a baseline comparison and a running average of relevant odds ratios allowing staff to focus on trace-back of contaminated food items earlier in the outbreak investigation process.
Seven half-day regional listening sessions were held between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide-resistance management. The objective of the listening sessions was to connect with stakeholders and hear their challenges and recommendations for addressing herbicide resistance. The coordinating team hired Strategic Conservation Solutions, LLC, to facilitate all the sessions. They and the coordinating team used in-person meetings, teleconferences, and email to communicate and coordinate the activities leading up to each regional listening session. The agenda was the same across all sessions and included small-group discussions followed by reporting to the full group for discussion. The planning process was the same across all the sessions, although the selection of venue, time of day, and stakeholder participants differed to accommodate the differences among regions. The listening-session format required a great deal of work and flexibility on the part of the coordinating team and regional coordinators. Overall, the participant evaluations from the sessions were positive, with participants expressing appreciation that they were asked for their thoughts on the subject of herbicide resistance. This paper details the methods and processes used to conduct these regional listening sessions and provides an assessment of the strengths and limitations of those processes.
Herbicide resistance is ‘wicked’ in nature; therefore, results of the many educational efforts to encourage diversification of weed control practices in the United States have been mixed. It is clear that we do not sufficiently understand the totality of the grassroots obstacles, concerns, challenges, and specific solutions needed for varied crop production systems. Weed management issues and solutions vary with such variables as management styles, regions, cropping systems, and available or affordable technologies. Therefore, to help the weed science community better understand the needs and ideas of those directly dealing with herbicide resistance, seven half-day regional listening sessions were held across the United States between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide resistance management. The major goals of the sessions were to gain an understanding of stakeholders and their goals and concerns related to herbicide resistance management, to become familiar with regional differences, and to identify decision maker needs to address herbicide resistance. The messages shared by listening-session participants could be summarized by six themes: we need new herbicides; there is no need for more regulation; there is a need for more education, especially for others who were not present; diversity is hard; the agricultural economy makes it difficult to make changes; and we are aware of herbicide resistance but are managing it. The authors concluded that more work is needed to bring a community-wide, interdisciplinary approach to understanding the complexity of managing weeds within the context of the whole farm operation and for communicating the need to address herbicide resistance.
Fe is an essential nutrient for many bacteria, and Fe supplementation has been reported to affect the composition of the gut microbiota in both Fe-deficient and Fe-replete individuals outside pregnancy. This study examined whether the dose of Fe in pregnancy multivitamin supplements affects the overall composition of the gut microbiota in overweight and obese pregnant women in early pregnancy. Women participating in the SPRING study with a faecal sample obtained at 16 weeks’ gestation were included in this substudy. For each subject, the brand of multivitamin used was recorded. Faecal microbiome composition was assessed by 16S rRNA sequencing and analysed with the QIIME software suite. Dietary intake of Fe was assessed using a FFQ at 16 weeks’ gestation. Women were grouped as receiving low (<60 mg/d, n 94) or high (≥60 mg/d; n 65) Fe supplementation. The median supplementary Fe intake in the low group was 10 (interquartile range (IQR) 5–10) v. 60 (IQR 60–60) mg/d in the high group (P<0·001). Dietary Fe intake did not differ between the groups (10·0 (IQR 7·4–13·3) v. 9·8 (IQR 8·2–13·2) mg/d). Fe supplementation did not significantly affect the composition of the faecal microbiome at any taxonomic level. Network analysis showed that the gut microbiota in the low Fe supplementation group had a higher predominance of SCFA producers. Pregnancy multivitamin Fe content has a minor effect on the overall composition of the gut microbiota of overweight and obese pregnant women at 16 weeks’ gestation.
Polycystic ovary syndrome (PCOS) affects ~7% of reproductive age women. Although its etiology is unknown, in animals, excess prenatal testosterone (T) exposure induces PCOS-like phenotypes. While measuring fetal T in humans is infeasible, demonstrating in utero androgen exposure using a reliable newborn biomarker, anogenital distance (AGD), would provide evidence for a fetal origin of PCOS and potentially identify girls at risk. Using data from a pregnancy cohort (The Infant Development and Environment Study), we tested the novel hypothesis that infant girls born to women with PCOS have longer AGD, suggesting higher fetal T exposure, than girls born to women without PCOS. During pregnancy, women reported whether they ever had a PCOS diagnosis. After birth, infant girls underwent two AGD measurements: anofourchette distance (AGD-AF) and anoclitoral distance (AGD-AC). We fit adjusted linear regression models to examine the association between maternal PCOS and girls’ AGD. In total, 300 mother–daughter dyads had complete data and 23 mothers reported PCOS. AGD was longer in the daughters of women with a PCOS diagnosis compared with daughters of women with no diagnosis (AGD-AF: β=1.21, P=0.05; AGD-AC: β=1.05, P=0.18). Results were stronger in analyses limited to term births (AGD-AF: β=1.65, P=0.02; AGD-AC: β=1.43, P=0.09). Our study is the first to examine AGD in offspring of women with PCOS. Our results are consistent with findings that women with PCOS have longer AGD and suggest that during PCOS pregnancies, daughters may experience elevated T exposure. Identifying the underlying causes of PCOS may facilitate early identification and intervention for those at risk.
Solvency II is currently one of the most sophisticated insurance regulatory regimes in the world. It is built around the principles of market consistency and embedding strong risk management and governance within insurance companies. For business with long-term guarantees, the original basis produced outcomes that were unacceptable to the member states. The original design was amended through Omnibus II. The working party has looked back at the outcome of the final regulation and comments on how well Solvency II has fared, principally from a UK perspective, relative to its initial goals of improved consumer protection, harmonisation, effective risk management and financial stability. We review Pillar 1’s market consistent valuation (including the risk margin and transitional measures) as well as the capital requirements (including internal models). We look at the impact this has on asset and liability management, pro-cyclicality and product design. We look at Pillars 2 and 3 in respect of the Own Risk and Solvency Assessment, liquidity and disclosure. Finally, we stand back and look at harmonisation and the implications of Brexit. In summary we conclude that Solvency II represents a huge improvement over Solvency I although it has not fully achieved the goals it aspired to. There are acknowledged shortfalls and imperfections where adjustments to Solvency II are likely. There remain other concerns around pro-cyclicality, and the appropriateness of market consistency is still open to criticism. It is hoped that the paper and the discussion that goes with it provide an insight into where Solvency II has taken European Insurance regulation and the directions in which it could evolve.
OBJECTIVES/SPECIFIC AIMS: Accumulation of cholesterol-laden macrophages in arterial walls leads to atherosclerosis. LXRs induce expression of genes that are atheroprotective in macrophages including CCR7, a chemokine receptor that promotes their emigration from the plaque. CCR7 expression has been shown to be negatively regulated by phosphorylation of LXRα at S198 and is reduced in diabetic mice that show impaired plaque regression. I hypothesized that LXRα phosphorylation at S198 diminishes macrophage emigration from atherosclerotic plaque and contributes to impaired regression in diabetes. METHODS/STUDY POPULATION: Inducible LXRα S198A phosphorylation deficient knock in mouse were used as donors for bone marrow transplantation into mice prone to develop atherosclerosis. Plaques were developed by placing mice on western diet; and regression was induced by lowering their lipid levels. Aortic plaques were then analyzed by using morphometric, histological, and molecular analyses in control and diabetic mice expressing either LXRα WT or LXRα S198A during regression. RESULTS/ANTICIPATED RESULTS: Surprisingly, lack of phosphorylation increased plaque macrophage content and impaired regression under normoglycemic condition; however, it did not exacerbate diabetic regression. Plaques in diabetic mice were associated with increased LXRα S198 phosphorylation. Consistent with this, LXRα phosphorylation is enhanced in macrophages cultured under hyperglycemic conditions indicating glucose-dependent regulation of LXRα phosphorylation. Monocyte trafficking studies reveal that lack of phosphorylation and diabetes independently increase recruitment of monocytes in the plaque that might contribute to increased macrophage content. Importantly, I found that diabetes also increases macrophage retention in the plaque, which is reversed in the absence of phosphorylation. We predict that this increased retention results from inhibition of emigration of plaque macrophages through enhanced phosphorylation in diabetes. DISCUSSION/SIGNIFICANCE OF IMPACT: These findings suggest that inhibiting LXRα phosphorylation could be beneficial in diabetic atherosclerosis to reverse the accumulation of macrophages in the plaque. This study imparts insight on regulation of plaque macrophage trafficking through LXRα S198 phosphorylation.
We aimed to identify sociodemographic, lifestyle and behavioural determinants of consumption of sugar-sweetened beverages (SSB) and artificially sweetened beverages (ASB) among adults in Cambridgeshire, UK.
Cross-sectional data were obtained from a cohort of 9991 adults born between 1950 and 1975. An FFQ was used to assess consumption of beverages and other dietary factors. Multivariable logistic regression was used to examine potential determinants of consuming SSB and ASB (≥1 serving/d).
Recruitment from general practice surgeries to participate in the ongoing population-based Fenland Study.
Adults (n 9991) aged 30–64 years from three areas of Cambridgeshire, UK.
Prevalence estimates for daily SSB and ASB consumption were 20·4 % (n 2041) and 8·9 % (n 893), respectively. SSB consumption (OR; 95 % CI) was more common in men than women (1·33; CI 1·17, 1·50) and among those reporting lower income (<£20 000/year) than those reporting higher income (>£40 000/year; 1·31; 1·09, 1·58). In contrast, daily ASB consumption was more common among women than men (1·62; 1·34, 1·96), those on weight-loss diets than those who were not (2·58; 2·05, 3·24) and those reporting higher income than lower income (1·53; 1·16, 2·00). Factors associated with higher consumption of each of SSB and ASB included being a younger adult, being overweight/obese, having shorter education, eating meals or snack foods while watching television, and skipping breakfast (P<0·05 each).
Frequent consumers of SSB and ASB differ by several sociodemographic characteristics. However, increased BMI, younger age and unhealthy eating behaviours are common to both groups.
Sauropodomorpha included the largest known terrestrial vertebrates and was the first dinosaur clade to achieve a global distribution. This success is associated with their early adoption of herbivory, and sauropod gigantism has been hypothesized to be a specialization for bulk feeding and obligate high-fiber herbivory. Here, we apply a combination of biomechanical character analysis and comparative phylogenetic methods with the aim of quantifying the evolutionary mechanics of the sauropodomorph feeding apparatus. We test for the role of convergence to common feeding function and divergence toward functional optima across sauropodomorph evolution, quantify the rate of evolution for functional characters, and test for coincident evolutionary rate shifts in craniodental functional characters and body mass. Results identify a functional shift toward increased cranial robustness, increased bite force, and the onset of static occlusion at the base of the Sauropoda, consistent with a shift toward bulk feeding. Trends toward similarity in functional characters are observed in Diplodocoidea and Titanosauriformes. However, diplodocids and titanosaurs retain significant craniodental functional differences, and evidence for convergent adoption of a common “adaptive zone” between them is weak. Modeling of craniodental character and body-mass evolution demonstrates that these functional shifts were not correlated with evolutionary rate shifts. Instead, a significant correlation between body mass and characters related to bite force and cranial robustness suggests a correlated-progression evolutionary mode, with positive-feedback loops between body mass and dietary specializations fueling sauropod gigantism.
No existing models of alcohol prevention concurrently adopt universal and selective approaches. This study aims to evaluate the first combined universal and selective approach to alcohol prevention.
A total of 26 Australian schools with 2190 students (mean age: 13.3 years) were randomized to receive: universal prevention (Climate Schools); selective prevention (Preventure); combined prevention (Climate Schools and Preventure; CAP); or health education as usual (control). Primary outcomes were alcohol use, binge drinking and alcohol-related harms at 6, 12 and 24 months.
Climate, Preventure and CAP students demonstrated significantly lower growth in their likelihood to drink and binge drink, relative to controls over 24 months. Preventure students displayed significantly lower growth in their likelihood to experience alcohol harms, relative to controls. While adolescents in both the CAP and Climate groups demonstrated slower growth in drinking compared with adolescents in the control group over the 2-year study period, CAP adolescents demonstrated faster growth in drinking compared with Climate adolescents.
Findings support universal, selective and combined approaches to alcohol prevention. Particularly novel are the findings of no advantage of the combined approach over universal or selective prevention alone.
We agree with Lake and colleagues on their list of “key ingredients” for building human-like intelligence, including the idea that model-based reasoning is essential. However, we favor an approach that centers on one additional ingredient: autonomy. In particular, we aim toward agents that can both build and exploit their own internal models, with minimal human hand engineering. We believe an approach centered on autonomous learning has the greatest chance of success as we scale toward real-world complexity, tackling domains for which ready-made formal models are not available. Here, we survey several important examples of the progress that has been made toward building autonomous agents with human-like abilities, and highlight some outstanding challenges.
Morphological responses of nonmammalian herbivores to external ecological drivers have not been quantified over extended timescales. Herbivorous nonavian dinosaurs are an ideal group to test for such responses, because they dominated terrestrial ecosystems for more than 155 Myr and included the largest herbivores that ever existed. The radiation of dinosaurs was punctuated by several ecologically important events, including extinctions at the Triassic/Jurassic (Tr/J) and Jurassic/Cretaceous (J/K) boundaries, the decline of cycadophytes, and the origin of angiosperms, all of which may have had profound consequences for herbivore communities. Here we present the first analysis of morphological and biomechanical disparity for sauropodomorph and ornithischian dinosaurs in order to investigate patterns of jaw shape and function through time. We find that morphological and biomechanical mandibular disparity are decoupled: mandibular shape disparity follows taxonomic diversity, with a steady increase through the Mesozoic. By contrast, biomechanical disparity builds to a peak in the Late Jurassic that corresponds to increased functional variation among sauropods. The reduction in biomechanical disparity following this peak coincides with the J/K extinction, the associated loss of sauropod and stegosaur diversity, and the decline of cycadophytes. We find no specific correspondence between biomechanical disparity and the proliferation of angiosperms. Continual ecological and functional replacement of pre-existing taxa accounts for disparity patterns through much of the Cretaceous, with the exception of several unique groups, such as psittacosaurids that are never replaced in their biomechanical or morphological profiles.