To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Cereal products provide 50 % of iron and 30 % of zinc in the UK diet. However, despite having high content, the bioavailability of minerals from cereals is low. This review discusses strategies to increase mineral bioavailability from cereal-based foods. Iron and zinc are localised to specific tissue structures within cereals; however, the cell walls of these structures are resistant to digestion in the human gastrointestinal tract and therefore the bioaccessibility of these essential minerals from foods for absorption in the intestine is limited. In addition, minerals are stored in cereals bound to phytate, which is the main dietary inhibitor of mineral absorption. Recent research has focused on ways to enhance mineral bioavailability from cereals. Current strategies include disruption of plant cell walls to increase mineral release (bioaccessibility) during digestion; increasing the mineral:phytate ratio either by increasing the mineral content through conventional breeding and/or agronomic biofortification, or by reducing phytate levels; and genetic biofortification to increase the mineral content in the starchy endosperm, which is used to produce white wheat flour. While much of this work is at an early stage, there is potential for these strategies to lead to the development of cereal-based foods with enhanced nutritional qualities that could address the low mineral status in the UK and globally.
Mental health problems are elevated in autistic individuals but there is limited evidence on the developmental course of problems across childhood. We compare the level and growth of anxious-depressed, behavioral and attention problems in an autistic and typically developing (TD) cohort.
Latent growth curve models were applied to repeated parent-report Child Behavior Checklist data from age 2–10 years in an inception cohort of autistic children (Pathways, N = 397; 84% boys) and a general population TD cohort (Wirral Child Health and Development Study; WCHADS; N = 884, 49% boys). Percentile plots were generated to quantify the differences between autistic and TD children.
Autistic children showed elevated levels of mental health problems, but this was substantially reduced by accounting for IQ and sex differences between the autistic and TD samples. There was small differences in growth patterns; anxious-depressed problems were particularly elevated at preschool and attention problems at late childhood. Higher family income predicted lower base-level on all three dimensions, but steeper increase of anxious-depressed problems. Higher IQ predicted lower level of attention problems and faster decline over childhood. Female sex predicted higher level of anxious-depressed and faster decline in behavioral problems. Social-affect autism symptom severity predicted elevated level of attention problems. Autistic girls' problems were particularly elevated relative to their same-sex non-autistic peers.
Autistic children, and especially girls, show elevated mental health problems compared to TD children and there are some differences in predictors. Assessment of mental health should be integrated into clinical practice for autistic children.
We assessed the prevalence of antibiotic prescriptions among ambulatory patients tested for coronavirus disease 2019 (COVID-19) in a large public US healthcare system and found a low overall rate of antibiotic prescriptions (6.7%). Only 3.8% of positive severe acute respiratory coronavirus virus 2 (SARS-CoV-2) tests were associated with an antibiotic prescription within 7 days.
Numerous theories posit different core features to borderline personality disorder (BPD). Recent advances in network analysis provide a method of examining the relative centrality of BPD symptoms, as well as examine the replicability of findings across samples. Additionally, despite the increase in research supporting the validity of BPD in adolescents, clinicians are reluctant to diagnose BPD in adolescents. Establishing the replicability of the syndrome across adolescents and adults informs clinical practice and research. This study examined the stability of BPD symptom networks and centrality of symptoms across samples varying in age and clinical characteristics.
Cross-sectional analyses of BPD symptoms from semi-structured diagnostic interviews from the Collaborative Longitudinal Study of Personality Disorders (CLPS), the Methods to Improve Diagnostic Assessment and Service (MIDAS) study, and an adolescent clinical sample. Network attributes, including edge (partial association) strength and node (symptom) expected influence, were compared.
The three networks were largely similar and strongly correlated. Affective instability and identity disturbance emerged as relatively central symptoms across the three samples, and relationship difficulties across adult networks. Differences in network attributes were more evident between networks varying both in age and in BPD symptom severity level.
Findings highlight the relative importance of affective, identity, and relationship symptoms, consistent with several leading theories of BPD. The network structure of BPD symptoms appears generally replicable across multiple large samples including adolescents and adults, providing further support for the validity of the diagnosis across these developmental phases.
Transforming towards global sustainability requires a dramatic acceleration of social change. Hence, there is growing interest in finding ‘positive tipping points’ at which small interventions can trigger self-reinforcing feedbacks that accelerate systemic change. Examples have recently been seen in power generation and personal transport, but how can we identify positive tipping points that have yet to occur? We synthesise theory and examples to provide initial guidelines for creating enabling conditions, sensing when a system can be positively tipped, who can trigger it, and how they can trigger it. All of us can play a part in triggering positive tipping points.
Recent work on positive tipping points towards sustainability has focused on social-technological systems and the agency of policymakers to tip change, whilst earlier work identified social-ecological positive feedbacks triggered by diverse actors. We bring these together to consider positive tipping points across social-technological-ecological systems and the potential for multiple actors and interventions to trigger them. Established theory and examples provide several generic mechanisms for triggering tipping points. From these we identify specific enabling conditions, reinforcing feedbacks, actors and interventions that can contribute to triggering positive tipping points in the adoption of sustainable behaviours and technologies. Actions that can create enabling conditions for positive tipping include targeting smaller populations, altering social network structure, providing relevant information, reducing price, improving performance, desirability and accessibility, and coordinating complementary technologies. Actions that can trigger positive tipping include social, technological and ecological innovations, policy interventions, public investment, private investment, broadcasting public information, and behavioural nudges. Positive tipping points can help counter widespread feelings of disempowerment in the face of global challenges and help unlock ‘paralysis by complexity’. A key research agenda is to consider how different agents and interventions can most effectively work together to create system-wide positive tipping points whilst ensuring a just transformation.
Social media summary
We identify key actors and actions that can enable and trigger positive tipping points towards global sustainability.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
This study aimed to investigate general factors associated with prognosis regardless of the type of treatment received, for adults with depression in primary care.
We searched Medline, Embase, PsycINFO and Cochrane Central (inception to 12/01/2020) for RCTs that included the most commonly used comprehensive measure of depressive and anxiety disorder symptoms and diagnoses, in primary care depression RCTs (the Revised Clinical Interview Schedule: CIS-R). Two-stage random-effects meta-analyses were conducted.
Twelve (n = 6024) of thirteen eligible studies (n = 6175) provided individual patient data. There was a 31% (95%CI: 25 to 37) difference in depressive symptoms at 3–4 months per standard deviation increase in baseline depressive symptoms. Four additional factors: the duration of anxiety; duration of depression; comorbid panic disorder; and a history of antidepressant treatment were also independently associated with poorer prognosis. There was evidence that the difference in prognosis when these factors were combined could be of clinical importance. Adding these variables improved the amount of variance explained in 3–4 month depressive symptoms from 16% using depressive symptom severity alone to 27%. Risk of bias (assessed with QUIPS) was low in all studies and quality (assessed with GRADE) was high. Sensitivity analyses did not alter our conclusions.
When adults seek treatment for depression clinicians should routinely assess for the duration of anxiety, duration of depression, comorbid panic disorder, and a history of antidepressant treatment alongside depressive symptom severity. This could provide clinicians and patients with useful and desired information to elucidate prognosis and aid the clinical management of depression.
Overweight and obesity are universal health challenges. Recent evidence emphasises the potential benefits of addressing psychological factors associated with obesity in dietary programmes. This pilot study investigated the efficacy and acceptability of a combined online and face-to-face dietary intervention that used self-compassion, goal-setting and self-monitoring to improve dietary behaviour, as well as psychological factors associated with dietary behaviour.
Embedded mixed methods including a 4-week before-after trial and a one-on-one interview. Quantitative outcomes of the study were the levels of self-compassion; eating pathology; depression, anxiety and stress; and dietary intake. Qualitative outcomes were participants’ perceptions about the acceptability of the intervention.
UNSW Kensington campus.
Fourteen participants with overweight and obesity aged between 18 and 55 years old.
Results showed that the intervention significantly improved self-compassion and some aspects of dietary intake (e.g. decrease in energy intake) at Week Four compared with Week Zero. Some aspects of eating pathology also significantly decreased (e.g. Eating Concern). However, changes in self-compassion over the 4 weeks did not significantly predict Week Four study outcomes, except for level of stress. Most participants found self-compassion, goal-setting and self-monitoring to be essential for dietary behaviour change. However, participants also indicated that an online programme needed to be efficient, simple and interactive.
In conclusion, the current study provides preliminary but promising findings of an effective and acceptable combined online and face-to-face intervention that used self-compassion, goal-setting and self-monitoring to improve dietary habits. However, the results need to be examined in future long-term randomised controlled trials.
Academic Medical Centers (AMCs) offer patient care and perform research. Increasingly, AMCs advertise to the public in order to garner income that can support these dual missions. In what follows, we raise concerns about the ways that advertising blurs important distinctions between them. Such blurring is detrimental to AMC efforts to fulfill critically important ethical responsibilities pertaining both to science communication and clinical research, because marketing campaigns can employ hype that weakens research integrity and contributes to therapeutic misconception and misestimation, undermining the informed consent process that is essential to the ethical conduct of research. We offer ethical analysis of common advertising practices that justify these concerns. We also suggest the need for a deliberative body convened by the Association of American Medical Colleges and others to develop a set of voluntary guidelines that AMCs can use to avoid in the future, the problems found in many current AMC advertising practices.
Some UK insurers have been using real-world economic scenarios for more than 30 years. Popular approaches have included random walks, time series models, arbitrage-free models with added risk premiums or 1-year Value at Risk distribution fits. Based on interviews with experienced practitioners as well as historical documents and meeting minutes, this paper traces historical model evolution in the United Kingdom and abroad. We examine the possible catalysts for changes in modelling practice with a particular emphasis on regulatory and socio-cultural influences. We apply past lessons to provide some guidance to the direction of capital market modelling in future, which has been key for business and strategy decisions.
Self-screening using an electronic version of the Malnutrition Universal Screening Tool (‘MUST’) has been developed but its implementation requires investigation. A total of 100 outpatients (mean age 50 (sd 16) years; 57 % male) self-screened with an electronic version of ‘MUST’ and were then screened by a healthcare professional (HCP) to assess concurrent validity. Ease of use, time to self-screen and prevalence of malnutrition were also assessed. A further twenty outpatients (mean age 54 (sd 15) years; 55 % male) examined preference between self- screening with paper and electronic versions of ‘MUST’. For the three-category classification of ‘MUST’ (low, medium and high risk), agreement between electronic self-screening and HCP screening was 94 % (κ=0·74, se 0·092; P<0·001). For the two-category classification (low risk; medium+high risk) agreement was 96 % (κ=0·82, se 0·085; P<0·001), comparable with the previously reported paper-based self-screening. In all, 15 % of patients categorised themselves ‘at risk’ of malnutrition (5 % medium, 10 % high). Electronic self-screening took 3 min (sd 1·2 min), 40 % faster than previously reported for the paper-based version. Patients found the tool easy or very easy to understand (99 %) and complete (98 %). Patients that assessed both tools found the electronic tool easier to complete (65 %) and preferred it (55 %) to the paper version. Electronic self-screening using ‘MUST’ in a heterogeneous group of hospital outpatients is acceptable, user-friendly and has ‘substantial to almost-perfect’ agreement with HCP screening. The electronic format appears to be as agreeable and often the preferred format when compared with the validated paper-based ‘MUST’ self-screening tool.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
Sulfometuron, when applied as a foliar and/or soil application, prevented regrowth of bahiagrass. Sulfometuron application did not reduce regrowth of centipedegrass regardless of method of application. Sulfometuron was absorbed by the roots and foliage of centipedegrass and bahiagrass. Symplasmic translocation of the herbicide was evident in both species. Translocation of foliar-applied sulfometuron increased from approximately 1% at 48 h after application to 23% at 72 h in bahiagrass. Metabolism of sulfometuron was greater in centipedegrass (69% of foliar-applied, 10% of root-applied) at 72 h after application than in bahiagrass (30% of foliar-applied and 4% of root-applied). Tolerance of centipedegrass to sulfometuron appeared to be related to a high degree of herbicide metabolism in this species.
Although high dose n-3 PUFA supplementation reduces exercise- and hyperpnoea-induced bronchoconstriction (EIB/HIB), there are concurrent issues with cost, compliance and gastrointestinal discomfort. It is thus pertinent to establish the efficacy of lower n-3 PUFA doses. Eight male adults with asthma and HIB and eight controls without asthma were randomly supplemented with two n-3 PUFA doses (6·2 g/d (3·7 g EPA and 2·5 g DHA) and 3·1 g/d (1·8 g EPA and 1·3 g DHA)) and a placebo, each for 21 d followed by 14 d washout. A eucapnic voluntary hyperpnoea (EVH) challenge was performed before and after treatments. Outcome measures remained unchanged in the control group. In the HIB group, the peak fall in forced expiratory volume in 1 s (FEV1) after EVH at day 0 (−1005 (sd 520) ml, −30 (sd 18) %) was unchanged after placebo. The peak fall in FEV1 was similarly reduced from day 0 to day 21 of 6·2 g/d n-3 PUFA (−1000 (sd 460) ml, −29 (sd 17) % v. −690 (sd 460) ml, −20 (sd 15) %) and 3·1 g/d n-3 PUFA (−970 (sd 480) ml, −28 (sd 18) % v. −700 (sd 420) ml, −21 (sd 15) %) (P<0·001). Baseline fraction of exhaled nitric oxide was reduced by 24 % (P=0·020) and 31 % (P=0·018) after 6·2 and 3·1 g/d n-3 PUFA, respectively. Peak increases in 9α, 11β PGF2 after EVH were reduced by 65 % (P=0·009) and 56 % (P=0·041) after 6·2 and 3·1 g/d n-3 PUFA, respectively. In conclusion, 3·1 g/d n-3 PUFA supplementation attenuated HIB and markers of airway inflammation to a similar extent as a higher dose. Lower doses of n-3 PUFA thus represent a potentially beneficial adjunct treatment for adults with asthma and EIB.