To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Familial co-aggregation studies of eating disorders (EDs) and schizophrenia reveal shared genetic and environment factors, yet their etiological and clinical relationship remains unclear. We evaluate the influence of schizophrenia family history on clinical outcomes of EDs.
We conducted a cohort evaluation of the association between family history of schizophrenia and ED clinical features, psychiatric comorbidities, and somatic and mental health burden in individuals born in Sweden 1977–2003 with anorexia nervosa (AN) or other EDs (OED: bulimia nervosa, binge-eating disorder, and ED not otherwise specified).
Of 12 424 individuals with AN and 20 716 individuals with OED, 599 (4.8%) and 1118 (5.4%), respectively, had a family history of schizophrenia (in up to third-degree relatives). Among individuals with AN, schizophrenia in first-degree relatives was significantly associated with increased comorbid attention-deficit/hyperactivity disorder (ADHD) [HR(95% CI) 2.26 (1.27–3.99)], substance abuse disorder (SUD) [HR (95% CI) 1.93 (1.25–2.98)], and anxiety disorders [HR (95% CI) 1.47 (1.08–2.01)], but higher lowest illness-associated body mass index (BMI) [1.14 kg/m2, 95% CI (0.19–2.10)]. Schizophrenia in any relative (up to third-degree) in AN was significantly associated with higher somatic and mental health burden, but lower ED psychopathology scores [−0.29, 95% CI (−0.54 to −0.04)]. Schizophrenia in first-degree relatives in individuals with OED was significantly associated with increased comorbid ADHD, obsessive-compulsive disorder, SUD, anxiety disorders, somatic and mental health burden, and suicide attempts.
We observed different patterns of ED-related outcomes, psychiatric comorbidity, and illness burden in individuals with EDs with and without family histories of schizophrenia and provide new insights into the diverse manifestations of EDs.
We examined parent- and adolescent-reported executive functioning (EF) behaviors following pediatric traumatic brain injury (TBI) in the context of Online Family Problem-Solving Therapy (OFPST) and moderators of change in EF behaviors.
In total, 274 families were randomized to OFPST or an internet resource comparison group. Parents and adolescents completed the Behavior Rating Inventory of Executive Function at four time points. Mixed models were used to examine EF behaviors, assessing the effects of visit, treatment group, rater, TBI severity, age, socioeconomic status, and family functioning.
Parents rated their adolescents’ EF as poorer (F(3,1156) = 220.15, p < .001; M = 58.11, SE = 0.73) than adolescents rated themselves (M = 51.81, SE = 0.73). Across raters, EF behaviors were poorer for adolescents whose parents had less education (F(3,1156) = 8.60, p = .003; M = 56.76, SE = 0.98) than for those with more education (M = 53.16, SE = 0.88). Age at baseline interacted with visit (F(3,1156) = 5.05, p = .002), such that families of older adolescents reported improvement in EF behaviors over time. Family functioning also interacted with visit (F(3, 1156) = 2.61, p = .049), indicating more improvement in EF behaviors over time in higher functioning families. There were no effects of treatment or TBI severity.
We identified a discrepancy between parent- and adolescent-reported EF, suggesting reduced awareness of deficits in adolescents with TBI. We also found that poorer family functioning and younger age were associated with poorer recovery after TBI, whereas adolescents of parents with less education were reported as having greater EF deficits across time points.
Current COVID-19 guidelines recommend symptom-based screening and regular nasopharyngeal (NP) testing for healthcare personnel in high-risk settings. We sought to estimate case detection percentages with various routine NP and saliva testing frequencies.
Simulation modeling study.
We constructed a sensitivity function based on the average infectiousness profile of symptomatic coronavirus disease 2019 (COVID-19) cases to determine the probability of being identified at the time of testing. This function was fitted to reported data on the percent positivity of symptomatic COVID-19 patients using NP testing. We then simulated a routine testing program with different NP and saliva testing frequencies to determine case detection percentages during the infectious period, as well as the presymptomatic stage.
Routine biweekly NP testing, once every 2 weeks, identified an average of 90.7% (SD, 0.18) of cases during the infectious period and 19.7% (SD, 0.98) during the presymptomatic stage. With a weekly NP testing frequency, the corresponding case detection percentages were 95.9% (SD, 0.18) and 32.9% (SD, 1.23), respectively. A 5-day saliva testing schedule had a similar case detection percentage as weekly NP testing during the infectious period, but identified ~10% more cases (mean, 42.5%; SD, 1.10) during the presymptomatic stage.
Our findings highlight the utility of routine noninvasive saliva testing for frontline healthcare workers to protect vulnerable patient populations. A 5-day saliva testing schedule should be considered to help identify silent infections and prevent outbreaks in nursing homes and healthcare facilities.
To investigate the influences of dietary riboflavin (RF) addition on nutrient digestion and rumen fermentation, eight rumen cannulated Holstein bulls were randomly allocated into four treatments in a repeated 4 × 4 Latin square design. Daily addition level of RF for each bull in control, low RF, medium RF and high RF was 0, 300, 600 and 900 mg, respectively. Increasing the addition level of RF, DM intake was not affected, average daily gain tended to be increased linearly and feed conversion ratio decreased linearly. Total tract digestibilities of DM, organic matter, crude protein (CP) and neutral-detergent fibre (NDF) increased linearly. Rumen pH decreased quadratically, and total volatile fatty acids (VFA) increased quadratically. Acetate molar percentage and acetate:propionate ratio increased linearly, but propionate molar percentage and ammonia-N content decreased linearly. Rumen effective degradability of DM increased linearly, NDF increased quadratically but CP was unaltered. Activity of cellulase and populations of total bacteria, protozoa, fungi, dominant cellulolytic bacteria, Prevotella ruminicola and Ruminobacter amylophilus increased linearly. Linear increase was observed for urinary total purine derivatives excretion. The data suggested that dietary RF addition was essential for rumen microbial growth, and no further increase in performance and rumen total VFA concentration was observed when increasing RF level from 600 to 900 mg/d in dairy bulls.
The aim of the present study was to identify reports of the prevalence of tinnitus in China and to present these findings in a review format.
This study assessed and collated published prevalence estimates of tinnitus and tinnitus severity, creating a narrative synthesis of the data from publications identified from a combination of Chinese and English language databases.
A total of 23 studies were included. Tinnitus prevalence ranged from 4.3 per cent to 51.33 per cent but varied with age and gender. The highest increase in prevalence from previous decade in age occurs during the fifth and sixth decades, and the highest prevalence was in the seventh decade at 32.47 per cent. There is also evidence that tinnitus prevalence is related to certain risk factors including comorbid disorders.
The prevalence of tinnitus in mainland China in this study is consistent with global data. With increasing awareness of the prevalence of tinnitus in China, the development of epidemiological standards is a priority.
The coronavirus disease 2019 (COVID-19) pandemic represents an unprecedented threat to mental health. Herein, we assessed the impact of COVID-19 on subthreshold depressive symptoms and identified potential mitigating factors.
Participants were from Depression Cohort in China (ChiCTR registry number 1900022145). Adults (n = 1722) with subthreshold depressive symptoms were enrolled between March and October 2019 in a 6-month, community-based interventional study that aimed to prevent clinical depression using psychoeducation. A total of 1506 participants completed the study in Shenzhen, China: 726 participants, who completed the study between March 2019 and January 2020 (i.e. before COVID-19), comprised the ‘wave 1’ group; 780 participants, who were enrolled before COVID-19 and completed the 6-month endpoint assessment during COVID-19, comprised ‘wave 2’. Symptoms of depression, anxiety and insomnia were assessed at baseline and endpoint (i.e. 6-month follow-up) using the Patient Health Questionnaire-9 (PHQ-9), Generalised Anxiety Disorder-7 (GAD-7) and Insomnia Severity Index (ISI), respectively. Measures of resilience and regular exercise were assessed at baseline. We compared the mental health outcomes between wave 1 and wave 2 groups. We additionally investigated how mental health outcomes changed across disparate stages of the COVID-19 pandemic in China, i.e. peak (7–13 February), post-peak (14–27 February), remission plateau (28 February−present).
COVID-19 increased the risk for three mental outcomes: (1) depression (odds ratio [OR] = 1.30, 95% confidence interval [CI]: 1.04–1.62); (2) anxiety (OR = 1.47, 95% CI: 1.16–1.88) and (3) insomnia (OR = 1.37, 95% CI: 1.07–1.77). The highest proportion of probable depression and anxiety was observed post-peak, with 52.9% and 41.4%, respectively. Greater baseline resilience scores had a protective effect on the three main outcomes (depression: OR = 0.26, 95% CI: 0.19–0.37; anxiety: OR = 1.22, 95% CI: 0.14–0.33 and insomnia: OR = 0.18, 95% CI: 0.11–0.28). Furthermore, regular physical activity mitigated the risk for depression (OR = 0.79, 95% CI: 0.79–0.99).
The COVID-19 pandemic exerted a highly significant and negative impact on symptoms of depression, anxiety and insomnia. Mental health outcomes fluctuated as a function of the duration of the pandemic and were alleviated to some extent with the observed decline in community-based transmission. Augmenting resiliency and regular exercise provide an opportunity to mitigate the risk for mental health symptoms during this severe public health crisis.
Winter half-year precipitation dominates variations in hydroclimatic conditions in North Xinjiang, but few researchers have focused on this very important aspect of the Holocene climate. Here we report multiproxy evidence of Holocene hydroclimate changes from the sediments of Wulungu Lake in North Xinjiang. The site is a closed terminal lake fed mainly by meltwater from snow and ice, and today the area is climatically dominated by the westerlies. Grain-size end-member analysis implies an important mode of variation that indicates a gradually increasing moisture trend, with superimposed centennial-scale variations, since 8000 cal yr BP. From 8000 to 5350 cal yr BP, a permanent lake developed, and the lake level gradually rose. Between 5350 and 500 cal yr BP, the moisture status increased rapidly, with the wettest climate occurring between 3200 and 500 cal yr BP. After 500 cal yr BP, the lake level fell. The trend of increasing Holocene wetness indicates a rising winter precipitation in North Xinjiang during the Holocene. This was due to an increase in upwind vapor concentrations caused by increased evaporation and strength of the westerlies, which were determined by the increasing boreal winter insolation and its latitudinal gradient.
An acute gastroenteritis (AGE) outbreak caused by a norovirus occurred at a hospital in Shanghai, China, was studied for molecular epidemiology, host susceptibility and serological roles. Rectal and environmental swabs, paired serum samples and saliva specimens were collected. Pathogens were detected by real-time polymerase chain reaction and DNA sequencing. Histo-blood group antigens (HBGA) phenotypes of saliva samples and their binding to norovirus protruding proteins were determined by enzyme-linked immunosorbent assay. The HBGA-binding interfaces and the surrounding region were analysed by the MegAlign program of DNAstar 7.1. Twenty-seven individuals in two care units were attacked with AGE at attack rates of 9.02 and 11.68%. Eighteen (78.2%) symptomatic and five (38.4%) asymptomatic individuals were GII.6/b norovirus positive. Saliva-based HBGA phenotyping showed that all symptomatic and asymptomatic cases belonged to A, B, AB or O secretors. Only four (16.7%) out of the 24 tested serum samples showed low blockade activity against HBGA-norovirus binding at the acute phase, whereas 11 (45.8%) samples at the convalescence stage showed seroconversion of such blockade. Specific blockade antibody in the population played an essential role in this norovirus epidemic. A wide HBGA-binding spectrum of GII.6 supports a need for continuous health attention and surveillance in different settings.
The neuropeptide oxytocin is proposed as a promising therapy for social dysfunction by modulating amygdala-mediated social-emotional behavior. Although clinical trials report some benefits of chronic treatment, it is unclear whether efficacy may be influenced by dose frequency or genotype.
In a randomized, double-blind, placebo-controlled pharmaco-functional magnetic resonance imaging trial (150 male subjects), we investigated acute and different chronic (every day or on alternate days for 5 days) intranasal oxytocin (24 international units) effects and oxytocin receptor genotype-mediated treatment sensitivity on amygdala responses to face emotions. We also investigated similar effects on resting-state functional connectivity between the amygdala and prefrontal cortex.
A single dose of oxytocin-reduced amygdala responses to all face emotions but for threatening (fear and anger) and happy faces, this effect was abolished after daily doses for 5 days but maintained by doses given every other day. The latter dose regime also enhanced associated anxious-arousal attenuation for fear faces. Oxytocin effects on reducing amygdala responses to face emotions only occurred in AA homozygotes of rs53576 and A carriers of rs2254298. The effects of oxytocin on resting-state functional connectivity were not influenced by either dose-frequency or receptor genotype.
Infrequent chronic oxytocin administration may be therapeutically most efficient and its anxiolytic neural and behavioral actions are highly genotype-dependent in males.
No studies have directly compared the effectiveness and safety of direct oral anticoagulants (DOACs) and warfarin in patients with atrial fibrillation (AF), with or without a history of ischemic stroke and transient ischemic attack (TIA). This is important for two reasons: first, previous research reports important differences between DOACs and warfarin across other patient subgroups, and second, patients with previous stroke or TIA have a high risk of recurrent stroke.
Using 2012–2014 Medicare claims data, we identified patients newly diagnosed with AF in 2013–014 who started taking apixaban, dabigatran, rivaroxaban, or warfarin. We categorized the patients according to whether they had a history of stroke or TIA. We constructed Cox proportional hazard models that included indicator variables for treatment groups, a history of stroke or TIA, and the interaction between them, and controlled for demographic and clinical characteristics.
The hazard ratio (HR) for stroke with dabigatran, compared with warfarin, was 0.64 (95% confidence interval [CI]: 0.48–0.85) for patients with a history of stroke or TIA and 0.94 (95% CI: 0.75–1.16) for patients without a history of stroke or TIA (p-value for interaction = 0.034). In patients with previous stroke or TIA, the risk of stroke was lower with dabigatran (HR 0.64, 95% CI: 0.48–0.85) and rivaroxaban (HR 0.70, 95%CI: 0.56–0.87), compared with apixaban, but there was no difference for patients in the other subgroup.
DOACs were generally more effective than warfarin for preventing stroke. The superiority of dabigatran was more pronounced in patients with a history of stroke or TIA. The comparative effectiveness of DOACs differed substantially between patients with and without a history of stroke or TIA; specifically, apixaban was less effective in patients with a history of stroke or TIA. Our results reinforce the need to tailor anticoagulation to patient characteristics and to support the investigation of the underlying mechanisms associated with DOACs.
This chapter comprises the following sections: names, taxonomy, subspecies and distribution, descriptive notes, habitat, movements and home range, activity patterns, feeding ecology, reproduction and growth, behavior, parasites and diseases, status in the wild, and status in captivity.
Achieving sub-picometer precision measurements of atomic column positions in high-resolution scanning transmission electron microscope images using nonrigid registration (NRR) and averaging of image series requires careful optimization of experimental conditions and the parameters of the registration algorithm. On experimental data from SrTiO3 , sub-pm precision requires alignment of the sample to the zone axis to within 1 mrad tilt and sample drift of less than 1 nm/min. At fixed total electron dose for the series, precision in the fast scan direction improves with shorter pixel dwell time to the limit of our microscope hardware, but the best precision along the slow scan direction occurs at 6 μs/px dwell time. Within the NRR algorithm, the “smoothness factor” that penalizes large estimated shifts is the most important parameter for sub-pm precision, but in general, the precision of NRR images is robust over a wide range of parameters.
There is a set of n bandits and at every stage, two of the bandits are chosen to play a game, with the result of a game being learned. In the “weak regret problem,” we suppose there is a “best” bandit that wins each game it plays with probability at least p > 1/2, with the value of p being unknown. The objective is to choose bandits to maximize the number of times that one of the competitors is the best bandit. In the “strong regret problem”, we suppose that bandit i has unknown value vi, i = 1, …, n, and that i beats j with probability vi/(vi + vj). One version of strong regret is interested in maximizing the number of times that the contest is between the players with the two largest values. Another version supposes that at any stage, rather than choosing two arms to play a game, the decision maker can declare that a particular arm is the best, with the objective of maximizing the number of stages in which the arm with the largest value is declared to be the best. In the weak regret problem, we propose a policy and obtain an analytic bound on the expected number of stages over an infinite time frame that the best arm is not one of the competitors when this policy is employed. In the strong regret problem, we propose a Thompson sampling type algorithm and empirically compare its performance with others in the literature.
Nutrition during the periconceptional period influences postnatal cardiovascular health. We determined whether in vitro embryo culture and transfer, which are manipulations of the nutritional environment during the periconceptional period, dysregulate postnatal blood pressure and blood pressure regulatory mechanisms. Embryos were either transferred to an intermediate recipient ewe (ET) or cultured in vitro in the absence (IVC) or presence of human serum (IVCHS) and a methyl donor (IVCHS+M) for 6 days. Basal blood pressure was recorded at 19–20 weeks after birth. Mean arterial pressure (MAP) and heart rate (HR) were measured before and after varying doses of phenylephrine (PE). mRNA expression of signaling molecules involved in blood pressure regulation was measured in the renal artery. Basal MAP did not differ between groups. Baroreflex sensitivity, set point, and upper plateau were also maintained in all groups after PE stimulation. Adrenergic receptors alpha-1A (αAR1A), alpha-1B (αAR1B), and angiotensin II receptor type 1 (AT1R) mRNA expression were not different from controls in the renal artery. These results suggest there is no programmed effect of ET or IVC on basal blood pressure or the baroreflex control mechanisms in adolescence, but future studies are required to determine the impact of ET and IVC on these mechanisms later in the life course when developmental programming effects may be unmasked by age.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
In order to maximize the utility of future studies of trilobite ontogeny, we propose a set of standard practices that relate to the collection, nomenclature, description, depiction, and interpretation of ontogenetic series inferred from articulated specimens belonging to individual species. In some cases, these suggestions may also apply to ontogenetic studies of other fossilized taxa.
Precise instrumental calibration is of crucial importance to 21-cm cosmology experiments. The Murchison Widefield Array’s (MWA) Phase II compact configuration offers us opportunities for both redundant calibration and sky-based calibration algorithms; using the two in tandem is a potential approach to mitigate calibration errors caused by inaccurate sky models. The MWA Epoch of Reionization (EoR) experiment targets three patches of the sky (dubbed EoR0, EoR1, and EoR2) with deep observations. Previous work in Li et al. (2018) and (2019) studied the effect of tandem calibration on the EoR0 field and found that it yielded no significant improvement in the power spectrum (PS) over sky-based calibration alone. In this work, we apply similar techniques to the EoR1 field and find a distinct result: the improvements in the PS from tandem calibration are significant. To understand this result, we analyse both the calibration solutions themselves and the effects on the PS over three nights of EoR1 observations. We conclude that the presence of the bright radio galaxy Fornax A in EoR1 degrades the performance of sky-based calibration, which in turn enables redundant calibration to have a larger impact. These results suggest that redundant calibration can indeed mitigate some level of model incompleteness error.
To evaluate the impacts of guanidinoacetic acid (GAA) and coated folic acid (CFA) on growth performance, nutrient digestion and hepatic gene expression, fifty-two Angus bulls were assigned to four groups in a 2 × 2 factor experimental design. The CFA of 0 or 6 mg/kg dietary DM folic acid was supplemented in diets with GAA of 0 (GAA−) or 0·6 g/kg DM (GAA+), respectively. Average daily gain (ADG), feed efficiency and hepatic creatine concentration increased with GAA or CFA addition, and the increased magnitude of these parameters was greater for addition of CFA in GAA− diets than in GAA+ diets. Blood creatine concentration increased with GAA or CFA addition, and greater increase was observed when CFA was supplemented in GAA+ diets than in GAA− diets. DM intake was unchanged, but rumen total SCFA concentration and digestibilities of DM, crude protein, neutral-detergent fibre and acid-detergent fibre increased with the addition of GAA or CFA. Acetate:propionate ratio was unaffected by GAA, but increased for CFA addition. Increase in blood concentrations of albumin, total protein and insulin-like growth factor-1 (IGF-1) was observed for GAA or CFA addition. Blood folate concentration was decreased by GAA, but increased with CFA addition. Hepatic expressions of IGF-1, phosphoinositide 3-kinase, protein kinase B, mammalian target of rapamycin and ribosomal protein S6 kinase increased with GAA or CFA addition. Results indicated that the combined supplementation of GAA and CFA could not cause ADG increase more when compared with GAA or CFA addition alone.
The Emeishan large igneous province (ELIP) in SW China is considered to be a typical mantle-plume-derived LIP. The picrites formed at relatively high temperatures in the ELIP, providing one of the important lines of argument for the role of mantle plume. Here we report trace-element data on olivine phenocrysts in the Dali picrites from the ELIP. The olivines are Ni-rich, and characterized by high (>1.4) 100×Mn/Fe value and low (<13) 10 000×Zn/Fe value, indicating a peridotite-dominated source. Since the olivine–melt Ni partition coefficient (KDNiol/melt) will decrease at high temperatures and pressures, the picrites derived from peridotite melting at high pressure, and that crystallized olivines at lower pressure, can generate high concentrations of Ni in olivine phenocrysts, excluding the necessity of a metasomatic pyroxenite contribution. Based on the Al-in-olivine thermometer, olivine crystallization temperature and mantle potential temperature (TP) were calculated at c. 1491°C and c. 1559°C, respectively. Our results are c. 200°C higher than that of the normal asthenospheric mantle, and are consistent with the role of a mantle thermal plume for the ELIP.
A disruption database characterizing the current quench of disruptions with ITER-like tungsten divertor has been developed on EAST. It provides a large number of plasma parameters describing the predisruptive plasma, current quench time, eddy current, and mitigation by massive impurity injection, which shows that the current quench time strongly depends on magnetic energy and post-disruption electron temperature. Further, the energy balance and magnetic energy dissipation during the current quench phase has been well analysed. Magnetic energy is also demonstrated to be dissipated mainly by ohmic reheating and inductive coupling, and both of the two channels have great effects on current quench time. Also, massive gas injection is an efficient method to speed up the current quench and increase the fraction of impurity radiation.