To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The recently-advancing Taku Glacier is excavating subglacial sediments at high rates over multi-decadal timescales. However, sediment redistribution over shorter timescales remains unquantified. We use a variety of methods to study subglacial and proglacial sediment redistribution on decadal, seasonal, and daily timescales to gain insight into sub- and proglacial landscape formation. Both excavation and deposition were observed from 2003 to 2015 (2.8 ± 0.9 m a−1 to +2.9 ± 0.9 m a−1). The observed patterns imply that a subglacial conduit has occupied the same site over the past decade. Outwash fans on the subaerial end moraine experience fluvial sediment reworking almost year-round, with net sediment gain in winter and net sediment loss in summer, and an overall mass gain between 2005 and 2015. We estimate that tens of meters of sediment still underlie the glacier terminus, sediments which can be remobilized during future activity. However, imminent retreat from the proglacial moraine will limit its sediment supply, leaving the moraine vulnerable to erosion by bordering rivers. Retreat into an over-deepened basin will leave the glacier vulnerable to increased frontal ablation and accelerating retreat.
Life events (LEs) are a risk factor for first onset and relapse of psychotic disorders. However, the impact of LEs on specific symptoms – namely reality distortion, disorganization, negative symptoms, depression, and mania – remains unclear. Moreover, the differential effects of negative v. positive LEs are poorly understood.
The present study utilizes an epidemiologic cohort of patients (N = 428) ascertained at first-admission for psychosis and followed for a decade thereafter. Symptoms were assessed at 6-, 24-, 48-, and 120-month follow-ups.
We examined symptom change within-person and found that negative events in the previous 6 months predicted an increase in reality distortion (β = 0.07), disorganized (β = 0.07), manic (β = 0.08), and depressive symptoms (β = 0.06), and a decrease in negative symptoms (β = −0.08). Conversely, positive LEs predicted fewer reality distortion (β = −0.04), disorganized (β = −0.04), and negative (β = −0.13) symptoms, and were unrelated to mood symptoms. A between-person approach to the same hypotheses confirmed that negative LEs predicted change in all symptoms, while positive LEs predicted change only in negative symptoms. In contrast, symptoms rarely predicted future LEs.
These findings confirm that LEs have an effect on symptoms, and thus contribute to the burden of psychotic disorders. That LEs increase positive symptoms and decrease negative symptoms suggest at least two different mechanisms underlying the relationship between LEs and symptoms. Our findings underscore the need for increased symptom monitoring following negative LEs, as symptoms may worsen during that time.
Each summer, surface melting of the margin of the Greenland Ice Sheet exposes a distinctive visible stratigraphy that is related to past variability in subaerial dust deposition across the accumulation zone and subsequent ice flow toward the margin. Here we map this surface stratigraphy along the northern margin of the ice sheet using mosaicked Sentinel-2 multispectral satellite imagery from the end of the 2019 melt season and finer-resolution WorldView-2/3 imagery for smaller regions of interest. We trace three distinct transitions in apparent dust concentration and the top of a darker basal layer. The three dust transitions have been identified previously as representing late-Pleistocene climatic transitions, allowing us to develop a coarse margin chronostratigraphy for northern Greenland. Substantial folding of late-Pleistocene stratigraphy is observed but uncommon. The oldest conformal surface-exposed ice in northern Greenland is likely located adjacent to Warming Land and may be up to ~55 thousand years old. Basal ice is commonly exposed hundreds of metres from the ice margin and may indicate a widespread frozen basal thermal state. We conclude that the ice margin across northern Greenland offers multiple opportunities to recover paleoclimatically distinct ice relative to previously studied regions in southwestern Greenland.
Decades of commitment to the basic principles of the Danish welfare state have been discarded with a new social policy reducing the benefits for people already at the bottom of the income ladder. The political intention is to increase job search via economic incentives that increase the gap between benefit income and market income. Using a panel dataset with benefit recipients, we show that the intended job search effect did not materialise to any significant extent; rather, the affected people became poorer because the vast majority of individuals could not respond to the economic incentives in the intended manner. Joblessness was not due to lack of incentives. This study confirms the importance of employability and self-efficacy, but it shows that health is an underlying variable that explains both of these factors and the recipients’ difficulties in getting a job. The results have two major social policy implications. Access to early retirement schemes should be easier for recipients who have serious health problems and therefore cannot respond to economic incentives, and there should be an increased focus on how to help the recipients without major health problems to develop self-efficacy.
Two common approaches to identify subgroups of patients with bipolar disorder are clustering methodology (mixture analysis) based on the age of onset, and a birth cohort analysis. This study investigates if a birth cohort effect will influence the results of clustering on the age of onset, using a large, international database.
The database includes 4037 patients with a diagnosis of bipolar I disorder, previously collected at 36 collection sites in 23 countries. Generalized estimating equations (GEE) were used to adjust the data for country median age, and in some models, birth cohort. Model-based clustering (mixture analysis) was then performed on the age of onset data using the residuals. Clinical variables in subgroups were compared.
There was a strong birth cohort effect. Without adjusting for the birth cohort, three subgroups were found by clustering. After adjusting for the birth cohort or when considering only those born after 1959, two subgroups were found. With results of either two or three subgroups, the youngest subgroup was more likely to have a family history of mood disorders and a first episode with depressed polarity. However, without adjusting for birth cohort (three subgroups), family history and polarity of the first episode could not be distinguished between the middle and oldest subgroups.
These results using international data confirm prior findings using single country data, that there are subgroups of bipolar I disorder based on the age of onset, and that there is a birth cohort effect. Including the birth cohort adjustment altered the number and characteristics of subgroups detected when clustering by age of onset. Further investigation is needed to determine if combining both approaches will identify subgroups that are more useful for research.
Suicide is a leading cause of death worldwide and is largely preventable. The social media site Twitter is used by individuals to express suicidal intentions. It is not yet feasible to contact each Twitter user to confirm risk. Instead, it may be possible to validate risk by linguistic analysis. Psychological linguistic theory suggests that language is a reliable way of measuring people's internal thoughts and emotions; however, the linguistics of suicidality on Twitter is yet to be fully explored.
Objectives & aim
The aim of this study is to characterise the linguistic styles of suicide-related posts on Twitter for the purposes of predicting suicide risk.
The Linguistic Inquiry and Word Count (LIWC) program was used to compare the linguistic features of suicide-related tweets previously coded for suicide risk by humans with a set of matched controls. Logistic regression was then used for predictive modelling.
The suicide-related tweets had significantly different linguistic profiles to the control tweets. The “strongly concerning” suicide tweets were found to have fewer words than all other tweets and not surprisingly, references to ‘death’ were significantly higher in this group. A number of other results were found. The final model which distinguished “strongly concerning” suicide risk from the controls was found to have 97.7% sensitivity and 99.8% specificity.
This study confirms that the linguistic features of suicide-related Twitter posts are different from general Twitter posts and that these linguistic profiles may be used to predict suicide risk in Twitter users.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Among the most disabling and fatal psychiatric illnesses, eating disorders (EDs) often manifest early in life, which encourages investigations into in utero and perinatal environmental risk factors. The objective of this study was to determine whether complications during pregnancy and birth and perinatal conditions are associated with later eating disorder risk in offspring and whether these associations are unique to EDs.
All individuals born in Denmark to Danish-born parents 1989–2010 were included in the study and followed from their 6th birthday until the end of 2016. Exposure to factors related to pregnancy, birth, and perinatal conditions was determined using national registers, as were hospital-based diagnoses of anorexia nervosa (AN), bulimia nervosa, and eating disorder not otherwise specified during follow-up. For comparison, diagnoses of depressive, anxiety, and obsessive-compulsive disorders were also included. Cox regression was used to compare hazards of psychiatric disorders in exposed and unexposed individuals.
1 167 043 individuals were included in the analysis. We found that similar to the comparison disorders, prematurity was associated with increased eating disorder risk. Conversely, patterns of increasing risks of EDs, especially in AN, with increasing parental ages differed from the more U-shaped patterns observed for depressive and anxiety disorders.
Our results suggest that pregnancy and early life are vulnerable developmental periods when exposures may influence offspring mental health, including eating disorder risk, later in life. The results suggest that some events pose more global transdiagnostic risk whereas other patterns, such as increasing parental ages, appear more specific to EDs.
We develop the concept of character level for the complex irreducible characters of finite, general or special, linear and unitary groups. We give characterizations of the level of a character in terms of its Lusztig label and in terms of its degree. Then we prove explicit upper bounds for character values at elements with not-too-large centralizers and derive upper bounds on the covering number and mixing time of random walks corresponding to these conjugacy classes. We also characterize the level of the character in terms of certain dual pairs and prove explicit exponential character bounds for the character values, provided that the level is not too large. Several further applications are also provided. Related results for other finite classical groups are obtained in the sequel [Guralnick et al. ‘Character levels and character bounds for finite classical groups’, Preprint, 2019, arXiv:1904.08070] by different methods.
Precision medicine is changing the way people are diagnosed and treated into a more personalized approach. In medical research, several statistical methods have been proposed for estimating personalized treatment effects. However, in nutritional science these methods have hardly been used. By re-evaluation of pre-treatment biomarker data, we demonstrate how two diets cause differential weight loss depending on pre-treatment fasting plasma glucose (FPG) and fasting insulin (FI) levels.
Materials and Methods
Overweight people with increased waist circumference were randomly assigned to receive an ad libitum New Nordic Diet (NND) high in dietary fiber and whole grain or an Average Danish (Western) Diet (ADD) for 26 weeks. All foods were provided free of charge. Body weight was measured throughout the study and blood was drawn before randomization from where FPG and FI were analyzed. Weight was described by linear mixed models including biomarker-diet group interactions, covariate adjustment, and participant-specific random effects. Personalized predictions of additional weight loss from NND compared to ADD given specific values of FPG or FI were estimated as contrasts of intercepts and slopes obtained from the biomarker-diet group interaction term.
Baseline FPG predicted a 3.00 (1.18;4.83, n = 181, P = 0.001) kg larger weight loss per mmol/L from choosing NND over ADD. For instance, a baseline FPG level of 4.7 mmol/L would lead to an average of 1.42 kg larger weight loss on NND vs. ADD (above 0.41 kg with 95% certainty), whereas the average effect size would be 8.33 kg (above 5.50 kg with 95% certainty) among subjects with FPG level of 7.0 mmol/L. Among individuals with FPG < 5.6 mmol/L, each pmol/L lower baseline FI predicted a 0.039 (95% CI 0.017;0.061, n = 143, P < 0.001) kg larger weight loss from choosing NND over ADD. For instance, a baseline FI level of 25 pmol/L would lead to an average larger weight loss of 4.10 kg on NND vs. ADD (> 2.51 kg with 99% certainty). Likewise, a baseline FI level of 75 pmol/L would result in an average effect size of 2.15 kg (> 1.11 kg with 99% certainty).
Use of pre-treatment FPG and FI led to truly individualized predictions of treatment effect of introducing more fiber and whole grain in the diet on weight loss, ranging from almost no effect to losing more than 8 kg. These findings tentatively suggest that re-evaluation of data from existing randomized controlled trials through suitable statistical methods may have a great potential.
Both blood- and milk-based biomarkers have been analysed for decades in research settings, although often only in one herd, and without focus on the variation in the biomarkers that are specifically related to herd or diet. Biomarkers can be used to detect physiological imbalance and disease risk and may have a role in precision livestock farming (PLF). For use in PLF, it is important to quantify normal variation in specific biomarkers and the source of this variation. The objective of this study was to estimate the between- and within-herd variation in a number of blood metabolites (β-hydroxybutyrate (BHB), non-esterified fatty acids, glucose and serum IGF-1), milk metabolites (free glucose, glucose-6-phosphate, urea, isocitrate, BHB and uric acid), milk enzymes (lactate dehydrogenase and N-acetyl-β-D-glucosaminidase (NAGase)) and composite indicators for metabolic imbalances (Physiological Imbalance-index and energy balance), to help facilitate their adoption within PLF. Blood and milk were sampled from 234 Holstein dairy cows from 6 experimental herds, each in a different European country, and offered a total of 10 different diets. Blood was sampled on 2 occasions at approximately 14 days-in-milk (DIM) and 35 DIM. Milk samples were collected twice weekly (in total 2750 samples) from DIM 1 to 50. Multilevel random regression models were used to estimate the variance components and to calculate the intraclass correlations (ICCs). The ICCs for the milk metabolites, when adjusted for parity and DIM at sampling, demonstrated that between 12% (glucose-6-phosphate) and 46% (urea) of the variation in the metabolites’ levels could be associated with the herd-diet combination. Intraclass Correlations related to the herd-diet combination were generally higher for blood metabolites, from 17% (cholesterol) to approximately 46% (BHB and urea). The high ICCs for urea suggest that this biomarker can be used for monitoring on herd level. The low variance within cow for NAGase indicates that few samples would be needed to describe the status and potentially a general reference value could be used. The low ICC for most of the biomarkers and larger within cow variation emphasises that multiple samples would be needed - most likely on the individual cows - for making the biomarkers useful for monitoring. The majority of biomarkers were influenced by parity and DIM which indicate that these should be accounted for if the biomarker should be used for monitoring.
The main purpose of this study was to find several early factors affecting stayability in rabbit females. To reach this goal, 203 females were used from their first artificial insemination to their sixth parturition. Throughout that period, 48 traits were recorded, considered to be performance, metabolic and immunological indicators. These traits were initially recorded in females’ first reproductive cycle. Later, removed females due to death or culling and those that were non-removed were identified. A first analysis was used to explore whether it was possible to classify females between those reaching and those not reaching up to the mean lifespan of a rabbit female (the fifth reproductive) cycle using information from the first reproductive cycle. The analysis results showed that 97% of the non-removed females were classified correctly, whereas only 60% of the removed females were classified as animals to be removed. The reason for this difference lies in the model’s characteristics, which was designed using early traits and was able to classify only the cases in which females would be removed due to performance, metabolic or immunologic imbalances in their early lives. Our results suggest that the model defines the necessary conditions, but not the sufficient ones, for females to remain alive in the herd. The aim of a second analysis was to find out the main early differences between the non-removed and removed females. The live weights records taken in the first cycle indicated that the females removed in their first cycle were lighter, while those removed in their second cycle were heavier with longer stayability (−203 and +202 g on average, respectively; P < 0.05). Non-removed females showed higher glucose and lower beta-hydroxybutyrate concentrations in the first cycle than the removed females (+4.8 and −10.7%, respectively; P < 0.05). The average lymphocytes B counts in the first cycle were 22.7% higher in the non-removed females group (P < 0.05). The females removed in the first reproductive cycle presented a higher granulocytes/lymphocytes ratio in this cycle than those that at least reached the second cycle (4.81 v. 1.66; P < 0.001). Consequently, non-removed females at sixth parturition offered adequate body development and energy levels, less immunological stress and a more mature immune function in the first reproductive cycle. The females that deviated from this pattern were at higher risk of being removed from the herd.
Economic pressures continue to mount on modern-day livestock farmers, forcing them to increase herds sizes in order to be commercially viable. The natural consequence of this is to drive the farmer and the animal further apart. However, closer attention to the animal not only positively impacts animal welfare and health but can also increase the capacity of the farmer to achieve a more sustainable production. State-of-the-art precision livestock farming (PLF) technology is one such means of bringing the animals closer to the farmer in the facing of expanding systems. Contrary to some current opinions, it can offer an alternative philosophy to ‘farming by numbers’. This review addresses the key technology-oriented approaches to monitor animals and demonstrates how image and sound analyses can be used to build ‘digital representations’ of animals by giving an overview of some of the core concepts of PLF tool development and value discovery during PLF implementation. The key to developing such a representation is by measuring important behaviours and events in the livestock buildings. The application of image and sound can realise more advanced applications and has enormous potential in the industry. In the end, the importance lies in the accuracy of the developed PLF applications in the commercial farming system as this will also make the farmer embrace the technological development and ensure progress within the PLF field in favour of the livestock animals and their well-being.
Paediatric hearing loss rates in Ghana are currently unknown.
A cross-sectional study was conducted in peri-urban Kumasi, Ghana; children (aged 3–15 years) were recruited from randomly selected households. Selected children underwent otoscopic examination prior to in-community pure tone screening using the portable ShoeBox audiometer. The LittlEars auditory questionnaire was also administered to caregivers and parents.
Data were collected from 387 children. After conditioning, 362 children were screened using monaural pure tones presented at 25 dB. Twenty-five children could not be conditioned to behavioural audiometric screening. Eight children were referred based on audiometric screening results. Of those, four were identified as having hearing loss. Four children scored less than the maximum mark of 35 on the LittleEars questionnaire. Of those, three had hearing loss as identified through pure tone screening. The predominant physical finding on otoscopy was ear canal cerumen impaction.
Paediatric hearing loss is prevalent in Ghana, and should be treated as a public health problem warranting further evaluation and epidemiology characterisation.
To establish the reliability of the application of National Health and Safety Network (NHSN) central-line–associated bloodstream infection (CLABSI) criteria within established reporting systems internationally.
Diagnostic-test accuracy systematic review.
We conducted a search of Medline, SCOPUS, the Cochrane Library, CINAHL (EbscoHost), and PubMed (NCBI). Cohort studies were eligible for inclusion if they compared publicly reported CLABSI rates and were conducted by independent and expertly trained reviewers using NHSN/Centers for Disease Control (or equivalent) criteria. Two independent reviewers screened, extracted data, and assessed risk of bias using the QUADAS 2 tool. Sensitivity, specificity, negative and positive predictive values were analyzed.
A systematic search identified 1,259 publications; 9 studies were eligible for inclusion (n = 7,160 central lines). Publicly reported CLABSI rates were more likely to be underestimated (7 studies) than overestimated (2 studies). Specificity ranged from 0.70 (95% confidence interval [CI], 0.58–0.81) to 0.99 (95% CI, 0.99–1.00) and sensitivity ranged from 0.42 (95% CI, 0.15–0.72) to 0.88 (95% CI, 0.77–0.95). Four studies, which included a consecutive series of patients (whole cohort), reported CLABSI incidence between 9.8% and 20.9%, and absolute CLABSI rates were underestimated by 3.3%–4.4%. The risk of bias was low to moderate in most included studies.
Our findings suggest consistent underestimation of true CLABSI incidence within publicly reported rates, weakening the validity and reliability of surveillance measures. Auditing, education, and adequate resource allocation is necessary to ensure that surveillance data are accurate and suitable for benchmarking and quality improvement measures over time.
Women suffering from first onset postpartum mental disorders (PPMD) have a highly elevated risk of suicide. The current study aimed to: (1) describe the risk of self-harm among women with PPMD and (2) investigate the extent to which self-harm is associated with later suicide.
We conducted a register-based cohort study linking national Danish registers. This identified women with any recorded first inpatient or outpatient contact to a psychiatric facility within 90 days after giving birth to their first child. The main outcome of interest was defined as the first hospital-registered episode of self-harm. Our cohort consisted of 1 202 292 women representing 24 053 543 person-years at risk.
Among 1554 women with severe first onset PPMD, 64 had a first-ever hospital record of self-harm. Women with PPMD had a hazard ratio (HR) for self-harm of 6.2 (95% CI 4.9–8.0), compared to mothers without mental disorders; but self-harm risk was lower in PPMD women compared to mothers with non-PPMD [HR: 10.1, (95% CI 9.6–10.5)] and childless women with mental disorders [HR: 9.3 (95% CI 8.9–9.7)]. Women with PPMD and records of self-harm had a significantly greater risk for later suicide compared with all other groups of women in the cohort.
Women with PPMD had a high risk of self-harm, although lower than risks observed in other psychiatric patients. However, PPMD women who had self-harmed constituted a vulnerable group at significantly increased risk of later suicide.
Understanding the association between diet quality and cardiometabolic risk by education level is important for preventing increased cardiometabolic risk in the Mexican population, especially considering pre-existing disparities in diet quality. The present study examined the cross-sectional association of overall diet quality with cardiometabolic risk, overall and by education level, among Mexican men and women.
Cardiometabolic risk was defined by using biomarkers and diet quality by the Mexican Diet Quality Index. We computed sex-specific multivariable logistic regression models.
Mexican men (n 634) and women (n 875) participating in the Mexican National Health and Nutrition Survey 2012.
We did not find associations of diet quality with cardiometabolic risk factors in the total sample or in men by education level. However, we observed that for each 10-unit increase in the dietary quality score, the odds of diabetes risk in women with no reading/writing skills was 0·47 (95 % CI 0·26, 0·85) relative to the odds in women with ≥10 years of school (referent). Similarly, for each 10-unit increase of the dietary quality score, the odds of having three v. no lipid biomarker level beyond the risk threshold in lower-educated women was 0·27 (95 % CI 0·12, 0·63) relative to the odds in higher-educated women.
Diet quality has a stronger protective association with some cardiometabolic disease risk factors for lower- than higher-educated Mexican women, but no association with cardiometabolic disease risk factors among men. Future research will be needed to understand what diet factors could be influencing the cardiometabolic disease risk disparities in this population.
This paper reviews the effects of extended lactation (EXT) as a strategy in dairy cattle on milk production and persistency, reproduction, milk quality, lifetime performance of the cow and finally the economic effects on herd and farm levels as well as the impact on emission of greenhouse gas at product level. Primiparous cows are able to produce equal or more milk per feeding day during EXT compared with a standard 305-d lactation, whereas results for multiparous cows are inconsistent. Cows managed for EXT can achieve a higher lifetime production while delivering milk with unchanged or improved quality properties. Delaying insemination enhances mounting behaviour and allows insemination after the cow’s energy balance has become positive. However, in most cases EXT has no effect or a non-significant positive effect on reproduction. The EXT strategy sets off a cascade of effects at herd and farm level. Thus, the EXT strategy leads to fewer calvings and thereby expected fewer diseases, fewer replacement heifers and fewer dry days per cow per year. The optimal lifetime scenario for milk production was modelled to be an EXT of 16 months for first parity cows followed by an EXT of 10 months for later lactations. Modelling studies of herd dynamics indicate a positive effect of EXT on lifetime efficiency (milk per dry matter intake), mainly originating from benefits of EXT on daily milk yield in primiparous cows and the reduced number of replacement heifers. Consequently, EXT also leads to reduced total meat production at herd level. For the farmer, EXT can give the same economic return as a traditional lactation period. At farm level, EXT can contribute to a reduction in the environmental impact of dairy production, mainly as a consequence of the reduced production of beef. A wider dissemination of the EXT concept will be supported by methods to predict which cows may be most suitable for EXT, and clarification of how milking frequency and feeding strategy through the lactation can be organised to support milk yield and an appropriate body condition at the next calving.
Tear staining (TS) in the pig has been related to different stressors and may be a useful tool for assessing animal welfare on farm. The aim of the current study was to investigate TS across the finisher period and its possible relation to age, growth, sex and experimentally induced stressors. The study included 80 finisher pens divided between three batches. Within each batch, the pens either included pigs with docked or undocked tails, had straw provided (150 g/pig/day) or not and had a low (1.21 m2/pig, 11 pigs) or high stocking density (0.73 m2/pig, 18 pigs). Tear staining (scores 1 to 4; from smaller to larger tear stain area, respectively) and tail damage were scored on each individual pig three times per week over the 9-week study period, and the individual maximum TS score within each week was chosen for further analysis. Data were analysed using logistic regression separately for each of the four possible TS score levels. The TS scores 1 and 2 decreased with weeks into the study period and were negatively related to the average daily gain (ADG) of the pigs, whereas the TS score 4 increased with weeks into the study period and was positively related to ADG. None of the TS scores differed between females and castrated males, and neither straw provision nor lowering the stocking density affected the TS scores. However, the TS score 1 decreased the last week before an event of tail damage (at least one pig in the pen with a bleeding tail wound), whereas the TS score 4 increased. The results of the current study advocates for a relation between TS and the factors such as age, growth and stress in the pig, while no relation was found between TS and the environmental factors straw provision and lowered stocking density. The relations to age and growth are important to take into consideration if using TS as a welfare assessment measure in the pig in the future.