To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Welfare and management of calves is of increasing interest and also influences performance of these animals in later life. The aim of this study was to assess management and environmental conditions under which pre-weaned dairy calves are reared on commercial Irish dairy farms. We included 47 spring-calving, pasture-based herds in this study. Herd and animal-specific data, such as mortality rate, age and breed, were gathered from all participants via the HerdPlus® database. Information pertaining to management practices was collected by conducting an interview with the principal calf rearer, while an assessment of calf housing facilities was conducted to identify conditions calves were reared in. The environmental assessment included measurements of space allowance per calf, as well as feeding equipment hygiene. To assess calf behaviour video observations were used, while accounting for the number of calves present in a group and the space available per calf. Faecal samples were also collected to determine the presence of enteric pathogens among calves. To compare calf space allowance, group size and presence of enteric pathogens early and late in the calving season each farm was visited twice. Calf mortality was not associated with either herd size, space allowance per calf or post-colostrum feeding practices. Higher calf mortality was identified among herds which reported experiencing an on-set of calf pneumonia during weeks 8 to 10 of the calving season. This study demonstrates that factors associated with calf welfare on commercial Irish dairy farms (e.g. space allowance, mortality rate) are independent of herd size. Some management practices however, such as methods used for treating health issues can affect rates of calf mortality experienced. Calf mortality, for example, was lower in herds which treated diarrhoea cases by administering electrolytes, while continuing to offer milk. Behavioural observations indicate that smaller group sizes could promote expression of positive behaviours, potentially resulting from an overall improvement in welfare. Space allowance per calf was not associated with observed behaviour frequencies. We also identified that similar rates of calf mortality are experienced across herds of different sizes.
Investigate protein intake patterns over the day and their association with total protein intake in older adults.
Cross-sectional study utilising the dietary data collected through two non-consecutive, dietary record-assisted 24-h recalls. Days with low protein intake (n 290) were defined using the RDA (<0·8 g protein/kg adjusted BW/d). For each day, the amount and proportion of protein ingested at every hour of the day and during morning, mid-day and evening hours was calculated. Amounts and proportions were compared between low and high protein intake days and related to total protein intake and risk of low protein intake.
739 Dutch community-dwelling adults ≥70 years.
The mean protein intake was 76·3 (sd 0·7) g/d. At each hour of the day, the amount of protein ingested was higher on days with a high protein intake than on days with a low protein intake and associated with a higher total protein intake. The proportion of protein ingested during morning hours was higher (22 v. 17 %, P < 0·0001) on days with a low protein intake, and a higher proportion of protein ingested during morning hours was associated with a lower total protein intake (P < 0·0001) and a higher odds of low protein intake (OR 1·04, 95 % CI 1·03, 1·06). For the proportion of protein intake during mid-day or evening hours, opposite but weaker associations were found.
In this sample, timing of protein intake was associated with total protein intake. Additional studies need to clarify the importance of these findings to optimise protein intake.
Mismatch negativity (MMN) and its neuromagnetic analog (MMNm) are event-related brain responses elicited by changes in a sequence of auditory events and indexes of early cognitive processing. It consistently detects neural pre-attentive information processing deficits in schizophrenia. So far, MMN can be assessed with different methods (electroencephalography, EEG; magnetoencephalography, MEG) and different paradigms using the “traditional” oddball (20% rare deviants) or the so-called “optimum” (50% rare deviants varying in one of five parameters each) designs but the latter has not been applied to schizophrenia as yet.
Both designs were compared in 12 patients with schizophrenia and 12 healthy controls using MEG and EEG. Automated, observer-independent data analysis rendered the procedures suitable for clinical applications.
The optimum design was fastest to detect MMN changes. MEG had the best signal-to-noise ratio. In addition MMN was mostly reduced in schizophrenia if measured with MEG in the optimum paradigm.
Optimized MMN paradigms - especially MMNm - improve sensitivity and speed for the detection of schizophrenia endophenotypes. Dysfunctions in this disorder may lie primary in the fast and automatic encoding of stimulus features in the auditory cortex. Of note, these MMN optimum measures may not reflect one unitary mechanism that is equally affected in schizophrenia.
Frowning expresses negative emotions like anger, fear, and sadness. According to the facial feedback hypothesis, suppression of frowning will also diminish the corresponding negative emotions. Hence, mood improvement has been observed in patients who underwent treatment of glabellar frown lines with botulinum neurotoxin. This observation suggests the possibility that the intervention may be employed for the management of psychiatric disorders associated with negative emotions. Preliminary data from an open case series indicate that the intervention might improve the symptoms of depression.
Aims & objectives
To test whether an onabotulinumtoxinA injection into the glabellar region is benefical as an adjunctive treatment of major depression within a clinical trial.
We used a randomized, double-blinded, placebo-controlled study design (n = 30; ClinicalTrials.gov, number, NCT00934687).
We show that a single onabotulinumtoxinA treatment shortly leads to a strong and sustained improvement in partly chronic major depression that did not respond sufficiently to previous treatment. As for the primary end-point, Hamilton Depression Rating Scale (HAM-D17) six weeks after treatment compared to baseline, scores of onabotulinumtoxinA recipients showed 37.9% (8.34 points) more improvement than those of placebo-treated participants (F = 12.30, p = 0.002, η2 = 0.31, d = 1.28).
Our findings support the concept that the facial musculature not only expresses, but also regulates, mood states. As it stands, treatment of glabellar frown lines with botulinum neurotoxin can be considered for depressed patients with the objective of inducing mood-lifting effects.
The quality of life in patients with depression may be a measure of the efficiency of its management. Although quality of life is a subjective concept, difficult to assess, it may be reflected by the degree of social adaptation and the individual's level of functioning.
The study evaluates the time evolution of depressive symptoms and of some parameters attesting the quality of life in patients diagnosed with depression who are on antidepressant treatment.
Highlighting the evolution in time of depressive symptoms and patients’ perceptions on some aspects of quality of life.
There were included 23 patients who met the criteria of depressive episode, single or within recurrent depressive disorder, according to the International Classification of Diseases (ICD-10-AM), requiring antidepressant treatment. Subjects were evaluated at baseline and after 12 weeks of treatment using the Hamilton Rating Scale for Depression (HAMD), Sheehan disability scale (SDS), Social Adjustment Scale – Self-report (SASS).
Statistically significant decrease in mean HAMD scores was observed in the second administration. There were registered statistically significant differences of scores obtained in the two administrations for the 17 items of the SASS scale. Correlations with statistical significance between HAMD scores and some of the SDS areas were observed.
Results showed a favorable course of depressive symptoms while under treatment and differences in time of subjects’ perception on several aspects evaluated on SASS for the group studied. Correlations with statistical significance between HAMD scores and some SDS areas were observed.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Steroid treatment has been widely used for immunologic and inflammatory disorders. Psychiatric symptoms are not uncommon complications of the corticosteroid treatment. Correlations between the hypothalamic-pituitary-adrenal (HPA) axis and various psychoses have been already established in the specialty literature (modified HPA activity by drugs or not, glucocorticoid receptors downregulation, reduced hippocampal volume). The prevalence of corticosteroid-induced psychotic disorders varies around 5–6%. Most corticosteroid-induced symptoms start during the first few weeks after treatment initiation, but their onset can also be in the first 3–4 days. We would like to report the case of a 30-year-old woman who was taken to the psychiatry emergency room for psychomotor agitation, auditory and visual hallucinations, and bizarre delusions, disorganized thinking and modified behavior. The patient had no personal or family history of psychiatric illness. One month earlier, she was admitted in a neurosurgery ward and underwent lumbar surgery for L4–L5 disc protrusion; at discharge, eight days later, she began treatment with methylprednisolone 80 mg/day for three days. One week later, psychotic symptoms emerged that resulted in her hospitalization in our ward for apparent steroid-induced psychosis. Treatment with risperidone (up to 6 mg/day) and diazepam (10 mg/day, rapidly discontinued) was initiated. The endocrinology examination revealed modified plasmatic cortisol. The psychosis resolved several weeks later and the patient was discharged. Psychiatric complications induced by steroids underline the role of physicians that have to educate the patients and their families about these side effects and their early recognition.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
A new fossil site in a previously unexplored part of western Madagascar (the Beanka Protected Area) has yielded remains of many recently extinct vertebrates, including giant lemurs (Babakotia radofilai, Palaeopropithecus kelyus, Pachylemur sp., and Archaeolemur edwardsi), carnivores (Cryptoprocta spelea), the aardvark-like Plesiorycteropus sp., and giant ground cuckoos (Coua). Many of these represent considerable range extensions. Extant species that were extirpated from the region (e.g., Prolemur simus) are also present. Calibrated radiocarbon ages for 10 bones from extinct primates span the last three millennia. The largely undisturbed taphonomy of bone deposits supports the interpretation that many specimens fell in from a rock ledge above the entrance. Some primates and other mammals may have been prey items of avian predators, but human predation is also evident. Strontium isotope ratios (87Sr/86Sr) suggest that fossils were local to the area. Pottery sherds and bones of extinct and extant vertebrates with cut and chop marks indicate human activity in previous centuries. Scarcity of charcoal and human artifacts suggests only occasional visitation to the site by humans. The fossil assemblage from this site is unusual in that, while it contains many sloth lemurs, it lacks ratites, hippopotami, and crocodiles typical of nearly all other Holocene subfossil sites on Madagascar.
The association between intake of different dairy products and the risk of stroke remains unclear. We therefore investigated substitutions between dairy product subgroups and risk of stroke. We included 36 886 Dutch men and women. Information about dairy product intake was collected through a FFQ. Dairy products were grouped as low-fat milk, whole-fat milk, buttermilk, low-fat yogurt, whole-fat yogurt, cheese and butter. Incident stroke cases were identified in national registers. We used Cox proportional hazards regression to calculate associations for substitutions between dairy products with the rate of stroke. During a median follow-up of 15·2 years we identified 884 stroke cases (503 ischaemic and 244 haemorrhagic). Median intake of total dairy products was four servings/d. Low-fat yogurt substituted for whole-fat yogurt was associated with a higher rate of ischaemic stroke (hazard ratio (HR) = 2·58 (95 % CI 1·11, 5·97)/serving per d). Whole-fat yogurt as a substitution for any other subgroup was associated with a lower rate of ischaemic stroke (HR between 0·33 and 0·36/serving per d). We did not observe any associations for haemorrhagic stroke. In conclusion, whole-fat yogurt as a substitution for low-fat yogurt, cheese, butter, buttermilk or milk, regardless of fat content, was associated with a lower rate of ischaemic stroke.
The increasing attention for global warming is likely to contribute to the introduction of policies or other incentives to reduce greenhouse gas (GHG) emissions related to livestock production, including dairy. The dairy sector is an important contributor to GHG emissions. Clinical mastitis (CM), an intramammary infection, results in reduced milk production and fertility, increases culling and mortality of cows and, therefore, has a negative impact on the efficiency (output/input) of milk production. This may increase GHG emissions per unit of product. Our objective was to estimate the impact of CM in dairy cows on GHG emissions of milk production for the Dutch situation. A dynamic stochastic simulation model was developed to simulate the dynamics and losses of CM for individual lactations. Cows receive a parity (1 to 5+), a milk production and a calving interval (CI). Based on the parity, cows have a risk of CM, with a maximum of three cases in a lactation. Pathogens causing CM were classified as gram-positive bacteria, gram-negative bacteria, or other. Based on the parity and pathogen combinations, cows had a reduced milk production, discarded milk, prolonged CI and a risk of removal (culling and mortality) that reduce productivity of dairy cows and therefore increase GHG emissions per unit of product. Using life cycle assessment, emissions of GHGs were estimated from cradle to farm gate for processes along the milk production chain that are affected by CM. Processes included were feed production, enteric fermentation, and manure management. Emissions of GHGs were expressed as kg CO2 equivalents per ton of fat-and-protein-corrected milk (kg CO2e/t FPCM). Emissions of cows with CM increased on average by 57.5 (6.2%) kg CO2e/t FPCM compared with cows without CM. This increase was caused by removal (39%), discarded milk (38%), reduced milk production (17%) and prolonged CI (6%). The GHG emissions increased by 48 kg CO2e/t FPCM for cows with one case of CM, by 69 kg CO2e/t FPCM for cows with two cases of CM and by 92 kg CO2e/t FPCM for cows with three cases of CM compared with cows without CM. Preventing CM can be an effective strategy for farmers to reduce GHG emissions and can contribute to sustainable development of the dairy sector, because this also can improve the income of farmers and the welfare of cows. The impact of CM on GHG emissions, however, will vary between farms due to environmental conditions and management practices.
Studies investigating the underlying mechanisms of hallucinations in patients with schizophrenia suggest that an imbalance in top-down expectations v. bottom-up processing underlies these errors in perception. This study evaluates this hypothesis by testing if individuals drawn from the general population who have had auditory hallucinations (AH) have more misperceptions in auditory language perception than those who have never hallucinated.
We used an online survey to determine the presence of hallucinations. Participants filled out the Questionnaire for Psychotic Experiences and participated in an auditory verbal recognition task to assess both correct perceptions (hits) and misperceptions (false alarms). A hearing test was performed to screen for hearing problems.
A total of 5115 individuals from the general Dutch population participated in this study. Participants who reported AH in the week preceding the test had a higher false alarm rate in their auditory perception compared with those without such (recent) experiences. The more recent the AH were experienced, the more mistakes participants made. While the presence of verbal AH (AVH) was predictive for false alarm rate in auditory language perception, the presence of non-verbal or visual hallucinations were not.
The presence of AVH predicted false alarm rate in auditory language perception, whereas the presence of non-verbal auditory or visual hallucinations was not, suggesting that enhanced top-down processing does not transfer across modalities. More false alarms were observed in participants who reported more recent AVHs. This is in line with models of enhanced influence of top-down expectations in persons who hallucinate.
Dietary guidelines for pure fruit juice consumption differ between countries, regarding the question whether pure fruit juice is an acceptable alternative for fruit. Currently, little is known about pure fruit juice consumption and the risk of CVD. In this prospective cohort study, we studied the association of pure fruit juice and fruit consumption with the incidence of fatal and non-fatal CVD, CHD and stroke and investigated the differences in association with pure fruit juice consumption between low and high fruit consumers. A validated FFQ was used to estimate dietary intake of 34 560 participants (26·0 % men and 74·0 % women) aged 20–69 years from the European Prospective Investigation into Cancer and Nutrition–Netherlands study. Adjusted hazard ratios (HR) were estimated using Cox regression after average follow-up of 14·6 years. Compared with no consumption, pure fruit juice consumption up to 7 glasses/week – but not consumption of ≥8 glasses – was significantly associated with reduced risk of CVD and CHD, with HR from 0·83 (95 % CI 0·73, 0·95) to 0·88 (95 % CI 0·80, 0·97). Consumption of 1–4 and 4–8 glasses/week was significantly associated with lower risk of stroke with HR of 0·80 (95 % CI 0·64, 0·99) and 0·76 (95 % CI 0·61, 0·94), respectively. Associations did not differ considerably between low and high fruit consumers. The highest three quintiles of fruit consumption (≥121 g/d) were significantly associated with lower incidence of CVD, with HR of 0·87 (95 % CI 0·78, 0·97) and 0·88 (95 % CI 0·80, 0·98). In conclusion, although we observed favourable associations of moderate pure fruit juice consumption with CVD, for now consumption of whole fruit should be preferred because the evidence of the health benefits of fruit is more conclusive.
Household surveys are one of the most commonly used tools for generating insight into rural communities. Despite their prevalence, few studies comprehensively evaluate the quality of data derived from farm household surveys. We critically evaluated a series of standard reported values and indicators that are captured in multiple farm household surveys, and then quantified their credibility, consistency and, thus, their reliability. Surprisingly, even variables which might be considered ‘easy to estimate’ had instances of non-credible observations. In addition, measurements of maize yields and land owned were found to be less reliable than other stationary variables. This lack of reliability has implications for monitoring food security status, poverty status and the land productivity of households. Despite this rather bleak picture, our analysis also shows that if the same farm households are followed over time, the sample sizes needed to detect substantial changes are in the order of hundreds of surveys, and not in the thousands. Our research highlights the value of targeted and systematised household surveys and the importance of ongoing efforts to improve data quality. Improvements must be based on the foundations of robust survey design, transparency of experimental design and effective training. The quality and usability of such data can be further enhanced by improving coordination between agencies, incorporating mixed modes of data collection and continuing systematic validation programmes.
The transition period is the most critical period in the lactation cycle of dairy cows. Extended lactations reduce the frequency of transition periods, the number of calves and the related labour for farmers. This study aimed to assess the impact of 2 and 4 months extended lactations on milk yield and net partial cash flow (NPCF) at herd level, and on greenhouse gas (GHG) emissions per unit of fat- and protein-corrected milk (FPCM), using a stochastic simulation model. The model simulated individual lactations for 100 herds of 100 cows with a baseline lactation length (BL), and for 100 herds with lactations extended by 2 or 4 months for all cows (All+2 and All+4), or for heifers only (H+2 and H+4). Baseline lactation length herds produced 887 t (SD: 13) milk/year. The NPCF, based on revenues for milk, surplus calves and culled cows, and costs for feed, artificial insemination, calving management and rearing of youngstock, was k€174 (SD: 4)/BL herd per year. Extended lactations reduced milk yield of the herd by 4.1% for All+2, 6.9% for All+4, 1.1% for H+2 and 2.2% for H+4, and reduced the NPCF per herd per year by k€7 for All+2, k€12 for All+4, k€2 for H+2 and k€4 for H+4 compared with BL herds. Extended lactations increased GHG emissions in CO2-equivalents per t FPCM by 1.0% for All+2, by 1.7% for All+4, by 0.2% for H+2 and by 0.4% for H+4, but this could be compensated by an increase in lifespan of dairy cows. Subsequently, production level and lactation persistency were increased to assess the importance of these aspects for the impact of extended lactations. The increase in production level and lactation persistency increased milk production of BL herds by 30%. Moreover, reductions in milk yield for All+2 and All+4 compared with BL herds were only 0.7% and 1.1% per year, and milk yield in H+2 and H+4 herds was similar to BL herds. The resulting NPCF was equal to BL for All+2 and All+4 and increased by k€1 for H+2 and H+4 due to lower costs for insemination and calving management. Moreover, GHG emissions per t FPCM were equal to BL herds or reduced (0% to −0.3%) when lactations were extended. We concluded that, depending on lactation persistency, extending lactations of dairy cows can have a positive or negative impact on the NPCF and GHG emissions of milk production.
LiGAPS-Beef (Livestock simulator for Generic analysis of Animal Production Systems – Beef cattle) is a generic, mechanistic model designed to quantify potential and feed-limited growth, which provides insight in the biophysical scope to increase beef production (i.e. yield gap). Furthermore, it enables identification of the bio-physical factors that define and limit growth, which provides insight in management strategies to mitigate yield gaps. The aim of this paper, third in a series of three, is to evaluate the performance of LiGAPS-Beef with independent experimental data. After model calibration, independent data were used from six experiments in Australia, one in Uruguay and one in the Netherlands. Experiments represented three cattle breeds, and a wide range of climates, feeding strategies and cattle growth rates. The mean difference between simulated and measured average daily gains (ADGs) was 137 g/day across all experiments, which equals 20.1% of the measured ADGs. The root mean square error was 170 g/day, which equals 25.0% of the measured ADGs. LiGAPS-Beef successfully simulated the factors that defined and limited growth during the experiments on a daily basis (genotype, heat stress, digestion capacity, energy deficiency and protein deficiency). The simulated factors complied well to the reported occurrence of heat stress, energy deficiency and protein deficiency at specific periods during the experiments. We conclude that the level of accuracy of LiGAPS-Beef is acceptable, and provides a good basis for acquiring insight in the potential and feed-limited production of cattle in different beef production systems across the world. Furthermore, its capacity to identify factors that define or limit growth and production provides scope to use the model for yield gap analysis.
The role that vitamin D plays in pulmonary function remains uncertain. Epidemiological studies reported mixed findings for serum 25-hydroxyvitamin D (25(OH)D)–pulmonary function association. We conducted the largest cross-sectional meta-analysis of the 25(OH)D–pulmonary function association to date, based on nine European ancestry (EA) cohorts (n 22 838) and five African ancestry (AA) cohorts (n 4290) in the Cohorts for Heart and Aging Research in Genomic Epidemiology Consortium. Data were analysed using linear models by cohort and ancestry. Effect modification by smoking status (current/former/never) was tested. Results were combined using fixed-effects meta-analysis. Mean serum 25(OH)D was 68 (sd 29) nmol/l for EA and 49 (sd 21) nmol/l for AA. For each 1 nmol/l higher 25(OH)D, forced expiratory volume in the 1st second (FEV1) was higher by 1·1 ml in EA (95 % CI 0·9, 1·3; P<0·0001) and 1·8 ml (95 % CI 1·1, 2·5; P<0·0001) in AA (Prace difference=0·06), and forced vital capacity (FVC) was higher by 1·3 ml in EA (95 % CI 1·0, 1·6; P<0·0001) and 1·5 ml (95 % CI 0·8, 2·3; P=0·0001) in AA (Prace difference=0·56). Among EA, the 25(OH)D–FVC association was stronger in smokers: per 1 nmol/l higher 25(OH)D, FVC was higher by 1·7 ml (95 % CI 1·1, 2·3) for current smokers and 1·7 ml (95 % CI 1·2, 2·1) for former smokers, compared with 0·8 ml (95 % CI 0·4, 1·2) for never smokers. In summary, the 25(OH)D associations with FEV1 and FVC were positive in both ancestries. In EA, a stronger association was observed for smokers compared with never smokers, which supports the importance of vitamin D in vulnerable populations.
The expected increase in the global demand for livestock products calls for insight in the scope to increase actual production levels across the world. This insight can be obtained by using theoretical concepts of production ecology. These concepts distinguish three production levels for livestock: potential (i.e. theoretical maximum) production, which is defined by genotype and climate only; feed-limited production, which is limited by feed quantity and quality; and actual production. The difference between the potential or limited production and the actual production is the yield gap. The objective of this paper, the first in a series of three, is to present a mechanistic, dynamic model simulating potential and feed-limited production for beef cattle, which can be used to assess yield gaps. A novelty of this model, named LiGAPS-Beef (Livestock simulator for Generic analysis of Animal Production Systems – Beef cattle), is the identification of the defining factors (genotype and climate) and limiting factors (feed quality and available feed quantity) for cattle growth by integrating sub-models on thermoregulation, feed intake and digestion, and energy and protein utilisation. Growth of beef cattle is simulated at the animal and herd level. The model is designed to be applicable to different beef production systems across the world. Main model inputs are breed-specific parameters, daily weather data, information about housing, and data on feed quality and quantity. Main model outputs are live weight gain, feed intake and feed efficiency (FE) at the animal and herd level. Here, the model is presented, and its use is illustrated for Charolais and Brahman × Shorthorn cattle in France and Australia. Potential and feed-limited production were assessed successfully, and we show that FE of herds is highest for breeds most adapted to the local climate conditions. LiGAPS-Beef also identified the factors that define and limit growth and production of cattle. Hence, we argue the model has scope to be used as a tool for the assessment and analysis of yield gaps in beef production systems.
The model LiGAPS-Beef (Livestock simulator for Generic analysis of Animal Production Systems – Beef cattle) has been developed to assess potential and feed-limited growth and production of beef cattle in different areas of the world and to identify the processes responsible for the yield gap. Sensitivity analysis and evaluation of model results with experimental data are important steps after model development. The first aim of this paper, therefore, is to identify which parameters affect the output of LiGAPS-Beef most by conducting sensitivity analyses. The second aim is to evaluate the accuracy of the thermoregulation sub-model and the feed intake and digestion sub-model with experimental data. Sensitivity analysis was conducted using a one-at-a-time approach. The upper critical temperature (UCT) simulated with the thermoregulation sub-model was most affected by the body core temperature and parameters affecting latent heat release from the skin. The lower critical temperature (LCT) and UCT were considerably affected by weather variables, especially ambient temperature and wind speed. Sensitivity analysis for the feed intake and digestion sub-model showed that the digested protein per kg feed intake was affected to a larger extent than the metabolisable energy (ME) content. Sensitivity analysis for LiGAPS-Beef was conducted for ¾ Brahman×¼ Shorthorn cattle in Australia and Hereford cattle in Uruguay. Body core temperature, conversion of digestible energy to ME, net energy requirements for maintenance, and several parameters associated with heat release affected feed efficiency at the herd level most. Sensitivity analyses have contributed, therefore, to insight which parameters are to be investigated in more detail when applying LiGAPS-Beef. Model evaluation was conducted by comparing model simulations with independent data from experiments. Measured heat production in experiments corresponded fairly well to the heat production simulated with the thermoregulation sub-model. Measured ME contents from two data sets corresponded well to the ME contents simulated with the feed intake and digestion sub-model. The relative mean absolute errors were 9.3% and 6.4% of the measured ME contents for the two data sets. In conclusion, model evaluation indicates the thermoregulation sub-model can deal with a wide range of weather conditions, and the feed intake and digestion sub-model with a variety of feeds, which corresponds to the aim of LiGAPS-Beef to simulate cattle in different beef production systems across the world.
Mineral phosphorus (P) used to fertilise crops is derived from phosphate rock, which is a finite resource. Preventing and recycling mineral P waste in the food system, therefore, are essential to sustain future food security and long-term availability of mineral P. The aim of our modelling exercise was to assess the potential of preventing and recycling P waste in a food system, in order to reduce the dependency on phosphate rock. To this end, we modelled a hypothetical food system designed to produce sufficient food for a fixed population with a minimum input requirement of mineral P. This model included representative crop and animal production systems, and was parameterised using data from the Netherlands. We assumed no import or export of feed and food. We furthermore assumed small P soil losses and no net P accumulation in soils, which is typical for northwest European conditions. We first assessed the minimum P requirement in a baseline situation, that is 42% of crop waste is recycled, and humans derived 60% of their dietary protein from animals (PA). Results showed that about 60% of the P waste in this food system resulted from wasting P in human excreta. We subsequently evaluated P input for alternative situations to assess the (combined) effect of: (1) preventing waste of crop and animal products, (2) fully recycling waste of crop products, (3) fully recycling waste of animal products and (4) fully recycling human excreta and industrial processing water. Recycling of human excreta showed most potential to reduce P waste from the food system, followed by prevention and finally recycling of agricultural waste. Fully recycling P could reduce mineral P input by 90%. Finally, for each situation, we studied the impact of consumption of PA in the human diet from 0% to 80%. The optimal amount of animal protein in the diet depended on whether P waste from animal products was prevented or fully recycled: if it was, then a small amount of animal protein in the human diet resulted in the most sustainable use of P; but if it was not, then the most sustainable use of P would result from a complete absence of animal protein in the human diet. Our results apply to our hypothetical situation. The principles included in our model however, also hold for food systems with, for example, different climatic and soil conditions, farming practices, representative types of crops and animals and population densities.
C-reactive protein (CRP) is a candidate biomarker for major depressive disorder (MDD), but it is unclear how peripheral CRP levels relate to the heterogeneous clinical phenotypes of the disorder.
To explore CRP in MDD and its phenotypic associations.
We recruited 102 treatment-resistant patients with MDD currently experiencing depression, 48 treatment-responsive patients with MDD not currently experiencing depression, 48 patients with depression who were not receiving medication and 54 healthy volunteers. High-sensitivity CRP in peripheral venous blood, body mass index (BMI) and questionnaire assessments of depression, anxiety and childhood trauma were measured. Group differences in CRP were estimated, and partial least squares (PLS) analysis explored the relationships between CRP and specific clinical phenotypes.
Compared with healthy volunteers, BMI-corrected CRP was significantly elevated in the treatment-resistant group (P = 0.007; Cohen's d = 0.47); but not significantly so in the treatment-responsive (d = 0.29) and untreated (d = 0.18) groups. PLS yielded an optimal two-factor solution that accounted for 34.7% of variation in clinical measures and for 36.0% of variation in CRP. Clinical phenotypes most strongly associated with CRP and heavily weighted on the first PLS component were vegetative depressive symptoms, BMI, state anxiety and feeling unloved as a child or wishing for a different childhood.
CRP was elevated in patients with MDD, and more so in treatment-resistant patients. Other phenotypes associated with elevated CRP included childhood adversity and specific depressive and anxious symptoms. We suggest that patients with MDD stratified for proinflammatory biomarkers, like CRP, have a distinctive clinical profile that might be responsive to second-line treatment with anti-inflammatory drugs.
Declaration of interest
S.R.C. consults for Cambridge Cognition and Shire; and his input in this project was funded by a Wellcome Trust Clinical Fellowship (110049/Z/15/Z). E.T.B. is employed half time by the University of Cambridge and half time by GlaxoSmithKline; he holds stock in GlaxoSmithKline. In the past 3 years, P.J.C. has served on an advisory board for Lundbeck. N.A.H. consults for GlaxoSmithKline. P.d.B., D.N.C.J. and W.C.D. are employees of Janssen Research & Development, LLC., of Johnson & Johnson, and hold stock in Johnson & Johnson. The other authors report no financial disclosures or potential conflicts of interest.
Higher-educated people often have healthier diets, but it is unclear whether specific dietary patterns exist within educational groups. We therefore aimed to derive dietary patterns in the total population and by educational level and to investigate whether these patterns differed in their composition and associations with the incidence of fatal and non-fatal CHD and stroke. Patterns were derived using principal components analysis in 36 418 participants of the European Prospective Investigation into Cancer and Nutrition-Netherlands cohort. Self-reported educational level was used to create three educational groups. Dietary intake was estimated using a validated semi-quantitative FFQ. Hazard ratios were estimated using Cox Proportional Hazard analysis after a mean follow-up of 16 years. In the three educational groups, similar ‘Western’, ‘prudent’ and ‘traditional’ patterns were derived as in the total population. However, with higher educational level a lower population-derived score for the ‘Western’ and ‘traditional’ patterns and a higher score on the ‘prudent’ pattern were observed. These differences in distribution of the factor scores illustrate the association between education and food consumption. After adjustments, no differences in associations between population-derived dietary patterns and the incidence of CHD or stroke were found between the educational groups (Pinteraction between 0·21 and 0·98). In conclusion, although in general population and educational groups-derived dietary patterns did not differ, small differences between educational groups existed in the consumption of food groups in participants considered adherent to the population-derived patterns (Q4). This did not result in different associations with incident CHD or stroke between educational groups.