To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Intermittent explosive disorder (IED) is characterised by impulsive anger attacks that vary greatly across individuals in severity and consequence. Understanding IED subtypes has been limited by lack of large, general population datasets including assessment of IED. Using the 17-country World Mental Health surveys dataset, this study examined whether behavioural subtypes of IED are associated with differing patterns of comorbidity, suicidality and functional impairment.
IED was assessed using the Composite International Diagnostic Interview in the World Mental Health surveys (n = 45 266). Five behavioural subtypes were created based on type of anger attack. Logistic regression assessed association of these subtypes with lifetime comorbidity, lifetime suicidality and 12-month functional impairment.
The lifetime prevalence of IED in all countries was 0.8% (s.e.: 0.0). The two subtypes involving anger attacks that harmed people (‘hurt people only’ and ‘destroy property and hurt people’), collectively comprising 73% of those with IED, were characterised by high rates of externalising comorbid disorders. The remaining three subtypes involving anger attacks that destroyed property only, destroyed property and threatened people, and threatened people only, were characterised by higher rates of internalising than externalising comorbid disorders. Suicidal behaviour did not vary across the five behavioural subtypes but was higher among those with (v. those without) comorbid disorders, and among those who perpetrated more violent assaults.
The most common IED behavioural subtypes in these general population samples are associated with high rates of externalising disorders. This contrasts with the findings from clinical studies of IED, which observe a preponderance of internalising disorder comorbidity. This disparity in findings across population and clinical studies, together with the marked heterogeneity that characterises the diagnostic entity of IED, suggests that it is a disorder that requires much greater research.
Epidemiological studies indicate that individuals with one type of mental disorder have an increased risk of subsequently developing other types of mental disorders. This study aimed to undertake a comprehensive analysis of pair-wise lifetime comorbidity across a range of common mental disorders based on a diverse range of population-based surveys.
The WHO World Mental Health (WMH) surveys assessed 145 990 adult respondents from 27 countries. Based on retrospectively-reported age-of-onset for 24 DSM-IV mental disorders, associations were examined between all 548 logically possible temporally-ordered disorder pairs. Overall and time-dependent hazard ratios (HRs) and 95% confidence intervals (CIs) were calculated using Cox proportional hazards models. Absolute risks were estimated using the product-limit method. Estimates were generated separately for men and women.
Each prior lifetime mental disorder was associated with an increased risk of subsequent first onset of each other disorder. The median HR was 12.1 (mean = 14.4; range 5.2–110.8, interquartile range = 6.0–19.4). The HRs were most prominent between closely-related mental disorder types and in the first 1–2 years after the onset of the prior disorder. Although HRs declined with time since prior disorder, significantly elevated risk of subsequent comorbidity persisted for at least 15 years. Appreciable absolute risks of secondary disorders were found over time for many pairs.
Survey data from a range of sites confirms that comorbidity between mental disorders is common. Understanding the risks of temporally secondary disorders may help design practical programs for primary prevention of secondary disorders.
Current feed evaluation systems often assume that fermented starch (i.e. resistant starch (RS)) yields less energy than digested starch. However, growth rates of pigs fed low and high RS diets are often the same when feed is available ad libitum. This may be explained by its effect on digestive processes changing feeding behavior, and consequently energy utilization. This study aims to investigate the effect of RS on nutrient digestion and digesta passage rate in pigs, in combination with its effect on feeding behavior and growth performance under ad libitum conditions. In experiment 1, 20 male pigs (40 ± 2.82 kg) were fed diets containing either 50% waxy maize starch (low in RS (LRS)) or high-amylose maize starch (high in RS (HRS)), and soluble and insoluble indigestible markers. After 14 days of adaptation to the diets, pigs were fed hourly to reach steady state (6 h), dissected, and digesta were collected from eight segments. From the collected samples, nutrient digestion and passage rate of the solid and liquid digesta fraction were determined. In experiment 2, 288 pigs (80 ± 0.48 kg; sex ratio per pen 1 : 1; boar : gilt) were housed in groups of 6. Pigs were ad libitum-fed one of the experimental diets, and slaughtered at approximately 115 kg. Feed intake, growth and carcass parameters were measured. Ileal starch digestibility was greater for LRS-fed than for HRS-fed pigs (98.0% v. 74.0%; P < 0.001), where the additional undigested starch in HRS-fed pigs was fermented in the large intestine. No effects of RS on digesta passage rate of the solid or liquid digesta fraction and on feeding behavior were observed. Growth rate and feed intake did not differ between diets, whereas feed efficiency of HRS-fed pigs was 1%-unit higher than that of LRS-fed pigs (P = 0.041). The efficiency of feed used for carcass gain did not differ between diets indicating that the difference in feed efficiency was determined by the non-carcass fraction. Despite a 30% greater RS intake (of total starch) with HRS than with LRS, carcass gain and feed efficiency used for carcass gain were unaffected. RS did not affect digesta passage rate nor feeding behavior suggesting that the difference in energy intake between fermented and digested starch is compensated for post-absorptively. Our results indicate that the net energy value of fermented starch currently used in pig feed evaluation systems is underestimated and should be reconsidered.
Misalignment of day/night and feeding rhythms has been shown to increase fat deposition and the risk for metabolic disorders in humans and rodents. In most studies, however, food intake and intake patterns are not controlled. We studied the effects of circadian misalignment on energy expenditure in pigs while controlling for food intake as well as intake patterns. Twelve groups of five male pigs were housed in respiration chambers and fed either during the day (10.00–18.00 hours; DF) or night (22.00–06.00 hours; NF), bihourly the same sequential meals, representing 15, 10, 25, 30 and 20 % of the daily allowance. Paired feeding was applied to ensure equal gross energy intake between treatments. Apparent total tract digestibility, energy balances and heat partitioning were measured and analysed using a mixed linear model. Apparent total tract energy and DM digestibility tended to be lower for NF-pigs than DF-pigs (P < 0·10). Heat production was 3 % lower for NF-pigs than DF-pigs (P < 0·026), increasing fat retention by 7 % in NF-pigs (P = 0·050). NF-pigs were less active than DF-pigs during the feeding period, but more active during the fasting period. RMR was greater for DF-pigs than NF-pigs during the fasting period. Methane production was 30 % greater in NF-pigs than DF-pigs (P < 0·001). In conclusion, circadian misalignment has little effect on nutrient digestion, but alters nutrient partitioning, ultimately increasing fat deposition. The causality of the association between circadian misalignment and methane production rates remains to be investigated.
Improvement in depression within the first 2 weeks of antidepressant treatment predicts good outcomes, but non-improvers can still respond or remit, whereas improvers often do not.
We aimed to investigate whether early improvement of individual depressive symptoms better predicts response or remission.
We obtained individual patient data of 30 trials comprising 2184 placebo-treated and 6058 antidepressant-treated participants. Primary outcome was week 6 response; secondary outcomes were week 6 remission and week 12 response and remission. We compared models that only included improvement in total score by week 2 (total improvement model) with models that also included improvement in individual symptoms.
For week 6 response, the area under the receiver operating characteristic curve and negative and positive predictive values of the total improvement model were 0.73, 0.67 and 0.74 compared with 0.77, 0.70 and 0.71 for the item improvement model. Model performance decreased for week 12 outcomes. Of predicted non-responders, 29% actually did respond by week 6 and 43% by week 12, which was decreased from the baseline (overall) probabilities of 51% by week 6 and 69% by week 12. In post hoc analyses with continuous rather than dichotomous early improvement, including individual items did not enhance model performance.
Examining individual symptoms adds little to the predictive ability of early improvement. Additionally, early non-improvement does not rule out response or remission, particularly after 12 rather than 6 weeks. Therefore, our findings suggest that routinely adapting pharmacological treatment because of limited early improvement would often be premature.
This study aimed to evaluate the effects of an intervention including nutritional telemonitoring, nutrition education, and follow-up by a nurse on nutritional status, diet quality, appetite, physical functioning and quality of life of Dutch community-dwelling elderly. We used a parallel arm pre-test post-test design with 214 older adults (average age 80 years) who were allocated to the intervention group (n 97) or control group (n 107), based on the municipality. The intervention group received a 6-month intervention including telemonitoring measurements, nutrition education and follow-up by a nurse. Effect measurements took place at baseline, after 4·5 months, and at the end of the study. The intervention improved nutritional status of participants at risk of undernutrition (β (T1)=2·55; 95 % CI 1·41, 3·68; β (T2)=1·77; 95 % CI 0·60, 2·94) and scores for compliance with Dutch guidelines for the intake of vegetables (β=1·27; 95 % CI 0·49, 2·05), fruit (β=1·24; 95 % CI 0·60, 1·88), dietary fibre (β=1·13; 95 % CI 0·70, 1·57), protein (β=1·20; 95 % CI 0·15, 2·24) and physical activity (β=2·13; 95 % CI 0·98, 3·29). The intervention did not have an effect on body weight, appetite, physical functioning and quality of life. In conclusion, this intervention leads to improved nutritional status in older adults at risk of undernutrition, and to improved diet quality and physical activity levels of community-dwelling elderly. Future studies with a longer duration should focus on older adults at higher risk of undernutrition than this study population to investigate whether the impact of the intervention on nutritional and functional outcomes can be improved.
Many astronomers working in the field of AstroInformatics write code as part of their work. Although the programming language of choice is Python, a small number (8%) use R. R has its specific strengths in the domain of statistics, and is often viewed as limited in the size of data it can handle. However, Microsoft R Server is a product that removes these limitations by being able to process much larger amounts of data. I present some highlights of R Server, by illustrating how to fit a convolutional neural network using R. The specific task is to classify galaxies, using only images extracted from the Sloan Digital Skyserver.
A standardised, national, 160-item FFQ, the FFQ-NL 1.0, was recently developed for Dutch epidemiological studies. The objective was to validate the FFQ-NL 1.0 against multiple 24-h recalls (24hR) and recovery and concentration biomarkers. The FFQ-NL 1.0 was filled out by 383 participants (25–69 years) from the Nutrition Questionnaires plus study. For each participant, one to two urinary and blood samples and one to five (mean 2·7) telephone-based 24hR were available. Group-level bias, correlation coefficients, attenuation factors, de-attenuated correlation coefficients and ranking agreement were assessed. Compared with the 24hR, the FFQ-NL 1.0 estimated the intake of energy and macronutrients well. However, it underestimated intakes of SFA and trans-fatty acids and alcohol and overestimated intakes of most vitamins by >5 %. The median correlation coefficient was 0·39 for energy and macronutrients, 0·30 for micronutrients and 0·30 for food groups. The FFQ underestimated protein intake by an average of 16 % and K by 5 %, relative to their urinary recovery biomarkers. Attenuation factors were 0·44 and 0·46 for protein and K, respectively. Correlation coefficients were 0·43–0·47 between (fatty) fish intake and plasma EPA and DHA and 0·24–0·43 between fruit and vegetable intakes and plasma carotenoids. In conclusion, the overall validity of the newly developed FFQ-NL 1.0 was acceptable to good. The FFQ-NL 1.0 is well suited for future use within Dutch cohort studies among adults.
The aim of this study was to test that the ability to obtain information about more than one letter at a glance develops prior to conventional reading. This study included 55 Dutch-speaking prereaders (mean age 63.56 months, SD = 6.55) and 45 Hebrew-speaking prereaders (mean age = 66.71 months, SD = 8.35). In a perceptual span task, one letter was projected in the fovea, the other to the right or to the left, at a distance of 4 or 6 letters from the center letter. A second perceptual span task included letter-like forms instead of letters. Eye-tracking was used to control whether children fixated on the center letter or letter-like form during the task. Obtaining information about two letters/forms was easier when the parafoveally projected letter/form was projected to the right for both Hebrew and Dutch children. Hemispheric dominance and not the dominant reading direction (right to left in Hebrew and left to right in Dutch) may explain this preference for right, which may mean that left-to-right reading is easier to learn than right-to-left reading. We did find, nevertheless, some evidence that reading direction in the dominant orthography affected how children divided attention over letters.
CHD may ensue from chronic systemic low-grade inflammation. Diet is a modifiable risk factor for both, and its optimisation may reduce post-operative mortality, atrial fibrillation and cognitive decline. In the present study, we investigated the usual dietary intakes of patients undergoing elective coronary artery bypass grafting (CABG), emphasising on food groups and nutrients with putative roles in the inflammatory/anti-inflammatory balance. From November 2012 to April 2013, we approached ninety-three consecutive patients (80 % men) undergoing elective CABG. Of these, fifty-five were finally included (84 % men, median age 69 years; range 46–84 years). The median BMI was 27 (range 18–36) kg/m2. The dietary intake items were fruits (median 181 g/d; range 0–433 g/d), vegetables (median 115 g/d; range 0–303 g/d), dietary fibre (median 22 g/d; range 9–45 g/d), EPA+DHA (median 0·14 g/d; range 0·01–1·06 g/d), vitamin D (median 4·9 μg/d; range 1·9–11·2 μg/d), saturated fat (median 13·1 % of energy (E%); range 9–23 E%) and linoleic acid (LA; median 6·3 E%; range 1·9–11·3 E%). The percentages of patients with dietary intakes below recommendations were 62 % (fruits; recommendation 200 g/d), 87 % (vegetables; recommendation 150–200 g/d), 73 % (dietary fibre; recommendation 30–45 g/d), 91 % (EPA+DHA; recommendation 0·45 g/d), 98 % (vitamin D; recommendation 10–20 μg/d) and 13 % (LA; recommendation 5–10 E%). The percentages of patients with dietary intakes above recommendations were 95 % (saturated fat; recommendation < 10 E%) and 7 % (LA). The dietary intakes of patients proved comparable with the average nutritional intake of the age- and sex-matched healthy Dutch population. These unbalanced pre-operative diets may put them at risk of unfavourable surgical outcomes, since they promote a pro-inflammatory state. We conclude that there is an urgent need for intervention trials aiming at rapid improvement of their diets to reduce peri-operative risks.
Rates of obesity are increasing in women of child bearing age with negative impacts on maternal and offspring health. Emerging evidence suggests in utero origins of respiratory health in offspring of obese mothers but mechanisms are unknown. Changes in maternal cortisol levels are one potential factor as cortisol levels are altered in obesity and cortisol is separately implicated in development of offspring wheeze. We aimed to assess whether increased pre-pregnancy maternal body mass index (BMI) was associated with offspring early life wheezing, and whether this was mediated by altered cortisol levels in the mother. In a prospective community-based cohort (Amsterdam Born Children and their Development cohort), women completed questionnaires during pregnancy and at 3–5 months post-delivery regarding self-history of asthma and atopy, and of wheezing of their offspring (n=4860). Pre-pregnancy BMI was recorded and serum total cortisol levels were measured in a subset of women (n=2227) at their first antenatal visit. A total of 20.2% (n=984) women were overweight or obese and 10.3% reported wheezing in their offspring. Maternal BMI was associated with offspring wheezing (1 unit (kg/m2) increase, OR: 1.03; 95% CI: 1.00–1.05), after correction for confounders. Although maternal cortisol levels were lower in overweight mothers and those with a history of asthma, maternal cortisol levels did not mediate the increased offspring wheezing. Pre-pregnancy BMI impacts on baby wheezing, which is not mediated by lower cortisol levels. As the prevalence of obesity in women of child-bearing age is increasing, further studies are needed to investigate modifiable maternal factors to avoid risk of wheezing in young children.
Bioethics has made remarkable progress as a scholarly and applied field. A mere fledgling in the 1960s, it is now firmly established in hospitals, medical schools, and government agencies and boasts a number of professional associations and a handsome collection of journals.
During the early phase of a large-scale accident with release of radioactivity to the atmosphere, it is essential to notify and inform competent authorities as early and as extensively as possible. Only when the accident is rapidly notified and information is continuously made available in the form of real-time monitoring data and dispersion forecasts are decision makers able to define appropriate countermeasures. The Chernobyl accident taught us that information exchange should be carried out in a harmonised and consistent manner. Although several European countries already had developed automatic monitoring networks by 1986 and in some cases established bilateral agreements to exchange this information, the size of the accident demonstrated the need to extend such schemes to the continental scale. It became important to have commonly agreed international data formats and procedures in place. Over the past 25 years, the European Commission has invested in improving the rapid exchange of information and data in the event of a major accident. For the early phase of emergency support, it has focussed on three closely related systems: the early notification system ECURIE, the automatic data exchange platform EURDEP and the atmospheric dispersion model exchange and evaluation system ENSEMBLE. Starting from the legal background, we describe these information systems in detail with an emphasis on their current status and their planned future developments.
The aim of this study is to specify the concept of ‘healthy ageing’ from both western and non-western cultural perspectives, and to compare the views of academics and lay older people. Thirty-four published peer-reviewed full papers in English and Chinese (traditional characters) were identified using electronic database searches. The key components of their definitions of healthy ageing were extracted and categorised into 12 domains. The results show that, in general, lay definitions (as described in 11 studies) included more domains (independency, family, adaptation, financial security, personal growth, and spirituality) and more diversity in the healthy ageing concept than academic views (which tend to focus more on physical and mental health and social functioning in later life). Certain domains were valued differently across cultures. As shown in previous studies, the findings affirm that healthy ageing is a multi-dimensional and complex concept and that there are substantial differences in different cultures. Moreover, we found that there are pronounced variations in the conceptualisation of healthy ageing as between academic and older lay people. Generally, older lay people perceive healthy ageing more broadly than the maintenance of physical, mental and social functioning. We suggest that academic researchers should integrate the more holistic perspectives of older lay people and cultural diversity into the classical ‘physical–mental–social’ healthy ageing concept.
The objective of the present paper is to review the methods of measuring micronutrient intake adequacy for individuals and for populations in order to ascertain best practice. A systematic review was conducted to locate studies on the methodological aspects of measuring nutrient adequacy. The results showed that for individuals, qualitative methods (to find probability of adequacy) and quantitative methods (to find confidence of adequacy) have been proposed for micronutrients where there is enough data to set an average nutrient requirement (ANR). If micronutrients do not have ANR, an adequate intake (AI) is often defined and can be used to assess adequacy, provided the distribution of daily intake over a number of days is known. The probability of an individual's intake being excessive can also be compared with the upper level of safe intake and the confidence of this estimate determined in a similar way. At the population level, adequacy can be judged from the ANR using the probability approach or its short cut – the estimated average requirement cut-point method. If the micronutrient does not have an ANR, adequacy cannot be determined from the average intake and must be expressed differently. The upper level of safe intake can be used for populations in a similar way to that of individuals. All of the methodological studies reviewed were from the American continent and all used the methodology described in the Institute of Medicine publications. The present methodology should now be adapted for use in Europe.
All is clouded by desire: as fire by smoke, as a mirror by dust.… Wisdom is clouded by desire, the ever present enemy of the wise.
– Krishna to Arjuna, Bhagavadgita, book 3
Most of us, academics or laypersons, accept the straightforward and simple syllogism: scientific discovery → change or improvement in medical practices → new ethical problems → bioethical advice or solutions. While there are some cynics among us, the progression from science to practice to ethical adjustment seems logical, if not natural. But the real world, alas, is not a clean and logical place. Our tidy idea of a natural progression conceals a messy process where competing information, demands, pressures, values, and emotions interact to produce scientific discovery, new medical practices, and bioethical advice. We may prefer to see bioethics as part of a well-organized division of labor, but it is, in fact, one part of the complex world of science, medicine, and health care.
Called upon to provide reasoned and defensible advice on the ethical quandaries that emerge from this jumble, bioethicists must find a way to collect information, strip away confounding factors, and zero in on the essential dilemma. Clearing away the clutter provides clarity for bioethics, but it also obscures the social origins of the facts that are brought to bear in bioethical deliberation. When consultant bioethicists consider ethical problems associated with genetic therapy, for example, they ask about risks and benefits of the procedure, scrutinize the informed-consent process, question the effects of altered genes on the population, and consider who will and will not have access to new treatments.
ELSA (European Leadership in Space Astrometry) is an EU-funded research project 2006–2010, contributing to the scientific preparations for the Gaia mission while training young researchers in space astrometry and related subjects. Nine postgraduate (PhD) students and five postdocs have been recruited to the network. Their research focuses on the principles of global astrometric, photometric, and spectroscopic measurements from space, instrument modelling and calibration, and numerical analysis tools and data processing methods relevant for Gaia.
Truncated models are indirect methods to estimate the size of a hidden population which, in contrast to the capture–recapture method, can be used on a single information source. We estimated the coverage of a tuberculosis screening programme among illicit drug users and homeless persons with a mobile digital X-ray unit between 1 January 2003 and 31 December 2005 in Rotterdam, The Netherlands, using truncated models. The screening programme reached about two-third of the estimated target population at least once annually. The intended coverage (at least two chest X-rays per person per year) was about 23%. We conclude that simple truncated models can be used relatively easily on available single-source routine data to estimate the size of a population of illicit drug users and homeless persons. We assumed that the most likely overall bias in this study would be overestimation and therefore the coverage of the targeted mobile tuberculosis screening programme would be higher.
Gigahertz Peaked Spectrum (GPS) radio galaxies are generally thought to be the young counterparts of classical extended radio sources and live in massive ellipticals. GPS sources are vital for studying the early evolution of radio-loud AGN, the trigger of their nuclear activity, and the importance of feedback in galaxy evolution. We study the Parkes half-Jansky sample of GPS radio galaxies of which now all host galaxies have been identified and 80% has their redshifts determined (0.122 < z < 1.539). Analysis of the absolute magnitudes of the GPS host galaxies show that at z > 1 they are on average a magnitude fainter than classical 3C radio galaxies. This suggests that the AGN in young radio galaxies have not yet much influenced the overall properties of the host galaxy. However their restframe UV luminosities indicate that there is a low level of excess as compared to passive evolution models.
The aim of this study was to describe a systematic process of record-linkage, cross-validation, case-ascertainment and capture–recapture analysis to assess the quality of tuberculosis registers and to estimate the completeness of notification of incident tuberculosis cases in The Netherlands in 1998. After record-linkage and cross-validation 1499 tuberculosis patients were identified, of whom 1298 were notified, resulting in an observed under-notification of 13·4%. After adjustment for possible imperfect record-linkage and remaining false-positive hospital cases observed under-notification was 7·3%. Log-linear capture–recapture analysis initially estimated a total number of 2053 (95% CI 1871–2443) tuberculosis cases, resulting in an estimated under-notification of 36·8%. After adjustment for possible imperfect record-linkage and remaining false-positive hospital cases various capture–recapture models estimated under-notification at 13·6%. One of the reasons for the higher than expected estimated under-notification in a country with a well-organized system of tuberculosis control might be that some tuberculosis cases, e.g. extrapulmonary tuberculosis, are managed by clinicians less familiar with notification of infectious diseases. This study demonstrates the possible impact of violation of assumptions underlying capture–recapture analysis, especially the perfect record-linkage, perfect positive predictive value and absent three-way interaction assumptions.