We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Psychiatric services providing care for patients and their families confronted with a first psychotic episode need to be sensitive towards patients’ and families’ preferences. Ten patients, ten family members and ten professional caregivers composed a list of 42 preferences in the treatment for a first psychotic episode. In total 99 patients, 100 family members and 263 professional caregivers evaluated these preferences, thus producing an order of priorities. There appears to be considerable agreement among the groups of respondents regarding their top ten priorities, especially concerning information on diagnosis and medication. However, we found important differences between groups of respondents. The results suggest that in psychiatric services great attention should be given to psycho-education and early outpatient intervention.
Surface melt on the coastal Antarctic ice sheet (AIS) determines the viability of its ice shelves and the stability of the grounded ice sheet, but very few in situ melt rate estimates exist to date. Here we present a benchmark dataset of in situ surface melt rates and energy balance from nine sites in the eastern Antarctic Peninsula (AP) and coastal Dronning Maud Land (DML), East Antarctica, seven of which are located on AIS ice shelves. Meteorological time series from eight automatic and one staffed weather station (Neumayer), ranging in length from 15 months to almost 24 years, serve as input for an energy-balance model to obtain consistent surface melt rates and energy-balance results. We find that surface melt rates exhibit large temporal, spatial and process variability. Intermittent summer melt in coastal DML is primarily driven by absorption of shortwave radiation, while non-summer melt events in the eastern AP occur during föhn events that force a large downward directed turbulent flux of sensible heat. We use the in situ surface melt rate dataset to evaluate melt rates from the regional atmospheric climate model RACMO2 and validate a melt product from the QuikSCAT satellite.
Forage maize (Zea mays L.) is often grown year after year on the same land on many intensive dairy farms in north-west Europe. This results in agronomical problems such as weed resistance and decline of soil quality, which may be solved by ley-arable farming. In the current study, forage maize was grown at different nitrogen (N) fertilization levels for 3 years on permanent arable land and on temporary arable land after ploughing out different types of grass–clover swards. Swards differed in management (grazing or cutting) and age (temporary or permanent). Maize yield and soil residual mineral N content were measured after the maize harvest. There was no effect on maize yield of the management of ploughed-out grass–clover swards but a clear effect of the age of grass–clover swards. The N fertilizer replacement value (NFRV) of all ploughed grass–clover swards was >170 kg N/ha in the first year after ploughing. In the third year after ploughing, NFRV of the permanent sward still exceeded 200 kg N/ha, whereas that of the temporary swards decreased to 30 kg N/ha on average. Soil residual nitrate (NO3−) remained below the local, legal threshold of 90 kg NO3− N/ha except for the ploughed-out permanent sward in the third year after ploughing (166 kg NO3− N/ha). The current study highlights the potential of forage maize – ley rotations in saving fertilizer N. This is beneficial both for the environment and for the profitability of dairy production in north-western Europe.
Introduction of biofortified cassava as school lunch can increase vitamin A intake, but may increase risk of other deficiencies due to poor nutrient profile of cassava. We assessed the potential effect of introducing a yellow cassava-based school lunch combined with additional food-based recommendations (FBR) on vitamin A and overall nutrient adequacy using Optifood (linear programming tool).
Design
Cross-sectional study to assess dietary intakes (24 h recall) and derive model parameters (list of foods consumed, median serving sizes, food and food (sub)group frequency distributions, food cost). Three scenarios were modelled, namely daily diet including: (i) no school lunch; (ii) standard 5d school lunch with maize/beans; and (iii) 5d school lunch with yellow cassava. Each scenario and scenario 3 with additional FBR were assessed on overall nutrient adequacy using recommended nutrient intakes (RNI).
Setting
Eastern Kenya.
Subjects
Primary-school children (n 150) aged 7–9 years.
Results
Best food pattern of yellow cassava-based lunch scenario achieved 100 % RNI for six nutrients compared with no lunch (three nutrients) or standard lunch (five nutrients) scenario. FBR with yellow cassava and including small dried fish improved nutrient adequacy, but could not ensure adequate intake of fat (52 % of average requirement), riboflavin (50 % RNI), folate (59 % RNI) and vitamin A (49 % RNI).
Conclusions
Introduction of yellow cassava-based school lunch complemented with FBR potentially improved vitamin A adequacy, but alternative interventions are needed to ensure dietary adequacy. Optifood is useful to assess potential contribution of a biofortified crop to nutrient adequacy and to develop additional FBR to address remaining nutrient gaps.
Objectives: In complex real life situations, memories for temporal and spatial information are naturally linked since sequential events coincide in time and space. Whether this connection is inseparable or instead whether these processes are functionally dissociable was investigated in this patient study. Methods: Spatial object-location and temporal order memory tasks were administered to 36 stroke patients and 44 healthy control participants. Results: On group level, patients with a stroke in the left hemisphere performed worse on temporal order memory, compared to the control participants. On individual level, using a multiple case-study approach, a clear pattern of dissociations was found between memory for temporal and for spatial features. Conclusions: These findings indicate that location and temporal order memory contain functionally separable processes. This adds to our understanding of how context information is processed in human memory. (JINS, 2017, 23, 421–430)
Supplementation with n-3 fatty acids may improve long-term outcomes of renal transplant recipients (RTR). Recent evidence suggests that EPA and DHA have different outcomes compared with α-linolenic acid (ALA). We examined the prospective associations of EPA–DHA and ALA intakes with graft failure and all-cause mortality in 637 RTR. During 3·1 years (interquartile range 2·7, 3·8) of follow-up, forty-one developed graft failure and sixty-seven died. In age- and sex-adjusted analyses, EPA–DHA and ALA intakes were not associated with graft failure. EPA–DHA intake was not significantly associated with mortality (hazard ratio (HR) 0·79; 95% CI 0·54, 1·15 per 0·1 energy% difference). ALA intake was significantly associated with mortality (HR 1·17; 95% CI 1·04, 1·31 per 0·1 energy% difference). This association remained following adjustments for BMI, proteinuria and intakes of fat, carbohydrate and protein. RTR in the highest tertile of ALA intake exhibited about 2-fold higher mortality risk (HR 2·21; 95% CI 1·23, 3·97) compared with the lowest tertile. In conclusion, ALA intake may be associated with increased mortality in RTR. Future RCT are needed to confirm these results.
Current ultra-high-risk (UHR) criteria appear insufficient to predict imminent onset of first-episode psychosis, as a meta-analysis showed that about 20% of patients have a psychotic outcome after 2 years. Therefore, we aimed to develop a stage-dependent predictive model in UHR individuals who were seeking help for co-morbid disorders.
Method
Baseline data on symptomatology, and environmental and psychological factors of 185 UHR patients (aged 14–35 years) participating in the Dutch Early Detection and Intervention Evaluation study were analysed with Cox proportional hazard analyses.
Results
At 18 months, the overall transition rate was 17.3%. The final predictor model included five variables: observed blunted affect [hazard ratio (HR) 3.39, 95% confidence interval (CI) 1.56–7.35, p < 0.001], subjective complaints of impaired motor function (HR 5.88, 95% CI 1.21–6.10, p = 0.02), beliefs about social marginalization (HR 2.76, 95% CI 1.14–6.72, p = 0.03), decline in social functioning (HR 1.10, 95% CI 1.01–1.17, p = 0.03), and distress associated with suspiciousness (HR 1.02, 95% CI 1.00–1.03, p = 0.01). The positive predictive value of the model was 80.0%. The resulting prognostic index stratified the general risk into three risk classes with significantly different survival curves. In the highest risk class, transition to psychosis emerged on average ⩾8 months earlier than in the lowest risk class.
Conclusions
Predicting a first-episode psychosis in help-seeking UHR patients was improved using a stage-dependent prognostic model including negative psychotic symptoms (observed flattened affect, subjective impaired motor functioning), impaired social functioning and distress associated with suspiciousness. Treatment intensity may be stratified and personalized using the risk stratification.
Tricyclic antidepressants (TCAs) and selective serotonin reuptake inhibitors (SSRIs) may be associated with lower heart rate variability (HRV), a condition associated with increased mortality risk. We aimed to investigate the association between TCAs, SSRIs and HRV in a population-based study.
Method
In the prospective Rotterdam Study cohort, up to five electrocardiograms (ECGs) per participant were recorded (1991–2012). Two HRV variables were studied based on 10-s ECG recordings: standard deviation of normal-to-normal RR intervals (SDNN) and root mean square of successive RR interval differences (RMSSD). We compared the HRV on ECGs recorded during use of antidepressants with the HRV on ECGs recorded during non-use of any antidepressant. Additionally, we analysed the change in HRV on consecutive ECGs. Those who started or stopped using antidepressants before the second ECG were compared with non-users on two ECGs.
Results
We included 23 647 ECGs from 11 729 participants (59% women, mean age 64.6 years at baseline). Compared to ECGs recorded during non-use of antidepressants (n = 22 971), SDNN and RMSSD were lower in ECGs recorded during use of TCAs (n = 296) and SSRIs (n = 380). Participants who started using TCAs before the second ECG had a decrease in HRV and those who stopped had an increase in HRV compared to consistent non-users (p < 0.001). Starting or stopping SSRIs was not associated with HRV changes.
Conclusion
TCAs were associated with a lower HRV in all analyses, indicating a real drug effect. For SSRIs the results are mixed, indicating a weaker association, possibly due to other factors.
Previous research has established the relationship between cannabis use and psychotic disorders. Whether cannabis use is related to transition to psychosis in patients at ultra-high risk (UHR) for psychosis remains unclear. The present study aimed to review the existing evidence on the association between cannabis use and transition to psychosis in UHR samples.
Method
A search of PsychInfo, Embase and Medline was conducted from 1996 to August 2015. The search yielded 5559 potentially relevant articles that were selected on title and abstract. Subsequently 36 articles were screened on full text for eligibility. Two random-effects meta-analyses were performed. First, we compared transition rates to psychosis of UHR individuals with lifetime cannabis use with non-cannabis-using UHR individuals. Second, we compared transition rates of UHR individuals with a current DSM-IV cannabis abuse or dependence diagnosis with lifetime users and non-using UHR individuals.
Results
We found seven prospective studies reporting on lifetime cannabis use in UHR subjects (n = 1171). Of these studies, five also examined current cannabis abuse or dependence. Lifetime cannabis use was not significantly associated with transition to psychosis [odds ratio (OR) 1.14, 95% confidence interval (CI) 0.856–1.524, p = 0.37]. A second meta-analysis yielded an OR of 1.75 (95% CI 1.135–2.710, p = 0.01), indicating a significant association between current cannabis abuse or dependence and transition to psychosis.
Conclusions
Our results show that cannabis use was only predictive of transition to psychosis in those who met criteria for cannabis abuse or dependence, tentatively suggesting a dose–response relationship between current cannabis use and transition to psychosis.
Severe depression can be a life-threatening disorder, especially in
elderly patients. A fast-acting treatment is crucial for this group.
Electroconvulsive therapy (ECT) may work faster than medication.
Aims
To compare the speed of remission using ECT v.
medication in elderly in-patients.
Method
The speed of remission in in-patients with a DSM-IV diagnosis of major
depression (baseline MADRS score $20) was compared between 47
participants (mean age 74.0 years, s.d. = 7.4) from an ECT randomised
controlled trial (RCT) and 81 participants (mean age 72.2 years, s.d. =
7.6) from a medication RCT (nortriptyline v.
venlafaxine).
Results
Mean time to remission was 3.1 weeks (s.d. = 1.1) for the ECT group and
4.0 weeks (s.d. = 1.0) for the medication group; the adjusted hazard
ratio for remission within 5 weeks (ECT v. medication)
was 3.4 (95% CI 1.9–6.2).
Conclusions
Considering the substantially higher speed of remission, ECT deserves a
more prominent position in the treatment of elderly patients with severe
depression.
Hypertension is highly prevalent among renal transplant recipients (RTR) and a risk factor for graft failure and cardiovascular events. Protein intake has been claimed to affect blood pressure (BP) in the general population and may affect renal function. We examined the association of dietary protein with BP and renal function in RTR. We included 625 RTR (age 53 (sd 13) years; 57 % male). Protein intake was assessed with a FFQ, differentiating between animal and plant protein. BP was measured according to a strict protocol. Creatinine clearance and albuminuria were measured as renal parameters. Protein intake was 83 (sd 12) g/d, of which 63 % derived from animal sources. BP was 136 (sd 17) mmHg systolic (SBP) and 83 (sd 11) mmHg diastolic (DBP). Creatinine clearance was 66 (sd 26) ml/min; albuminuria 41 (10–178) mg/24 h. An inverse, though statistically insignificant, association was found between the total protein intake and both SBP (β = − 2·22 mmHg per sd, P= 0·07) and DBP (β = − 0·48 mmHg per sd, P= 0·5). Protein intake was not associated with creatinine clearance. Although albuminuria was slightly higher in the highest tertile of animal protein intake compared with the lowest tertile (66 v. 33 mg/d, respectively, P= 0·03), linear regression analyses did not reveal significant associations between dietary protein and albuminuria. Protein intake exceeded the current recommendations. Nevertheless, within the range of protein intake in our RTR population, we found no evidence for an association of dietary protein with BP and renal function. Intervention studies focusing on different protein types are warranted to clarify their effect on BP and renal function in RTR.
A decline in everyday cognitive functioning is important for diagnosing dementia. Informant questionnaires, such as the informant questionnaire on cognitive decline in the elderly (IQCODE), are used to measure this. Previously, conflicting results on the IQCODEs ability to discriminate between Alzheimer's disease (AD), mild cognitive impairment (MCI), and cognitively healthy elderly were found. We aim to investigate whether specific groups of items are more useful than others in discriminating between these patient groups. Informants of 180 AD, 59 MCI, and 89 patients with subjective memory complaints (SMC) completed the IQCODE. To investigate the grouping of questionnaire items, we used a two-dimensional graded response model (GRM).The association between IQCODE, age, gender, education, and diagnosis was modeled using structural equation modeling. The GRM with two groups of items fitted better than the unidimensional model. However, the high correlation between the dimensions (r=.90) suggested unidimensionality. The structural model showed that the IQCODE was able to differentiate between all patient groups. The IQCODE can be considered as unidimensional and as a useful addition to diagnostic screening in a memory clinic setting, as it was able to distinguish between AD, MCI, and SMC and was not influenced by gender or education. (JINS, 2011, 17, 674–681)
Clinical research shows that nutritional intervention is necessary to prevent malnutrition in head and neck cancer patients undergoing radiotherapy. The objective of the present study was to assess the value of individually adjusted counselling by a dietitian compared to standard nutritional care (SC). A prospective study, conducted between 2005 and 2007, compared individual dietary counselling (IDC, optimal energy and protein requirement) to SC by an oncology nurse (standard nutritional counselling). Endpoints were weight loss, BMI and malnutrition (5 % weight loss/month) before, during and after the treatment. Thirty-eight patients were included evenly distributed over two groups. A significant decrease in weight loss was found 2 months after the treatment (P = 0·03) for IDC compared with SC. Malnutrition in patients with IDC decreased over time, while malnutrition increased in patients with SC (P = 0·02). Therefore, early and intensive individualised dietary counselling by a dietitian produces clinically relevant effects in terms of decreasing weight loss and malnutrition compared with SC in patients with head and neck cancer undergoing radiotherapy.
Fibroblast growth factor (FGF)-2 (basic) is a potent angiogenic molecule involved in tumor progression, and is one of several growth factors with a central role in ovarian carcinogenesis. We hypothesized that common single nucleotide polymorphisms (SNPs) in the FGF2 gene may alter angiogenic potential and thereby susceptibility to ovarian cancer. We analyzed 25 FGF2 tgSNPs using five independent study populations from the United States and Australia. Analysis was restricted to non-Hispanic White women with serous ovarian carcinoma (1269 cases and 2829 controls). There were no statistically significant associations between any FGF2 SNPs and ovarian cancer risk. There were two nominally statistically significant associations between heterozygosity for two FGF2 SNPs (rs308379 and rs308447; p < .05) and serous ovarian cancer risk in the combined dataset, but rare homozygous estimates did not achieve statistical significance, nor were they consistent with the log additive model of inheritance. Overall genetic variation in FGF2 does not appear to play a role in susceptibility to ovarian cancer.
Although symptoms such as fatigue, headache and pain in bones and muscles are common after disasters, risk factors for these symptoms among disaster survivors have rarely been studied. We examined predisposing, precipitating and perpetuating factors for these physical symptoms among survivors of a man-made disaster. In addition, we examined whether risk factors for physical symptoms differ between survivors and controls.
Method
Survivors completed a questionnaire 3 weeks (n=1567), 18 months and 4 years after the disaster. Symptoms and risk factors were measured using validated questionnaires. A comparison group was included at waves 2 and 3 (n=821). Random coefficient analysis (RCA) was used to study risk factors for symptoms.
Results
Female gender [beta (β)=1.0, 95% confidence interval (CI) 0.6–1.4], immigrant status (β=1.0, 95% CI 0.6–1.4) and pre-disaster psychological problems (β=0.8, 95% CI 0.1–1.4) were predisposing factors for symptoms. Although disaster-related factors were predictors, the relationship between symptoms and disaster-related factors was not very strong and the magnitude of this association was reduced when perpetuating factors were added. Intrusions and avoidance, depression, anxiety and sleeping problems were important perpetuating factors for physical symptoms among survivors and mediated the association between traumatic stress and physical symptoms. Risk factors for symptoms were comparable between survivors and controls.
Conclusions
The results indicate that health-care workers should be alert for physical symptoms among female survivors, immigrant survivors and individuals with a high level of psychological problems both before and after a disaster.
Type 2 diabetes mellitus (DM2) is a common metabolic disorder. DM2 is
associated with cognitive impairments, and with depressive symptoms, which
occur in about one third of patients. In the current study we compared the
cognitive profile and psychological well-being of 119 patients with DM2
(mean age: 66 ± 6; mean duration: 9 ± 6 years) with 55 age
and education matched-control participants. Groups were compared on
cognitive performance in five major cognitive domains, psychological
wellbeing [assessed by Symptom Checklist (SCL)-90-R and the Beck
Depression Inventory (BDI-II)] and abnormalities on brain MRI. We
hypothesized an interrelationship between cognition, MRI abnormalities,
and psychological well-being. DM2 patients performed significantly worse
than controls on cognitive tasks, especially on tasks that required more
mental efficiency, although the differences were modest (effect sizes
Cohen d < .6). We speculate that DM2 patients have a
diminished ability to efficiently process unstructured information.
Patients with DM2 had significantly higher scores on the SCL-90-R
(p < .001) and on the BDI-II (p < .001) and worse
MRI ratings than controls, but psychological distress did not correlate
with cognition, MRI ratings or biomedical characteristics. Contrary to our
hypothesis, cognitive disturbances and psychological distress thus seem
independent symptoms of the same disease. (JINS, 2007,
13, 288–297.)