To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Conspiracy theories are popular during the COVID-19 pandemic. Conspiratorial thinking is characterised by the strong conviction that a certain situation that one sees as unjust is the result of a deliberate conspiracy of a group of people with bad intentions. Conspiratorial thinking appears to have many similarities with paranoid delusions.
To explore the nature, consequences, and social-psychological dimensions of conspiratorial thinking, and describe similarities and differences with paranoid delusions.
Critically assessing relevant literature about conspiratorial thinking and paranoid delusions.
Conspiratorial thinking meets epistemic, existential, and social needs. It provides clarity in uncertain times and connection with an in-group of like-minded people. Both conspiratorial thinking and paranoid delusions involve an unjust, persistent, and sometimes bizarre conviction. Unlike conspiracy theorists, people with a paranoid delusion are almost always the only target of the presumed conspiracy, and they usually stand alone in their conviction. Furthermore, conspiracy theories are not based as much on unusual experiences of their inner self, reality, or interpersonal contacts.
Conspirational thinking is common in uncertain circumstances. It gives grip, certainty, moral superiority and social support. Extreme conspirational thinking seems to fit current psychiatric definitions of paranoid delusions, but there are also important differences. To make a distinction with regard to conspiratorial thinking, deepening of conventional definitions of delusions is required. Instead of the strong focus on the erroneous content of delusions, more attention should be given to the underlying idiosyncratic, changed way of experiencing reality.
Clinicians in mental healthcare have few objective tools to identify and analyse their patient’s care needs. Clinical decision aids are tools that can support this process.
This study examines whether 1) clinicians working with a clinical decision aid (TREAT) discuss more of their patient’s care needs compared to usual treatment, and 2) agree on more evidence-based treatment decisions.
Clinicians participated in consultations (n=166) with patients diagnosed with psychotic disorders from four Dutch mental healthcare institutions. Primary outcomes were measured with the modified Clinical Decision-making in Routine Care questionnaire and combined with psychiatric, physical and social wellbeing related care needs. A multilevel analysis compared discussed care needs and evidence-based treatment decisions between treatment as usual (TAU) before, TAU after and the TREAT-condition.
First, a significant increase in discussed care needs for TREAT compared to both TAU conditions (b = 20.2, SE = 5.2, p = 0.00 and b = 15.8, SE = 5.4, p = 0.01) was found. Next, a significant increase in evidence-based treatments decisions for care needs was observed for TREAT compared to both TAU conditions (b = 16.7, SE = 4.8, p = 0.00 and b = 16.0, SE = 5.1, p = 0.01).
TREAT improved the discussion about physical health issues and social wellbeing related topics. It also increased evidence-based treatment decisions for care needs which are sometimes overlooked and difficult to treat. Our findings suggest that TREAT makes sense of ROM data and improves guideline-informed
The study examined the association between depressive symptoms and iron status, anaemia, body weight and pubertal status among Mexican adolescent girls.
In this cross-sectional study, depressive symptoms were assessed by the 6-item Kutcher Adolescent Depression Scale, and latent class analysis (LCA) was used to identify and characterise groups of girls based on depressive symptoms. Iron status and inflammation were assessed using ferritin and soluble transferrin receptor, C-reactive protein and alpha-1-acid glycoprotein, respectively. Multiple logistic and linear regressions were applied to model class membership as a function of iron status, anaemia, body weight and pubertal status.
We collected data from 408 girls aged 12–20 years.
Public schools in northern Mexico.
LCA yielded three classes of depressive symptoms: 44·4 % of the adolescents were ‘unlikely to be depressed’, 41·5 % were ‘likely to be depressed’ and 14·1 % were ‘highly likely to be depressed’. Our analyses demonstrated that iron-deficient girls had greater odds of being ‘likely depressed’ (OR 2·01, 95 % CI 1·01, 3·00) or ‘highly likely depressed’ (OR 2·80, 95 % CI 1·76, 3·84). Linear regression analyses revealed that lower Hb concentrations and higher body weight increased the probability of being ‘likely depressed’. There was no evidence that depressive symptoms were associated with age at menarche and years since menstruation.
This study shows that iron-deficient adolescent girls are more likely to suffer from depressive symptoms and that lower concentrations of Hb and higher body weight increased the probability of experiencing depressive symptoms.
Zn deficiency arising from inadequate dietary intake of bioavailable Zn is common in children in developing countries. Because house crickets are a rich source of Zn, their consumption could be an effective public health measure to combat Zn deficiency. This study used Optifood, a tool based on linear programming analysis, to develop food-based dietary recommendations (FBR) and predict whether dietary house crickets can improve both Zn and overall nutrient adequacy of children’s diets. Two quantitative, multi-pass 24-h recalls from forty-seven children aged 2 and 3 years residing in rural Kenya were collected and used to derive model parameters, including a list of commonly consumed foods, median serving sizes and frequency of consumption. Two scenarios were modelled: (i) FBR based on local available foods and (ii) FBR based on local available foods with house crickets. Results revealed that Zn would cease to be a problem nutrient when including house crickets to children’s diets (population reference intake coverage for Zn increased from 89 % to 121 % in the best-case scenario). FBR based on both scenarios could ensure nutrient adequacy for all nutrients except for fat, but energy percentage (E%) for fat was higher when house crickets were included in the diet (23 E% v. 19 E%). This manoeuvre, combined with realistic changes in dietary practices, could therefore improve dietary Zn content and ensure adequacy for twelve nutrients for Kenyan children. Further research is needed to render these theoretical recommendations, practical.
People with psychotic disorders receive mental healthcare services mainly for their psychiatric care needs. However, patients often experience multiple physical or social wellbeing-related care needs as well. This study aims to identify care needs, investigate their changes over time and examine their association with mental healthcare consumption and evidence-based pharmacotherapy.
This study combined annually obtained routine outcome monitoring (ROM) data with care consumption data of people with a long-term psychotic illness receiving treatment in four Dutch mental healthcare institutes between 2012 and 2016. Existing treatment algorithms were used to determine psychiatric, physical and social wellbeing-related care needs based on self-report questionnaires, semi-structured interviews and physical parameters. Care consumption was measured in hours of outpatient mental healthcare consumption per year. Generalised estimating equation models were used to calculate odds ratios of care needs and their associations with time, mental healthcare consumption and medication use.
Participants (n = 2054) had on average 7.4 care needs per measurement and received 25.4 h of care per year. Physical care needs are most prevalent and persistent and people with more care needs receive more mental healthcare. Care needs for psychotic symptoms and most social wellbeing-related care needs decreased, whereas the chance of being overweight significantly increased with subsequent years of care. Several positive associations were found between care needs and mental healthcare consumption as well as positive relations between care needs and evidence-based pharmacotherapy.
This longitudinal study present a novel approach in identifying care needs and their association with mental healthcare consumption and pharmacotherapy. Identification of care needs in this way based on ROM can assist daily clinical practice. A recovery-oriented view and a well-coordinated collaboration between clinicians and general practitioners together with shared decisions about which care needs to treat, can improve treatment delivery. Special attention is required for improving physical health in psychosis care which, despite appropriate pharmacotherapy and increasing care consumption, remains troublesome.
Cognitive deficits may be characteristic for only a subgroup of first-episode psychosis (FEP) and the link with clinical and functional outcomes is less profound than previously thought. This study aimed to identify cognitive subgroups in a large sample of FEP using a clustering approach with healthy controls as a reference group, subsequently linking cognitive subgroups to clinical and functional outcomes.
204 FEP patients were included. Hierarchical cluster analysis was performed using baseline brief assessment of cognition in schizophrenia (BACS). Cognitive subgroups were compared to 40 controls and linked to longitudinal clinical and functional outcomes (PANSS, GAF, self-reported WHODAS 2.0) up to 12-month follow-up.
Three distinct cognitive clusters emerged: relative to controls, we found one cluster with preserved cognition (n = 76), one moderately impaired cluster (n = 74) and one severely impaired cluster (n = 54). Patients with severely impaired cognition had more severe clinical symptoms at baseline, 6- and 12-month follow-up as compared to patients with preserved cognition. General functioning (GAF) in the severely impaired cluster was significantly lower than in those with preserved cognition at baseline and showed trend-level effects at 6- and 12-month follow-up. No significant differences in self-reported functional outcome (WHODAS 2.0) were present.
Current results demonstrate the existence of three distinct cognitive subgroups, corresponding with clinical outcome at baseline, 6- and 12-month follow-up. Importantly, the cognitively preserved subgroup was larger than the severely impaired group. Early identification of discrete cognitive profiles can offer valuable information about the clinical outcome but may not be relevant in predicting self-reported functional outcomes.
Biofortified yellow cassava has been developed to alleviate vitamin A deficiency. We examined the potential contribution of yellow cassava to total retinol activity equivalent (RAE) intake if replacing white by yellow cassava among pre-school Nigerian children. Dietary intake was assessed as part of a randomised controlled trial. Pre-schoolchildren (n 176) were randomly assigned to receive either white cassava (WC) or yellow cassava (YC) for 17 weeks. Dietary intake assessments were conducted during the intervention and 1 month after, when children had resumed their habitual diet. Differences in RAE intake between groups and time points were compared using a linear mixed model regression analysis. During intervention, median RAE intake was 536 µg/d in the YC group and 301 µg/d in the WC group (P < 0·0001). YC contributed approximately 40 % to total RAE intake. Of the children, 9 % in the YC group and 29 % in the WC group had RAE intake below the Estimated Average Requirement. After intervention, median RAE intake was 300 µg/d and did not differ between intervention groups (P = 0·5). The interaction effect of group and time showed a 37 % decrease in RAE intake in the YC group after the intervention (Exp(β) = 0·63; 95 % CI 0·56, 0·72). If WC was replaced by YC after intervention, the potential contribution of YC to total RAE intake was estimated to be approximately 32 %. YC increased total RAE intake and showed a substantially lower inadequacy of intake. It is therefore recommended as a good source of provitamin A in cassava-consuming regions.
Resilience is a cross-disciplinary concept that is relevant for understanding the sustainability of the social and environmental conditions in which we live. Most research normatively focuses on building or strengthening resilience, despite growing recognition of the importance of breaking the resilience of, and thus transforming, unsustainable social-ecological systems. Undesirable resilience (cf. lock-ins, social-ecological traps), however, is not only less explored in the academic literature, but its understanding is also more fragmented across different disciplines. This disparity can inhibit collaboration among researchers exploring interdependent challenges in sustainability sciences. In this article, we propose that the term lock-in may contribute to a common understanding of undesirable resilience across scientific fields.
The scope of this presentation is to investigate the possible relationship between a delayed sleep phase and the timing of other activities, like meals and other disturbances of eating habits.
The literature will be reviewed and preliminary data from own research regarding these associations will be presented.
The delayed sleep phase has been shown to occur in many children and in around 80% of adults with ADHD (Van Veen ea, accepted for publication, 2009). Adult patients with ADHD who get up relatively too early according to their biological clock, tend to skip breakfast. Skipping breakfast is associated with binge eating in the afternoon; both binge eating and skipping breakfast are associated with overweight and obesity. According to the literature, the prevalence of ADHD and sleepproblems is increased in obese patients; the higher the BMI, the higher the chance of ADHD.
ADHD and the very frequently comorbid delayed sleep phase disorder may be associated with a delayed timing of meals that may lead to overweight and obesity.
Children with ADHD may have chronic sleeping problems, associated with circadian rhythm disturbances. Little is known about sleep in adults with ADHD.
We studied the prevalence and type of sleeping problems in 120 adults with ADHD using an interview questionnaire.
78% of the 120 adults with ADHD had difficulty to go to bed in time (between 1 and 3 am). Almost 70% reported sleep onset problems, more than 50% had difficulty sleeping through. Almost 70% had difficulty getting up in the morning and 62% felt sleepy during the day. In more than 60% these sleeping problems had been there all their lives. These results are very similar to earlier data presented by Dodson (Dodson, 1999). Several explanations for these sleeping problems may be considered (Kooij ea, 2001; Oosterloo ea, 2006; Boonstra ea, 2007). However, the frequently occurring sleeping pattern of being a ‘nightowl’, with restless sleep and difficulty getting up in the morning, may be associated with the delayed sleep phase syndrome, as was recently shown in children with ADHD and sleep onset problems (van der Heijden ea, 2006; van der Heijden ea, 2005; Weiss ea, 2006). We currently study the circadian rhytm in adults by measuring the Dim Light Melatonin Onset (DLMO) in saliva in ADHD patients with sleep onset problems (ADHD+SO), compared to ADHD patients without sleep onset problems (ADHD-SO).
About 70% of adults with ADHD have sleep onset problems compatible with a delayed sleep phase pattern. First data of DLMO in adult ADHD patients with and without sleep onset problems will be discussed.
Patients who seek treatment often suffer from negative auditory vocal hallucinations (‘voices’). However, some of these patients also report positive or useful voices, that they wish to preserve. When this wish is neglected by their therapist, this may lead to rejection of therapy or low compliance. This study describes prevalence, characteristics, course of and attributions to these voices in psychotic and non-psychotic patients.
One hundred and thirty one patients of a Voices Clinic and 65 members of the Dutch Resonance Foundation were assessed with the Positive and Useful Voices Inquiry. Data were analyzed using Pearson's chi-square, one-way ANOVA, Student's T-test and Crohnbach's alpha statistics.
First voices are most often reported as negative. the lifetime prevalence of positive voices ranged from 50 to 75%, useful voices were reported by 40 to 60% of respondents, with higher prevalences among the members of the Resonance Foundation. Positive voices occur more among non-psychotic patients. No significant association was found between voice characteristics and diagnosis. Attributions of protective power to positive voices has the strongest association with positive experience. Useful voices that are advising are experienced as most useful. Over 30% of respondents want to keep their positive and/or useful hallucinations. This wish is significantly associated with perceived control over the voices.
The prevalence of positive and useful voices is considerable and therefore clinically relevant. A substantial part of patients want to preserve these voices.
Studies in children suggest that neurocognitive performance is a possible endophenotype for ADHD. We wished to establish a first connection between key genetic polymorphisms and neurocognitive performance in adults with ADHD.
We genotyped 45 adults with ADHD at four key candidate polymorphisms for the disorder (DRD4 48 bp repeat, DRD4 120 bp duplicated repeat, SLC6A3 40 bp VNTR, and COMT Val158Met). We then sub-grouped the sample for each polymorphism by genotype or by the presence of the (putative) ADHD risk allele and compared the performance of the subgroups on a large battery of neurocognitive tests.
The COMT Val158Met polymorphism was related to differences in IQ and reaction time, both of the DRD4 polymorphisms (48 bp repeat and 120 bp duplication) showed an association with verbal memory skills, and the SLC6A3 40 bp VNTR polymorphism could be linked to differences in inhibition.
Our findings contribute to the complicated search for possible endophenotypes for (adult) ADHD.
During this presentation, the first pharmacogenetic study on response to methylphenidate (MPH) in adults with ADHD will be reported.
We performed a stratified analysis of the association between response to MPH, assessed under double-blind conditions, in 42 adults with ADHD, and polymorphisms in the genes encoding the dopamine transporter, SLC6A3 (DAT1), the norepinephrine transporter, SLC6A2 (NET), and the dopamine receptor D4, DRD4.
Polymorphisms in the DRD4 and the SLC6A2 (NET) genes were not significantly associated with the response to MPH treatment; however, the VNTR polymorphism in the 3'untranslated region of SLC6A3 (DAT1) was significantly associated with an increased likelihood of a response to MPH treatment (odds ratio 5.4; 95% CI 1.4-21.9) in heterozygous 10-repeat allele carriers in comparison with the 10/10 homozygotes: 52.2% of the participants heterozygous for the 10-repeat allele improved significantly on MPH treatment whereas only 22.2% of the 10/10 homozygous individuals did.
This study confirms that the SLC6A3 (DAT1) genotype may have an influential role in determining the response to MPH in the treatment of ADHD. The SLC6A3 (DAT1) gene might be a factor worth evaluating further in the future regarding choice of treatment and possibly dose adjustment.
Several factors may contribute to duration of untreated psychosis (DUP): patient-delay, referral-delay and treatment-delay caused by mental health care services (MHS-delay). In order to find the most effective interventions to reduce DUP, it is important to know what factors in these pathways to care contribute to DUP.
To examine the relationship of the constituents of treatment delay, migration status and urbanicity.
In first episode psychotic patients (n = 182) from rural, urban and highly urbanized areas, DUP, migration status and pathways to care were determined.
Mean DUP was 53.6 weeks (median 8.9, SD=116.8). Patient-delay was significantly longer for patients from highly urbanized areas and for first generation immigrants. MHS-delay was longer for patients who were treated already by MHS for other diagnoses.
Specific interventions are needed focusing on patients living in highly urbanized areas and first generation immigrants in order to shorten patient delay. MHS should improve early detection of psychosis in patients already in treatment for other diagnosis.
Routine Outcome Monitoring (ROM) has become part of the treatment process in mental health care. However, studies have indicated that few clinicians in psychiatry use the outcome of ROM in their daily work. The aim of this study was to explore the degree of ROM use in clinical practice as well as the explanatory factors of this use.
In the Northern Netherlands, a ROM-protocol (ROM-Phamous) for patients with a psychotic disorder has been implemented. To establish the degree of ROM-Phamous use in clinical practice, the ROM results of patients (n = 204) were compared to the treatment goals formulated in their treatment plans. To investigate factors that might influence ROM use, clinicians (n = 32) were asked to fill out a questionnaire about ROM-Phamous.
Care domains that were problematic according to the ROM-Phamous results were mentioned in the treatment plan in 28% of cases on average (range 5–45%). The use of ROM-Phamous in the treatment process varies considerably among clinicians. Most of the clinicians find ROM-Phamous both useful and important for good clinical practice. In contrast, the perceived ease-of-use is low and most clinicians report insufficient time to use ROM-Phamous.
More frequent ROM use should be facilitated in clinicians. This could be achieved by improving the fit with clinical routines and the ease-of-use of ROM systems. It is important for all stakeholders to invest in integrating ROM in clinical practice. Eventually, this might improve the diagnostics and treatment of patients in mental health care.
Protein is important for growth, maintenance and protection of the body. Both adequacy of protein quantity and protein quality in the diet are important to guarantee obtaining all the essential amino acids. Protein–energy malnutrition is widely present in developing countries such as Nigeria and might result in stunting and wasting. Needs for protein differ depending on age and physiological status and are higher during growth, pregnancy and lactation. The present review assessed protein quantity and quality in diets of Nigerian infants, children, adolescents, and pregnant and lactating women. Literature reviews and calculations were performed to assess adequacy of Nigerian protein intake and to examine the Nigerian diet. The digestible indispensable amino acid score was used to calculate protein quality of nine Nigerian staple foods and of a mixture of foods. The Nigerian population had mostly adequate protein intake when compared with the most recent protein recommendations by the FAO (2013) and WHO/FAO/UNU (2007). An important exception was the protein intake of adolescent girls and pregnant and lactating women. Most of the assessed Nigerian plant-based staple foods were of low protein quality and predominantly lacked the amino acid lysine. The addition of animal-source foods can bridge the protein quality gap created by predominance of plant-based foods in the Nigerian diet. The methodology of this review can be applied to other low- and middle-income countries where diets are often plant-based and lack variety, which might influence protein intake adequacy.
Saliva and urine are the two main body fluids sampled when breast milk intake is measured with the 2H oxide dose-to-mother technique. However, these two body fluids may generate different estimates of breast milk intake due to differences in isotope enrichment. Therefore, we aimed to assess how the estimated amount of breast milk intake differs when based on saliva and urine samples and to explore whether the total energy expenditure of the mothers is related to breast milk output. We used a convenience sample of thirteen pairs of mothers and babies aged 2–4 months, who were exclusively breastfed and apparently healthy. To assess breast milk intake, we administered doubly labelled water to the mothers and collected saliva samples from them, while simultaneously collecting both saliva and urine from their babies over a 14-d period. Isotope ratio MS was used to analyse the samples for 2H and 18O enrichments. Mean breast milk intake based on saliva samples was significantly higher than that based on urine samples (854·5 v. 812·8 g/d, P = 0·029). This can be attributed to slightly higher isotope enrichments in saliva and to a poorer model fit for urine samples as indicated by a higher square root of the mean square error (14·6 v. 10·4 mg/kg, P = 0·001). Maternal energy expenditure was not correlated with breast milk output. Our study suggests that saliva sampling generates slightly higher estimates of breast milk intake and is more precise as compared with urine and that maternal energy expenditure does not influence breast milk output.
Nosocomial outbreaks due to multidrug-resistant microorganisms in rehabilitation centers have rarely been reported. We report an outbreak of extended-spectrum beta-lactamase (ESBL)–producing Klebsiella pneumoniae (ESBL-K. pneumoniae) on a single ward in a rehabilitation center in Rotterdam, The Netherlands.
A 40-bed ward of a rehabilitation center in the Netherlands.
In October 2016, 2 patients were found to be colonized by genetically indistinguishable ESBL-K. pneumoniae isolates. Therefore, an outbreak management team was installed, by whom a contact tracing plan was made. In addition to general outbreak measures, specific measures were formulated to allow continuation of the rehabilitation process. Also, environmental cultures were taken. Multiple-locus variable-number tandem-repeat analysis and amplification fragment-length polymorphism were used to determine strain relatedness. Selected isolates were subjected to whole-genome multilocus sequence typing.
The outbreak lasted 8 weeks. In total, 14 patients were colonized with an ESBL-K. pneumoniae, of whom 11 patients had an isolate belonging to sequence type 307. Overall, 163 environmental cultures were taken. Several sites of a household washing machine were repeatedly found to be contaminated with the outbreak strain. This machine was used to wash lifting slings and patient clothing contaminated with feces. The outbreak was contained after taking the machine temporarily out of service and implementing a reinforced and adapted protocol on the use of this machine.
We conclude that in this outbreak, the route of transmission of the outbreak strain via the household washing machine played a major role.
Fragmented habitats generally harbour small populations that are potentially more prone to local extinctions caused by biotic factors such as parasites. We evaluated the effects of botflies (Cuterebra apicalis) on naturally fragmented populations of the gracile mouse opossum (Gracilinanus agilis). We examined how sex, food supplementation experiment, season and daily climatic variables affected body condition and haemoglobin concentration in animals that were parasitized or not by botflies. Although parasitism did not affect body condition, haemoglobin concentrations were lower in parasitized animals. Among the non-parasitized individuals, haemoglobin concentration increased with the increase of maximum temperature and the decrease of relative humidity, a climatic pattern found at the peak of the dry season. However, among parasitized animals, the opposite relationship between haemoglobin concentration and relative humidity occurred, as a consequence of parasite-induced anaemia interacting with dehydration as an additional stressor. We conclude that it is critical to assess how climate affects animal health (through blood parameters) to understand the population consequences of parasitism on the survival of individuals and hence of small population viability.