We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Social connectedness might positively influence the course of clinical symptoms in people with psychotic disorders.
Objectives
This study examines satisfaction with social connectedness (SSC) as predictor of positive and negative symptoms in people with a psychotic disorder.
Methods
Data from the Pharmacotherapy Monitoring and Outcome Survey (PHAMOUS, 2014-2019) was used from patients diagnosed with a psychotic disorder (N=2109). Items about social connectedness of the Manchester short assessment of Quality of Life (ManSA) were used to measure SSC. Linear mixed models were used to estimate the association of SSC with the Positive and Negative Syndrome Scale (PANSS) after one and two years against α=0.01. Analyses were adjusted for symptoms, time since onset, gender and age. Additionally, fluctuation of positive and negative symptom scores over time was estimated.
Results
Mean duration of illness of the sample was 18.8 years (SD 10.7) with >65% showing only small variation in positive and negative symptoms over a two to five-year time period. After adjustment for covariates, SSC showed to be negatively associated with positive symptoms after one year (β=-0.47, p<0.001, 95% CI=-0.70,-0.25) and two years (β =-0.59, p<0.001, 95% CI = -0.88,-0.30), and for negative symptoms after one year (β=-0.52, p<0.001, 95% CI = -0.77,-0.27). The prediction of negative symptoms was not significant at two years.
Conclusions
This research indicates that interventions on SSC might positively impact mental health for people with psychosis. SSC is a small and robust predictor of future levels of positive symptoms. Negative symptoms could be predicted by SSC at one year.
Clinicians in mental healthcare have few objective tools to identify and analyse their patient’s care needs. Clinical decision aids are tools that can support this process.
Objectives
This study examines whether 1) clinicians working with a clinical decision aid (TREAT) discuss more of their patient’s care needs compared to usual treatment, and 2) agree on more evidence-based treatment decisions.
Methods
Clinicians participated in consultations (n=166) with patients diagnosed with psychotic disorders from four Dutch mental healthcare institutions. Primary outcomes were measured with the modified Clinical Decision-making in Routine Care questionnaire and combined with psychiatric, physical and social wellbeing related care needs. A multilevel analysis compared discussed care needs and evidence-based treatment decisions between treatment as usual (TAU) before, TAU after and the TREAT-condition.
Results
First, a significant increase in discussed care needs for TREAT compared to both TAU conditions (b = 20.2, SE = 5.2, p = 0.00 and b = 15.8, SE = 5.4, p = 0.01) was found. Next, a significant increase in evidence-based treatments decisions for care needs was observed for TREAT compared to both TAU conditions (b = 16.7, SE = 4.8, p = 0.00 and b = 16.0, SE = 5.1, p = 0.01).
Conclusions
TREAT improved the discussion about physical health issues and social wellbeing related topics. It also increased evidence-based treatment decisions for care needs which are sometimes overlooked and difficult to treat. Our findings suggest that TREAT makes sense of ROM data and improves guideline-informed
care.
Most research on COVID-19 effects has focused on the general population. Here we measure its impact on Dutch FACT and autism outpatient service users during both waves.
Objectives
This study aimed to: 1) investigate participants’ mental health, 2) assess experiences with outpatient services, and 3) assess respondents’ experiences with governmental measures in the Netherlands during the first and second wave of COVID-19.
Methods
Respondents (wave 1: n=100; wave 2: n=150) reported on mental health, experiences with outpatient care, government measures and information services in an online survey.
Results
Findings demonstrate happiness was rated an average of 6 out of 10, 70% of respondents scored below average on resilience, positive consequences for mental health (ordered world, reflection time) during both waves were similar, and prominent negative consequences included decreased social interactions and increased or new problems regarding mental health and daily functioning from wave 1-2. Lifestyle changed in 50% in both waves, although only slightly attributed to the pandemic. Substance use during both waves hardly changed. Mental healthcare continuation was highly appreciated in both waves (75-80% scored ≥7 on 10-point scale). (Video)calling was the most frequently mentioned positive care experience; missing face-to-face contact with care providers considered most negative. COVID-19 measures were less doable in the second wave. Vaccination willingness approximated 70%.
Conclusions
Results show a nuanced, but clear picture of experiences during both waves. Continuation of services through tele-health was well-received. Monitoring of long-term impact is needed.
The objective of the present study was to evaluate the contribution of voluntary fortified foods and supplements to reducing micronutrient shortfalls in the UK population. A secondary analysis of the UK National Diet and Nutrition Survey was conducted (2012/13–2013/14, N 2546, 1·5–95 years). Micronutrient intakes were derived from food consumption intake data and food composition data and calculated as the proportion below or above the Dietary Reference Values for males and females of different age groups, for those on a base diet only, users of fortified foods but no supplements and users of fortified foods and supplements. Of the population consuming a base diet only, 21–45 % and 5–29 % fell below the Estimated Average Requirement (EAR) for minerals and vitamins, respectively. About 3–13 % fewer consumers of fortified foods fell below the EAR for vitamins and minerals. Supplements barely reduced the prevalence of intakes below the EAR. Among supplement non-users and users, 99 and 96 % failed to meet the reference intakes for vitamin D. More women than men were at risk of inadequacies of micronutrient intakes. The prevalence of inadequacies declined with increasing age. Voluntary fortified foods but not supplements made a meaningful contribution to intakes of vitamin and minerals, without risk of unacceptably high intakes. These insights may help the UK to define approaches to address micronutrients of concern in vulnerable groups.
People with psychotic disorders receive mental healthcare services mainly for their psychiatric care needs. However, patients often experience multiple physical or social wellbeing-related care needs as well. This study aims to identify care needs, investigate their changes over time and examine their association with mental healthcare consumption and evidence-based pharmacotherapy.
Methods
This study combined annually obtained routine outcome monitoring (ROM) data with care consumption data of people with a long-term psychotic illness receiving treatment in four Dutch mental healthcare institutes between 2012 and 2016. Existing treatment algorithms were used to determine psychiatric, physical and social wellbeing-related care needs based on self-report questionnaires, semi-structured interviews and physical parameters. Care consumption was measured in hours of outpatient mental healthcare consumption per year. Generalised estimating equation models were used to calculate odds ratios of care needs and their associations with time, mental healthcare consumption and medication use.
Results
Participants (n = 2054) had on average 7.4 care needs per measurement and received 25.4 h of care per year. Physical care needs are most prevalent and persistent and people with more care needs receive more mental healthcare. Care needs for psychotic symptoms and most social wellbeing-related care needs decreased, whereas the chance of being overweight significantly increased with subsequent years of care. Several positive associations were found between care needs and mental healthcare consumption as well as positive relations between care needs and evidence-based pharmacotherapy.
Conclusions
This longitudinal study present a novel approach in identifying care needs and their association with mental healthcare consumption and pharmacotherapy. Identification of care needs in this way based on ROM can assist daily clinical practice. A recovery-oriented view and a well-coordinated collaboration between clinicians and general practitioners together with shared decisions about which care needs to treat, can improve treatment delivery. Special attention is required for improving physical health in psychosis care which, despite appropriate pharmacotherapy and increasing care consumption, remains troublesome.
OBJECTIVES/GOALS: AA are over-represented on the waitlist for kidney transplant and are often unaware of how waitlist acceptance practices differ across transplant programs and influence access to transplant. We will develop a culturally sensitive transplant program report card to communicate these variations. METHODS/STUDY POPULATION: Scientific Registry of Transplant Recipients (SRTR) data will be used to identity clinical factors strongly associated with AA access to transplant. Interviews and focus groups with AA kidney transplant candidates and their families will collect feedback on the SRTR report card and inform the development of the culturally sensitive report card. Additional focus groups will evaluate its effect on knowledge and medical decision making. We will collaborate with the stakeholders, including AA transplant candidates and their families, transplant programs, SRTR, and providers, to identify strategies to disseminate the report card in the AA community RESULTS/ANTICIPATED RESULTS: To date, no investigation has systematically collected feedback on the SRTR transplant program report card from AA candidates to ensure that the tool is accessible and effective in the AA community. We hypothesize that a culturally sensitive report card will improve AA candidates’ knowledge of program factors that impact access to transplant and enable informed decisions about where they pursue a transplant evaluation. The results of this study have the potential to change how AA patients are counselled while seeking transplantation. DISCUSSION/SIGNIFICANCE OF IMPACT: A culturally sensitive report card can reach more AA patients and enable more informed decision making by providing education about differences in transplant programs that may impact their access to transplant. In the future, we will design a trial to evaluate the prototype.
The A allele of the 5-HT2A gene (–1438A/G polymorphism) has been associated with anorexia nervosa in four studies, but not in three others. One possibility to explain such a discrepancy is that the A allele acts as a modifying rather than a vulnerability allele. To test this hypothesis, we increased our initial sample of 102 trios [Mol. Psychiatry 7 (2002) 90] with 43 new patients with anorexia nervosa and 98 healthy controls. In addition to confirming the absence of association on the global sample of 145 patients, we found that patients with the A allele had a significantly later age at onset of the disease (P = 0.032). Furthermore, the A allele was also transmitted with an older age at onset (P = 0.023) using a quantitative-trait TDT approach. The A allele may thus act as a modifying factor (delaying onset), potentially explaining variations of allele frequency across samples, in which differences in average age at onset are not only possible, but also expected. Taking into account vulnerability genes, but also genes modifying the expression of the disorder, will help to disentangle the complexity of the etiological factors involved in anorexia nervosa.
Studies in the general population show cannabis use has a beneficial effect on metabolic disorders. Given the increased cardiometabolic risk in patients with psychotic disorders, as well as their prevalent use of cannabis, we aim to investigate whether such effects are also evident in these patients.
Method
3176 patients with chronic psychotic disorders from mental health institutions in the Netherlands were included in the study. With multivariate regression analyses we examined the effects of cannabis use on metabolic risk factors; BMI, waist circumference, blood pressure (BP), cholesterol, HDL-C, LDL-C, triglycerides, glucose and HbA1c. Age, sex, smoking, alcohol use and antipsychotic drugs were included as confounders. Next, we examined change in metabolic risk factors after one-year follow up for cannabis users, non-users, discontinuers and starters.
Results
We found a significant negative association between cannabis use and BMI (p=0.003), waist circumference (p>0.001), diastolic BP (p=0.015) and HbA1c (0.004). One year later, patients who had discontinued their cannabis use had a greater increase of BMI (p=0.002) and waist circumference (p=0.011) than other patients. They also had a greater increase of diastolic BP than non-users (p=0.036) or starters (p=0.004).
Conclusion
Discontinuation of cannabis use increased metabolic risk. To stop cannabis use is often an important treatment goal, because it reduces psychotic symptoms. However, physicians should be aware of the increased metabolic risk in patients who discontinue the use of cannabis. Extra attention should be paid to monitoring and treatment of metabolic parameters in these patients to prevent cardiovascular diseases and premature cardiovascular mortality.
Breast-feeding is associated with a lower risk of developing obesity during childhood and adulthood compared with feeding infant milk formula (IMF). Previous studies have shown that an experimental IMF (eIMF; comprising Nuturis®) programmed mouse pups for a lower body weight and fat mass gain in adulthood when challenged with a high-fat diet (HFD) compared with a control IMF (cIMF). Nuturis has a lipid composition and structure more similar to breast milk. Here, the long-term effects were tested of a similar eIMF, but with an adapted lipid composition and a cIMF, on body weight, glucose homoeostasis, liver and adipose tissue. Nutrient composition was similar for the eIMF and cIMF; the lipid fractions comprised approximately 50 % milk fat. C57BL/6JOlaHsd mice were fed cIMF or eIMF from postnatal day (PN) 16–42 followed by an HFD until PN168. Feeding eIMF v. cIMF in early life resulted in a lower body weight (–9 %) and body fat deposition (–14 %) in adulthood (PN105). The effect appeared transient, as from PN126 onwards, after 12 weeks’ HFD, eIMF-fed mice caught up on controls and body and fat weights became comparable between groups. Glucose and energy metabolism were similar between groups. At dissection (PN168), eIMF-fed mice showed larger (+27 %) epididymal fat depots and a lower (–26 %) liver weight without clear morphological aberrations. Our data suggest the size and coating but not the lipid composition of IMF fat globules underlie the programming effect observed. Prolonged exposure to an HFD challenge partly overrules the programming effect of early diet.
The volcanic mega event of the Minoan Santorini eruption constitutes a time anchor in the 2nd millennium BCE that is inherently independent of archaeology and political history. It was a geological event. Yet the dimension of time in geology is not different than in archaeology or human history. Why then does archaeological dating usually place the Minoan Santorini eruption in the 18th Dynasty around 1500 BCE, whilst radiocarbon dating of the volcanic event at Akrotiri (Thera) yielded a calibrated age of 1646–1603 cal BCE, a difference of more than a century? The crux of the problem lies apparently in the correlation between archaeological strata and political history. We present radiocarbon dates of Ashkelon Phases 10 and 11 in comparison to Tell el-Dabca and the Santorini eruption, based only on 14C dating. Tell el-Dabca Phase D/2 is slightly older than the volcanic event. But Phase D/1 or Phase C/2-3 could have witnessed the eruption. Ashkelon Phase 11 has similar radiocarbon dates as Tell el-Dabca Phases E/2, E/1 and D/3, all being significantly older than the Minoan eruption. It seems that the duration of Ashkelon Phase 10 includes the temporal occurrence of the Minoan Santorini eruption within the Second Intermediate Period.
In this response to the reply by Shahack-Gross and Finkelstein (2017), we present additional data of our research at Horvat Haluqim. This includes phytolith percentages and multicellular phytolith stomata in a thin section of a layer in Terraced Field 12, dated by radiocarbon (14C) to the Late Bronze–Early Iron Age. We also show thin-section evidence of aggrading sediment laminations in this terraced field. A new 14C date is given of the Early Islamic Period in Terraced Field 7, as differences in terrace wall architecture are highlighted. We revisit the interpretation by Shahack-Gross and Finkelstein in relation to herd management. Our 14C dates attest that terrace agriculture based on runoff/floodwater irrigation occurred in the Negev Highlands during several periods, including the Iron Age.
Shahack-Gross and Finkelstein (2015) further developed their theory, based on microarchaeology, that there was no agriculture in the Negev Highlands during the Iron Age. We critically evaluate their article in this rejoinder and propose that their conclusion is an example of overinterpretation from a small amount of indirect data. Based on phytoliths in two courtyards and a few rooms, i.e. structures not related to farming, they construed the absence of agriculture during the Iron Age in an area of 2000 km2. We present new radiocarbon, macroarchaeological, and microarchaeological data of Horvat Haluqim, showing that agriculture in the Negev Highlands based on runoff/floodwater capture and related terrace wall construction did not begin with the Roman–Byzantine period. Terrace agriculture in the Negev is older and includes also the Iron Age.
The maximum spreading of drops impacting on smooth and rough surfaces is measured from low to high impact velocity for liquids with different surface tensions and viscosities. We demonstrate that dynamic wetting plays an important role in the spreading at low velocity, characterized by the dynamic contact angle at maximum spreading. In the energy balance, we account for the dynamic wettability by introducing the capillary energy at zero impact velocity, which relates to the spreading ratio at zero impact velocity. Correcting the measured spreading ratio by the spreading ratio at zero velocity, we find a correct scaling behaviour for low and high impact velocity and, by interpolation between the two, we find a universal scaling curve. The influence of the liquid as well as the nature and roughness of the surface are taken into account properly by rescaling with the spreading ratio at zero velocity, which, as demonstrated, is equivalent to accounting for the dynamic contact angle.
An olive branch is traditionally a symbol of peace, but not necessarily in the context of chronological problems in the Eastern Mediterranean region and the Near East during the second millennium BC. Cherubini et al. (above) strongly attack the radiocarbon dating by Friedrich et al. (2006) of an ancient olive branch, buried by volcanic tephra during the Minoan Santorini eruption. The criticism stems from their investigation of growth rings in modern olive trees on Santorini. The authors attempt with additional arguments, beyond their botanical investigation, to defend the traditional low chronology of the Santorini eruption of around 1500 BC. However, they ignore other crucial publications with radiocarbon dates concerning the Santorini eruption. In this response, we evaluate and negate their main arguments, and present our own conclusions.
Responsibility for health and social care services is being delegated from central to local authorities in an increasing number of countries. In the Netherlands, the planned transfer of responsibility for day care for people with dementia from the central government to municipalities is a case in point. The impacts of this decentralisation process for innovative care concepts such as day care at green care farms are largely unknown. We therefore interviewed representatives of municipalities and green care farms to explore what consequences they expected of decentralisation for their organisations and people with dementia. Our study shows that communication and collaboration between municipalities and green care farms is relatively limited. Consequently, municipalities are insufficiently aware of how green care farms can help them to perform their new tasks and green care farmers know little about what municipalities expect from them in the new situation. We therefore recommend that municipalities and green care farms keep each other informed about their responsibilities, duties and activities to ensure a tailored package of future municipal services for people with dementia.
The present review describes brain imaging technologies that can be used to assess the effects of nutritional interventions in human subjects. Specifically, we summarise the biological relevance of their outcome measures, practical use and feasibility, and recommended use in short- and long-term nutritional studies. The brain imaging technologies described consist of MRI, including diffusion tensor imaging, magnetic resonance spectroscopy and functional MRI, as well as electroencephalography/magnetoencephalography, near-IR spectroscopy, positron emission tomography and single-photon emission computerised tomography. In nutritional interventions and across the lifespan, brain imaging can detect macro- and microstructural, functional, electrophysiological and metabolic changes linked to broader functional outcomes, such as cognition. Imaging markers can be considered as specific for one or several brain processes and as surrogate instrumental endpoints that may provide sensitive measures of short- and long-term effects. For the majority of imaging measures, little information is available regarding their correlation with functional endpoints in healthy subjects; therefore, imaging markers generally cannot replace clinical endpoints that reflect the overall capacity of the brain to behaviourally respond to specific situations and stimuli. The principal added value of brain imaging measures for human nutritional intervention studies is their ability to provide unique in vivo information on the working mechanism of an intervention in hypothesis-driven research. Selection of brain imaging techniques and target markers within a given technique should mainly depend on the hypothesis regarding the mechanism of action of the intervention, level (structural, metabolic or functional) and anticipated timescale of the intervention's effects, target population, availability and costs of the techniques.
To examine the use of vitamin D supplements during infancy among the participants in an international infant feeding trial.
Design
Longitudinal study.
Setting
Information about vitamin D supplementation was collected through a validated FFQ at the age of 2 weeks and monthly between the ages of 1 month and 6 months.
Subjects
Infants (n 2159) with a biological family member affected by type 1 diabetes and with increased human leucocyte antigen-conferred susceptibility to type 1 diabetes from twelve European countries, the USA, Canada and Australia.
Results
Daily use of vitamin D supplements was common during the first 6 months of life in Northern and Central Europe (>80 % of the infants), with somewhat lower rates observed in Southern Europe (>60 %). In Canada, vitamin D supplementation was more common among exclusively breast-fed than other infants (e.g. 71 % v. 44 % at 6 months of age). Less than 2 % of infants in the USA and Australia received any vitamin D supplementation. Higher gestational age, older maternal age and longer maternal education were study-wide associated with greater use of vitamin D supplements.
Conclusions
Most of the infants received vitamin D supplements during the first 6 months of life in the European countries, whereas in Canada only half and in the USA and Australia very few were given supplementation.
Traditional archaeological approaches in the central Negev Desert used to employ excavation techniques in post-prehistoric periods in which stratigraphy is based on architecture, while material culture forms the basis for dating assessment and chronology. Such an approach was understandable, as it focused on the most visible remains of past human habitation. However, the detailed habitation record is in the soil rather than in the walls. Moreover, ceramics and stone tools in desert cultures often have limited time resolution in terms of absolute chronology. The rural desert site of Horvat Haluqim in the central Negev yielded 2 habitation periods with the traditional methodology: (1) Roman period, 2nd–3rd centuries CE; (2) Iron Age IIA, 10th century BCE. We have conducted at Horvat Haluqim initial excavations in small building remains that were never excavated before. Our excavation methodology focuses on detailed examination of the archaeological soil in building structures, coupled with accelerator mass spectrometry (AMS) radiocarbon dating for chronology, and micromorphology of undisturbed soil samples to study stratigraphy and soil contents at the microscopic scale. Here, we report preliminary results, concentrating on the 14C dates. These suggest a much longer habitation history at the site during the Iron Age. The 14C dates obtained so far from these building remains cover Iron Age I, II, III, and the Persian period. The oldest calibrated date (charred C4 plants) in a rectangular building structure (L100) is 1129–971 BCE (60.5%, highest relative probability). The youngest calibrated date in a round building structure (L700) is 540–411 BCE (57.9%, highest relative probability). This excavation methodology provides additional “eyes” to look at past human habitation in the Negev Desert, seeing more periods and more detail than was possible with traditional schemes and ceramic dating.