To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Treatment with antipsychotics is associated with an increased risk of type 2 diabetes mellitus (T2D), and increased levels of inflammatory biomarkers are present in patients with T2D. We previously demonstrated that the glucagon-like peptide-1 receptor agonist liraglutide significantly reduced glucometabolic disturbances and body weight in prediabetic, overweight/obese schizophrenia-spectrum disorder patients treated with clozapine or olanzapine. This study aims to assess the involvement of cytokines in the therapeutic effects of liraglutide.
Serum concentrations of 10 cytokines (interferon-γ [IFN-γ], tumor necrosis factor-α, interleukin 1β [IL-1β], IL-2, IL-4, IL-6, IL-8, IL-10, IL-12p70, and IL-13) from fasting prediabetic and normal glucose-tolerant (NGT) patients with schizophrenia-spectrum disorders were measured using multiplexed immunoassays. Prediabetic patients were randomized to 16 weeks of treatment with liraglutide or placebo, and cytokines were measured again at the end of the treatment.
IFN-γ (1.98 vs 1.17 pg/ml, P = .001), IL-4 (0.02 vs 0.01 pg/ml, P < .001), and IL-6 (0.73 vs 0.46 pg/ml, P < .001) were significantly higher in prediabetic (n = 77) vs NGT patients (n = 31). No significant changes in cytokine levels following treatment with liraglutide (n = 37) vs placebo (n = 40) were found.
Prediabetic vs NGT patients with schizophrenia treated with clozapine or olanzapine had increased serum levels of several proinflammatory cytokines, further substantiating the link between inflammation and T2D. Treatment with liraglutide did not affect the investigated cytokines. Further testing of these findings in larger numbers of individuals is needed.
New guidelines for peanut allergy prevention in high-risk infants recommend introducing peanut during infancy but do not address breastfeeding or maternal peanut consumption. We assessed the independent and combined association of these factors with peanut sensitization in the general population CHILD birth cohort (N = 2759 mother–child dyads). Mothers reported peanut consumption during pregnancy, timing of first infant peanut consumption, and length of breastfeeding duration. Child peanut sensitization was determined by skin prick testing at 1, 3, and 5 years. Overall, 69% of mothers regularly consumed peanuts and 36% of infants were fed peanut in the first year (20% while breastfeeding and 16% after breastfeeding cessation). Infants who were introduced to peanut early (before 1 year) after breastfeeding cessation had a 66% reduced risk of sensitization at 5 years compared to those who were not (1.9% vs. 5.8% sensitization; aOR 0.34, 95% CI 0.14–0.68). This risk was further reduced if mothers introduced peanut early while breastfeeding and regularly consumed peanut themselves (0.3% sensitization; aOR 0.07, 0.01–0.25). In longitudinal analyses, these associations were driven by a higher odds of outgrowing early sensitization and a lower odds of late-onset sensitization. There was no apparent benefit (or harm) from maternal peanut consumption without breastfeeding. Taken together, these results suggest the combination of maternal peanut consumption and breastfeeding at the time of peanut introduction during infancy may help to decrease the risk of peanut sensitization. Mechanistic and clinical intervention studies are needed to confirm and understand this “triple exposure” hypothesis.
To compare sensitivity of specimens for COVID-19 diagnosis, we tested 151 nasopharyngeal/midturbinate swab pairs from 117 COVID-19 inpatients using reverse-transcriptase polymerase chain reaction (RT-PCR). Sensitivity was 94% for nasopharyngeal and 75% for midturbinate swabs (P = .0001). In 88 nasopharyngeal/midturbinate pairs with matched saliva, sensitivity was 86% for nasopharyngeal swabs and 88% for combined midturbinate swabs/saliva.
Pertussis is a highly contagious infectious disease and remains an important cause of mortality and morbidity worldwide. Over the last decade, vaccination has greatly reduced the burden of pertussis. Yet, uncertainty in individual vaccination coverage and ineffective case surveillance systems make it difficult to estimate burden and the related quantity of population-level susceptibility, which determines population risk. These issues are more pronounced in low-income settings where coverage is often overestimated, and case numbers are under-reported. Serological data provide a direct characterisation of the landscape of susceptibility to infection; and can be combined with vaccination coverage and basic theory to estimate rates of exposure to natural infection. Here, we analysed cross-sectional data on seropositivity against pertussis to identify spatial and age patterns of susceptibility in children in Madagascar. A large proportion of individuals surveyed were seronegative; however, there were patterns suggestive of natural infection in all the regions analysed. Improvements in vaccination coverage are needed to help prevent additional burden of pertussis in the country.
Given that smoking results in poor physical and mental health, reducing tobacco harm is of high importance. Recommendations published by the National Institute for Health and Care Excellence to reduce smoking harms included provision of support, use of nicotine containing products and commissioning of smoking cessation services.
This report explores the difficulties in obtaining such support, as observed in a recently conducted randomised controlled trial in patients with severe mental ill health, and outlines suggestions to improve facilitation of provision.
Data collected during the Smoking Cessation Intervention for Severe Mental Ill Health Trial (SCIMITAR+) (trial Registration ISRCTN72955454), was reviewed to identify the difficulties experienced, across the trial, with regards to access and provision of nicotine replacements therapy (NRT). Actions taken to facilitate access and provision of NRT were collated to outline how provision could be better facilitated.
Access to NRT varied across study settings and in some instances proved impossible for patients to access. Difficulty in access was irrespective of a diagnosis of severe mental ill health. Where NRT was provided, this was not always provided in accordance with NICE guidelines.
Availability of smoking cessation support, and NRT provision would benefit from being made clearer, simpler and more easily accessible so as to enhance smoking cessation rates.
The prevalence of psychotic experiences (PEs) is higher in low-and-middle-income-countries (LAMIC) than in high-income countries (HIC). Here, we examine whether this effect is explicable by measurement bias.
A community sample from 13 countries (N = 7141) was used to examine the measurement invariance (MI) of a frequently used self-report measure of PEs, the Community Assessment of Psychic Experiences (CAPE), in LAMIC (n = 2472) and HIC (n = 4669). The CAPE measures positive (e.g. hallucinations), negative (e.g. avolition) and depressive symptoms. MI analyses were conducted with multiple-group confirmatory factor analyses.
MI analyses showed similarities in the structure and understanding of the CAPE factors between LAMIC and HIC. Partial scalar invariance was found, allowing for latent score comparisons. Residual invariance was not found, indicating that sum score comparisons are biased. A comparison of latent scores before and after MI adjustment showed both overestimation (e.g. avolition, d = 0.03 into d = −0.42) and underestimation (e.g. magical thinking, d = −0.03 into d = 0.33) of PE in LAMIC relative to HIC. After adjusting the CAPE for MI, participants from LAMIC reported significantly higher levels on most CAPE factors but a significantly lower level of avolition.
Previous studies using sum scores to compare differences across countries are likely to be biased. The direction of the bias involves both over- and underestimation of PEs in LAMIC compared to HIC. Nevertheless, the study confirms the basic finding that PEs are more frequent in LAMIC than in HIC.
An observational study was conducted to characterize high-touch surfaces in emergency departments and hemodialysis facilities. Certain surfaces were touched with much greater frequency than others. A small number of surfaces accounted for the majority of touch episodes. Prioritizing disinfection of these surfaces may reduce pathogen transmission within healthcare environments.
Background: The healthcare environment can serve as a reservoir for many microorganisms and, in the absence of appropriate cleaning and disinfection, can contribute to pathogen transmission. Identification of high-touch surfaces (HTS) in hospital patient rooms has allowed the recognition of surfaces that represent the greatest transmission risk and prioritization of cleaning and disinfection resources for infection prevention. HTS in other healthcare settings, including high-volume and high-risk settings such as emergency departments (EDs) and hemodialysis facilities (HDFs), have not been well studied or defined. Methods: Observations were conducted in 2 EDs and 3 HDFs using structured observation tools. All touch episodes, defined as hand-to-surface contact regardless of hand hygiene and/or glove use, were recorded. Touches by healthcare personnel, patients, and visitors were included. Surfaces were classified as being allocated to individual patients or shared among multiple patients. The number of touch episodes per hour was calculated for each surface to rank surfaces by frequency of touch. Results: In total, 28 hours of observation (14 hours each in EDs and HDFs) were conducted. 1,976 touch episodes were observed among 62 surfaces. On average, more touch episodes were observed per hour in HDFs than in EDs (89 vs 52, respectively). The most frequently touched surfaces in EDs included stretcher rails, privacy curtains, visitor chair arm rests and seats, and patient bedside tables, which together accounted for 68.8% of all touch episodes in EDs (Fig. 1). Frequently touched surfaces in HDFs included both shared and single-patient surfaces: 27.8% and 72.2% of HDF touch episodes, respectively. The most frequently touched surfaces in HDFs were supply cart drawers, dialysis machine control panels and keyboards, handwashing faucet handles, bedside work tables, and bed rail or dialysis chair armrests, which accounted for 68.4% of all touch-episodes recorded. Conclusions: To our knowledge, this is the first quantitative study to identify HTSs in EDs and HDFs. Our observations reveal that certain surfaces within these environments are subject to a substantially greater frequency of hand contact than others and that a relatively small number of surfaces account for most touch episodes. Notably, whereas HTSs in EDs were primarily single-patient surfaces, HTSs in HDFs included surfaces shared in the care of multiple patients, which may represent an even greater risk of patient-to-patient pathogen transmission than single-patient surfaces. The identification of HTSs in EDs and HDFs contributes to a better understanding of the risk of environment-related pathogen transmission in these settings and may allow prioritization and optimization of cleaning and disinfection resources within facilities.
Fifty to ninety percent of individuals with Major Neurocognitive Disorder (MNCD) have Neuropsychiatric Symptoms (NPS)1. Agitation and aggression are amongst the most persistent and treatment-refractory symptom clusters. Patients with these NPS are associated with increased risk of institutionalization, psychotropic medication use, caregiver burden, and mortality2.
Safe and effective treatments for NPS are lacking. Consensus guidelines emphasize the initial use of non-pharmacologic approaches though supportive evidence is limited3.
Extensive research has established the safety and efficacy of ECT in elderly patients with depression and other psychiatric conditions6. Clinical experience suggests that ECT is a valuable treatment option in MNCD-related treatment refractory NPS cases7-10. However, data supporting the efficacy and safety of this practice is scant.
Materials and Method:
Patients admitted to the geriatric psychiatry inpatient units who meet the inclusion criteria, were recruited from 2 Vancouver sites and 3 unit at Ontario Shores. These patients had an anesthesia consultation to evaluate their safety of going through ECT. Consent was obtained from their substitute decision makers. All patients enrolled are already on psychotropic medications.
Postprandial glycaemia and insulinaemia are important risk factors for type 2 diabetes. The prevalence of insulin resistance in adolescents is increasing, but it is unknown how adolescent participant characteristics such as BMI, waist circumference, fitness and maturity offset may explain responses to a standard meal. The aim of the present study was to examine how such participant characteristics affect the postprandial glycaemic and insulinaemic responses to an ecologically valid mixed meal. Data from the control trials of three separate randomised, crossover experiments were pooled, resulting in a total of 108 participants (fifty-two boys, fifty-six girls; aged 12·5 (SD 0·6) years; BMI 19·05 (SD 2·66) kg/m2). A fasting blood sample was taken for the calculation of fasting insulin resistance, using the homoeostatic model assessment of insulin resistance (HOMA-IR). Further capillary blood samples were taken before and 30, 60 and 120 min after a standardised lunch, providing 1·5 g/kg body mass of carbohydrate, for the quantification of blood glucose and plasma insulin total AUC (tAUC). Hierarchical multiple linear regression demonstrated significant predictors for plasma insulin tAUC were waist circumference, physical fitness and HOMA-IR (F(3,98) = 36·78, P < 0·001, adjusted R2 = 0·515). The variance in blood glucose tAUC was not significantly explained by the predictors used (F(7,94) = 1·44, P = 0·198). Significant predictors for HOMA-IR were BMI and maturity offset (F(2,102) = 14·06, P < 0·001, adjusted R2 = 0·021). In summary, the key findings of the study are that waist circumference, followed by physical fitness, best explained the insulinaemic response to an ecologically valid standardised meal in adolescents. This has important behavioural consequences because these variables can be modified.
Investigation of treatments that effectively treat adults with post-traumatic stress disorder from childhood experiences (Ch-PTSD) and are well tolerated by patients is needed to improve outcomes for this population.
The purpose of this study was to compare the effectiveness of two trauma-focused treatments, imagery rescripting (ImRs) and eye movement desensitisation and reprocessing (EMDR), for treating Ch-PTSD.
We conducted an international, multicentre, randomised clinical trial, recruiting adults with Ch-PTSD from childhood trauma before 16 years of age. Participants were randomised to treatment condition and assessed by blind raters at multiple time points. Participants received up to 12 90-min sessions of either ImRs or EMDR, biweekly.
A total of 155 participants were included in the final intent-to-treat analysis. Drop-out rates were low, at 7.7%. A generalised linear mixed model of repeated measures showed that observer-rated post-traumatic stress disorder (PTSD) symptoms significantly decreased for both ImRs (d = 1.72) and EMDR (d = 1.73) at the 8-week post-treatment assessment. Similar results were seen with secondary outcome measures and self-reported PTSD symptoms. There were no significant differences between the two treatments on any standardised measure at post-treatment and follow-up.
ImRs and EMDR treatments were found to be effective in treating PTSD symptoms arising from childhood trauma, and in reducing other symptoms such as depression, dissociation and trauma-related cognitions. The low drop-out rates suggest that the treatments were well tolerated by participants. The results from this study provide evidence for the use of trauma-focused treatments for Ch-PTSD.
Trifludimoxazin, a new protoporphyrinogen oxidase–inhibiting herbicide, is being evaluated for possible use as a soil-residual active herbicide treatment in cotton for control of small-seeded annual broadleaf weeds. Laboratory and greenhouse studies were conducted to compare vertical mobility and cotton tolerance of trifludimoxazin to flumioxazin and saflufenacil, which are two currently registered protoporphyrinogen oxidase–inhibiting herbicides for use in cotton, in three West Texas soils. Vertical soil mobility of trifludimoxazin was similar to flumioxazin in Acuff loam and Olton loam soils, but was more mobile than flumioxazin in the Amarillo loamy sand soil. The depth of trifludimoxazin movement after a 2.5-cm irrigation event ranged from 2.5 to 5.0 cm in all soils, which would not allow for crop selectivity based on herbicide placement, because ideal cotton seeding depth is from 0.6 to 2.54 cm deep. Greenhouse studies indicated that PRE treatments were more injurious than the 14 d preplant treatment when summarized across soils for the three herbicides (43% and 14% injury, respectively). No differences in visual cotton response or dry weight was observed after trifludimoxazin preplant as compared with the nontreated control within each of the three West Texas soils and was similar to the flumioxazin preplant across soils. On the basis of these results, a use pattern for trifludimoxazin in cotton may be established with the use of a more than 14-d preplant restriction before cotton planting.
Newly established populations of endangered species can help mitigate declines elsewhere and can be a valuable genetic reservoir. When these populations are located within anthropogenic habitats, they may also help mitigate the potential biodiversity loss created by urbanization. The Red-crowned Amazon Amazona viridigenalis is an endangered species that has become naturalized in multiple urban areas throughout the United States and Mexico, and these populations may currently outnumber the population within their historical habitat. While these urban populations may hold the majority of this endangered species, very few studies have analyzed the status and trends of this species, or of threatened parrots in general, in urban areas. Our study focuses on an urban Red-crowned Amazon population in the Lower Rio Grande Valley (LRGV) of Texas: the only parrot population currently recognized as native to the United States. To determine a timeline of Red-crowned Amazon arrival and growth in the LRGV, we reviewed published literature and online citizen science databases. To quantify current population levels and trends, we conducted 412 surveys at all known roost sites throughout the LRGV from January 2016 through April 2019. We also quantified the ratio of adult and juvenile parrots at roosts. Our data suggest the species has been present in the LRGV consistently since the 1970s and showed rapid growth from the mid-1990s through roughly 2016. Roost counts suggest there is currently a minimum LRGV population of about 680 and the population has been relatively stable over the last 3.5 years. Productivity averaged 19% over three breeding seasons, suggesting successful internal reproduction. This study provides important baseline information for the management and conservation of Red-crowned Amazons in the region and provides a valuable timeline on the beginnings and trends of this recently established urban population of Amazona parrot.
Previous genetic association studies have failed to identify loci robustly associated with sepsis, and there have been no published genetic association studies or polygenic risk score analyses of patients with septic shock, despite evidence suggesting genetic factors may be involved. We systematically collected genotype and clinical outcome data in the context of a randomized controlled trial from patients with septic shock to enrich the presence of disease-associated genetic variants. We performed genomewide association studies of susceptibility and mortality in septic shock using 493 patients with septic shock and 2442 population controls, and polygenic risk score analysis to assess genetic overlap between septic shock risk/mortality with clinically relevant traits. One variant, rs9489328, located in AL589740.1 noncoding RNA, was significantly associated with septic shock (p = 1.05 × 10–10); however, it is likely a false-positive. We were unable to replicate variants previously reported to be associated (p < 1.00 × 10–6 in previous scans) with susceptibility to and mortality from sepsis. Polygenic risk scores for hematocrit and granulocyte count were negatively associated with 28-day mortality (p = 3.04 × 10–3; p = 2.29 × 10–3), and scores for C-reactive protein levels were positively associated with susceptibility to septic shock (p = 1.44 × 10–3). Results suggest that common variants of large effect do not influence septic shock susceptibility, mortality and resolution; however, genetic predispositions to clinically relevant traits are significantly associated with increased susceptibility and mortality in septic individuals.
Radiocarbon (14C) ages cannot provide absolutely dated chronologies for archaeological or paleoenvironmental studies directly but must be converted to calendar age equivalents using a calibration curve compensating for fluctuations in atmospheric 14C concentration. Although calibration curves are constructed from independently dated archives, they invariably require revision as new data become available and our understanding of the Earth system improves. In this volume the international 14C calibration curves for both the Northern and Southern Hemispheres, as well as for the ocean surface layer, have been updated to include a wealth of new data and extended to 55,000 cal BP. Based on tree rings, IntCal20 now extends as a fully atmospheric record to ca. 13,900 cal BP. For the older part of the timescale, IntCal20 comprises statistically integrated evidence from floating tree-ring chronologies, lacustrine and marine sediments, speleothems, and corals. We utilized improved evaluation of the timescales and location variable 14C offsets from the atmosphere (reservoir age, dead carbon fraction) for each dataset. New statistical methods have refined the structure of the calibration curves while maintaining a robust treatment of uncertainties in the 14C ages, the calendar ages and other corrections. The inclusion of modeled marine reservoir ages derived from a three-dimensional ocean circulation model has allowed us to apply more appropriate reservoir corrections to the marine 14C data rather than the previous use of constant regional offsets from the atmosphere. Here we provide an overview of the new and revised datasets and the associated methods used for the construction of the IntCal20 curve and explore potential regional offsets for tree-ring data. We discuss the main differences with respect to the previous calibration curve, IntCal13, and some of the implications for archaeology and geosciences ranging from the recent past to the time of the extinction of the Neanderthals.
While medical nutrition therapy is an essential part of the care for critically ill patients, uncertainty exists about the right form, dosage, timing and route in relation to the phases of critical illness. As enteral nutrition (EN) is often withheld or interrupted during the intensive care unit (ICU) stay, combined EN and parenteral nutrition (PN) may represent an effective and safe option to achieve energy and protein goals as recommended by international guidelines. We hypothesise that critically ill patients at high nutritional risk may benefit from such a combined approach during their stay on the ICU. Therefore, we aim to test if an early combination of EN and high-protein PN (EN+PN) is effective in reaching energy and protein goals in patients at high nutritional risk, while avoiding overfeeding. This approach will be tested in the here-presented EFFORTcombo trial. Nutritionally high-risk ICU patients will be randomised to either high (≥2·2 g/kg per d) or low protein (≤1·2 g/kg per d). In the high protein group, the patients will receive EN+PN; in the low protein group, patients will be given EN alone. EN will be started in accordance with international guidelines in both groups. Efforts will be made to reach nutrition goals within 48–96 h. The efficacy of the proposed nutritional strategy will be tested as an innovative approach by functional outcomes at ICU and hospital discharge, as well as at a 6-month follow-up.
Even in cases with complexity, simple techniques can be useful to target a specific symptom. Intrusive mental images are highly disruptive, drive emotion, and contribute to maintaining psychopathology. Cognitive science suggests that we might target intrusive images using competing tasks.
We describe an imagery competing task technique within cognitive behavioural therapy (CBT) with a patient with bipolar disorder and post-traumatic stress disorder (PTSD) symptoms. The intervention – including Tetris computer game-play – was used (1) to target a specific image within one therapy session, and (2) to manage multiple images in daily life.
A single case (AB) design was used. (1) To target a specific image, the patient brought the image to mind and, after mental rotation instructions and game-play practice, played Tetris for 10 minutes. Outcomes, pre- and post-technique, were: vividness/distress ratings when the image was brought to mind; reported intrusion frequency over a week. (2) To manage multiple images, the patient used the intervention after an intrusive image occurred. Outcomes were weekly measures of: (a) imagery characteristics; (b) symptoms of PTSD, anxiety, depression and mania.
(1) For the target image, there were reductions in vividness (80% to 40%), distress (70% to 0%), and intrusion frequency (daily to twice/week). (2) For multiple images, there were reductions from baseline to follow-up in (a) imagery vividness (38%), realness (66%) and compellingness (23%), and (b) PTSD symptoms (Impact of Events Scale-Revised score 26.33 to 4.83).
This low-intensity intervention aiming to directly target intrusive mental imagery may offer an additional, complementary tool in CBT.
The experience of mental imagery is a common part of everyday life for most people, and much of this mental imagery has an emotional tone. For example, we may enjoy anticipating an upcoming holiday in our imagination, or an unpleasant image we saw on the television the previous evening may suddenly flash into our mind and bring with it a feeling of sadness or disgust. Scientific research into mental imagery has demonstrated its capacity to evoke emotion, and this is likely to play a role in the important functions that mental imagery appears to have in everyday life. However, the experience of emotional mental imagery is not always helpful, and dysfunctions in emotional mental imagery are observed across a range of areas of mental health, such as depression and anxiety disorders. At the same time, the properties of emotional mental imagery can be deliberately harnessed, for example in psychological therapies. The research presented in this chapter highlights the importance of being aware of the capacity of mental imagery to evoke emotion and the properties of emotional mental imagery when studying the imagination, and raises a number of suggestions for furthering interdisciplinary research in this area.