We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The association between cannabis and psychosis is established, but the role of underlying genetics is unclear. We used data from the EU-GEI case-control study and UK Biobank to examine the independent and combined effect of heavy cannabis use and schizophrenia polygenic risk score (PRS) on risk for psychosis.
Methods
Genome-wide association study summary statistics from the Psychiatric Genomics Consortium and the Genomic Psychiatry Cohort were used to calculate schizophrenia and cannabis use disorder (CUD) PRS for 1098 participants from the EU-GEI study and 143600 from the UK Biobank. Both datasets had information on cannabis use.
Results
In both samples, schizophrenia PRS and cannabis use independently increased risk of psychosis. Schizophrenia PRS was not associated with patterns of cannabis use in the EU-GEI cases or controls or UK Biobank cases. It was associated with lifetime and daily cannabis use among UK Biobank participants without psychosis, but the effect was substantially reduced when CUD PRS was included in the model. In the EU-GEI sample, regular users of high-potency cannabis had the highest odds of being a case independently of schizophrenia PRS (OR daily use high-potency cannabis adjusted for PRS = 5.09, 95% CI 3.08–8.43, p = 3.21 × 10−10). We found no evidence of interaction between schizophrenia PRS and patterns of cannabis use.
Conclusions
Regular use of high-potency cannabis remains a strong predictor of psychotic disorder independently of schizophrenia PRS, which does not seem to be associated with heavy cannabis use. These are important findings at a time of increasing use and potency of cannabis worldwide.
Cannabis use and familial vulnerability to psychosis have been associated with social cognition deficits. This study examined the potential relationship between cannabis use and cognitive biases underlying social cognition and functioning in patients with first episode psychosis (FEP), their siblings, and controls.
Methods
We analyzed a sample of 543 participants with FEP, 203 siblings, and 1168 controls from the EU-GEI study using a correlational design. We used logistic regression analyses to examine the influence of clinical group, lifetime cannabis use frequency, and potency of cannabis use on cognitive biases, accounting for demographic and cognitive variables.
Results
FEP patients showed increased odds of facial recognition processing (FRP) deficits (OR = 1.642, CI 1.123–2.402) relative to controls but not of speech illusions (SI) or jumping to conclusions (JTC) bias, with no statistically significant differences relative to siblings. Daily and occasional lifetime cannabis use were associated with decreased odds of SI (OR = 0.605, CI 0.368–0.997 and OR = 0.646, CI 0.457–0.913 respectively) and JTC bias (OR = 0.625, CI 0.422–0.925 and OR = 0.602, CI 0.460–0.787 respectively) compared with lifetime abstinence, but not with FRP deficits, in the whole sample. Within the cannabis user group, low-potency cannabis use was associated with increased odds of SI (OR = 1.829, CI 1.297–2.578, FRP deficits (OR = 1.393, CI 1.031–1.882, and JTC (OR = 1.661, CI 1.271–2.171) relative to high-potency cannabis use, with comparable effects in the three clinical groups.
Conclusions
Our findings suggest increased odds of cognitive biases in FEP patients who have never used cannabis and in low-potency users. Future studies should elucidate this association and its potential implications.
Edited by
David M. Greer, Boston University School of Medicine and Boston Medical Center,Neha S. Dangayach, Icahn School of Medicine at Mount Sinai and Mount Sinai Health System
There are many central nervous system (CNS) pathologies that are managed in the neurointensive care unit. Neurocritical patients are a diverse group with vastly different presentations, management, expected duration of their clinical course, and disease-related long-term outcomes. Clinical entities include traumatic brain injury (TBI), ischemic stroke, aneurysmal subarachnoid hemorrhage (aSAH), intraparenchymal hemorrhages (ICH), spinal cord injury (SCI), brain tumors, postoperative craniotomy patients, and nonsurgical diseases, such as myasthenia gravis, Guillain–Barré syndrome, and CNS infections (meningitis and encephalitis).
There are a variety of bedside neurosurgical and neurocritical care procedures that may be required to provide care and mitigate the effects of primary neurologic pathology and to improve outcomes. Despite the many advances in neurosurgical and neurocritical care in that last several decades, complications from these procedures, while generally rare, still can occur (Table 16.1).
Field experiments were conducted at Clayton and Rocky Mount, NC, during summer 2020 to determine the growth and fecundity of Palmer amaranth plants that survived glufosinate with and without grass competition in cotton. Glufosinate (590 g ai ha−1) was applied to Palmer amaranth early postemergence (5 cm tall), mid-postemergence (7 to 10 cm tall), and late postemergence (>10 cm tall) and at orthogonal combinations of those timings. Nontreated Palmer amaranth was grown in weedy, weed-free in-crop (WFIC) and weed-free fallow (WFNC) conditions for comparisons. Palmer amaranth control decreased as larger plants were treated; no plants survived the sequential glufosinate applications in both experiments. The apical and circumferential growth of Palmer amaranth surviving glufosinate treatments was reduced by more than 44% compared to the WFIC and WFNC Palmer amaranth in both experiments. The biomass of Palmer amaranth plants surviving glufosinate was reduced by more than 62% when compared with the WFIC and WFNC in all experiments. The fecundity of Palmer amaranth surviving glufosinate treatments was reduced by more than 73% compared to WFNC Palmer amaranth in all experiments. Remarkably, the plants that survived glufosinate were fecund as WFIC plants only in the Grass Competition experiment. The results prove that despite decreased vegetative growth of Palmer amaranth surviving glufosinate treatment, plants remain fecund and can be fecund as nontreated plants in cotton. These results suggest that a glufosinate-treated grass weed may not have a significant interspecific competition effect on Palmer amaranth that survives glufosinate. Glufosinate should be applied to 5 to 7 cm Palmer amaranth to cease vegetative and reproductive capacities.
Palmer amaranth (Amaranthus palmeri S. Watson, AMAPA) is one of the most troublesome weeds in North America due to its rapid growth rate, substantial seed production, competitiveness and the evolution of herbicide-resistant populations. Though frequently encountered in the South, Midwest, and Mid-Atlantic regions of the United States, A. palmeri was recently identified in soybean [Glycine max (L.) Merr.] fields in Genesee, Orange, and Steuben counties, NY, where glyphosate was the primary herbicide for in-crop weed control. This research, conducted in 2023, aimed to (1) describe the dose response of three putative resistant NY A. palmeri populations to glyphosate, (2) determine their mechanisms of resistance, and (3) assess their sensitivity to other postemergence herbicides commonly used in NY crop production systems. Based on the effective dose necessary to reduce aboveground biomass by 50% (ED50), the NY populations were 42 to 67 times more resistant to glyphosate compared with a glyphosate-susceptible population. Additionally, the NY populations had elevated EPSPS gene copy numbers ranging from 25 to 135 located within extrachromosomal circular DNA (eccDNA). Label rate applications of Weed Science Society of America (WSSA) Group 2 herbicides killed up to 42% of the NY populations of A. palmeri. Some variability was observed among populations in response to WSSA Group 5 and 27 herbicides. All populations were effectively controlled by labeled rates of herbicides belonging to WSSA Groups 4, 10, 14, and 22. Additional research is warranted to confirm whether NY populations have evolved multiple resistance to herbicides within other WSSA groups and to develop effective A. palmeri management strategies suitable for NY crop production.
Coastal wetlands are hotspots of carbon sequestration, and their conservation and restoration can help to mitigate climate change. However, there remains uncertainty on when and where coastal wetland restoration can most effectively act as natural climate solutions (NCS). Here, we synthesize current understanding to illustrate the requirements for coastal wetland restoration to benefit climate, and discuss potential paths forward that address key uncertainties impeding implementation. To be effective as NCS, coastal wetland restoration projects will accrue climate cooling benefits that would not occur without management action (additionality), will be implementable (feasibility) and will persist over management-relevant timeframes (permanence). Several issues add uncertainty to understanding if these minimum requirements are met. First, coastal wetlands serve as both a landscape source and sink of carbon for other habitats, increasing uncertainty in additionality. Second, coastal wetlands can potentially migrate outside of project footprints as they respond to sea-level rise, increasing uncertainty in permanence. To address these first two issues, a system-wide approach may be necessary, rather than basing cooling benefits only on changes that occur within project boundaries. Third, the need for NCS to function over management-relevant decadal timescales means methane responses may be necessary to include in coastal wetland restoration planning and monitoring. Finally, there is uncertainty on how much data are required to justify restoration action. We summarize the minimum data required to make a binary decision on whether there is a net cooling benefit from a management action, noting that these data are more readily available than the data required to quantify the magnitude of cooling benefits for carbon crediting purposes. By reducing uncertainty, coastal wetland restoration can be implemented at the scale required to significantly contribute to addressing the current climate crisis.
Seattle Children’s Research Institute is identifying the amount and type of health equity scholarship being conducted institution wide. However, methods for categorizing how scholarship is equity-focused are lacking. We developed and evaluated the reliability of a health equity scholarship coding schema applied to Seattle Children’s affiliated scholarship.
Methods:
A 2021–2022 Ovid MEDLINE affiliation search yielded 3551 affiliated scholarship records, with 1079 records identified via an existing filter as scholarship addressing social determinants of health. Through reliability testing and examining concordance and discordance across three independent coders of these records, we developed a coding schema to classify health equity scholarship (yes/no). When health equity scholarship proved positive/Yes, the coders assigned a one through five maturity rating of the scholarship towards addressing inequities. Subsequent reliability testing including a new coder was conducted for 992 subsequent affiliated scholarship records (Oct 2022–June 2023), with additional testing of the sensitivity and specificity of the existing filter relative to the new coding schema.
Results:
Reliability for identifying health equity scholarship was consistently high (Fleiss kappas ≥ .78) and categorization of health equity scholarship into maturity levels was moderate (Fleiss kappas ≥ .47). The coding schema identified additional health equity scholarship not captured in an existing filter for social determinants of health scholarship. Based on the new schema, 23.3% of Seattle Childrens’ affiliated scholarship published October 2002–June 2023 was health equity focused.
Conclusions:
This new coding schema can be used to identify and categorize health equity scholarship to help quantitate the health equity focus of portfolios of human-focused research.
Military Servicemembers and Veterans are at elevated risk for suicide, but rarely self-identify to their leaders or clinicians regarding their experience of suicidal thoughts. We developed an algorithm to identify posts containing suicide-related content on a military-specific social media platform.
Methods
Publicly-shared social media posts (n = 8449) from a military-specific social media platform were reviewed and labeled by our team for the presence/absence of suicidal thoughts and behaviors and used to train several machine learning models to identify such posts.
Results
The best performing model was a deep learning (RoBERTa) model that incorporated post text and metadata and detected the presence of suicidal posts with relatively high sensitivity (0.85), specificity (0.96), precision (0.64), F1 score (0.73), and an area under the precision-recall curve of 0.84. Compared to non-suicidal posts, suicidal posts were more likely to contain explicit mentions of suicide, descriptions of risk factors (e.g. depression, PTSD) and help-seeking, and first-person singular pronouns.
Conclusions
Our results demonstrate the feasibility and potential promise of using social media posts to identify at-risk Servicemembers and Veterans. Future work will use this approach to deliver targeted interventions to social media users at risk for suicide.
We assessed adverse events in hospitalized patients receiving selected vesicant antibiotics or vasopressors administered through midline catheters or peripherally inserted central catheters (PICC). The rates of catheter-related bloodstream infections, thrombosis, and overall events were similar across the two groups, while occlusion was higher in the PICC group.
Globally, mental disorders account for almost 20% of disease burden and there is growing evidence that mental disorders are associated with various social determinants. Tackling the United Nations Sustainable Development Goals (UN SDGs), which address known social determinants of mental disorders, may be an effective way to reduce the global burden of mental disorders.
Objectives
To examine the evidence base for interventions that seek to improve mental health through targeting the social determinants of mental disorders.
Methods
We conducted a systematic review of reviews, using a five-domain conceptual framework which aligns with the UN SDGs (PROSPERO registration: CRD42022361534). PubMed, PsycInfo, and Scopus were searched from 01 January 2012 until 05 October 2022. Citation follow-up and expert consultation were used to identify additional studies. Systematic reviews including interventions seeking to change or improve a social determinant of mental disorders were eligible for inclusion. Study screening, selection, data extraction, and quality appraisal were conducted in accordance with PRISMA guidelines. The AMSTAR-2 was used to assess included reviews and results were narratively synthesised.
Results
Over 20,000 records were screened, and 101 eligible reviews were included. Most reviews were of low, or critically low, quality. Reviews included interventions which targeted sociocultural (n = 31), economic (n = 24), environmental (n = 19), demographic (n = 15), and neighbourhood (n = 8) determinants of mental disorders. Interventions demonstrating the greatest promise for improved mental health from high and moderate quality reviews (n = 37) included: digital and brief advocacy interventions for female survivors of intimate partner violence; cash transfers for people in low-middle-income countries; improved work schedules, parenting programs, and job clubs in the work environment; psychosocial support programs for vulnerable individuals following environmental events; and social and emotional learning programs for school students. Few effective neighbourhood-level interventions were identified.
Conclusions
This review presents interventions with the strongest evidence base for the prevention of mental disorders and highlights synergies where addressing the UN SDGs can be beneficial for mental health. A range of issues across the literature were identified, including barriers to conducting randomised controlled trials and lack of follow-up limiting the ability to measure long-term mental health outcomes. Interdisciplinary and novel approaches to intervention design, implementation, and evaluation are required to improve the social circumstances and mental health experienced by individuals, communities, and populations.
Researchers increasingly rely on aggregations of radiocarbon dates from archaeological sites as proxies for past human populations. This approach has been critiqued on several grounds, including the assumptions that material is deposited, preserved, and sampled in proportion to past population size. However, various attempts to quantitatively assess the approach suggest there may be some validity in assuming date counts reflect relative population size. To add to this conversation, here we conduct a preliminary analysis coupling estimates of ethnographic population density with late Holocene radiocarbon dates across all counties in California. Results show that counts of late Holocene radiocarbon-dated archaeological sites increase significantly as a function of ethnographic population density. This trend is robust across varying sampling windows over the last 5000 BP. Though the majority of variation in dated-site counts remains unexplained by population density. Outliers reveal how departures from the central trend may be influenced by regional differences in research traditions, development-driven contract work, organic preservation, and landscape taphonomy. Overall, this exercise provides some support for the “dates-as-data” approach and offers insights into the conditions where the underlying assumptions may or may not hold.
Throughout the COVID-19 pandemic, many areas in the United States experienced healthcare personnel (HCP) shortages tied to a variety of factors. Infection prevention programs, in particular, faced increasing workload demands with little opportunity to delegate tasks to others without specific infectious diseases or infection control expertise. Shortages of clinicians providing inpatient care to critically ill patients during the early phase of the pandemic were multifactorial, largely attributed to increasing demands on hospitals to provide care to patients hospitalized with COVID-19 and furloughs.1 HCP shortages and challenges during later surges, including the Omicron variant-associated surges, were largely attributed to HCP infections and associated work restrictions during isolation periods and the need to care for family members, particularly children, with COVID-19. Additionally, the detrimental physical and mental health impact of COVID-19 on HCP has led to attrition, which further exacerbates shortages.2 Demands increased in post-acute and long-term care (PALTC) settings, which already faced critical staffing challenges difficulty with recruitment, and high rates of turnover. Although individual healthcare organizations and state and federal governments have taken actions to mitigate recurring shortages, additional work and innovation are needed to develop longer-term solutions to improve healthcare workforce resiliency. The critical role of those with specialized training in infection prevention, including healthcare epidemiologists, was well-demonstrated in pandemic preparedness and response. The COVID-19 pandemic underscored the need to support growth in these fields.3 This commentary outlines the need to develop the US healthcare workforce in preparation for future pandemics.
Throughout history, pandemics and their aftereffects have spurred society to make substantial improvements in healthcare. After the Black Death in 14th century Europe, changes were made to elevate standards of care and nutrition that resulted in improved life expectancy.1 The 1918 influenza pandemic spurred a movement that emphasized public health surveillance and detection of future outbreaks and eventually led to the creation of the World Health Organization Global Influenza Surveillance Network.2 In the present, the COVID-19 pandemic exposed many of the pre-existing problems within the US healthcare system, which included (1) a lack of capacity to manage a large influx of contagious patients while simultaneously maintaining routine and emergency care to non-COVID patients; (2) a “just in time” supply network that led to shortages and competition among hospitals, nursing homes, and other care sites for essential supplies; and (3) longstanding inequities in the distribution of healthcare and the healthcare workforce. The decades-long shift from domestic manufacturing to a reliance on global supply chains has compounded ongoing gaps in preparedness for supplies such as personal protective equipment and ventilators. Inequities in racial and socioeconomic outcomes highlighted during the pandemic have accelerated the call to focus on diversity, equity, and inclusion (DEI) within our communities. The pandemic accelerated cooperation between government entities and the healthcare system, resulting in swift implementation of mitigation measures, new therapies and vaccinations at unprecedented speeds, despite our fragmented healthcare delivery system and political divisions. Still, widespread misinformation or disinformation and political divisions contributed to eroded trust in the public health system and prevented an even uptake of mitigation measures, vaccines and therapeutics, impeding our ability to contain the spread of the virus in this country.3 Ultimately, the lessons of COVID-19 illustrate the need to better prepare for the next pandemic. Rising microbial resistance, emerging and re-emerging pathogens, increased globalization, an aging population, and climate change are all factors that increase the likelihood of another pandemic.4
Over the last two decades, self-practice/self-reflection (SP/SR) has been advocated as a useful experiential teaching method on CBT training programmes. As part of this, theoretical positions point to the importance of there being an explicit process of ‘bridging’ between what is learnt about the self (personal development) and the implications of this for clinical practice (professional development). However, exactly how participants experience this synthesis as part of their engagement in SP/SR has not yet been clarified. As such, the present study set out to explicate trainee CBT therapists’ experiences of this process, in order to further our understanding of how they synthesise their personal and professional development during training. Nineteen trainees took part in the study, each consenting to a 1000-word written summary of their learning from SP/SR being entered into the dataset and analysed using thematic analysis. The analysis identified five interconnected themes, illustrating how trainees had (i) identified self-schemas, (ii) increased their awareness of personal context, and (iii) conceptualised the role of the self in the therapeutic process; they had then achieved (iv) personal–professional development via experiential change methods, resulting in (v) perceived benefits for their clinical practice. SP/SR may therefore be a useful vehicle to enhance personal and professional development during training by helping trainees to understand and address the role of the self in cognitive behavioural psychotherapy. Tentative implications for CBT training and practice have been offered.
Key learning aims
(1) To summarise key theoretical positions and research outcomes underpinning the use of SP/SR as a CBT training method to enhance personal and professional development.
(2) To understand trainee experiences of synthesising personal and professional development from SP/SR during training.
(3) To consider implications for CBT training and ongoing professional practice.
Recent theories suggest that for youth highly sensitive to incentives, perceiving more social threat may contribute to social anxiety (SA) symptoms. In 129 girls (ages 11–13) oversampled for shy/fearful temperament, we thus examined how interactions between neural responses to social reward (vs. neutral) cues (measured during anticipation of peer feedback) and perceived social threat in daily peer interactions (measured using ecological momentary assessment) predict SA symptoms two years later. No significant interactions emerged when neural reward function was modeled as a latent factor. Secondary analyses showed that higher perceived social threat was associated with more severe SA symptoms two years later only for girls with higher basolateral amygdala (BLA) activation to social reward cues at baseline. Interaction effects were specific to BLA activation to social reward (not threat) cues, though a main effect of BLA activation to social threat (vs. neutral) cues on SA emerged. Unexpectedly, interactions between social threat and BLA activation to social reward cues also predicted generalized anxiety and depression symptoms two years later, suggesting possible transdiagnostic risk pathways. Perceiving high social threat may be particularly detrimental for youth highly sensitive to reward incentives, potentially due to mediating reward learning processes, though this remains to be tested.
Field experiments were conducted at Clayton and Rocky Mount, North Carolina, during the summer of 2020 to determine the growth and fecundity of Palmer amaranth plants that survived glufosinate with and without grass competition in soybean crops. Glufosinate (590 g ai ha−1) was applied at early postemergence (when Palmer amaranth plants were 5 cm tall), mid-postemergence (7–10 cm), and late postemergence (>10 cm) and at orthogonal combinations of those timings. Nontreated Palmer amaranth was grown in weedy (i.e., intraspecific and grass competition), weed-free in-crop (WFIC), and weed-free fallow (WFNC) conditions for comparisons. No Palmer amaranth plants survived the sequential glufosinate applications and control decreased as the plants were treated at a larger size in both experiments. The apical and circumferential growth rate of Palmer amaranth surviving glufosinate was reduced by more than 44% compared with the WFNC Palmer amaranth. The biomass of Palmer amaranth plants that survived glufosinate was reduced by more than 87% compared with the WFNC Palmer amaranth. The fecundity of Palmer amaranth that survived glufosinate was reduced by more than 70% compared with WFNC Palmer amaranth. Palmer amaranth plants that survived glufosinate were as fecund as the WFIC Palmer amaranth in both experiments in soybean fields. The results prove that despite the significant vegetative growth rate decrease of Palmer amaranth that survived glufosinate, plants can be as fecund as nontreated plants. The trends in growth and fecundity of Palmer amaranth that survives glufosinate with and without grass competition were similar. These results suggest that glufosinate-treated grass weeds may not reduce the growth or fecundity of Palmer amaranth that survives glufosinate.
Knowledge of sex differences in risk factors for posttraumatic stress disorder (PTSD) can contribute to the development of refined preventive interventions. Therefore, the aim of this study was to examine if women and men differ in their vulnerability to risk factors for PTSD.
Methods
As part of the longitudinal AURORA study, 2924 patients seeking emergency department (ED) treatment in the acute aftermath of trauma provided self-report assessments of pre- peri- and post-traumatic risk factors, as well as 3-month PTSD severity. We systematically examined sex-dependent effects of 16 risk factors that have previously been hypothesized to show different associations with PTSD severity in women and men.
Results
Women reported higher PTSD severity at 3-months post-trauma. Z-score comparisons indicated that for five of the 16 examined risk factors the association with 3-month PTSD severity was stronger in men than in women. In multivariable models, interaction effects with sex were observed for pre-traumatic anxiety symptoms, and acute dissociative symptoms; both showed stronger associations with PTSD in men than in women. Subgroup analyses suggested trauma type-conditional effects.
Conclusions
Our findings indicate mechanisms to which men might be particularly vulnerable, demonstrating that known PTSD risk factors might behave differently in women and men. Analyses did not identify any risk factors to which women were more vulnerable than men, pointing toward further mechanisms to explain women's higher PTSD risk. Our study illustrates the need for a more systematic examination of sex differences in contributors to PTSD severity after trauma, which may inform refined preventive interventions.
Marine litter poses a complex challenge in Indonesia, necessitating a well-informed and coordinated strategy for effective mitigation. This study investigates the seasonality of plastic concentrations around Sulawesi Island in central Indonesia during monsoon-driven wet and dry seasons. By using open data and methodologies including the HYCOM and Parcels models, we simulated the dispersal of plastic waste over 3 months during both the southwest and northeast monsoons. Our research extended beyond data analysis, as we actively engaged with local communities, researchers and policymakers through a range of outreach initiatives, including the development of a web application to visualize model results. Our findings underscore the substantial influence of monsoon-driven currents on surface plastic concentrations, highlighting the seasonal variation in the risk to different regional seas. This study adds to the evidence provided by coarser resolution regional ocean modelling studies, emphasizing that seasonality is a key driver of plastic pollution within the Indonesian archipelago. Inclusive international collaboration and a community-oriented approach were integral to our project, and we recommend that future initiatives similarly engage researchers, local communities and decision-makers in marine litter modelling results. This study aims to support the application of model results in solutions to the marine litter problem.