We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Oriented matroids appear throughout discrete geometry, with applications in algebra, topology, physics, and data analysis. This introduction to oriented matroids is intended for graduate students, scientists wanting to apply oriented matroids, and researchers in pure mathematics. The presentation is geometrically motivated and largely self-contained, and no knowledge of matroid theory is assumed. Beginning with geometric motivation grounded in linear algebra, the first chapters prove the major cryptomorphisms and the Topological Representation Theorem. From there the book uses basic topology to go directly from geometric intuition to rigorous discussion, avoiding the need for wider background knowledge. Topics include strong and weak maps, localizations and extensions, the Euclidean property and non-Euclidean properties, the Universality Theorem, convex polytopes, and triangulations. Themes that run throughout include the interplay between combinatorics, geometry, and topology, and the idea of oriented matroids as analogs to vector spaces over the real numbers and how this analogy plays out topologically.
The mechanisms underlying generalized forms of dissociative (‘psychogenic’) amnesia are poorly understood. One theory suggests that memory retrieval is inhibited via prefrontal control. Findings from cognitive neuroscience offer a candidate mechanism for this proposed retrieval inhibition. By applying predictions based on these experimental findings, we examined the putative role of retrieval suppression in dissociative amnesia.
Methods
We analyzed fMRI data from two previously reported cases of dissociative amnesia. Patients had been shown reminders from forgotten and remembered time periods (colleagues and school friends). We examined the neuroanatomical overlap between regions engaged in the unrecognized compared to the recognized condition, and the regions engaged during retrieval suppression in laboratory-based tasks. Effective connectivity analyses were performed to test the hypothesized modulatory relationship between the right anterior dorsolateral prefrontal cortex (raDLPFC) and the hippocampus. Both patients were scanned again following treatment, and analyses were repeated.
Results
We observed substantial functional alignment between the inhibitory regions engaged during laboratory-based retrieval suppression tasks, and those engaged when patients failed to recognize their current colleagues. This included significant activation in the raDLPFC and right ventrolateral prefrontal cortex, and a corresponding deactivation across autobiographical memory regions (hippocampus, medial PFC). Dynamic causal modeling confirmed the hypothesized modulatory relationship between the raDLPFC and the hippocampus. This pattern was no longer evident following memory recovery in the first patient, but persisted in the second patient who remained amnesic.
Conclusions
Findings are consistent with an inhibitory mechanism driving down activity across core memory regions to prevent the recognition of personally relevant stimuli.
The association between cannabis and psychosis is established, but the role of underlying genetics is unclear. We used data from the EU-GEI case-control study and UK Biobank to examine the independent and combined effect of heavy cannabis use and schizophrenia polygenic risk score (PRS) on risk for psychosis.
Methods
Genome-wide association study summary statistics from the Psychiatric Genomics Consortium and the Genomic Psychiatry Cohort were used to calculate schizophrenia and cannabis use disorder (CUD) PRS for 1098 participants from the EU-GEI study and 143600 from the UK Biobank. Both datasets had information on cannabis use.
Results
In both samples, schizophrenia PRS and cannabis use independently increased risk of psychosis. Schizophrenia PRS was not associated with patterns of cannabis use in the EU-GEI cases or controls or UK Biobank cases. It was associated with lifetime and daily cannabis use among UK Biobank participants without psychosis, but the effect was substantially reduced when CUD PRS was included in the model. In the EU-GEI sample, regular users of high-potency cannabis had the highest odds of being a case independently of schizophrenia PRS (OR daily use high-potency cannabis adjusted for PRS = 5.09, 95% CI 3.08–8.43, p = 3.21 × 10−10). We found no evidence of interaction between schizophrenia PRS and patterns of cannabis use.
Conclusions
Regular use of high-potency cannabis remains a strong predictor of psychotic disorder independently of schizophrenia PRS, which does not seem to be associated with heavy cannabis use. These are important findings at a time of increasing use and potency of cannabis worldwide.
From early on, infants show a preference for infant-directed speech (IDS) over adult-directed speech (ADS), and exposure to IDS has been correlated with language outcome measures such as vocabulary. The present multi-laboratory study explores this issue by investigating whether there is a link between early preference for IDS and later vocabulary size. Infants’ preference for IDS was tested as part of the ManyBabies 1 project, and follow-up CDI data were collected from a subsample of this dataset at 18 and 24 months. A total of 341 (18 months) and 327 (24 months) infants were tested across 21 laboratories. In neither preregistered analyses with North American and UK English, nor exploratory analyses with a larger sample did we find evidence for a relation between IDS preference and later vocabulary. We discuss implications of this finding in light of recent work suggesting that IDS preference measured in the laboratory has low test-retest reliability.
Background: Presenteeism when ill in healthcare personnel (HCP) can contribute to the spread of respiratory illness among HCP and patients. However, during the COVID-19 pandemic and now, there are substantial challenges preventing HCP from staying home when ill. We examined these challenges using the Systems Engineering Initiative for Patient Safety (SEIPS) framework. Method: As part of a larger anonymous electronic survey between 3/11/2022 and 4/12/2022 at an academic tertiary referral center, in inpatient and ambulatory settings where respondents were asked to describe factors impacting presenteeism when ill, we analyzed free-text responses using the SEIPS categories of tasks, tools/technology, person, organization, and physical environment. Result: 522 comments were received in response to the open-ended survey question asking individuals to describe any factors that would assist them in remaining home and/or help them get tested for COVID-19 when they have symptoms of a respiratory illness; 21 were excluded due to absent or incomplete response. Of the remaining responses (N = 501, Figure 1), 82% were associated with a single SEIPS component such as organization (N = 409), while other responses discussed factors that involved two SEIPS components, in no particular order (N = 92). A majority of the responses (N = 324, 55%) reported organizational barriers, frequently citing a strict sick call-in policy as well as a lack of protected time-off for COVID-19 testing or related absences. The next two most commonly identified components were physical environment (N= 88, 15%) and tasks (N = 72, 12%), mentioning barriers such as far distances to testing centers and prolonged waiting periods for testing Results: The person and tools/technology components were less commonly identified, with a frequency of 9% each. Conclusion: A number of systems level factors were identified that may impact the ability of HCP to stay home when ill. Interventions to help overcome HCP perceived barriers to staying home when experiencing respiratory symptoms should focus on the policies and practices within an organization. Communication from leadership should support staying home with respiratory symptoms by creating plans for coverage and back up consistently across all employee types in direct care.
Estimate the impact of 20 % flat-rate and tiered sugary drink tax structures on the consumption of sugary drinks, sugar-sweetened beverages and 100 % juice by age, sex and socio-economic position.
Design:
We modelled the impact of price changes – for each tax structure – on the demand for sugary drinks by applying own- and cross-price elasticities to self-report sugary drink consumption measured using single-day 24-h dietary recalls from the cross-sectional, nationally representative 2015 Canadian Community Health Survey-Nutrition. For both 20 % flat-rate and tiered sugary drink tax scenarios, we used linear regression to estimate differences in mean energy intake and proportion of energy intake from sugary drinks by age, sex, education, food security and income.
Setting:
Canada.
Participants:
19 742 respondents aged 2 and over.
Results:
In the 20 % flat-rate scenario, we estimated mean energy intake and proportion of daily energy intake from sugary drinks on a given day would be reduced by 29 kcal/d (95 % UI: 18, 41) and 1·3 % (95 % UI: 0·8, 1·8), respectively. Similarly, in the tiered tax scenario, additional small, but meaningful reductions were estimated in mean energy intake (40 kcal/d, 95 % UI: 24, 55) and proportion of daily energy intake (1·8 %, 95 % UI: 1·1, 2·5). Both tax structures reduced, but did not eliminate, inequities in mean energy intake from sugary drinks despite larger consumption reductions in children/adolescents, males and individuals with lower education, food security and income.
Conclusions:
Sugary drink taxation, including the additional benefit of taxing 100 % juice, could reduce overall and inequities in mean energy intake from sugary drinks in Canada.
A review of hospital-onset COVID-19 cases revealed 8 definite, 106 probable, and 46 possible cases. Correlations between hospital-onset cases and both HCW and inpatient cases were noted in 2021. Rises in community measures were associated with rises in hospital-onset cases. Measures of community COVID-19 activity might predict hospital-onset cases.
Incidence of first-episode psychosis (FEP) varies substantially across geographic regions. Phenotypes of subclinical psychosis (SP), such as psychotic-like experiences (PLEs) and schizotypy, present several similarities with psychosis. We aimed to examine whether SP measures varied across different sites and whether this variation was comparable with FEP incidence within the same areas. We further examined contribution of environmental and genetic factors to SP.
Methods
We used data from 1497 controls recruited in 16 different sites across 6 countries. Factor scores for several psychopathological dimensions of schizotypy and PLEs were obtained using multidimensional item response theory models. Variation of these scores was assessed using multi-level regression analysis to estimate individual and between-sites variance adjusting for age, sex, education, migrant, employment and relational status, childhood adversity, and cannabis use. In the final model we added local FEP incidence as a second-level variable. Association with genetic liability was examined separately.
Results
Schizotypy showed a large between-sites variation with up to 15% of variance attributable to site-level characteristics. Adding local FEP incidence to the model considerably reduced the between-sites unexplained schizotypy variance. PLEs did not show as much variation. Overall, SP was associated with younger age, migrant, unmarried, unemployed and less educated individuals, cannabis use, and childhood adversity. Both phenotypes were associated with genetic liability to schizophrenia.
Conclusions
Schizotypy showed substantial between-sites variation, being more represented in areas where FEP incidence is higher. This supports the hypothesis that shared contextual factors shape the between-sites variation of psychosis across the spectrum.
Birthweight has been associated with diabetes in a reverse J-shape (highest risk at low birthweight and moderately high risk at high birthweight) and inversely associated with hypertension in adulthood with inconsistent evidence for cardiovascular disease. There is a lack of population-based studies examining the incidence of cardiometabolic outcomes in young adults born with low and high birthweights. To evaluate the association between birthweight and diabetes, hypertension, and ischemic heart disease (IHD) in young adulthood, we conducted a retrospective cohort study of 874,904 singletons born in Ontario, Canada, from 1994 to 2002, identified from population-based health administrative data. Separate Cox regression models examined birthweight in association with diabetes, hypertension, and IHD adjusting for confounders. Among adults 18–26 years, the diabetes incidence rate was 18.15 per 100,000 person-years, hypertension was 15.80 per 100,000 person-years, and IHD was 1.85 per 100,000 person-years. Adjusted hazard ratios (AHR) for the hazard of diabetes with low (<2500g) and high (>4000g), compared with normal (2500–4000g) birthweight, were 1.46 (95% CI 1.28, 1.68) and 1.09 (0.99, 1.21), respectively. AHR for hypertension with low and high birthweight were 1.34 (1.15, 1.56) and 0.86 (0.77, 0.97), respectively. AHR for IHD with low and high birthweight were 1.28 (0.80, 2.05) and 0.97 (0.71, 1.33), respectively. Overall, birthweight was associated with diabetes in young adults in a reverse J-shape and inversely with hypertension. There was insufficient evidence of an association with IHD. Further evidence is needed to understand the causal mechanisms between birthweight and cardiometabolic diseases in young adults.
Despite infection control guidance, sporadic nosocomial coronavirus disease 2019 (COVID-19) outbreaks occur. We describe a complex severe acute respiratory coronavirus virus 2 (SARS-CoV-2) cluster with interfacility spread during the SARS-CoV-2 δ (delta) pandemic surge in the Midwest.
Setting:
This study was conducted in (1) a hematology-oncology ward in a regional academic medical center and (2) a geographically distant acute rehabilitation hospital.
Methods:
We conducted contact tracing for each COVID-19 case to identify healthcare exposures within 14 days prior to diagnosis. Liberal testing was performed for asymptomatic carriage for patients and staff. Whole-genome sequencing was conducted for all available clinical isolates from patients and healthcare workers (HCWs) to identify transmission clusters.
Results:
In the immunosuppressed ward, 19 cases (4 patients, 15 HCWs) shared a genetically related SARS-CoV-2 isolate. Of these 4 patients, 3 died in the hospital or within 1 week of discharge. The suspected index case was a patient with new dyspnea, diagnosed during preprocedure screening. In the rehabilitation hospital, 20 cases (5 patients and 15 HCWs) positive for COVID-19, of whom 2 patients and 3 HCWs had an isolate genetically related to the above cluster. The suspected index case was a patient from the immune suppressed ward whose positive status was not detected at admission to the rehabilitation facility. Our response to this cluster included the following interventions in both settings: restricting visitors, restricting learners, restricting overflow admissions, enforcing strict compliance with escalated PPE, access to on-site free and frequent testing for staff, and testing all patients prior to hospital discharge and transfer to other facilities.
Conclusions:
Stringent infection control measures can prevent nosocomial COVID-19 transmission in healthcare facilities with high-risk patients during pandemic surges. These interventions were successful in ending these outbreaks.
The purpose of this investigation was to examine neuropsychological functioning after frontal (FL) or temporal lobectomy (ATL) in patients with localization related epilepsy. Few studies have compared cognitive changes following FL and ATL. Past research found improvement on measures of verbal and visual memory along with confrontation naming after FL (Busch et al., 2017). In contrast, a number of studies have reported verbal memory and naming decline in those undergoing left ATL. The current study examined post-operative cognitive changes in epilepsy patients who underwent either a left or right FL or ATL.
Participants and Methods:
Subjects include 430 patients (204 men, 225 women, 1 gender not specified), who underwent surgical resection; Right FL = 25, Left FL = 26, Right ATL = 211, Left ATL = 168. Patients had a mean FSIQ = 90, ages ranging from 18 to 71 (mean age = 37 years), right (n=359), left (n=50), or mixed (n=18) handedness, and education ranging from 3 to 22 years (mean = 12.9 years of education). Change from pre- to post FL and ATL was examined in the following domains: learning and memory [Long Term Storage for Selective Reminding Tests (SRT), Wechsler’s Memory Scale (WMS): Logical Memory Delayed Recall (LM) and Visual Reproduction Delayed Recall (VR)], and language [Boston Naming Test (BNT)].
Results:
A one-way ANOVA was used to examine changes in language and memory. Our findings revealed statistically significant differences between resection groups for LM, SRT, and BNT. There were significant declines (p<.001) for left ATL when compared to right ATL for LM, SRT, and BNT. There were significant declines for left ATL, when compared to the gains in both left (p<.001; p=.002) and right (p=.018; p=.008) FL for LM and BNT. Left ATL also had significant declines when compared to gains in SRT (p<.001) for right FL. There were significant declines for left FL when compared to right ATL for SRT (p=.007). Lastly, there were significant gains for right FL when compared to left FL for SRT (p=.020).
Conclusions:
The pre- to post-surgical neuropsychological change in learning, memory, and language is understudied in frontal lobe epilepsy (FLE); although several investigators reported some learning and memory impairments in FLE at either pre- or post-surgical time points (Johnson-Markve et al., 2011; Incisa Della Rocchetta et al., 1993). The current study suggests that resections of the frontal lobes are associated with better outcomes for naming and verbal memory (LM) when compared to left ATLs. Interestingly, verbal list learning declined more in left than right FL and right ATL patients suggesting a possible language based executive functioning component to this memory measure. As expected, our study further supports that left ATLs are associated with material specific memory declines. This pattern was not seen for those undergoing a right ATL (i.e., nonverbal memory did not decline in patients with right ATL).
This study assessed outcomes prior to and after electronic medical record-based clinical decision support implementation combined with prospective audit in patients with COVID-19. This multimodal stewardship intervention was associated with a decrease in antibiotic exposure for patients with COVID-19 (44.4% vs 61.8%, p = 0.002) within the first 7 days of hospitalization.
Objective: To characterize hospital-onset COVID-19 cases and to investigate the associations between these rates and population and hospital-level rates including trends in healthcare worker infections (HCW), community cases, and COVID-19 wastewater data. Design: Retrospective cohort study from January 1, 2021, to November 23, 2022. Setting: This study was conducted at a 589-bed urban Midwestern tertiary-care hospital system. Participants and interventions: The infection prevention team reviewed the electronic medical records (EMR) of patients who were admitted for >48 hours and subsequently tested positive for SARS-CoV-2 to determine whether COVID-19 was likely to be hospital-onset illness. Each case was further categorized as definite, probable, or possible based on viral sequencing, caregiver tracing analysis, symptoms, and cycle threshold values. Patients were excluded if there was a known exposure prior to admission. Clinical data including vaccination status were collected from the EMR. HCW case data were collected via our institution’s employee health services. Community cases and wastewater data were collected via the Wisconsin Department of Health Services database. Additionally, we evaluated the timing of changes in infection prevention guidance such as visitor restrictions. Results: In total, 156 patients met criteria for hospital-onset COVID-19. Overall, 6% of cases were categorized as definite, 24% were probable, and 70% were possible hospital-onset illness. Most patients were tested prior to a procedure (31%), for new symptoms (30%), and for discharge planning (30%). Also, 53% were symptomatic and 41% received treatment for their COVID-19. Overall, 38% of patients were immunocompromised and 27% were unvaccinated. Overall, 12% of patients died within 1 month of their positive SARS-CoV-2 test, and 11% required ICU admission during their hospital stay. Hospital-onset COVID-19 increased in fall of 2022. Specifically, October 2022 had 16 cases, whereas fall of 2021 (September–November) only had 3 cases total. Finally, similar peaks were observed in total cases by week between healthcare workers, county cases, and COVID-19 wastewater levels. These peaks correspond with the SARS-CoV-2 delta and omicron variant surges, respectively. Conclusions: Hospital-onset cases followed similar trends as population and hospital-level data throughout the study period. However, hospital-onset rate did not correlate as strongly in the second half of 2022 when cases were disproportionately high. Given that hospital-onset cases can result in significant morbidity, continued enhanced infection prevention efforts and low threshold for testing are warranted in the inpatient environment.
Potato producers in Canada’s Atlantic provinces of Prince Edward Island (PE) and New Brunswick rely on photosystem II (PSII)-inhibiting herbicides to provide season-long weed control. Despite this fact, a high proportion of common lambsquarters populations in the region have been identified as resistant to this class of herbicides. Crop-topping is a late-season weed management practice that exploits the height differential between weeds and a developing crop canopy. Two field experiments were conducted in Harrington, PE, in 2020 and 2021, one each to evaluate the efficacy of a different crop-topping strategy, above-canopy mowing or wick-applied glyphosate, at two potato phenological stages, on common lambsquarters viable seed production and potato yield and quality. Mowing common lambsquarters postflowering decreased viable seed production (72% to 91%) in 2020 but increased seed production (78% to 278%) in 2021. Mowing had minimal impact on potato marketable yield across cultivars in both years. In contrast, treating common lambsquarters with wick-applied glyphosate had variable impacts on seed output in 2020 but dramatically reduced seed production (up to 95%) in 2021 when treatments were applied preflowering. Glyphosate damage to potato tubers was not influenced by timing and resulted in a 14% to 15% increase in culled tubers due to black spotting and rot. Our results highlight the importance of potato and common lambsquarters phenology when selecting a crop-topping strategy and demonstrate that above-canopy mowing and wick-applied glyphosate can be utilized for seedbank management of herbicide-resistant common lambsquarters in potato production systems.
This national pre-pandemic survey compared demand and capacity of adult community eating disorder services (ACEDS) with NHS England (NHSE) commissioning guidance.
Results
Thirteen services in England and Scotland responded (covering 10.7 million population). Between 2016–2017 and 2019–2020 mean referral rates increased by 18.8%, from 378 to 449/million population. Only 3.7% of referrals were from child and adolescent eating disorder services (CEDS-CYP), but 46% of patients were aged 18–25 and 54% were aged >25. Most ACEDS had waiting lists and rationed access. Many could not provide full medical monitoring, adapt treatment for comorbidities, offer assertive outreach or provide seamless transitions. For patient volume, the ACEDS workforce budget was 15%, compared with the NHSE workforce calculator recommendations for CEDS-CYP. Parity required £7 million investment/million population for the ACEDS.
Clinical implications
This study highlights the severe pressure in ACEDS, which has increased since the COVID-19 pandemic. Substantial investment is required to ensure NHS ACEDS meet national guidance, offer evidence-based treatment, reduce risk and preventable deaths, and achieve parity with CEDS-CYP.
To determine risk factors for the development of long coronavirus disease 2019 (COVID-19) in healthcare personnel (HCP).
Methods:
We conducted a case–control study among HCP who had confirmed symptomatic COVID-19 working in a Brazilian healthcare system between March 1, 2020, and July 15, 2022. Cases were defined as those having long COVID according to the Centers for Disease Control and Prevention definition. Controls were defined as HCP who had documented COVID-19 but did not develop long COVID. Multiple logistic regression was used to assess the association between exposure variables and long COVID during 180 days of follow-up.
Results:
Of 7,051 HCP diagnosed with COVID-19, 1,933 (27.4%) who developed long COVID were compared to 5,118 (72.6%) who did not. The majority of those with long COVID (51.8%) had 3 or more symptoms. Factors associated with the development of long COVID were female sex (OR, 1.21; 95% CI, 1.05–1.39), age (OR, 1.01; 95% CI, 1.00–1.02), and 2 or more SARS-CoV-2 infections (OR, 1.27; 95% CI, 1.07–1.50). Those infected with the SARS-CoV-2 δ (delta) variant (OR, 0.30; 95% CI, 0.17–0.50) or the SARS-CoV-2 o (omicron) variant (OR, 0.49; 95% CI, 0.30–0.78), and those receiving 4 COVID-19 vaccine doses prior to infection (OR, 0.05; 95% CI, 0.01–0.19) were significantly less likely to develop long COVID.
Conclusions:
Long COVID can be prevalent among HCP. Acquiring >1 SARS-CoV-2 infection was a major risk factor for long COVID, while maintenance of immunity via vaccination was highly protective.
Childhood adversity and cannabis use are considered independent risk factors for psychosis, but whether different patterns of cannabis use may be acting as mediator between adversity and psychotic disorders has not yet been explored. The aim of this study is to examine whether cannabis use mediates the relationship between childhood adversity and psychosis.
Methods
Data were utilised on 881 first-episode psychosis patients and 1231 controls from the European network of national schizophrenia networks studying Gene–Environment Interactions (EU-GEI) study. Detailed history of cannabis use was collected with the Cannabis Experience Questionnaire. The Childhood Experience of Care and Abuse Questionnaire was used to assess exposure to household discord, sexual, physical or emotional abuse and bullying in two periods: early (0–11 years), and late (12–17 years). A path decomposition method was used to analyse whether the association between childhood adversity and psychosis was mediated by (1) lifetime cannabis use, (2) cannabis potency and (3) frequency of use.
Results
The association between household discord and psychosis was partially mediated by lifetime use of cannabis (indirect effect coef. 0.078, s.e. 0.022, 17%), its potency (indirect effect coef. 0.059, s.e. 0.018, 14%) and by frequency (indirect effect coef. 0.117, s.e. 0.038, 29%). Similar findings were obtained when analyses were restricted to early exposure to household discord.
Conclusions
Harmful patterns of cannabis use mediated the association between specific childhood adversities, like household discord, with later psychosis. Children exposed to particularly challenging environments in their household could benefit from psychosocial interventions aimed at preventing cannabis misuse.
While cannabis use is a well-established risk factor for psychosis, little is known about any association between reasons for first using cannabis (RFUC) and later patterns of use and risk of psychosis.
Methods
We used data from 11 sites of the multicentre European Gene-Environment Interaction (EU-GEI) case–control study. 558 first-episode psychosis patients (FEPp) and 567 population controls who had used cannabis and reported their RFUC.
We ran logistic regressions to examine whether RFUC were associated with first-episode psychosis (FEP) case–control status. Path analysis then examined the relationship between RFUC, subsequent patterns of cannabis use, and case–control status.
Results
Controls (86.1%) and FEPp (75.63%) were most likely to report ‘because of friends’ as their most common RFUC. However, 20.1% of FEPp compared to 5.8% of controls reported: ‘to feel better’ as their RFUC (χ2 = 50.97; p < 0.001). RFUC ‘to feel better’ was associated with being a FEPp (OR 1.74; 95% CI 1.03–2.95) while RFUC ‘with friends’ was associated with being a control (OR 0.56; 95% CI 0.37–0.83). The path model indicated an association between RFUC ‘to feel better’ with heavy cannabis use and with FEPp-control status.
Conclusions
Both FEPp and controls usually started using cannabis with their friends, but more patients than controls had begun to use ‘to feel better’. People who reported their reason for first using cannabis to ‘feel better’ were more likely to progress to heavy use and develop a psychotic disorder than those reporting ‘because of friends’.
People often assess the reasonableness of another person’s judgments. When doing so, the evaluator should set aside knowledge that would not have been available to the evaluatee to assess whether the evaluatee made a reasonable decision, given the available information. But under what circumstances does the evaluator set aside information? On the one hand, if the evaluator fails to set aside prior information, not available to the evaluatee, they exhibit belief bias. But on the other hand, when Bayesian inference is called for, the evaluator should generally incorporate prior knowledge about relevant probabilities in decision making. The present research integrated these two perspectives in two experiments. Participants were asked to take the perspective of a fictitious evaluatee and to evaluate the reasonableness of the evaluatee’s decision. The participant was privy to information that the fictitious evaluatee did not have. Specifically, the participant knew whether the evaluatee’s decision judgment was factually correct. Participants’ judgments were biased (Experiments 1 and 2) by the factuality of the conclusion as they assessed the evaluatee’s reasonableness. We also found that the format of information presentation (Experiment 2) influenced the degree to which participants’ reasonableness ratings were responsive to the evaluatee’s Bayesian rationality. Specifically, responsivity was greater when the information was presented in an icon-based, graphical, natural-frequency format than when presented in either a numerical natural-frequency format or a probability format. We interpreted the effects of format to suggest that graphical presentation can help organize information into nested sets, which in turn enhances Bayesian rationality.
Child maltreatment (CM) and migrant status are independently associated with psychosis. We examined prevalence of CM by migrant status and tested whether migrant status moderated the association between CM and first-episode psychosis (FEP). We further explored whether differences in CM exposure contributed to variations in the incidence rates of FEP by migrant status.
Methods
We included FEP patients aged 18–64 years in 14 European sites and recruited controls representative of the local populations. Migrant status was operationalized according to generation (first/further) and region of origin (Western/non-Western countries). The reference population was composed by individuals of host country's ethnicity. CM was assessed with Childhood Trauma Questionnaire. Prevalence ratios of CM were estimated using Poisson regression. We examined the moderation effect of migrant status on the odds of FEP by CM fitting adjusted logistic regressions with interaction terms. Finally, we calculated the population attributable fractions (PAFs) for CM by migrant status.
Results
We examined 849 FEP cases and 1142 controls. CM prevalence was higher among migrants, their descendants and migrants of non-Western heritage. Migrant status, classified by generation (likelihood test ratio:χ2 = 11.3, p = 0.004) or by region of origin (likelihood test ratio:χ2 = 11.4, p = 0.003), attenuated the association between CM and FEP. PAFs for CM were higher among all migrant groups compared with the reference populations.
Conclusions
The higher exposure to CM, despite a smaller effect on the odds of FEP, accounted for a greater proportion of incident FEP cases among migrants. Policies aimed at reducing CM should consider the increased vulnerability of specific subpopulations.