We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Moderates are often overlooked in contemporary research on American voters. Many scholars who have examined moderates argue that these individuals are only classified as such due to a lack of political sophistication or conflicted views across issues. We develop a method to distinguish three ways an individual might be classified as moderate: having genuinely moderate views across issues, being inattentive to politics or political surveys, or holding views poorly summarized by a single liberal–conservative dimension. We find that a single ideological dimension accurately describes most, but not all, Americans’ policy views. Using the classifications from our model, we demonstrate that moderates and those whose views are not well explained by a single dimension are especially consequential for electoral selection and accountability. These results suggest a need for renewed attention to the middle of the American political spectrum.
Subsidised or cost-offset community-supported agriculture (CO-CSA) connects farms directly to low-income households and can improve fruit and vegetable intake. This analysis identifies factors associated with participation in CO-CSA.
Design:
Farm Fresh Foods for Healthy Kids (F3HK) provided a half-price, summer CO-CSA plus healthy eating classes to low-income households with children. Community characteristics (population, socio-demographics and health statistics) and CO-CSA operational practices (share sizes, pick up sites, payment options and produce selection) are described and associations with participation levels are examined.
Setting:
Ten communities in New York (NY), North Carolina (NC), Vermont and Washington states in USA.
Participants:
Caregiver–child dyads enrolled in spring 2016 or 2017.
Results:
Residents of micropolitan communities had more education and less poverty than in small towns. The one rural location (NC2) had the fewest college graduates (10 %) and most poverty (23 %) and poor health statistics. Most F3HK participants were white, except in NC where 45·2 % were African American. CO-CSA participation varied significantly across communities from 33 % (NC2) to 89 % (NY1) of weeks picked up. Most CO-CSA farms offered multiple share sizes (69·2 %) and participation was higher than when not offered (76·8 % v. 57·7 % of weeks); whereas 53·8 % offered a community pick up location, and participation in these communities was lower than elsewhere (64·7 % v. 78·2 % of weeks).
Conclusion:
CO-CSA programmes should consider offering a choice of share sizes and innovate to address potential barriers such as rural location and limited education and income among residents. Future research is needed to better understand barriers to participation, particularly among participants utilising community pick up locations.
Community-supported agriculture (CSA) is an alternative food marketing model in which community members subscribe to receive regular shares of a farm's harvest. Although CSA has the potential to improve access to fresh produce, certain features of CSA membership may prohibit low-income families from participating. A ‘cost-offset’ CSA (CO-CSA) model provides low-income families with purchasing support with the goal of making CSA more affordable. As a first step toward understanding the potential of CO-CSA to improve access to healthy foods among low-income households, we interviewed 24 CSA farmers and 20 full-pay CSA members about their experiences and perceptions of the cost-offset model and specific mechanisms for offsetting the cost of CSA. Audio recordings were transcribed verbatim and coded using a thematic approach. Ensuring that healthy food was accessible to everyone, regardless of income level, was a major theme expressed by both farmers and members. In general, CSA farmers and CSA members favored member donations over other mechanisms for funding the CO-CSA. The potential time burden that could affect CSA farmers when administering a cost-offset was a commonly-mentioned barrier. Future research should investigate various CO-CSA operational models in order to determine which models are most economically viable and sustainable.
Coronavirus disease (COVID-19) is a “disaster of uncertainty” with ambiguity about its nature and trajectory. These features amplify its psychological toxicity and increase the number of psychological casualties it inflicts. Uncertainty was fueled by lack of knowledge about the lethality of a disaster, its duration, and ambiguity in messaging from leaders and health care authorities. Human resilience can have a buffering effect on the psychological impact. Experts have advocated “flattening the curve” to slow the spread of the infection. Our strategy for crisis leadership is focused on flattening the rise in psychological casualties by increasing resilience among health care workers. This paper describes an approach employed at Johns Hopkins to promote and enhance crisis leadership. The approach is based on 4 factors: vision for the future, decisiveness, effective communication, and following a moral compass. We make specific actionable recommendations for implementing these factors that are being disseminated to frontline leaders and managers. The COVID-19 pandemic is destined to have a strong psychological impact that extends far beyond the end of quarantine. Following these guidelines has the potential to build resilience and thus reduce the number of psychological casualties and speed the return to normal – or at least the new normal in the post-COVID world.
Clinical intuition suggests that personality disorders hinder the treatment of depression, but research findings are mixed. One reason for this might be the way in which current assessment measures conflate general aspects of personality disorders, such as overall severity, with specific aspects, such as stylistic tendencies. The goal of this study was to clarify the unique contributions of the general and specific aspects of personality disorders to depression outcomes.
Methods
Patients admitted to the Menninger Clinic, Houston, between 2012 and 2015 (N = 2352) were followed over a 6–8-week course of multimodal inpatient treatment. Personality disorder symptoms were assessed with the Structured Clinical Interview for Diagnostic and Statistical Manual of Mental Disorders, 4th edition Axis II Personality Screening Questionnaire at admission, and depression severity was assessed using the Patient Health Questionnaire-9 every fortnight. General and specific personality disorder factors estimated with a confirmatory bifactor model were used to predict latent growth curves of depression scores in a structural equation model.
Results
The general factor predicted higher initial depression scores but not different rates of change. By contrast, the specific borderline factor predicted slower rates of decline in depression scores, while the specific antisocial factor predicted a U shaped pattern of change.
Conclusions
Personality disorder symptoms are best represented by a general factor that reflects overall personality disorder severity, and specific factors that reflect unique personality styles. The general factor predicts overall depression severity while specific factors predict poorer prognosis which may be masked in prior studies that do not separate the two.
Treatment resistance causes significant burden in psychosis. Clozapine is the only evidence-based pharmacologic intervention available for people with treatment-resistant schizophrenia; current guidelines recommend commencement after two unsuccessful trials of standard antipsychotics.
Aims
This paper aims to explore the prevalence of treatment resistance and pathways to commencement of clozapine in UK early intervention in psychosis (EIP) services.
Method
Data were taken from the National Evaluation of the Development and Impact of Early Intervention Services study (N = 1027) and included demographics, medication history and psychosis symptoms measured by the Positive and Negative Syndrome Scale (PANSS) at baseline, 6 months and 12 months. Prescribing patterns and pathways to clozapine were examined. We adopted a strict criterion for treatment resistance, defined as persistent elevated positive symptoms (a PANSS positive score ≥16, equating to at least two items of at least moderate severity), across three time points.
Results
A total of 143 (18.1%) participants met the definition of treatment resistance of having continuous positive symptoms over 12 months, despite treatment in EIP services. Sixty-one (7.7%) participants were treatment resistant and eligible for clozapine, having had two trials of standard antipsychotics; however, only 25 (2.4%) were prescribed clozapine over the 12-month study period. Treatment-resistant participants were more likely to be prescribed additional antipsychotic medication and polypharmacy, instead of clozapine.
Conclusions
Prevalent treatment resistance was observed in UK EIP services, but prescription of polypharmacy was much more common than clozapine. Significant delays in the commencement of clozapine may reflect a missed opportunity to promote recovery in this critical period.
The coronavirus disease 2019 (COVID-19) has greatly impacted health-care systems worldwide, leading to an unprecedented rise in demand for health-care resources. In anticipation of an acute strain on established medical facilities in Dallas, Texas, federal officials worked in conjunction with local medical personnel to convert a convention center into a Federal Medical Station capable of caring for patients affected by COVID-19. A 200,000 square foot event space was designated as a direct patient care area, with surrounding spaces repurposed to house ancillary services. Given the highly transmissible nature of the novel coronavirus, the donning and doffing of personal protective equipment (PPE) was of particular importance for personnel staffing the facility. Furthermore, nationwide shortages in the availability of PPE necessitated the reuse of certain protective materials. This article seeks to delineate the procedures implemented regarding PPE in the setting of a COVID-19 disaster response shelter, including workspace flow, donning and doffing procedures, PPE conservation, and exposure event protocols.
This paper presents new data on the metasomatic development of zoned ultramafic balls from Fiskenaesset, West Greenland. Field and petrographic evidence indicate retrogression of original ultrabasic inclusions in acid country rock gneisses to serpentinemagnesite assemblages, followed by regional metamorphism and consequent development of the zonal structure upon metasomatic re-equilibration via a supercritical aqueous fluid phase confined to grain boundaries. The balls show varying degrees of deviation from an ideal sequence (antigorite-talc-tremolite-hornblende-chlorite-country rock) intimated by non-equilibrium thermo-dynamics, the currently accepted conceptual framework for the discussion of diffusion metasomatism.
Major and trace-element variations behave in a similar manner to those reported from other zoned ultramafic balls but fail to define the original country rock ultrabasic discontinuity unambiguously. It is tentatively suggested that application of the apparently systematic deviations from ideality may provide additional evidence.
The presence of a hornblende zone and absence of a continuous biotite zone are two significant differences from other zoned ultramafic bodies. The former may suggest increased mobility of the aluminium species at Fiskenaesset, the latter a lower temperature of formation or smaller K2O content of the gneissic hosts.
The distribution of uranium in a suite of variably deformed and metamorphosed rocks from the leucocratic member of the Glendessarry syenite has been determined using the fission track method. The uranium content of the magma increased during crystallization and uranium was concentrated in accessory minerals such as monazite, zircon, sphene, allanite, apatite, and microinclusions of a Zr- and Ti-rich phase. Contamination of the magma by pelitic metasediment enhanced the uranium content and monazite and zircon formed instead of sphene, allanite, and apatite.
Evidence of subsolidus uranium mobility in late stage magmatic or metamorphic fluids is presented here and shows: (a) Intracrystalline redistribution of uranium, especially in grains of sphene. (b) Intergranular mobility in a fluid phase, which affected the uraniferous accessory minerals in several ways.
The distribution of REE in a zoned ultramafic pod formed by incomplete re-equilibration of ultrabasic and quartzofeldspathic reactant compositions has been studied. Transport of the heavy REE (HREE) as well as the light REE (LREE) over several metres has occurred during the diffusion-controlled metasomatism of the protolith mineral assemblages. The largest resultant concentration range (Eu) exceeds two orders of magnitude. In general, REE abundances increase towards the marginal zones, and differences between the behaviour of LREE, middle REE (MREE) and HREE subgroups are observed. LREE are least mobile in the aqueous transporting medium. Complexing by carbonate ligands is probably not an important factor in this system, and the final REE distribution is thought to be governed largely by the crystal structure of the major zonal minerals.
The widespread evolution of resistance in rigid ryegrass populations to the highly effective, in-crop, selective herbicides used within southern Australian grain-crop production systems has severely diminished the available herbicide resource. A new PRE grass-selective herbicide, pyroxasulfone, may offer Australian grain producers a new option for rigid ryegrass control in wheat crops. The efficacy and level of selectivity of rigid ryegrass control with pyroxasulfone was investigated for a range of annual crop species in potted-plant, dose–response studies. In comparison with other currently available PRE herbicides, pyroxasulfone provided effective control of both resistant and susceptible rigid ryegrass populations. Additionally, control of these populations was achieved at rates that had little or no effect on the growth and survival of wheat. This crop was also the most tolerant of cereal species, with triticale, barley, and oat being more injured at higher pyroxasulfone rates than wheat was. In general though, pulse-crop species were found to be more tolerant of high pyroxasulfone rates than cereal-crop species. There were subtle effects of soil type on the efficacy of pyroxasulfone, where higher rates were required to achieve effective control on soils with higher clay or organic matter contents. The ability of pyroxasulfone to selectively control resistant and susceptible rigid ryegrass populations as identified in these studies clearly indicate the potential for widespread use and success of this herbicide in Australian cropping systems.
Recession of the Laurentide Ice Sheet from northern New Hampshire was interrupted by the Littleton-Bethlehem (L-B) readvance and deposition of the extensive White Mountain Moraine System (WMMS). Our mapping of this moraine belt and related glacial lake sequence has refined the deglaciation history of the region. The age of the western part of the WMMS is constrained to ~14.0–13.8 cal ka BP by glacial Lake Hitchcock varves that occur beneath and above L-B readvance till and were matched to a revised calibration of the North American Varve Chronology presented here. Using this age for when boulders were deposited on the moraines has enabled calibration of regional cosmogenic-nuclide production rates to improve the precision of exposure dating in New England. The L-B readvance coincided with the Older Dryas (OD) cooling documented by workers in Europe and the equivalent GI-1d cooling event in the Greenland Ice Core Chronology 2005 (GICC05) time scale. The readvance and associated moraines provide the first well-documented and dated evidence of the OD event in the northeastern United States. Our lake sediment cores show that the Younger Dryas cooling was likewise prominent in the White Mountains, thus extending the record of this event westward from Maine and Maritime Canada.
Voters in US elections receive markedly different representation depending on which candidate they elect, and because of incumbent advantages, the effects of this choice persist for many years. What are the long-term consequences of these two phenomena? Combining electoral and legislative roll-call data in a dynamic regression discontinuity design, this study assesses the long-term consequences of election results for representation. Across the US House, the US Senate and state legislatures, the effects of ‘coin-flip’ elections persist for at least a decade in all settings, and for as long as three decades in some. Further results suggest that elected officials do not adapt their roll-call voting to their districts’ preferences over time, and that voters do not systematically respond by replacing incumbents.
Paranoia is one of the commonest symptoms of psychosis but has rarely been studied in a population at risk of developing psychosis. Based on existing theoretical models, including the proposed distinction between ‘poor me’ and ‘bad me’ paranoia, we aimed to test specific predictions about associations between negative cognition, metacognitive beliefs and negative emotions and paranoid ideation and the belief that persecution is deserved (deservedness).
Method
We used data from 117 participants from the Early Detection and Intervention Evaluation for people at risk of psychosis (EDIE-2) trial of cognitive–behaviour therapy, comparing them with samples of psychiatric in-patients and healthy students from a previous study. Multi-level modelling was utilized to examine predictors of both paranoia and deservedness, with post-hoc planned comparisons conducted to test whether person-level predictor variables were associated differentially with paranoia or with deservedness.
Results
Our sample of at-risk mental state participants was not as paranoid, but reported higher levels of ‘bad-me’ deservedness, compared with psychiatric in-patients. We found several predictors of paranoia and deservedness. Negative beliefs about self were related to deservedness but not paranoia, whereas negative beliefs about others were positively related to paranoia but negatively with deservedness. Both depression and negative metacognitive beliefs about paranoid thinking were specifically related to paranoia but not deservedness.
Conclusions
This study provides evidence for the role of negative cognition, metacognition and negative affect in the development of paranoid beliefs, which has implications for psychological interventions and our understanding of psychosis.
This study examined whether a family-based preventive intervention for inner-city children entering the first grade could alter the developmental course of attention-deficit/hyperactivity disorder (ADHD) symptoms. Participants were 424 families randomly selected and randomly assigned to a control condition (n = 192) or Schools and Families Educating Children (SAFE) Children (n = 232). SAFE Children combined family-focused prevention with academic tutoring to address multiple developmental–ecological needs. A booster intervention provided in the 4th grade to randomly assigned children in the initial intervention (n =101) evaluated the potential of increasing preventive effects. Follow-up occurred over 5 years with parents and teachers reporting on attention problems. Growth mixture models identified multiple developmental trajectories of ADHD symptoms. The initial phase of intervention placed children on more positive developmental trajectories for impulsivity and hyperactivity, demonstrating the potential for ADHD prevention in at-risk youth, but the SAFE Children booster had no additional effect on trajectory or change in ADHD indicators.
Internalised stigma in young people meeting criteria for at-risk mental states (ARMS) has been highlighted as an important issue, and it has been suggested that provision of cognitive therapy may increase such stigma.
Aims
To investigate the effects of cognitive therapy on internalised stigma using a secondary analysis of data from the EDIE-2 trial.
Method
Participants meeting criteria for ARMS were recruited as part of a multisite randomised controlled trial of cognitive therapy for prevention and amelioration of psychosis. Participants were assessed at baseline and at 6, 12, 18 and 24 months using measures of psychotic experiences, symptoms and internalised stigma.
Results
Negative appraisals of experiences were significantly reduced in the group assigned to cognitive therapy (estimated difference at 12 months was −1.36 (95% Cl −2.69 to −0.02), P = 0.047). There was no difference in social acceptability of experiences (estimated difference at 12 months was 0.46, 95% Cl −0.05 to 0.98, P = 0.079).
Conclusions
These findings suggest that, rather than increasing internalised stigma, cognitive therapy decreases negative appraisals of unusual experiences in young people at risk of psychosis; as such, it is a non-stigmatising intervention for this population.