We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
U.S. veterans report high rates of traumatic experiences and mental health symptomology [e.g. posttraumatic stress disorder (PTSD)]. The stress sensitization hypothesis posits experiences of adversity sensitize individuals to stress reactions which can lead to greater psychiatric problems. We extend this hypothesis by exploring how multiple adversities such as early childhood adversity, combat-related trauma, and military sexual trauma related to heterogeneity in stress over time and, subsequently, greater risk for PTSD.
Methods
1230 veterans were recruited for an observational, longitudinal study. Veterans responded to questionnaires on PTSD, stress, and traumatic experiences five times over an 18-month study period. We used latent transition analysis to understand how heterogeneity in adverse experiences is related to transition into stress trajectory classes. We also explored how transition patterns related to PTSD symptomology.
Results
Across all models, we found support for stress sensitization. In general, combat trauma in combinations with other types of adverse experiences, namely early childhood adversity and military sexual trauma, imposed a greater probability of transitioning into higher risk stress profiles. We also showed differential effects of early childhood and military-specific adversity on PTSD symptomology.
Conclusion
The present study rigorously integrates both military-specific and early life adversity into analysis on stress sensitivity, and is the first to examine how sensitivity might affect trajectories of stress over time. Our study provides a nuanced, and specific, look at who is risk for sensitization to stress based on previous traumatic experiences as well as what transition patterns are associated with greater PTSD symptomology.
A hedonic model was employed to examine factors that influence the resale price of row crop planters on the used machinery market. Planter sale data from 2016 to 2018 were utilized to conduct the analysis. Results suggested that the primary factors impacting planter resale prices were make, age, condition, planter configuration, row number, and row spacing. As a function of age (depreciation), planter values were generally determined to decrease at a decreasing rate. Finally, it was determined that there was a significant interaction between the variables make and age, suggesting that different planter makes depreciate differently.
Immediate posttreatment irrigation has been proposed as a method to reduce hybrid bermudagrass [Cynodon dactylon (L.) Pers. × Cynodon transvaalensis Burtt Davy] phytotoxicity from topramezone. Immediate irrigation is impractical, because it would take a turfgrass sprayer 10 to 15 min to cover an average golf course fairway or athletic field. There is also insufficient evidence regarding how posttreatment irrigation, immediate or otherwise, influences mature goosegrass [Eleusine indica (L.) Gaertn.] control from topramezone or low-dose topramezone plus metribuzin programs. We sought to investigate bermudagrass and E. indica response to immediate, 15-min, and 30-min posttreatment irrigation compared with no irrigation following topramezone at 12.3 g ae ha−1, the lowest labeled rate, or topramezone at 6.1 g ha−1 plus metribuzin at 210 g ai ha−1. We also evaluated placement of each herbicide and their combination on soil, foliage, and soil plus foliage to help elucidate the mechanisms involved in differential responses between species and herbicide mixtures. Responses were largely dependent on trial due to bermudagrass injury from high-dose topramezone being nearly eliminated by immediate irrigation in one trial and only slightly affected in another. When posttreatment irrigation was postponed for 15 or 30 min, topramezone alone injured bermudagrass unacceptably in both trials. Bermudagrass was injured less by low-dose topramezone plus metribuzin than by high-dose topramezone. All posttreatment irrigation timings reduced E. indica control compared with no posttreatment irrigation. The herbicide placement study suggested that topramezone control of E. indica is highly dependent on foliar uptake and that phytotoxicity of both bermudagrass and E. indica is greater from topramezone than metribuzin. Thus, posttreatment irrigation likely reduces topramezone rate load with a concomitant effect on plant phytotoxicity of both species. Metribuzin reduced 21-d cumulative clipping weight and tiller production of plants, and this may be a mechanism by which it reduces foliar white discoloration from topramezone.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
This study assessed the cost-effectiveness of the Centers for Disease Control and Prevention’s (CDC’s) Sodium Reduction in Communities Program (SRCP).
Design:
We collected implementation costs and performance measure indicators from SRCP recipients and their partner food service organisations. We estimated the cost per person and per food service organisation reached and the cost per menu item impacted. We estimated the short-term effectiveness of SRCP in reducing sodium consumption and used it as an input in the Prevention Impact Simulation Model to project the long-term impact on medical cost savings and quality-adjusted life-years gained due to a reduction in CVD and estimate the cost-effectiveness of SRCP if sustained through 2025 and 2040.
Setting:
CDC funded eight recipients as part of the 2016–2021 round of the SRCP to work with food service organisations in eight settings to increase the availability and purchase of lower-sodium food options.
Participants:
Eight SRCP recipients and twenty of their partners.
Results:
At the recipient level, average cost per person reached was $10, and average cost per food service organisation reached was $42 917. At the food service organisation level, median monthly cost per food item impacted by recipe modification or product substitution was $684. Cost-effectiveness analyses showed that, if sustained, the programme is cost saving (i.e. the reduction in medical costs is greater than the implementation costs) in the target population by $1·82 through 2025 and $2·09 through 2040.
Conclusions:
By providing evidence of the cost-effectiveness of a real-world sodium reduction initiative, this study can help inform decisions by public health organisations about related CVD prevention interventions.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
ABSTRACT IMPACT: This project seeks to identify unique host responses that are biomarkers for specific urethral pathogens, and which can be used in the development of point-of-care (POC) STI diagnostics. OBJECTIVES/GOALS: How Chlamydia trachomatis (CT) and other common STIs, e.g. Neisseria gonorrhoeae, evade immunity and elicit pathology in the male urethra is poorly understood. Our objective is to determine how STI-infected urethral epithelial cells, as well as the uninfected ‘bystander’ cells with which infected cells communicate, respond to CT and other STIs. METHODS/STUDY POPULATION: We evaluated how immortalized urethral cell lines - including transduced human urethral epithelial cells (THUECs) - respond to increasing doses of CT infectious particles using in vitro one-step progeny assays performed in the presence or absence of cycloheximide, a drug that inhibits eukaryotic protein synthesis. We will perform concurrent single-cell RNA sequencing (scRNA-seq) and multiplex cytokine analyses to determine how different CT doses impact the transcriptomes of infected and bystander urethral epithelial cells and modulate cytokine production of the overall monolayer. Results of these experiments will inform the feasibility of performing similar analyses in situ using urethral swabs from men with clinically diagnosed urethritis. RESULTS/ANTICIPATED RESULTS: Our results demonstrate that immune-competent urethral cell monolayers strongly resist CT infection, unless most of the cells are simultaneously infected. This suggests that uninfected bystander cells sense CT-infected cells and secrete soluble factors that may act to limit CT proliferation in infected cells and to inform remaining uninfected cells that a potential pathogen is present. We anticipate that our scRNA-seq and cytokine analyses will identify both specific effector pathways that protect against CT and intracellular signals that modulate them. We speculate that these pathways and signals may differ during infection with CT and other STIs. Importantly, we anticipate that our in vitro model of CT infection will be highly representative of in situ immune responses observed in urethras of infected men. DISCUSSION/SIGNIFICANCE OF FINDINGS: In men, common STIs including CT are usually managed syndromically due to a lack of POC diagnostics. By determining how STIs elicit urethral inflammation and identifying countermeasures that STIs use to evade urethral immunity, we can identify host responses that serve as biomarkers for urethritis, generally, and for specific urethral pathogens.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Impairment in reciprocal social behavior (RSB), an essential component of early social competence, clinically defines autism spectrum disorder (ASD). However, the behavioral and genetic architecture of RSB in toddlerhood, when ASD first emerges, has not been fully characterized. We analyzed data from a quantitative video-referenced rating of RSB (vrRSB) in two toddler samples: a community-based volunteer research registry (n = 1,563) and an ethnically diverse, longitudinal twin sample ascertained from two state birth registries (n = 714). Variation in RSB was continuously distributed, temporally stable, significantly associated with ASD risk at age 18 months, and only modestly explained by sociodemographic and medical factors (r2 = 9.4%). Five latent RSB factors were identified and corresponded to aspects of social communication or restricted repetitive behaviors, the two core ASD symptom domains. Quantitative genetic analyses indicated substantial heritability for all factors at age 24 months (h2 ≥ .61). Genetic influences strongly overlapped across all factors, with a social motivation factor showing evidence of newly-emerging genetic influences between the ages of 18 and 24 months. RSB constitutes a heritable, trait-like competency whose factorial and genetic structure is generalized across diverse populations, demonstrating its role as an early, enduring dimension of inherited variation in human social behavior. Substantially overlapping RSB domains, measurable when core ASD features arise and consolidate, may serve as markers of specific pathways to autism and anchors to inform determinants of autism's heterogeneity.
To understand how exposure to victimization during adolescence and the presence of comorbid psychological conditions influence substance use treatment entry and substance use disorder diagnosis from 14 to 25 years old among serious juvenile offenders, this study included 1,354 serious juvenile offenders who were prospectively followed over 7 years. Growth mixture modeling was used to assess profiles of early victimization during adolescence (14–17 years). Discrete time survival mixture analysis was used to assess time to treatment entry and substance use disorder diagnosis. Posttraumatic stress disorder (PTSD) and major depressive disorder (MDD) were used as predictors of survival time. Mixture models revealed three profiles of victimization: sustained poly-victimization, moderate/decreasing victimization, and low victimization. Youth in the sustained poly-victimization class were more likely to enter treatment earlier and have a substance use diagnosis earlier than other classes. PTSD was a significant predictor of treatment entry for youth in the sustained poly-victimization class, and MDD was a significant predictor of substance use disorder diagnosis for youth in the moderate/decreasing victimization class. Therefore, substance use prevention programming targeted at youth experiencing poly-victimization in early adolescence—especially those who have PTSD or MDD—is needed.
According to a well-rehearsed media trope, the ‘Alt-Right’ (‘alternative right’) burst into a shocked public consciousness in the run-up to the 2016 US presidential election (Caldwell, 2016; Collins, 2016). Curiously, this phenomenon materialised in media consciousness as a nebulous interconnectivity of white supremacists incubated in the obscure ‘dark web’ before spreading into the minds of poorly educated people via the unfiltered medium of the mainstream Internet (Cook, 2016; Caldwell, 2016). This definition was rapidly deployed by presidential candidate Hillary Clinton to attack Donald Trump's supporters as a ‘paranoid fringe’ (Ohlheiser and Dewey, 2016). Explaining the ability of this racist ‘fringe’ to somehow capture the White House similarly defaulted to the deeply classist, technophobic and socially convenient narrative of uneducated poor people exposed to bad ideas on the Internet (Ember, 2016; Weigel, 2016; Marwick and Lewis, 2017; Bartlett, 2018; Daniels, 2018).
A series of books and articles exploring the Alt-Right have been published since 2016. Perhaps due to the different publication timelines involved in academic output, this material is primarily journalistic. This literature provides detailed empirical data critical to understanding the underpinning social networks of the Alt-Right. However, intensive media focus on young, working-class – usually American – white supremacists (Neiwert, 2017) sharing extremist material over the Internet (Nagel, 2017) masks incidences of closely related racist, conspiracist (ie belief in/promotion of conspiracy theories), misogynist and ‘anti-elitist’ ideology in wider, often middle-class, mainstream media, politics and social policy discourse (Mondon and Winter, 2018). This mainstream extremism – or, as we will call it, ‘mainstremeism’ – is often couched in terms of ‘refreshingly un-politically correct’ hard truths (see Harris, 2015). This article problematises current narratives surrounding the ‘Alt-Right’. We then draw on the work of anthropologist Mary Douglas – who argues that ideologies of purity, impurity and purge recur in numerous cultures, helping to mask social inequalities, shore up group identity and legitimise and rationalise access to group resources (Douglas, 2003) – and Antonio Gramsci's insights about the role of ‘organic intellectuals’ to contribute to ongoing national and international ‘Alt-Right’ debates by presenting an interdisciplinary, political-anthropological understanding of ‘mainstremeist’ belief and action. This approach highlights the links between ‘fringe’ and ‘centre’.
OBJECTIVES/SPECIFIC AIMS: This study (1) investigated the presence and severity of autonomic nervous system (ANS) dysfunction in patients with pre-symptomatic Huntington Disease (HD) and (2) determined if pharmacologic manipulation of the ANS could modify the progression of HD. METHODS/STUDY POPULATION: Using a unique data set of children at-risk for HD (the Kids-HD study), markers of autonomic function (resting heart rate [rHR], blood pressure [BP], and core body temperature [CBT]) were compared between pre-symptomatic, gene-expanded children (psGE) and healthy developing children using mixed models analyses controlling for sex, age, and body mass index. Included participants had to be < 18 years old and be at least 10 years from their predicted motor diagnosis of HD. Using the Enroll-HD database, inverse-propensity score weighted, Cox Regression analyses investigated the effects of beta-blockers on the timing of motor diagnosis of presymptomatic, adult patients with HD. RESULTS/ANTICIPATED RESULTS: Compared to healthy controls, the psGE participants had significantly (p<0.05) higher mean rHR, systolic BP percentile, and CBT compared to the healthy controls (elevated by 4.01 bpm 0.19°C, and 5.96 percentile points, respectively, in the psGE group). Participants from Enroll-HD who were using a beta-blocker prior to motor diagnosis (n=65) demonstrated a significantly lower annualized risk of motor diagnosis [HR=0.56, p=0.03], compared to other participants with HD (n=1972). DISCUSSION/SIGNIFICANCE OF IMPACT: Sympathetic nervous system activity is elevated in patients with HD decades prior to their predicted motor diagnosis. Furthermore, modulation of the sympathetic nervous system with beta-blockers significantly lowers the annualized risk of motor diagnosis of HD.
Based on the data from the Next Generation Virgo cluster Survey (NGVS), we statistically study the photometric properties of globular clusters (GCs), ultra-compact dwarfs (UCDs) and dwarf nuclei in the Virgo core (M87) region. We found an obvious negative color (g - z) gradient in GC system associate with M87, i.e. GCs in the outer regions are bluer. However, such color gradient does not exist in UCD system, neither in dwarf nuclei system around M87. In addition, we found that many UCDs are surrounded by extended, low surface brightness envelopes. The dwarf nuclei and UCDs show different spatial distributions from GCs, with dwarf nuclei and UCDs (especially for the UCDs with visible envelopes) lying at larger distances to the Virgo center. These results support the view that UCDs (at least for a fraction of UCDs) are more tied to dwarf nuclei than to GCs.
Let $E$ be an elliptic curve over a field $k$. Let $R:=\operatorname{End}E$. There is a functor $\mathscr{H}\!\mathit{om}_{R}(-,E)$ from the category of finitely presented torsion-free left $R$-modules to the category of abelian varieties isogenous to a power of $E$, and a functor $\operatorname{Hom}(-,E)$ in the opposite direction. We prove necessary and sufficient conditions on $E$ for these functors to be equivalences of categories. We also prove a partial generalization in which $E$ is replaced by a suitable higher-dimensional abelian variety over $\mathbb{F}_{p}$.
Workfare increases requirements on welfare claimants: a major shift in UK social welfare policy post-1980s. Political, academic and cultural debates surround the ethical basis, and practical operations, of workfare schemes. Moreover, the UK government has claimed that workfare provides value for money in an age of austerity, ‘help and support’ for the long-term unemployed, and ‘incentives’ for increased claimant job-seeking. This article presents results gathered from sociological research into the UK's ‘Work Programme’ workfare scheme in order to contextualise these debates and contribute to wider academic and social policy workfare analyses. It finds a complex picture: a largely pointless scheme, resented by many participants, but providing a basic social service for some others.
Obesity is a major risk factor for osteoarthritis (OA) whilst there is some evidence that diabetes also increases risk. Metformin is a common oral treatment for those with diabetes.
Objective
The aim is to investigate whether metformin reduces the risk of OA.
Methods
This was a cohort study set within the Consultations in Primary Care Archive, with 3217 patients with type 2 diabetes. Patients at 13 general practices with recorded type 2 diabetes in the baseline period (2002–2003) and no prior record of OA were identified. Exposure was a prescription for metformin. Outcome was an OA record during follow up. Cox proportional hazard models with Gamma frailty term were fitted: adjusted for age, gender, deprivation, and comorbidity.
Results
There was no association between prescribed metformin treatment at baseline and OA (adjusted HR: 1.02, 95% CI: 0.91, 1.15). A similar non- significant association was found when allowing exposure status of prescription of metformin to vary over time.
Driving in persons with dementia poses risks that must be counterbalanced with the importance of the care for autonomy and mobility. Physicians often find substantial challenges in the assessment and reporting of driving safety for persons with dementia. This paper describes a driving in dementia decision tool (DD-DT) developed to aid physicians in deciding when to report older drivers with either mild dementia or mild cognitive impairment to local transportation administrators.
Methods:
A multi-faceted, computerized decision support tool was developed, using a systematic literature and guideline review, expert opinion from an earlier Delphi study, as well as qualitative interviews and focus groups with physicians, caregivers of former drivers with dementia, and transportation administrators. The tool integrates inputs from the physician-user about the patient's clinical and driving history as well as cognitive findings, and it produces a recommendation for reporting to transportation administrators. This recommendation is translated into a customized reporting form for the transportation authority, if applicable, and additional resources are provided for the patient and caregiver.
Conclusions:
An innovative approach was needed to develop the DD-DT. The literature and guideline review confirmed the algorithm derived from the earlier Delphi study, and barriers identified in the qualitative research were incorporated into the design of the tool.