We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
New machine-vision technologies like the John Deere See & Spray™ could provide the opportunity to reduce herbicide use by detecting weeds and target-spraying herbicides simultaneously. Experiments were conducted for 2 yr in Keiser, AR, and Greenville, MS, to compare residual herbicide timings and targeted spray applications versus traditional broadcast herbicide programs in glyphosate/glufosinate/dicamba-resistant soybean. Treatments utilized consistent herbicides and rates with a preemergence (PRE) application followed by an early postemergence (EPOST) dicamba application followed by a mid-postemergence (MPOST) glufosinate application. All treatments included a residual at PRE and excluded or included a residual EPOST and MPOST. Additionally, the herbicide application method was considered, with traditional broadcast applications, broadcasted residual + targeted applications of postemergence herbicides (dual tank), or targeted applications of all herbicides (single tank). Targeted applications provided comparable control to broadcast applications with a ≤1% decrease in efficacy and overall control ≥93% for Palmer amaranth, broadleaf signalgrass, morningglory species, and purslane species. Additionally, targeted sprays slightly reduced soybean injury by at most 5 percentage points across all evaluations, and these effects did not translate to a yield increase at harvest. The relationship between weed area and targeted sprayed area also indicates that nozzle angle can influence potential herbicide savings, with narrower nozzle angles spraying less area. On average, targeted sprays saved a range of 28.4% to 62.4% on postemergence herbicides. On the basis of these results, with specific machine settings, targeted application programs could reduce the amount of herbicide applied while providing weed control comparable to that of traditional broadcast applications.
The reading the mind in the eyes test (RMET) – which assesses the theory of mind component of social cognition – is often used to compare social cognition between patients with schizophrenia and healthy controls. There is, however, no systematic review integrating the results of these studies. We identified 198 studies published before July 2020 that administered RMET to patients with schizophrenia or healthy controls from three English-language and two Chinese-language databases. These studies included 41 separate samples of patients with schizophrenia (total n = 1836) and 197 separate samples of healthy controls (total n = 23 675). The pooled RMET score was 19.76 (95% CI 18.91–20.60) in patients and 25.53 (95% CI 25.19–25.87) in controls (z = 12.41, p < 0.001). After excluding small-sample outlier studies, this difference in RMET performance was greater in studies using non-English v. English versions of RMET (Chi [Q] = 8.54, p < 0.001). Meta-regression analyses found a negative association of age with RMET score and a positive association of years of schooling with RMET score in both patients and controls. A secondary meta-analysis using a spline construction of 180 healthy control samples identified a non-monotonic relationship between age and RMET score – RMET scores increased with age before 31 and decreased with age after 31. These results indicate that patients with schizophrenia have substantial deficits in theory of mind compared with healthy controls, supporting the construct validity of RMET as a measure of social cognition. The different results for English versus non-English versions of RMET and the non-monotonic relationship between age and RMET score highlight the importance of the language of administration of RMET and the possibility that the relationship of aging with theory of mind is different from the relationship of aging with other types of cognitive functioning.
Traumatic brain injury is one of several recognized risk factors for cognitive decline and neurodegenerative disease. Currently, risk scores involving modifiable risk/protective factors for dementia have not incorporated head injury history as part of their overall weighted risk calculation. We investigated the association between the LIfestyle for BRAin Health (LIBRA) risk score with odds of mild cognitive impairment (MCI) diagnosis and cognitive function in older former National Football League (NFL) players, both with and without the influence of concussion history.
Participants and Methods:
Former NFL players, ages ≥ 50 (N=1050; mean age=61.1±5.4-years), completed a general health survey including self-reported medical history and ratings of function across several domains. LIBRA factors (weighted value) included cardiovascular disease (+1.0), hypertension (+1.6), hyperlipidemia (+1.4), diabetes (+1.3), kidney disease (+1.1), cigarette use history (+1.5), obesity (+1.6), depression (+2.1), social/cognitive activity (-3.2), physical inactivity (+1.1), low/moderate alcohol use (-1.0), healthy diet (-1.7). Within Group 1 (n=761), logistic regression models assessed the association of LIBRA scores and independent contribution of concussion history with the odds of MCI diagnosis. A modified-LIBRA score incorporated concussion history at the level planned contrasts showed significant associations across concussion history groups (0, 1-2, 3-5, 6-9, 10+). The weighted value for concussion history (+1.9) within the modified-LIBRA score was based on its proportional contribution to dementia relative to other LIBRA risk factors, as proposed by the 2020 Lancet Commission Report on Dementia Prevention. Associations of the modified-LIBRA score with odds of MCI and cognitive function were assessed via logistic and linear regression, respectively, in a subset of the sample (Group 2; n=289) who also completed the Brief Test of Adult Cognition by Telephone (BTACT). Race was included as a covariate in all models.
Results:
The median LIBRA score in the Group 1 was 1.6(IQR= -1, 3.6). Standard and modified-LIBRA median scores were 1.1(IQR= -1.3, 3.3) and 2(IQR= -0.4, 4.6), respectively, within Group 2. In Group 1, LIBRA score was significantly associated with odds of MCI diagnosis (odds ratio[95% confidence interval]=1.27[1.19, 1.28], p <.001). Concussion history provided additional information beyond LIBRA scores and was independently associated with odds of MCI; specifically, odds of MCI were higher among those with 6-9 (Odds Ratio[95% confidence interval]; OR=2.54[1.21, 5.32], p<.001), and 10+ (OR=4.55;[2.21, 9.36], p<.001) concussions, compared with those with no prior concussions. Within Group 2, the modified-LIBRA score was associated with higher odds of MCI (OR=1.61[1.15, 2.25]), and incrementally improved model information (0.04 increase in Nagelkerke R2) above standard LIBRA scores in the same model. Modified-LIBRA scores were inversely associated with BTACT Executive Function (B=-0.53[0.08], p=.002) and Episodic Memory scores (B=-0.53[0.08], p=.002).
Conclusions:
Numerous modifiable risk/protective factors for dementia are reported in former professional football players, but incorporating concussion history may aid the multifactorial appraisal of cognitive decline risk and identification of areas for prevention and intervention. Integration of multi-modal biomarkers will advance this person-centered, holistic approach toward dementia reduction, detection, and intervention.
It has been posited that alcohol use may confound the association between greater concussion history and poorer neurobehavioral functioning. However, while greater alcohol use is positively correlated with neurobehavioral difficulties, the association between alcohol use and concussion history is not well understood. Therefore, this study investigated the cross-sectional and longitudinal associations between cumulative concussion history, years of contact sport participation, and health-related/psychological factors with alcohol use in former professional football players across multiple decades.
Participants and Methods:
Former professional American football players completed general health questionnaires in 2001 and 2019, including demographic information, football history, concussion/medical history, and health-related/psychological functioning. Alcohol use frequency and amount was reported for three timepoints: during professional career (collected retrospectively in 2001), 2001, and 2019. During professional career and 2001 alcohol use frequency included none, 1-2, 3-4, 5-7 days/week, while amount included none, 12, 3-5, 6-7, 8+ drinks/occasion. For 2019, frequency included never, monthly or less, 2-4 times/month, 2-3 times/week, >4 times/week, while amount included none, 1-2, 3-4, 5-6, 7-9, 10+ drinks/occasion. Scores on a screening measure for Alcohol Use Disorder (CAGE) were also available at during professional career and 2001 timepoints. Concussion history was recorded in 2001 and binned into five groups: 0, 1-2, 3-5, 6-9, 10+. Depression and pain interference were assessed via PROMIS measures at all timepoints. Sleep disturbance was assessed in 2001 via separate instrument and with PROMIS Sleep Disturbance in 2019. Spearman’s rho correlations tested associations between concussion history and years of sport participation with alcohol use across timepoints, and whether poor health functioning (depression, pain interference, sleep disturbance) in 2001 and 2019 were associated with alcohol use both within and between timepoints.
Results:
Among the 351 participants (Mage=47.86[SD=10.18] in 2001), there were no significant associations between concussion history or years of contact sport participation with CAGE scores or alcohol use frequency/amount during professional career, 2001, or 2019 (rhos=-.072-.067, ps>.05). In 2001, greater depressive symptomology and sleep disturbance were related to higher CAGE scores (rho=.209, p<.001; rho=.176, p<.001, respectively), while greater depressive symptomology, pain interference, and sleep disturbance were related to higher alcohol use frequency (rho=.176, p=.002; rho=.109, p=.045; rho=.132, p=.013, respectively) and amount/occasion (rho=.215, p<.001; rho=.127, p=.020; rho=.153, p=.004, respectively). In 2019, depressive symptomology, pain interference, and sleep disturbance were not related to alcohol use (rhos=-.047-.087, ps>.05). Between timepoints, more sleep disturbance in 2001 was associated with higher alcohol amount/occasion in 2019 (rho=.115, p=.036).
Conclusions:
Increased alcohol intake has been theorized to be a consequence of greater concussion history, and as such, thought to confound associations between concussion history and neurobehavioral function later in life. Our findings indicate concussion history and years of contact sport participation were not significantly associated with alcohol use cross-sectionally or longitudinally, regardless of alcohol use characterization. While higher levels of depression, pain interference, and sleep disturbance in 2001 were related to greater alcohol use in 2001, they were not associated cross-sectionally in 2019. Results support the need to concurrently address health-related and psychological factors in the implementation of alcohol use interventions for former NFL players, particularly earlier in the sport discontinuation timeline.
Traumatic brain injury and cardiovascular disease (CVD) are modifiable risk factors for cognitive decline and dementia. Greater concussion history can potentially increase risk for cerebrovascular changes associated with cognitive decline and may compound effects of CVD. We investigated the independent and dynamic effects of CVD/risk factor burden and concussion history on cognitive function and odds of mild cognitive impairment (MCI) diagnoses in older former National Football League (NFL) players.
Participants and Methods:
Former NFL players, ages 50-70 (N=289; mean age=61.02±5.33 years), reported medical history and completed the Brief Test of Adult Cognition by Telephone (BTACT). CVD/risk factor burden was characterized as ordinal (0-3+) based on the sum of the following conditions: coronary artery disease/myocardial infarction, chronic obstructive pulmonary disease, hypertension, hyperlipidemia, sleep apnea, type-I and II diabetes. Cognitive outcomes included BTACT Executive Function and Episodic Memory Composite Z-scores (standardized on age- and education-based normative data), and the presence of physician diagnosed (self-reported) MCI. Concussion history was discretized into five groups: 0, 1-2, 3-5, 6-9, 10+. Linear and logistic regression models were fit to test independent and joint effects of concussion history and CVD burden on cognitive outcomes and odds of MCI. Race (dichotomized as White and Non-white due to sample distribution) was included in models as a covariate.
Results:
Greater CVD burden (unstandardized beta [standard error]; B=-0.10[0.42], p=.013, and race (B=0.622[0.09], p<.001), were associated with lower executive functioning. Compared to those with 0 prior concussions, no significant differences were observed for those with 1-2, 3-5, 6-9, or 10+ prior concussions (ps >.05). Race (B=0.61[.13], p<.001), but not concussion history or CVD burden, was associated with episodic memory. There was a trend for lower episodic memory scores among those with 10+ prior concussion compared to those with no prior concussions (B=-0.49[.25], p=.052). There were no significant differences in episodic memory among those with 1-2, 3-5, or 6-9 prior concussions compared to those with 0 prior concussions (ps>.05). CVD burden (B=0.35[.13], p=.008), race (greater odds in Non-white group; B=0.82[.29], p=.005), and greater concussion history (higher odds of diagnosis in 10+ group compared to those with 0 prior concussions; B=2.19[0.78], p<.005) were associated with higher odds of MCI diagnosis. Significant interaction effects between concussion history and CVD burden were not observed for any outcome (ps >.05).
Conclusions:
Lower executive functioning and higher odds of MCI diagnosis were associated with higher CVD burden and race. Very high concussion history (10+) was selectively associated with higher odds of MCI diagnosis. Reduction of these modifiable factors may mitigate adverse outcomes in older contact sport athletes. In former athletes, consideration of CVD burden is particularly pertinent when assessing executive dysfunction, considered to be a common cognitive feature of traumatic encephalopathy syndrome, as designated by the recent diagnostic criteria. Further research should investigate the social and structural determinants contributing to racial disparities in long-term health outcomes within former NFL players.
We recently reported on the radio-frequency attenuation length of cold polar ice at Summit Station, Greenland, based on bi-static radar measurements of radio-frequency bedrock echo strengths taken during the summer of 2021. Those data also allow studies of (a) the relative contributions of coherent (such as discrete internal conducting layers with sub-centimeter transverse scale) vs incoherent (e.g. bulk volumetric) scattering, (b) the magnitude of internal layer reflection coefficients, (c) limits on signal propagation velocity asymmetries (‘birefringence’) and (d) limits on signal dispersion in-ice over a bandwidth of ~100 MHz. We find that (1) attenuation lengths approach 1 km in our band, (2) after averaging 10 000 echo triggers, reflected signals observable over the thermal floor (to depths of ~1500 m) are consistent with being entirely coherent, (3) internal layer reflectivities are ≈–60$\to$–70 dB, (4) birefringent effects for vertically propagating signals are smaller by an order of magnitude relative to South Pole and (5) within our experimental limits, glacial ice is non-dispersive over the frequency band relevant for neutrino detection experiments.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
The IntCal family of radiocarbon (14C) calibration curves is based on research spanning more than three decades. The IntCal group have collated the 14C and calendar age data (mostly derived from primary publications with other types of data and meta-data) and, since 2010, made them available for other sorts of analysis through an open-access database. This has ensured transparency in terms of the data used in the construction of the ratified calibration curves. As the IntCal database expands, work is underway to facilitate best practice for new data submissions, make more of the associated metadata available in a structured form, and help those wishing to process the data with programming languages such as R, Python, and MATLAB. The data and metadata are complex because of the range of different types of archives. A restructured interface, based on the “IntChron” open-access data model, includes tools which allow the data to be plotted and compared without the need for export. The intention is to include complementary information which can be used alongside the main 14C series to provide new insights into the global carbon cycle, as well as facilitating access to the data for other research applications. Overall, this work aims to streamline the generation of new calibration curves.
Prior trials suggest that intravenous racemic ketamine is a highly effective for treatment-resistant depression (TRD), but phase 3 trials of racemic ketamine are needed.
Aims
To assess the acute efficacy and safety of a 4-week course of subcutaneous racemic ketamine in participants with TRD. Trial registration: ACTRN12616001096448 at www.anzctr.org.au.
Method
This phase 3, double-blind, randomised, active-controlled multicentre trial was conducted at seven mood disorders centres in Australia and New Zealand. Participants received twice-weekly subcutaneous racemic ketamine or midazolam for 4 weeks. Initially, the trial tested fixed-dose ketamine 0.5 mg/kg versus midazolam 0.025 mg/kg (cohort 1). Dosing was revised, after a Data Safety Monitoring Board recommendation, to flexible-dose ketamine 0.5–0.9 mg/kg or midazolam 0.025–0.045 mg/kg, with response-guided dosing increments (cohort 2). The primary outcome was remission (Montgomery-Åsberg Rating Scale for Depression score ≤10) at the end of week 4.
Results
The final analysis (those who received at least one treatment) comprised 68 in cohort 1 (fixed-dose), 106 in cohort 2 (flexible-dose). Ketamine was more efficacious than midazolam in cohort 2 (remission rate 19.6% v. 2.0%; OR = 12.1, 95% CI 2.1–69.2, P = 0.005), but not different in cohort 1 (remission rate 6.3% v. 8.8%; OR = 1.3, 95% CI 0.2–8.2, P = 0.76). Ketamine was well tolerated. Acute adverse effects (psychotomimetic, blood pressure increases) resolved within 2 h.
Conclusions
Adequately dosed subcutaneous racemic ketamine was efficacious and safe in treating TRD over a 4-week treatment period. The subcutaneous route is practical and feasible.
Children with congenital heart disease (CHD) can face neurodevelopmental, psychological, and behavioural difficulties beginning in infancy and continuing through adulthood. Despite overall improvements in medical care and a growing focus on neurodevelopmental screening and evaluation in recent years, neurodevelopmental disabilities, delays, and deficits remain a concern. The Cardiac Neurodevelopmental Outcome Collaborative was founded in 2016 with the goal of improving neurodevelopmental outcomes for individuals with CHD and pediatric heart disease. This paper describes the establishment of a centralised clinical data registry to standardize data collection across member institutions of the Cardiac Neurodevelopmental Outcome Collaborative. The goal of this registry is to foster collaboration for large, multi-centre research and quality improvement initiatives that will benefit individuals and families with CHD and improve their quality of life. We describe the components of the registry, initial research projects proposed using data from the registry, and lessons learned in the development of the registry.
Precision Medicine is an emerging approach for disease treatment and prevention that takes into account individual variability in genes, environment, and lifestyle. Autoimmune diseases are those in which the body’s natural defense system loses discriminating power between its own cells and foreign cells, causing the body to mistakenly attack healthy tissues. These conditions are very heterogeneous in their presentation and therefore difficult to diagnose and treat. Achieving precision medicine in autoimmune diseases has been challenging due to the complex etiologies of these conditions, involving an interplay between genetic, epigenetic, and environmental factors. However, recent technological and computational advances in molecular profiling have helped identify patient subtypes and molecular pathways which can be used to improve diagnostics and therapeutics. This review discusses the current understanding of the disease mechanisms, heterogeneity, and pathogenic autoantigens in autoimmune diseases gained from genomic and transcriptomic studies and highlights how these findings can be applied to better understand disease heterogeneity in the context of disease diagnostics and therapeutics.
Illicit substance use is dangerous in both acute and chronic forms, frequently resulting in lethal poisoning, addiction, and other negative consequences. Similar to research in other psychiatric conditions, whose ultimate goal is to enable effective prevention and treatment, studies in substance use are focused on factors elevating the risk for the disorder. The rapid growth of the substance use problem despite the effort invested in fighting it, however, suggests the need in changing the research approach. Instead of attempting to identify risk factors, whose neutralization is often infeasible if not impossible, it may be more promising to systematically reverse the perspective to the factors enhancing the aspect of liability to disorder that shares the same dimension but is opposite to risk, that is, resistance to substance use. Resistance factors, which enable the majority of the population to remain unaffected despite the ubiquity of psychoactive substances, may be more amenable to translation. While the resistance aspect of liability is symmetric to risk, the resistance approach requires substantial changes in sampling (high-resistance rather than high-risk) and using quantitative indices of liability. This article provides an overview and a practical approach to research in resistance to substance use/addiction, currently implemented in a NIH-funded project. The project benefits from unique opportunities afforded by the data originating from two longitudinal twin studies, the Virginia Twin Study of Adolescent and Behavioral Development and the Minnesota Twin Family Study. The methodology described is also applicable to other psychiatric disorders.
Consumption of unpasteurised milk in the United States has presented a public health challenge for decades because of the increased risk of pathogen transmission causing illness outbreaks. We analysed Foodborne Disease Outbreak Surveillance System data to characterise unpasteurised milk outbreaks. Using Poisson and negative binomial regression, we compared the number of outbreaks and outbreak-associated illnesses between jurisdictions grouped by legal status of unpasteurised milk sale based on a May 2019 survey of state laws. During 2013–2018, 75 outbreaks with 675 illnesses occurred that were linked to unpasteurised milk; of these, 325 illnesses (48%) were among people aged 0–19 years. Of 74 single-state outbreaks, 58 (78%) occurred in states where the sale of unpasteurised milk was expressly allowed. Compared with jurisdictions where retail sales were prohibited (n = 24), those where sales were expressly allowed (n = 27) were estimated to have 3.2 (95% CI 1.4–7.6) times greater number of outbreaks; of these, jurisdictions where sale was allowed in retail stores (n = 14) had 3.6 (95% CI 1.3–9.6) times greater number of outbreaks compared with those where sale was allowed on-farm only (n = 13). This study supports findings of previously published reports indicating that state laws resulting in increased availability of unpasteurised milk are associated with more outbreak-associated illnesses and outbreaks.
Over the last 25 years, radiowave detection of neutrino-generated signals, using cold polar ice as the neutrino target, has emerged as perhaps the most promising technique for detection of extragalactic ultra-high energy neutrinos (corresponding to neutrino energies in excess of 0.01 Joules, or 1017 electron volts). During the summer of 2021 and in tandem with the initial deployment of the Radio Neutrino Observatory in Greenland (RNO-G), we conducted radioglaciological measurements at Summit Station, Greenland to refine our understanding of the ice target. We report the result of one such measurement, the radio-frequency electric field attenuation length $L_\alpha$. We find an approximately linear dependence of $L_\alpha$ on frequency with the best fit of the average field attenuation for the upper 1500 m of ice: $\langle L_\alpha \rangle = ( ( 1154 \pm 121) - ( 0.81 \pm 0.14) \, ( \nu /{\rm MHz}) ) \,{\rm m}$ for frequencies ν ∈ [145 − 350] MHz.
Approximately one-third of individuals in a major depressive episode will not achieve sustained remission despite multiple, well-delivered treatments. These patients experience prolonged suffering and disproportionately utilize mental and general health care resources. The recently proposed clinical heuristic of ‘difficult-to-treat depression’ (DTD) aims to broaden our understanding and focus attention on the identification, clinical management, treatment selection, and outcomes of such individuals. Clinical trial methodologies developed to detect short-term therapeutic effects in treatment-responsive populations may not be appropriate in DTD. This report reviews three essential challenges for clinical intervention research in DTD: (1) how to define and subtype this heterogeneous group of patients; (2) how, when, and by what methods to select, acquire, compile, and interpret clinically meaningful outcome metrics; and (3) how to choose among alternative clinical trial design options to promote causal inference and generalizability. The boundaries of DTD are uncertain, and an evidence-based taxonomy and reliable assessment tools are preconditions for clinical research and subtyping. Traditional outcome metrics in treatment-responsive depression may not apply to DTD, as they largely reflect the only short-term symptomatic change and do not incorporate durability of benefit, side effect burden, or sustained impact on quality of life or daily function. The trial methodology will also require modification as trials will likely be of longer duration to examine the sustained impact, raising complex issues regarding control group selection, blinding and its integrity, and concomitant treatments.
Academic discovery in biomedicine is a growing enterprise with tens of billions of dollars in research funding available to universities and hospitals. Protecting and optimizing the resultant intellectual property is required in order for the discoveries to have an impact on society. To achieve that, institutions must create a multidisciplinary, collaborative system of review and support, and utilize connections to industry partners. In this study, we outline the efforts of Case Western Reserve University, coordinated through its Clinical and Translational Science Collaborative (CTSC), to promote entrepreneurial culture, and achieve goals of product development and startup formation for biomedical and population health discoveries arising from the academic ecosystem in Cleveland. The CTSC Office of Translation and Innovation, with the university’s Technology Transfer Office (TTO), helps identify and derisk promising IP while building interdisciplinary project teams to optimize the assets through key preclinical derisking steps. The benefits of coordinating funding across multiple programs, assuring dedicated project management to oversee optimizing the IP, and ensuring training to help improve proposals and encourage an entrepreneurial culture, are discussed in the context of a case study of therapeutic assets, the Council to Advance Human Health. This case study highlights best practices in academic innovation.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.