To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Response to lithium in patients with bipolar disorder is associated with clinical and transdiagnostic genetic factors. The predictive combination of these variables might help clinicians better predict which patients will respond to lithium treatment.
To use a combination of transdiagnostic genetic and clinical factors to predict lithium response in patients with bipolar disorder.
This study utilised genetic and clinical data (n = 1034) collected as part of the International Consortium on Lithium Genetics (ConLi+Gen) project. Polygenic risk scores (PRS) were computed for schizophrenia and major depressive disorder, and then combined with clinical variables using a cross-validated machine-learning regression approach. Unimodal, multimodal and genetically stratified models were trained and validated using ridge, elastic net and random forest regression on 692 patients with bipolar disorder from ten study sites using leave-site-out cross-validation. All models were then tested on an independent test set of 342 patients. The best performing models were then tested in a classification framework.
The best performing linear model explained 5.1% (P = 0.0001) of variance in lithium response and was composed of clinical variables, PRS variables and interaction terms between them. The best performing non-linear model used only clinical variables and explained 8.1% (P = 0.0001) of variance in lithium response. A priori genomic stratification improved non-linear model performance to 13.7% (P = 0.0001) and improved the binary classification of lithium response. This model stratified patients based on their meta-polygenic loadings for major depressive disorder and schizophrenia and was then trained using clinical data.
Using PRS to first stratify patients genetically and then train machine-learning models with clinical predictors led to large improvements in lithium response prediction. When used with other PRS and biological markers in the future this approach may help inform which patients are most likely to respond to lithium treatment.
Patients presenting to hospital with suspected coronavirus disease 2019 (COVID-19), based on clinical symptoms, are routinely placed in a cohort together until polymerase chain reaction (PCR) test results are available. This procedure leads to delays in transfers to definitive areas and high nosocomial transmission rates. FebriDx is a finger-prick point-of-care test (PoCT) that detects an antiviral host response and has a high negative predictive value for COVID-19. We sought to determine the clinical impact of using FebriDx for COVID-19 triage in the emergency department (ED).
We undertook a retrospective observational study evaluating the real-world clinical impact of FebriDx as part of an ED COVID-19 triage algorithm.
Emergency department of a university teaching hospital.
Patients presenting with symptoms suggestive of COVID-19, placed in a cohort in a ‘high-risk’ area, were tested using FebriDx. Patients without a detectable antiviral host response were then moved to a lower-risk area.
Between September 22, 2020, and January 7, 2021, 1,321 patients were tested using FebriDx, and 1,104 (84%) did not have a detectable antiviral host response. Among 1,104 patients, 865 (78%) were moved to a lower-risk area within the ED. The median times spent in a high-risk area were 52 minutes (interquartile range [IQR], 34–92) for FebriDx-negative patients and 203 minutes (IQR, 142–255) for FebriDx-positive patients (difference of −134 minutes; 95% CI, −144 to −122; P < .0001). The negative predictive value of FebriDx for the identification of COVID-19 was 96% (661 of 690; 95% CI, 94%–97%).
FebriDx improved the triage of patients with suspected COVID-19 and reduced the time that severe acute respiratory coronavirus virus 2 (SARS-CoV-2) PCR-negative patients spent in a high-risk area alongside SARS-CoV-2–positive patients.
Monoclonal antibody therapeutics to treat coronavirus disease (COVID-19) have been authorized by the US Food and Drug Administration under Emergency Use Authorization (EUA). Many barriers exist when deploying a novel therapeutic during an ongoing pandemic, and it is critical to assess the needs of incorporating monoclonal antibody infusions into pandemic response activities. We examined the monoclonal antibody infusion site process during the COVID-19 pandemic and conducted a descriptive analysis using data from 3 sites at medical centers in the United States supported by the National Disaster Medical System. Monoclonal antibody implementation success factors included engagement with local medical providers, therapy batch preparation, placing the infusion center in proximity to emergency services, and creating procedures resilient to EUA changes. Infusion process challenges included confirming patient severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) positivity, strained staff, scheduling, and pharmacy coordination. Infusion sites are effective when integrated into pre-existing pandemic response ecosystems and can be implemented with limited staff and physical resources.
The present study aimed to clarify the neuropsychological profile of the emergent diagnostic category of Mild Cognitive Impairment with Lewy bodies (MCI-LB) and determine whether domain-specific impairments such as in memory were related to deficits in domain-general cognitive processes (executive function or processing speed).
Patients (n = 83) and healthy age- and sex-matched controls (n = 34) underwent clinical and imaging assessments. Probable MCI-LB (n = 44) and MCI-Alzheimer’s disease (AD) (n = 39) were diagnosed following National Institute on Aging-Alzheimer’s Association (NIA-AA) and dementia with Lewy bodies (DLB) consortium criteria. Neuropsychological measures included cognitive and psychomotor speed, executive function, working memory, and verbal and visuospatial recall.
MCI-LB scored significantly lower than MCI-AD on processing speed [Trail Making Test B: p = .03, g = .45; Digit Symbol Substitution Test (DSST): p = .04, g = .47; DSST Error Check: p < .001, g = .68] and executive function [Trail Making Test Ratio (A/B): p = .04, g = .52] tasks. MCI-AD performed worse than MCI-LB on memory tasks, specifically visuospatial (Modified Taylor Complex Figure: p = .01, g = .46) and verbal (Rey Auditory Verbal Learning Test: p = .04, g = .42) delayed recall measures. Stepwise discriminant analysis correctly classified the subtype in 65.1% of MCI patients (72.7% specificity, 56.4% sensitivity). Processing speed accounted for more group-associated variance in visuospatial and verbal memory in both MCI subtypes than executive function, while no significant relationships between measures were observed in controls (all ps > .05)
MCI-LB was characterized by executive dysfunction and slowed processing speed but did not show the visuospatial dysfunction expected, while MCI-AD displayed an amnestic profile. However, there was considerable neuropsychological profile overlap and processing speed mediated performance in both MCI subtypes.
Electroencephalographic (EEG) abnormalities are greater in mild cognitive impairment (MCI) with Lewy bodies (MCI-LB) than in MCI due to Alzheimer’s disease (MCI-AD) and may anticipate the onset of dementia. We aimed to assess whether quantitative EEG (qEEG) slowing would predict a higher annual hazard of dementia in MCI across these etiologies. MCI patients (n = 92) and healthy comparators (n = 31) provided qEEG recording and underwent longitudinal clinical and cognitive follow-up. Associations between qEEG slowing, measured by increased theta/alpha ratio, and clinical progression from MCI to dementia were estimated with a multistate transition model to account for death as a competing risk, while controlling for age, cognitive function, and etiology classified by an expert consensus panel.
Over a mean follow-up of 1.5 years (SD = 0.5), 14 cases of incident dementia and 5 deaths were observed. Increased theta/alpha ratio on qEEG was associated with increased annual hazard of dementia (hazard ratio = 1.84, 95% CI: 1.01–3.35). This extends previous findings that MCI-LB features early functional changes, showing that qEEG slowing may anticipate the onset of dementia in prospectively identified MCI.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Ecosystem modeling, a pillar of the systems ecology paradigm (SEP), addresses questions such as, how much carbon and nitrogen are cycled within ecological sites, landscapes, or indeed the earth system? Or how are human activities modifying these flows? Modeling, when coupled with field and laboratory studies, represents the essence of the SEP in that they embody accumulated knowledge and generate hypotheses to test understanding of ecosystem processes and behavior. Initially, ecosystem models were primarily used to improve our understanding about how biophysical aspects of ecosystems operate. However, current ecosystem models are widely used to make accurate predictions about how large-scale phenomena such as climate change and management practices impact ecosystem dynamics and assess potential effects of these changes on economic activity and policy making. In sum, ecosystem models embedded in the SEP remain our best mechanism to integrate diverse types of knowledge regarding how the earth system functions and to make quantitative predictions that can be confronted with observations of reality. Modeling efforts discussed are the Century ecosystem model, DayCent ecosystem model, Grassland Ecosystem Model ELM, food web models, Savanna model, agent-based and coupled systems modeling, and Bayesian modeling.
There is mounting evidence for the potential for the natural dietary antioxidant and anti-inflammatory amino acid l-Ergothioneine (ERGO) to prevent or mitigate chronic diseases of aging. This has led to the suggestion that it could be considered a ‘longevity vitamin.’ ERGO is produced in nature only by certain fungi and a few other microbes. Mushrooms are, by far, the leading dietary source of ERGO, but it is found in small amounts throughout the food chain, most likely due to soil-borne fungi passing it on to plants. Because some common agricultural practices can disrupt beneficial fungus–plant root relationships, ERGO levels in foods grown under those conditions could be compromised. Thus, research is needed to further analyse the role agricultural practices play in the availability of ERGO in the human diet and its potential to improve our long-term health.
Mechanistic endophenotypes can inform process models of psychopathology and aid interpretation of genetic risk factors. Smaller total brain and subcortical volumes are associated with attention-deficit hyperactivity disorder (ADHD) and provide clues to its development. This study evaluates whether common genetic risk for ADHD is associated with total brain volume (TBV) and hypothesized subcortical structures in children.
Children 7–15 years old were recruited for a case–control study (N = 312, N = 199 ADHD). Children were assessed with a multi-informant, best-estimate diagnostic procedure and motion-corrected MRI measured brain volumes. Polygenic scores were computed based on discovery data from the Psychiatric Genomics Consortium (N = 19 099 ADHD, N = 34 194 controls) and the ENIGMA + CHARGE consortium (N = 26 577).
ADHD was associated with smaller TBV, and altered volumes of caudate, cerebellum, putamen, and thalamus after adjustment for TBV; however, effects were larger and statistically reliable only in boys. TBV was associated with an ADHD polygenic score [β = −0.147 (−0.27 to −0.03)], and mediated a small proportion of the effect of polygenic risk on ADHD diagnosis (average ACME = 0.0087, p = 0.012). This finding was stronger in boys (average ACME = 0.019, p = 0.008). In addition, we confirm genetic variation associated with whole brain volume, via an intracranial volume polygenic score.
Common genetic risk for ADHD is not expressed primarily as developmental alterations in subcortical brain volumes, but appears to alter brain development in other ways, as evidenced by TBV differences. This is among the first demonstrations of this effect using molecular genetic data. Potential sex differences in these effects warrant further examination.
Kochia is one of the most problematic weeds in the United States. Field studies were conducted in five states (Wyoming, Colorado, Kansas, Nebraska, and South Dakota) over 2 yr (2010 and 2011) to evaluate kochia control with selected herbicides registered in five common crop scenarios: winter wheat, fallow, corn, soybean, and sugar beet to provide insight for diversifying kochia management in crop rotations. Kochia control varied by experimental site such that more variation in kochia control and biomass production was explained by experimental site than herbicide choice within a crop. Kochia control with herbicides currently labeled for use in sugar beet averaged 32% across locations. Kochia control was greatest and most consistent from corn herbicide programs (99%), followed by soybean (96%) and fallow (97%) herbicide programs. Kochia control from wheat herbicide programs was 93%. With respect to the availability of effective herbicide options, glyphosate-resistant kochia control was easiest in corn, soybean, and fallow, followed by wheat; and difficult to manage with herbicides in sugar beet.
Wildlife conservation in the Anthropocene means there is a pressing need to find ways for wildlife and humans to share landscapes. However, this is challenging due to the complex interactions that occur within social-ecological systems (SES). This challenge is exemplified by grey wolf management in the American West, where human governance systems influence where and at what densities carnivores persist, thereby regulating and limiting the impacts of carnivores on both human and ecological communities. Here, we build a SES conceptual framework to disentangle the interdependencies between wolves and humans, including the ecological impacts of wolves and people in anthropogenic landscapes and the socio-economic forces shaping human–wolf interactions now and in the future. A key lesson is that coexistence rests not only on the biophysical capacity of a landscape to be shared by humans and wolves, but also on the capacity for human societies to adjust to and accept some level of conflict with wolves. As such, a holistic view that recognizes humans, our social systems and institutions as key actors and attributes of ecological systems can advance the theory and practice of coexistence.