We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Ecosystem modeling, a pillar of the systems ecology paradigm (SEP), addresses questions such as, how much carbon and nitrogen are cycled within ecological sites, landscapes, or indeed the earth system? Or how are human activities modifying these flows? Modeling, when coupled with field and laboratory studies, represents the essence of the SEP in that they embody accumulated knowledge and generate hypotheses to test understanding of ecosystem processes and behavior. Initially, ecosystem models were primarily used to improve our understanding about how biophysical aspects of ecosystems operate. However, current ecosystem models are widely used to make accurate predictions about how large-scale phenomena such as climate change and management practices impact ecosystem dynamics and assess potential effects of these changes on economic activity and policy making. In sum, ecosystem models embedded in the SEP remain our best mechanism to integrate diverse types of knowledge regarding how the earth system functions and to make quantitative predictions that can be confronted with observations of reality. Modeling efforts discussed are the Century ecosystem model, DayCent ecosystem model, Grassland Ecosystem Model ELM, food web models, Savanna model, agent-based and coupled systems modeling, and Bayesian modeling.
Clarifying the relationship between depression symptoms and cardiometabolic and related health could clarify risk factors and treatment targets. The objective of this study was to assess whether depression symptoms in midlife are associated with the subsequent onset of cardiometabolic health problems.
Methods
The study sample comprised 787 male twin veterans with polygenic risk score data who participated in the Harvard Twin Study of Substance Abuse (‘baseline’) and the longitudinal Vietnam Era Twin Study of Aging (‘follow-up’). Depression symptoms were assessed at baseline [mean age 41.42 years (s.d. = 2.34)] using the Diagnostic Interview Schedule, Version III, Revised. The onset of eight cardiometabolic conditions (atrial fibrillation, diabetes, erectile dysfunction, hypercholesterolemia, hypertension, myocardial infarction, sleep apnea, and stroke) was assessed via self-reported doctor diagnosis at follow-up [mean age 67.59 years (s.d. = 2.41)].
Results
Total depression symptoms were longitudinally associated with incident diabetes (OR 1.29, 95% CI 1.07–1.57), erectile dysfunction (OR 1.32, 95% CI 1.10–1.59), hypercholesterolemia (OR 1.26, 95% CI 1.04–1.53), and sleep apnea (OR 1.40, 95% CI 1.13–1.74) over 27 years after controlling for age, alcohol consumption, smoking, body mass index, C-reactive protein, and polygenic risk for specific health conditions. In sensitivity analyses that excluded somatic depression symptoms, only the association with sleep apnea remained significant (OR 1.32, 95% CI 1.09–1.60).
Conclusions
A history of depression symptoms by early midlife is associated with an elevated risk for subsequent development of several self-reported health conditions. When isolated, non-somatic depression symptoms are associated with incident self-reported sleep apnea. Depression symptom history may be a predictor or marker of cardiometabolic risk over decades.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
Methods
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Results
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
Conclusions
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
Associations of socioenvironmental features like urbanicity and neighborhood deprivation with psychosis are well-established. An enduring question, however, is whether these associations are causal. Genetic confounding could occur due to downward mobility of individuals at high genetic risk for psychiatric problems into disadvantaged environments.
Methods
We examined correlations of five indices of genetic risk [polygenic risk scores (PRS) for schizophrenia and depression, maternal psychotic symptoms, family psychiatric history, and zygosity-based latent genetic risk] with multiple area-, neighborhood-, and family-level risks during upbringing. Data were from the Environmental Risk (E-Risk) Longitudinal Twin Study, a nationally-representative cohort of 2232 British twins born in 1994–1995 and followed to age 18 (93% retention). Socioenvironmental risks included urbanicity, air pollution, neighborhood deprivation, neighborhood crime, neighborhood disorder, social cohesion, residential mobility, family poverty, and a cumulative environmental risk scale. At age 18, participants were privately interviewed about psychotic experiences.
Results
Higher genetic risk on all indices was associated with riskier environments during upbringing. For example, participants with higher schizophrenia PRS (OR = 1.19, 95% CI = 1.06–1.33), depression PRS (OR = 1.20, 95% CI = 1.08–1.34), family history (OR = 1.25, 95% CI = 1.11–1.40), and latent genetic risk (OR = 1.21, 95% CI = 1.07–1.38) had accumulated more socioenvironmental risks for schizophrenia by age 18. However, associations between socioenvironmental risks and psychotic experiences mostly remained significant after covariate adjustment for genetic risk.
Conclusion
Genetic risk is correlated with socioenvironmental risk for schizophrenia during upbringing, but the associations between socioenvironmental risk and adolescent psychotic experiences appear, at present, to exist above and beyond this gene-environment correlation.
In 2019, a 42-year-old African man who works as an Ebola virus disease (EVD) researcher traveled from the Democratic Republic of Congo (DRC), near an ongoing EVD epidemic, to Philadelphia and presented to the Hospital of the University of Pennsylvania Emergency Department with altered mental status, vomiting, diarrhea, and fever. He was classified as a “wet” person under investigation for EVD, and his arrival activated our hospital emergency management command center and bioresponse teams. He was found to be in septic shock with multisystem organ dysfunction, including circulatory dysfunction, encephalopathy, metabolic lactic acidosis, acute kidney injury, acute liver injury, and diffuse intravascular coagulation. Critical care was delivered within high-risk pathogen isolation in the ED and in our Special Treatment Unit until a diagnosis of severe cerebral malaria was confirmed and EVD was definitively excluded.
This report discusses our experience activating a longitudinal preparedness program designed for rare, resource-intensive events at hospitals physically remote from any active epidemic but serving a high-volume international air travel port-of-entry.
In the past decade, network analysis (NA) has been applied to psychopathology to quantify complex symptom relationships. This statistical technique has demonstrated much promise, as it provides researchers the ability to identify relationships across many symptoms in one model and can identify central symptoms that may predict important clinical outcomes. However, network models are highly influenced by node selection, which could limit the generalizability of findings. The current study (N = 6850) tests a comprehensive, cognitive–behavioral model of eating-disorder symptoms using items from two, widely used measures (Eating Disorder Examination Questionnaire and Eating Pathology Symptoms Inventory).
Methods
We used NA to identify central symptoms and compared networks across the duration of illness (DOI), as chronicity is one of the only known predictors of poor outcome in eating disorders (EDs).
Results
Our results suggest that eating when not hungry and feeling fat were the most central symptoms across groups. There were no significant differences in network structure across DOI, meaning the connections between symptoms remained relatively consistent. However, differences emerged in central symptoms, such that cognitive symptoms related to overvaluation of weight/shape were central in individuals with shorter DOI, and behavioral central symptoms emerged more in medium and long DOI.
Conclusions
Our results have important implications for the treatment of individuals with enduring EDs, as they may have a different core, maintaining symptoms. Additionally, our findings highlight the importance of using comprehensive, theoretically- or empirically-derived models for NA.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
$60+$
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
The Murchison Widefield Array (MWA) is an electronically steered low-frequency (<300 MHz) radio interferometer, with a ‘slew’ time less than 8 s. Low-frequency (∼100 MHz) radio telescopes are ideally suited for rapid response follow-up of transients due to their large field of view, the inverted spectrum of coherent emission, and the fact that the dispersion delay between a 1 GHz and 100 MHz pulse is on the order of 1–10 min for dispersion measures of 100–2000 pc/cm3. The MWA has previously been used to provide fast follow-up for transient events including gamma-ray bursts (GRBs), fast radio bursts (FRBs), and gravitational waves, using systems that respond to gamma-ray coordinates network packet-based notifications. We describe a system for automatically triggering MWA observations of such events, based on Virtual Observatory Event standard triggers, which is more flexible, capable, and accurate than previous systems. The system can respond to external multi-messenger triggers, which makes it well-suited to searching for prompt coherent radio emission from GRBs, the study of FRBs and gravitational waves, single pulse studies of pulsars, and rapid follow-up of high-energy superflares from flare stars. The new triggering system has the capability to trigger observations in both the regular correlator mode (limited to ≥0.5 s integrations) and using the Voltage Capture System (VCS, 0.1 ms integration) of the MWA and represents a new mode of operation for the MWA. The upgraded standard correlator triggering capability has been in use since MWA observing semester 2018B (July–Dec 2018), and the VCS and buffered mode triggers will become available for observing in a future semester.
A new fossil site in a previously unexplored part of western Madagascar (the Beanka Protected Area) has yielded remains of many recently extinct vertebrates, including giant lemurs (Babakotia radofilai, Palaeopropithecus kelyus, Pachylemur sp., and Archaeolemur edwardsi), carnivores (Cryptoprocta spelea), the aardvark-like Plesiorycteropus sp., and giant ground cuckoos (Coua). Many of these represent considerable range extensions. Extant species that were extirpated from the region (e.g., Prolemur simus) are also present. Calibrated radiocarbon ages for 10 bones from extinct primates span the last three millennia. The largely undisturbed taphonomy of bone deposits supports the interpretation that many specimens fell in from a rock ledge above the entrance. Some primates and other mammals may have been prey items of avian predators, but human predation is also evident. Strontium isotope ratios (87Sr/86Sr) suggest that fossils were local to the area. Pottery sherds and bones of extinct and extant vertebrates with cut and chop marks indicate human activity in previous centuries. Scarcity of charcoal and human artifacts suggests only occasional visitation to the site by humans. The fossil assemblage from this site is unusual in that, while it contains many sloth lemurs, it lacks ratites, hippopotami, and crocodiles typical of nearly all other Holocene subfossil sites on Madagascar.
TwinsUK is the largest cohort of community-dwelling adult twins in the UK. The registry comprises over 14,000 volunteer twins (14,838 including mixed, single and triplets); it is predominantly female (82%) and middle-aged (mean age 59). In addition, over 1800 parents and siblings of twins are registered volunteers. During the last 27 years, TwinsUK has collected numerous questionnaire responses, physical/cognitive measures and biological measures on over 8500 subjects. Data were collected alongside four comprehensive phenotyping clinical visits to the Department of Twin Research and Genetic Epidemiology, King’s College London. Such collection methods have resulted in very detailed longitudinal clinical, biochemical, behavioral, dietary and socioeconomic cohort characterization; it provides a multidisciplinary platform for the study of complex disease during the adult life course, including the process of healthy aging. The major strength of TwinsUK is the availability of several ‘omic’ technologies for a range of sample types from participants, which includes genomewide scans of single-nucleotide variants, next-generation sequencing, metabolomic profiles, microbiomics, exome sequencing, epigenetic markers, gene expression arrays, RNA sequencing and telomere length measures. TwinsUK facilitates and actively encourages sharing the ‘TwinsUK’ resource with the scientific community — interested researchers may request data via the TwinsUK website (http://twinsuk.ac.uk/resources-for-researchers/access-our-data/) for their own use or future collaboration with the study team. In addition, further cohort data collection is planned via the Wellcome Open Research gateway (https://wellcomeopenresearch.org/gateways). The current article presents an up-to-date report on the application of technological advances, new study procedures in the cohort and future direction of TwinsUK.
Objectives: Maintaining two active languages may increase cognitive and brain reserve among bilingual individuals. We explored whether such a neuroprotective effect was manifested in the performance of memory tests for participants with amnestic mild cognitive impairment (aMCI). Methods: We compared 42 bilinguals to 25 monolinguals on verbal and nonverbal memory tests. We used: (a) the Loewenstein-Acevedo Scales for Semantic Interference and Learning (LASSI-L), a sensitive test that taps into proactive, retroactive, and recovery from proactive semantic interference (verbal memory), and (b) the Benson Figure delayed recall (nonverbal memory). A subsample had volumetric MRI scans. Results: The bilingual group significantly outperformed the monolingual group on two LASSI-L cued recall measures (Cued A2 and Cued B2). A measure of maximum learning (Cued A2) showed a correlation with the volume of the left hippocampus in the bilingual group only. Cued B2 recall (sensitive to recovery from proactive semantic interference) was correlated with the volume of the hippocampus and the entorhinal cortex of both cerebral hemispheres in the bilingual group, as well as with the left and right hippocampus in the monolingual group. The memory advantage in bilinguals on these measures was associated with higher inhibitory control as measured by the Stroop Color-Word test. Conclusions: Our results demonstrated a superior performance of aMCI bilinguals over aMCI monolinguals on selected verbal memory tasks. This advantage was not observed in nonverbal memory. Superior memory performance of bilinguals over monolinguals suggests that bilinguals develop a different and perhaps more efficient semantic association system that influences verbal recall. (JINS, 2019, 25, 15–28)
Palaeochannels of lowland rivers provide a means of investigating the sensitivity of river response to climate-driven hydrologic change. About 80 palaeochannels of the lower Macquarie River of southeastern Australia record the evolution of this distributive fluvial system. Six Macquarie palaeochannels were dated by single-grain optically stimulated luminescence. The largest of the palaeochannels (Quombothoo, median age 54 ka) was on average 284 m wide, 12 times wider than the modern river (24 m) and with 21 times greater meander wavelength. Palaeo-discharge then declined, resulting in a younger, narrower, group of palaeochannels, Bibbijibbery (125 m wide, 34 ka), Billybingbone (92 m, 20 ka), Milmiland (112 m, 22 ka), and Mundadoo (86 m, 5.6 ka). Yet these channels were still much larger than the modern river and were continuous downstream to the confluence with the Barwon-Darling River. At 5.5 ka, a further decrease in river discharge led to the formation of the narrow modern river, the ecologically important Macquarie Marshes, and Marra Creek palaeochannel (31 m, 2.1 ka) and diminished sediment delivery to the Barwon-Darling River as palaeo-discharge fell further. The hydrologic changes suggest precipitation was a driving forcing on catchment discharge in addition to a temperature-driven runoff response.
The triazines are one of the most widely used herbicide classes ever developed and are critical for managing weed populations that have developed herbicide resistance. These herbicides are traditionally valued for their residual weed control in more than 50 crops. Scientific literature suggests that atrazine, and perhaps other s-triazines, may no longer remain persistent in soils due to enhanced microbial degradation. Experiments examined the rate of degradation of atrazine and two other triazine herbicides, simazine and metribuzin, in both atrazine-adapted and non-history Corn Belt soils, with similar soils being used from each state as a comparison of potential triazine degradation. In three soils with no history of atrazine use, the t1/2 of atrazine was at least four times greater than in three soils with a history of atrazine use. Simazine degradation in the same three sets of soils was 2.4 to 15 times more rapid in history soils than non-history soils. Metribuzin in history soils degraded at 0.6, 0.9, and 1.9 times the rate seen in the same three non-history soils. These results indicate enhanced degradation of the symmetrical triazine simazine, but not of the asymmetrical triazine metribuzin.
The enormous economic burden of dementia in the United States of America falls disproportionately on families coping with this devastating disease. Black Americans, who are at greater risk of developing dementia than white Americans, hold on average less than one-eighth of the wealth of white Americans. This study explores whether dementia exacerbates this wealth disparity by examining dementia's effect on wealth trajectories of black versus non-black Americans over an eight-year period preceding death, using five waves of data (beginning in 2002 or 2004) on decedents in the 2012 and 2014 waves of the Health and Retirement Study (N = 2,429). Dementia is associated with a loss of 97 per cent of wealth among black Americans, compared with 42 per cent among non-black Americans, while wealth loss among black and non-black Americans without dementia did not differ substantially (15% versus 19%). Dementia appears to increase the probability of wealth exhaustion among both black and non-black Americans, although the estimate is no longer significant after adjusting for all covariates (for blacks, odds ratio (OR) = 2.04, 95% confidence interval (CI) = 0.83, 5.00; for non-blacks, OR = 1.47, 95% CI = 0.95, 2.27). Dementia has a negative association with home-ownership, and the loss or sale of a home may play a mediating role in the exhaustion of wealth among black Americans with dementia.
Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88) presented a critique of our recently published paper in Cell Reports entitled ‘Large-Scale Cognitive GWAS Meta-Analysis Reveals Tissue-Specific Neural Expression and Potential Nootropic Drug Targets’ (Lam et al., Cell Reports, Vol. 21, 2017, 2597–2613). Specifically, Hill offered several interrelated comments suggesting potential problems with our use of a new analytic method called Multi-Trait Analysis of GWAS (MTAG) (Turley et al., Nature Genetics, Vol. 50, 2018, 229–237). In this brief article, we respond to each of these concerns. Using empirical data, we conclude that our MTAG results do not suffer from ‘inflation in the FDR [false discovery rate]’, as suggested by Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88), and are not ‘more relevant to the genetic contributions to education than they are to the genetic contributions to intelligence’.
OBJECTIVES/SPECIFIC AIMS: Objectives/goals: Describe the process used to develop leveled competencies and associated examples. Discuss the final leveled competencies and their potential use in clinical research professional workforce initiatives. METHODS/STUDY POPULATION: The revised JTFCTC Framework 2.0 has 51 competency statements, representing 8 domains. Each competency statement has now been refined to delineate fundamental, skilled or advanced levels of knowledge and capability. Typically, the fundamental level describes the competency for a professional that requires some coaching and oversight, but is able to understand and identify basic concepts. The skilled level of the competency reflects the professional’s solid understanding of the competency and use of the information to take action independently in most situations. The advanced level embodies high level thinking, problem solving, and the ability to guide others in the competency. The process for developing both the three levels and examples involved 5 workgroups, each chaired by a content expert and comprising of national/international clinical research experts, including representatives from research sites, professional associations, government, and industry and academic sponsors. RESULTS/ANTICIPATED RESULTS: The committee developed 51 specific competencies arrayed across 3 levels and examples of each to demonstrate an appropriate application of the competency. The competencies and examples, and potential utilization, will be described. DISCUSSION/SIGNIFICANCE OF IMPACT: The use of competencies in the context of workforce development and training initiatives is helping to create standards for the clinical research profession. These leveled competencies allow for an important refinement to the standards that can be used to enhance the quality and safety of the clinical research enterprise and guide workforce development.