To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A recent genome-wide association study (GWAS) identified 12 independent loci significantly associated with attention-deficit/hyperactivity disorder (ADHD). Polygenic risk scores (PRS), derived from the GWAS, can be used to assess genetic overlap between ADHD and other traits. Using ADHD samples from several international sites, we derived PRS for ADHD from the recent GWAS to test whether genetic variants that contribute to ADHD also influence two cognitive functions that show strong association with ADHD: attention regulation and response inhibition, captured by reaction time variability (RTV) and commission errors (CE).
The discovery GWAS included 19 099 ADHD cases and 34 194 control participants. The combined target sample included 845 people with ADHD (age: 8–40 years). RTV and CE were available from reaction time and response inhibition tasks. ADHD PRS were calculated from the GWAS using a leave-one-study-out approach. Regression analyses were run to investigate whether ADHD PRS were associated with CE and RTV. Results across sites were combined via random effect meta-analyses.
When combining the studies in meta-analyses, results were significant for RTV (R2 = 0.011, β = 0.088, p = 0.02) but not for CE (R2 = 0.011, β = 0.013, p = 0.732). No significant association was found between ADHD PRS and RTV or CE in any sample individually (p > 0.10).
We detected a significant association between PRS for ADHD and RTV (but not CE) in individuals with ADHD, suggesting that common genetic risk variants for ADHD influence attention regulation.
Childhood exposure to interpersonal violence (IPV) may be linked to distinct manifestations of mental illness, yet the nature of this change remains poorly understood. Network analysis can provide unique insights by contrasting the interrelatedness of symptoms underlying psychopathology across exposed and non-exposed youth, with potential clinical implications for a treatment-resistant population. We anticipated marked differences in symptom associations among IPV-exposed youth, particularly in terms of ‘hub’ symptoms holding outsized influence over the network, as well as formation and influence of communities of highly interconnected symptoms.
Participants from a population-representative sample of youth (n = 4433; ages 11–18 years) completed a comprehensive structured clinical interview assessing mental health symptoms, diagnostic status, and history of violence exposure. Network analytic methods were used to model the pattern of associations between symptoms, quantify differences across diagnosed youth with (IPV+) and without (IPV–) IPV exposure, and identify transdiagnostic ‘bridge’ symptoms linking multiple disorders.
Symptoms organized into six ‘disorder’ communities (e.g. Intrusive Thoughts/Sensations, Depression, Anxiety), that exhibited considerably greater interconnectivity in IPV+ youth. Five symptoms emerged in IPV+ youth as highly trafficked ‘bridges’ between symptom communities (11 in IPV– youth).
IPV exposure may alter mutually reinforcing symptom co-occurrence in youth, thus contributing to greater psychiatric comorbidity and treatment resistance. The presence of a condensed and unique set of bridge symptoms suggests trauma-enriched nodes which could be therapeutically targeted to improve outcomes in violence-exposed youth.
The pharmacotherapy of epilepsy is a complex process guided by evidence-based research and clinical experience. Some patients achieve seizure freedom upon treatment with the first anti-seizure medication (ASM) prescribed, whereas others may be treated with two or three medications before one (or a combination) is found that reduces seizure frequency and/or severity with minimal side effects. Many patients demonstrate a partial response to treatment, leading to reduced seizure frequency and/or severity, but do not become completely seizure free. It is often stated that ~30% of epilepsy patients have seizures that cannot be controlled pharmacologically, and these patients are defined as having medication-resistant epilepsy (MRE). The International League Against Epilepsy (ILAE) published the following definition of MRE: ‘drug resistant epilepsy may be defined as failure of adequate trials of two tolerated and appropriately chosen and used ASM schedules (whether as monotherapies or in combination) to achieve sustained seizure freedom’. Treatment success or sustained seizure freedom is defined as one year without seizures or three times the inter-seizure interval (whichever is longer). The ILAE definition provides a useful standard from which to work, and MRE can be clinically identified in patients that fail to achieve seizure freedom after multiple ASM trials. However, the ILAE definition of successful treatment does not account for partial response to pharmacotherapy. Indeed, many partial responders have improved quality of life, even if they are not seizure-free for one year or more.
We describe 14 yr of public data from the Parkes Pulsar Timing Array (PPTA), an ongoing project that is producing precise measurements of pulse times of arrival from 26 millisecond pulsars using the 64-m Parkes radio telescope with a cadence of approximately 3 weeks in three observing bands. A comprehensive description of the pulsar observing systems employed at the telescope since 2004 is provided, including the calibration methodology and an analysis of the stability of system components. We attempt to provide full accounting of the reduction from the raw measured Stokes parameters to pulse times of arrival to aid third parties in reproducing our results. This conversion is encapsulated in a processing pipeline designed to track provenance. Our data products include pulse times of arrival for each of the pulsars along with an initial set of pulsar parameters and noise models. The calibrated pulse profiles and timing template profiles are also available. These data represent almost 21 000 h of recorded data spanning over 14 yr. After accounting for processes that induce time-correlated noise, 22 of the pulsars have weighted root-mean-square timing residuals of
in at least one radio band. The data should allow end users to quickly undertake their own gravitational wave analyses, for example, without having to understand the intricacies of pulsar polarisation calibration or attain a mastery of radio frequency interference mitigation as is required when analysing raw data files.
The deviation from thermodynamic equilibrium of the ion velocity distribution functions (VDFs), as measured by the Magnetospheric Multiscale (MMS) mission in the Earth’s turbulent magnetosheath, is quantitatively investigated. Making use of the unprecedented high-resolution MMS ion data, and together with Vlasov–Maxwell simulations, this analysis aims at investigating the relationship between deviation from Maxwellian equilibrium and typical plasma parameters. Correlations of the non-Maxwellian features with plasma quantities such as electric fields, ion temperature, current density and ion vorticity are found to be similar in magnetosheath data and numerical experiments, with a poor correlation between distortions of ion VDFs and current density, evidence that questions the occurrence of VDF departure from Maxwellian at the current density peaks. Moreover, strong correlation has been observed with the magnitude of the electric field in the turbulent magnetosheath, while a certain degree of correlation has been found in the numerical simulations and during a magnetopause crossing by MMS. This work could help shed light on the influence of electrostatic waves on the distortion of the ion VDFs in space turbulent plasmas.
The evolution of agriculture improved food security and enabled significant increases in the size and complexity of human groups. Despite these positive effects, some societies never adopted these practices, became only partially reliant on them, or even reverted to foraging after temporarily adopting them. Given the critical importance of climate and biotic interactions for modern agriculture, it seems likely that ecological conditions could have played a major role in determining the degree to which different societies adopted farming. However, this seemingly simple proposition has been surprisingly difficult to prove and is currently controversial. Here, we investigate how recent agricultural practices relate both to contemporary ecological opportunities and the suitability of local environments for the first species domesticated by humans. Leveraging a globally distributed dataset on 1,291 traditional societies, we show that after accounting for the effects of cultural transmission and more current ecological opportunities, levels of reliance on farming continue to be predicted by the opportunities local ecologies provided to the first human domesticates even after centuries of cultural evolution. Based on the details of our models, we conclude that ecology probably helped shape the geography of agriculture by biasing both human movement and the human-assisted dispersal of domesticates.
Although death by neurologic criteria (brain death) is legally recognized throughout the United States, state laws and clinical practice vary concerning three key issues: (1) the medical standards used to determine death by neurologic criteria, (2) management of family objections before determination of death by neurologic criteria, and (3) management of religious objections to declaration of death by neurologic criteria. The American Academy of Neurology and other medical stakeholder organizations involved in the determination of death by neurologic criteria have undertaken concerted action to address variation in clinical practice in order to ensure the integrity of brain death determination. To complement this effort, state policymakers must revise legislation on the use of neurologic criteria to declare death. We review the legal history and current laws regarding neurologic criteria to declare death and offer proposed revisions to the Uniform Determination of Death Act (UDDA) and the rationale for these recommendations.
Early-life stress (ELS) has previously been identified as a risk factor for cognitive decline, but this work has predominantly focused on clinical groups and indexed traditional cognitive domains. It, therefore, remains unclear whether ELS is related to cognitive function in healthy community-dwelling older adults, as well as whether any effects of ELS also extend to social cognition. To test each of these questions, the Childhood Trauma Questionnaire (CTQ) was administered to 484 older adults along with a comprehensive neuropsychological test battery and a well-validated test of social cognitive function. The results revealed no differences in global cognition according to overall experiences of ELS. However, a closer examination into the different ELS subscales showed that global cognition was poorer in those who had experienced physical neglect (relative to those who had not). Social cognitive function did not differ according to experiences to ELS. These results indicate that the relationship between ELS and cognition in older age may be dependent on the nature of the trauma experienced.
How landscapes respond to, and evolve from, large jökulhlaups (glacial outburst floods) is poorly constrained due to limited observations and detailed monitoring. We investigate how melt of glacier ice transported and deposited by multiple jökulhlaups during the 2010 eruption of Eyjafjallajökull, Iceland, modified the volume and surface elevation of jökulhlaup deposits. Jökulhlaups generated by the eruption deposited large volumes of sediment and ice, causing significant geomorphic change in the Gígjökull proglacial basin over a 4-week period. Observation of these events enabled robust constraints on the physical properties of the floods which informs our understanding of the deposits. Using ground-based LiDAR, GPS observations and the satellite-image-derived ArcticDEMs, we quantify the post-depositional response of the 60 m-thick Gígjökull sediment package to the meltout of buried ice and other geomorphic processes. Between 2010 and 2016, total deposit volume reduced by −0.95 × 106 m3 a−1, with significant surface lowering of up to 1.88 m a−1. Surface lowering and volumetric loss of the deposits is attributed to three factors: (i) meltout of ice deposited by the jökulhlaups; (ii) rapid melting of the buried Gígjökull glacier snout; and (iii) incision of the proglacial meltwater system into the jökulhlaup deposits.
In Cameroon, there is a national programme engaged in the control of schistosomiasis and soil-transmitted helminthiasis. In certain locations, the programme is transitioning from morbidity control towards local interruption of parasite transmission. The volcanic crater lake villages of Barombi Mbo and Barombi Kotto are well-known transmission foci and are excellent context-specific locations to assess appropriate disease control interventions. Most recently they have served as exemplars of expanded access to deworming medications and increased environmental surveillance. In this paper, we review infection dynamics through time, beginning with data from 1953, and comment on the short- and long-term success of disease control. We show how intensification of local control is needed to push towards elimination and that further environmental surveillance, with targeted snail control, is needed to consolidate gains in preventive chemotherapy as well as empower local communities to take ownership of interventions.
The study of parasites typically crosses into other research disciplines and spans across diverse scales, from molecular- to populational-levels, notwithstanding promoting an understanding of parasites set within evolutionary time. Today, the 2030 Sustainable Development Goals (SDGs) help frame much of contemporary parasitological research, since parasites can be found in all ecosystems, blighting human, animal and plant health. In recognition of the multi-disciplinary nature of parasitological research, the 2017 Autumn Symposium of the British Society for Parasitology was held in London to provide a forum for novel exchange across medical, veterinary and wildlife fields of study. Whilst the meeting was devoted to the topic of parasitism, it sought to foster mutualism, the antithesis perhaps of parasitism, by forging new academic connections and social networks to exchange novel ideas. The meeting also celebrated the longstanding career of Professor David Rollinson, FLS in the award of the International Federation for Tropical Medicine Medal for his efforts spanning 40 years of parasitological research. Indeed, David has done so much to explore and promote the fascinating biology of parasitism, as exemplified by the 15 manuscripts contained within this Special Issue.
Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88) presented a critique of our recently published paper in Cell Reports entitled ‘Large-Scale Cognitive GWAS Meta-Analysis Reveals Tissue-Specific Neural Expression and Potential Nootropic Drug Targets’ (Lam et al., Cell Reports, Vol. 21, 2017, 2597–2613). Specifically, Hill offered several interrelated comments suggesting potential problems with our use of a new analytic method called Multi-Trait Analysis of GWAS (MTAG) (Turley et al., Nature Genetics, Vol. 50, 2018, 229–237). In this brief article, we respond to each of these concerns. Using empirical data, we conclude that our MTAG results do not suffer from ‘inflation in the FDR [false discovery rate]’, as suggested by Hill (Twin Research and Human Genetics, Vol. 21, 2018, 84–88), and are not ‘more relevant to the genetic contributions to education than they are to the genetic contributions to intelligence’.
Human fascioliasis infection sources are analysed for the first time in front of the new worldwide scenario of this disease. These infection sources include foods, water and combinations of both. Ingestion of freshwater wild plants is the main source, with watercress and secondarily other vegetables involved. The problem of vegetables sold in uncontrolled urban markets is discussed. Distinction between infection sources by freshwater cultivated plants, terrestrial wild plants, and terrestrial cultivated plants is made. The risks by traditional local dishes made from sylvatic plants and raw liver ingestion are considered. Drinking of contaminated water, beverages and juices, ingestion of dishes and soups and washing of vegetables, fruits, tubercles and kitchen utensils with contaminated water are increasingly involved. Three methods to assess infection sources are noted: detection of metacercariae attached to plants or floating in freshwater, anamnesis in individual patients, and questionnaire surveys in endemic areas. The infectivity of metacercariae is reviewed both under field conditions and experimentally under the effects of physicochemical agents. Individual and general preventive measures appear to be more complicated than those considered in the past. The high diversity of infection sources and their heterogeneity in different countries underlie the large epidemiological heterogeneity of human fascioliasis throughout.
Monitoring vectors is relevant to ascertain transmission of lymphatic filariasis (LF). This may require the best sampling method that can capture high numbers of specific species to give indication of transmission. Gravid anophelines are good indicators for assessing transmission due to close contact with humans through blood meals. This study compared the efficiency of an Anopheles gravid trap (AGT) with other mosquito collection methods including the box and the Centres for Disease Control and Prevention gravid, light, exit and BioGent-sentinel traps, indoor resting collection (IRC) and pyrethrum spray catches across two endemic regions of Ghana. The AGT showed high trapping efficiency by collecting the highest mean number of anophelines per night in the Western (4.6) and Northern (7.3) regions compared with the outdoor collection methods. Additionally, IRC was similarly efficient in the Northern region (8.9) where vectors exhibit a high degree of endophily. AGT also showed good trapping potential for collecting Anopheles melas which is usually difficult to catch with existing methods. Screening of mosquitoes for infection showed a 0.80–3.01% Wuchereria bancrofti and 2.15–3.27% Plasmodium spp. in Anopheles gambiae. The AGT has shown to be appropriate for surveying Anopheles populations and can be useful for xenomonitoring for both LF and malaria.
Adult schistosomes live in the blood vessels and cannot easily be sampled from humans, so archived miracidia larvae hatched from eggs expelled in feces or urine are commonly used for population genetic studies. Large collections of archived miracidia on FTA cards are now available through the Schistosomiasis Collection at the Natural History Museum (SCAN). Here we describe protocols for whole genome amplification of Schistosoma mansoni and Schistosome haematobium miracidia from these cards, as well as real time PCR quantification of amplified schistosome DNA. We used microgram quantities of DNA obtained for exome capture and sequencing of single miracidia, generating dense polymorphism data across the exome. These methods will facilitate the transition from population genetics, using limited numbers of markers to population genomics using genome-wide marker information, maximising the value of collections such as SCAN.
Cardiomyopathy develops in >90% of Duchenne muscular dystrophy (DMD) patients by the second decade of life. We assessed the associations between DMD gene mutations, as well as Latent transforming growth factor-beta-binding protein 4 (LTBP4) haplotypes, and age at onset of myocardial dysfunction in DMD. DMD patients with baseline normal left ventricular systolic function and genotyping between 2004 and 2013 were included. Patients were grouped in multiple ways: specific DMD mutation domains, true loss-of-function mutations (group A) versus possible residual gene expression (group B), and LTBP4 haplotype. Age at onset of myocardial dysfunction was the first echocardiogram with an ejection fraction <55% and/or shortening fraction <28%. Of 101 DMD patients, 40 developed cardiomyopathy. There was no difference in age at onset of myocardial dysfunction among DMD genotype mutation domains (13.7±4.8 versus 14.3±1.0 versus 14.3±2.9 versus 13.8±2.5, p=0.97), groups A and B (14.4±2.8 versus 12.1±4.4, p=0.09), or LTBP4 haplotypes (14.5±3.2 versus 13.1±3.2 versus 11.0±2.8, p=0.18). DMD gene mutations involving the hinge 3 region, actin-binding domain, and exons 45–49, as well as the LTBP4 IAAM haplotype, were not associated with age of left ventricular dysfunction onset in DMD.
Different diagnostic interviews are used as reference standards for major depression classification in research. Semi-structured interviews involve clinical judgement, whereas fully structured interviews are completely scripted. The Mini International Neuropsychiatric Interview (MINI), a brief fully structured interview, is also sometimes used. It is not known whether interview method is associated with probability of major depression classification.
To evaluate the association between interview method and odds of major depression classification, controlling for depressive symptom scores and participant characteristics.
Data collected for an individual participant data meta-analysis of Patient Health Questionnaire-9 (PHQ-9) diagnostic accuracy were analysed and binomial generalised linear mixed models were fit.
A total of 17 158 participants (2287 with major depression) from 57 primary studies were analysed. Among fully structured interviews, odds of major depression were higher for the MINI compared with the Composite International Diagnostic Interview (CIDI) (odds ratio (OR) = 2.10; 95% CI = 1.15–3.87). Compared with semi-structured interviews, fully structured interviews (MINI excluded) were non-significantly more likely to classify participants with low-level depressive symptoms (PHQ-9 scores ≤6) as having major depression (OR = 3.13; 95% CI = 0.98–10.00), similarly likely for moderate-level symptoms (PHQ-9 scores 7–15) (OR = 0.96; 95% CI = 0.56–1.66) and significantly less likely for high-level symptoms (PHQ-9 scores ≥16) (OR = 0.50; 95% CI = 0.26–0.97).
The MINI may identify more people as depressed than the CIDI, and semi-structured and fully structured interviews may not be interchangeable methods, but these results should be replicated.
Declaration of interest
Drs Jetté and Patten declare that they received a grant, outside the submitted work, from the Hotchkiss Brain Institute, which was jointly funded by the Institute and Pfizer. Pfizer was the original sponsor of the development of the PHQ-9, which is now in the public domain. Dr Chan is a steering committee member or consultant of Astra Zeneca, Bayer, Lilly, MSD and Pfizer. She has received sponsorships and honorarium for giving lectures and providing consultancy and her affiliated institution has received research grants from these companies. Dr Hegerl declares that within the past 3 years, he was an advisory board member for Lundbeck, Servier and Otsuka Pharma; a consultant for Bayer Pharma; and a speaker for Medice Arzneimittel, Novartis, and Roche Pharma, all outside the submitted work. Dr Inagaki declares that he has received grants from Novartis Pharma, lecture fees from Pfizer, Mochida, Shionogi, Sumitomo Dainippon Pharma, Daiichi-Sankyo, Meiji Seika and Takeda, and royalties from Nippon Hyoron Sha, Nanzando, Seiwa Shoten, Igaku-shoin and Technomics, all outside of the submitted work. Dr Yamada reports personal fees from Meiji Seika Pharma Co., Ltd., MSD K.K., Asahi Kasei Pharma Corporation, Seishin Shobo, Seiwa Shoten Co., Ltd., Igaku-shoin Ltd., Chugai Igakusha and Sentan Igakusha, all outside the submitted work. All other authors declare no competing interests. No funder had any role in the design and conduct of the study; collection, management, analysis and interpretation of the data; preparation, review or approval of the manuscript; and decision to submit the manuscript for publication.
Inflammation of the mammary gland following bacterial infection, commonly known as mastitis, affects all mammalian species. Although the aetiology and epidemiology of mastitis in the dairy cow are well described, the genetic factors mediating resistance to mammary gland infection are not well known, due in part to the difficulty in obtaining robust phenotypic information from sufficiently large numbers of individuals. To address this problem, an experimental mammary gland infection experiment was undertaken, using a Friesian-Jersey cross breed F2 herd. A total of 604 animals received an intramammary infusion of Streptococcus uberis in one gland, and the clinical response over 13 milkings was used for linkage mapping and genome-wide association analysis. A quantitative trait locus (QTL) was detected on bovine chromosome 11 for clinical mastitis status using micro-satellite and Affymetrix 10 K SNP markers, and then exome and genome sequence data used from the six F1 sires of the experimental animals to examine this region in more detail. A total of 485 sequence variants were typed in the QTL interval, and association mapping using these and an additional 37 986 genome-wide markers from the Illumina SNP50 bovine SNP panel revealed association with markers encompassing the interleukin-1 gene cluster locus. This study highlights a region on bovine chromosome 11, consistent with earlier studies, as conferring resistance to experimentally induced mammary gland infection, and newly prioritises the IL1 gene cluster for further analysis in genetic resistance to mastitis.
The History, Electrocardiogram (ECG), Age, Risk Factors, and Troponin (HEART) score is a decision aid designed to risk stratify emergency department (ED) patients with acute chest pain. It has been validated for ED use, but it has yet to be evaluated in a prehospital setting.
A prehospital modified HEART score can predict major adverse cardiac events (MACE) among undifferentiated chest pain patients transported to the ED.
A retrospective cohort study of patients with chest pain transported by two county-based Emergency Medical Service (EMS) agencies to a tertiary care center was conducted. Adults without ST-elevation myocardial infarction (STEMI) were included. Inter-facility transfers and those without a prehospital 12-lead ECG or an ED troponin measurement were excluded. Modified HEART scores were calculated by study investigators using a standardized data collection tool for each patient. All MACE (death, myocardial infarction [MI], or coronary revascularization) were determined by record review at 30 days. The sensitivity and negative predictive values (NPVs) for MACE at 30 days were calculated.
Over the study period, 794 patients met inclusion criteria. A MACE at 30 days was present in 10.7% (85/794) of patients with 12 deaths (1.5%), 66 MIs (8.3%), and 12 coronary revascularizations without MI (1.5%). The modified HEART score identified 33.2% (264/794) of patients as low risk. Among low-risk patients, 1.9% (5/264) had MACE (two MIs and three revascularizations without MI). The sensitivity and NPV for 30-day MACE was 94.1% (95% CI, 86.8-98.1) and 98.1% (95% CI, 95.6-99.4), respectively.
Prehospital modified HEART scores have a high NPV for MACE at 30 days. A study in which prehospital providers prospectively apply this decision aid is warranted.