To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The National Neuropsychology Network (NNN) is a multicenter clinical research initiative funded by the National Institute of Mental Health (NIMH; R01 MH118514) to facilitate neuropsychology’s transition to contemporary psychometric assessment methods with resultant improvement in test validation and assessment efficiency.
The NNN includes four clinical research sites (Emory University; Medical College of Wisconsin; University of California, Los Angeles (UCLA); University of Florida) and Pearson Clinical Assessment. Pearson Q-interactive (Q-i) is used for data capture for Pearson published tests; web-based data capture tools programmed by UCLA, which serves as the Coordinating Center, are employed for remaining measures.
NNN is acquiring item-level data from 500–10,000 patients across 47 widely used Neuropsychology (NP) tests and sharing these data via the NIMH Data Archive. Modern psychometric methods (e.g., item response theory) will specify the constructs measured by different tests and determine their positive/negative predictive power regarding diagnostic outcomes and relationships to other clinical, historical, and demographic factors. The Structured History Protocol for NP (SHiP-NP) helps standardize acquisition of relevant history and self-report data.
NNN is a proof-of-principle collaboration: by addressing logistical challenges, NNN aims to engage other clinics to create a national and ultimately an international network. The mature NNN will provide mechanisms for data aggregation enabling shared analysis and collaborative research. NNN promises ultimately to enable robust diagnostic inferences about neuropsychological test patterns and to promote the validation of novel adaptive assessment strategies that will be more efficient, more precise, and more sensitive to clinical contexts and individual/cultural differences.
Horseshoe crabs within Austrolimulidae represent the extreme limits to which the xiphosurid Bauplan could be modified. Recent interest in this group has uncovered an unprecedented diversity of these odd-ball xiphosurids and led to suggestions that Austrolimulidae arose during the Permian Period and had become extinct by the end of the Triassic Period. Here, we extend the temporal record of Austrolimulidae by documenting a new horseshoe crab from the Lower Jurassic (Hettangian) Bayreuth Formation, Franconiolimulus pochankei gen. et sp. nov. The novel specimen displays hypertrophied genal spines, a key feature indicative of Austrolimulidae, but does not show as prominent accentuation or reduction of other exoskeletal sections. In considering the interesting family, we explore the possible origins and explanations for the bizarre morphologies exhibited by the Austrolimulidae and present hypotheses regarding the extinction of the group. Further examination of horseshoe crab fossils with unique features will undoubtedly continue to increase the diversity and disparity of these curious xiphosurids.
A recent genome-wide association study (GWAS) identified 12 independent loci significantly associated with attention-deficit/hyperactivity disorder (ADHD). Polygenic risk scores (PRS), derived from the GWAS, can be used to assess genetic overlap between ADHD and other traits. Using ADHD samples from several international sites, we derived PRS for ADHD from the recent GWAS to test whether genetic variants that contribute to ADHD also influence two cognitive functions that show strong association with ADHD: attention regulation and response inhibition, captured by reaction time variability (RTV) and commission errors (CE).
The discovery GWAS included 19 099 ADHD cases and 34 194 control participants. The combined target sample included 845 people with ADHD (age: 8–40 years). RTV and CE were available from reaction time and response inhibition tasks. ADHD PRS were calculated from the GWAS using a leave-one-study-out approach. Regression analyses were run to investigate whether ADHD PRS were associated with CE and RTV. Results across sites were combined via random effect meta-analyses.
When combining the studies in meta-analyses, results were significant for RTV (R2 = 0.011, β = 0.088, p = 0.02) but not for CE (R2 = 0.011, β = 0.013, p = 0.732). No significant association was found between ADHD PRS and RTV or CE in any sample individually (p > 0.10).
We detected a significant association between PRS for ADHD and RTV (but not CE) in individuals with ADHD, suggesting that common genetic risk variants for ADHD influence attention regulation.
Little is known about practices used to disseminate findings to non-research, practitioner audiences. This study describes the perspectives, experience and activities of dissemination & implementation (D&I) scientists around disseminating their research findings.
The study explored D&I scientists’ experiences and recommendations for assessment of dissemination activities to non-research audiences. Existing list serves were used to recruit scientists. Respondents were asked three open-ended questions on an Internet survey about dissemination activities, recommendations for changing evaluation systems and suggestions to improve their own dissemination of their work.
Surveys were completed by 159 scientists reporting some training, funding and/or publication history in D&I. Three themes emerged across each of the three open-ended questions. Question 1 on evaluation generated the themes of: 1a) promotional review; 1b) funding requirements and 1c) lack of acknowledgement of dissemination activities. Question 2 on recommended changes generated the themes of: 2a) dissemination as a requirement of the academic promotion process; 2b) requirement of dissemination plan and 2c) dissemination metrics. Question 3 on personal changes to improve dissemination generated the themes of: 3a) allocation of resources for dissemination activities; 3b) emerging dissemination channels and 3c) identify and address issues of priority for stakeholders.
Our findings revealed different types of issues D&I scientists encounter when disseminating findings to clinical, public health or policy audiences and their suggestions to improve the process. Future research should consider key requirements which determine academic promotion and grant funding as an opportunity to expand dissemination efforts.
Childhood exposure to interpersonal violence (IPV) may be linked to distinct manifestations of mental illness, yet the nature of this change remains poorly understood. Network analysis can provide unique insights by contrasting the interrelatedness of symptoms underlying psychopathology across exposed and non-exposed youth, with potential clinical implications for a treatment-resistant population. We anticipated marked differences in symptom associations among IPV-exposed youth, particularly in terms of ‘hub’ symptoms holding outsized influence over the network, as well as formation and influence of communities of highly interconnected symptoms.
Participants from a population-representative sample of youth (n = 4433; ages 11–18 years) completed a comprehensive structured clinical interview assessing mental health symptoms, diagnostic status, and history of violence exposure. Network analytic methods were used to model the pattern of associations between symptoms, quantify differences across diagnosed youth with (IPV+) and without (IPV–) IPV exposure, and identify transdiagnostic ‘bridge’ symptoms linking multiple disorders.
Symptoms organized into six ‘disorder’ communities (e.g. Intrusive Thoughts/Sensations, Depression, Anxiety), that exhibited considerably greater interconnectivity in IPV+ youth. Five symptoms emerged in IPV+ youth as highly trafficked ‘bridges’ between symptom communities (11 in IPV– youth).
IPV exposure may alter mutually reinforcing symptom co-occurrence in youth, thus contributing to greater psychiatric comorbidity and treatment resistance. The presence of a condensed and unique set of bridge symptoms suggests trauma-enriched nodes which could be therapeutically targeted to improve outcomes in violence-exposed youth.
Coronavirus disease 2019 (COVID-19) has migrated to regions that were initially spared, and it is likely that different populations are currently at risk for illness. Herein, we present our observations of the change in characteristics and resource use of COVID-19 patients over time in a national system of community hospitals to help inform those managing surge planning, operational management, and future policy decisions.
To determine risk factors for mortality among COVID-19 patients admitted to a system of community hospitals in the United States.
Retrospective analysis of patient data collected from the routine care of COVID-19 patients.
System of >180 acute-care facilities in the United States.
All admitted patients with positive identification of COVID-19 and a documented discharge as of May 12, 2020.
Determination of demographic characteristics, vital signs at admission, patient comorbidities and recorded discharge disposition in this population to construct a logistic regression estimating the odds of mortality, particular for those patients characterized as not being critically ill at admission.
In total, 6,180 COVID-19+ patients were identified as of May 12, 2020. Most COVID-19+ patients (4,808, 77.8%) were admitted directly to a medical-surgical unit with no documented critical care or mechanical ventilation within 8 hours of admission. After adjusting for demographic characteristics, comorbidities, and vital signs at admission in this subgroup, the largest driver of the odds of mortality was patient age (OR, 1.07; 95% CI, 1.06–1.08; P < .001). Decreased oxygen saturation at admission was associated with increased odds of mortality (OR, 1.09; 95% CI, 1.06–1.12; P < .001) as was diabetes (OR, 1.57; 95% CI, 1.21–2.03; P < .001).
The identification of factors observable at admission that are associated with mortality in COVID-19 patients who are initially admitted to non-critical care units may help care providers, hospital epidemiologists, and hospital safety experts better plan for the care of these patients.
The pharmacotherapy of epilepsy is a complex process guided by evidence-based research and clinical experience. Some patients achieve seizure freedom upon treatment with the first anti-seizure medication (ASM) prescribed, whereas others may be treated with two or three medications before one (or a combination) is found that reduces seizure frequency and/or severity with minimal side effects. Many patients demonstrate a partial response to treatment, leading to reduced seizure frequency and/or severity, but do not become completely seizure free. It is often stated that ~30% of epilepsy patients have seizures that cannot be controlled pharmacologically, and these patients are defined as having medication-resistant epilepsy (MRE). The International League Against Epilepsy (ILAE) published the following definition of MRE: ‘drug resistant epilepsy may be defined as failure of adequate trials of two tolerated and appropriately chosen and used ASM schedules (whether as monotherapies or in combination) to achieve sustained seizure freedom’. Treatment success or sustained seizure freedom is defined as one year without seizures or three times the inter-seizure interval (whichever is longer). The ILAE definition provides a useful standard from which to work, and MRE can be clinically identified in patients that fail to achieve seizure freedom after multiple ASM trials. However, the ILAE definition of successful treatment does not account for partial response to pharmacotherapy. Indeed, many partial responders have improved quality of life, even if they are not seizure-free for one year or more.
Early administration of blood products to patients with hemorrhagic shock has a positive impact on morbidity and mortality. Smaller hospitals may have limited supply of blood, and air medical systems may not carry blood. The primary outcome is to quantify the number of patients meeting established physiologic criteria for blood product administration and to identify which patients receive and which ones do not receive it due to lack of availability locally.
Electronic patient care records were used to identify a retrospective cohort of patients undergoing emergent air medical transport in Ontario, Canada, who are likely to require blood. Presenting problems for blood product administration were identified. Physiologic data were extracted with criteria for transfusion used to identify patients where blood product administration is indicated.
There were 11,520 emergent patient transports during the study period, with 842 (7.3%) where blood product administration was considered. Of these, 290 met established physiologic criteria for blood products, with 167 receiving blood, of which 57 received it at a hospital with a limited supply. The mean number of units administered per patient was 3.5. The remaining 123 patients meeting criteria did not receive product because none was unavailable.
Indications for blood product administration are present in 2.5% of patients undergoing time-sensitive air medical transport. Air medical services can enhance access to potentially lifesaving therapy in patients with hemorrhagic shock by carrying blood products, as blood may be unavailable or in limited supply locally in the majority of patients where it is indicated.
The role of air medical and land-based critical care transport services is not always clear amongst traditional emergency medical service providers or hospital-based health care practitioners. Some of this is historical, when air medical services were in their infancy and their role within the broader health care system was limited. Despite their evolution within the regionalized health care system, some myths remain regarding air medical services in Canada. The goal is to clarify several commonly held but erroneous beliefs regarding the role, impact, and practices in air medical transport.
Catatonia is a psychomotor dysregulation syndrome of diverse aetiology, increasingly recognised as a prominent feature of N-methyl-d-aspartate receptor antibody encephalitis (NMDARE) in adults. No study to date has systematically assessed the prevalence and symptomatology of catatonia in children with NMDARE. We analysed 57 paediatric patients with NMDARE from the literature using the Bush-Francis Catatonia Rating Scale. Catatonia was common (occurring in 86% of patients), manifesting as complex clusters of positive and negative features within individual patients. It was both underrecognised and undertreated. Immunotherapy was the only effective intervention, highlighting the importance of prompt recognition and treatment of the underlying cause of catatonia.
We present English translations of two French documents to show that the main reason for the rejection of Semmelweis's theory of the cause of childbed (puerperal) fever was because his proof relied on the post hoc ergo propter hoc fallacy, and not because Joseph Skoda referred only to cadaveric particles as the cause in his lecture to the Academy of Science on Semmelweis's discovery. Friedrich Wieger (1821–1890), an obstetrician from Strasbourg, published an accurate account of Semmelweis's theory six months before Skoda's lecture, and reported a case in which the causative agent originated from a source other than cadavers. Wieger also presented data showing that chlorine hand disinfection reduced the annual maternal mortality rate from childbed fever (MMR) from more than 7 per cent for the years 1840–1846 to 1.27 per cent in 1848, the first full year in which chlorine hand disinfection was practised. But an editorial in the Gazette médicale de Paris rejected the data as proof of the effectiveness of chlorine hand disinfection, stating that the fact that the MMR fell after chlorine hand disinfection was implemented did not mean that this innovation had caused the MMR to fall. This previously unrecognized objection to Semmelweis's proof was also the reason why Semmelweis's chief rejected Semmelweis's evidence.
We describe 14 yr of public data from the Parkes Pulsar Timing Array (PPTA), an ongoing project that is producing precise measurements of pulse times of arrival from 26 millisecond pulsars using the 64-m Parkes radio telescope with a cadence of approximately 3 weeks in three observing bands. A comprehensive description of the pulsar observing systems employed at the telescope since 2004 is provided, including the calibration methodology and an analysis of the stability of system components. We attempt to provide full accounting of the reduction from the raw measured Stokes parameters to pulse times of arrival to aid third parties in reproducing our results. This conversion is encapsulated in a processing pipeline designed to track provenance. Our data products include pulse times of arrival for each of the pulsars along with an initial set of pulsar parameters and noise models. The calibrated pulse profiles and timing template profiles are also available. These data represent almost 21 000 h of recorded data spanning over 14 yr. After accounting for processes that induce time-correlated noise, 22 of the pulsars have weighted root-mean-square timing residuals of
in at least one radio band. The data should allow end users to quickly undertake their own gravitational wave analyses, for example, without having to understand the intricacies of pulsar polarisation calibration or attain a mastery of radio frequency interference mitigation as is required when analysing raw data files.
The deviation from thermodynamic equilibrium of the ion velocity distribution functions (VDFs), as measured by the Magnetospheric Multiscale (MMS) mission in the Earth’s turbulent magnetosheath, is quantitatively investigated. Making use of the unprecedented high-resolution MMS ion data, and together with Vlasov–Maxwell simulations, this analysis aims at investigating the relationship between deviation from Maxwellian equilibrium and typical plasma parameters. Correlations of the non-Maxwellian features with plasma quantities such as electric fields, ion temperature, current density and ion vorticity are found to be similar in magnetosheath data and numerical experiments, with a poor correlation between distortions of ion VDFs and current density, evidence that questions the occurrence of VDF departure from Maxwellian at the current density peaks. Moreover, strong correlation has been observed with the magnitude of the electric field in the turbulent magnetosheath, while a certain degree of correlation has been found in the numerical simulations and during a magnetopause crossing by MMS. This work could help shed light on the influence of electrostatic waves on the distortion of the ion VDFs in space turbulent plasmas.
Giant miscanthus has the potential to move beyond cultivated fields and invade noncrop areas, but this can be overshadowed by aesthetic appeal and monetary value as a biofuel crop. Most research on giant miscanthus has focused on herbicide tolerance for establishment and production rather than terminating an existing stand. This study was conducted to evaluate herbicide options for control or terminating a stand of giant miscanthus. In 2013 and 2014, field experiments were conducted on established stands of the giant miscanthus cultivars ‘Nagara’ and ‘Freedom.’ Herbicides evaluated in both years included glyphosate, hexazinone, imazapic, imazapyr, clethodim, fluazifop, and glyphosate plus fluazifop. All treatments were applied in summer (June or July) and September. For both years, biomass reduction ranged from 85% to 100% when glyphosate was applied in June or July at 4.5 or 7.3 kg ae ha−1. No other treatment applied at this timing provided more than 50% giant miscanthus biomass reduction 1 yr after application. September applications of glyphosate were not consistent: treatments in 2013 reduced biomass by 40% or less, whereas in 2014, at all rates provided at least 78% biomass reduction. Glyphosate applied in June or July was the only treatment that provided effective and consistent control of giant miscanthus 1 yr after treatment.
The evolution of agriculture improved food security and enabled significant increases in the size and complexity of human groups. Despite these positive effects, some societies never adopted these practices, became only partially reliant on them, or even reverted to foraging after temporarily adopting them. Given the critical importance of climate and biotic interactions for modern agriculture, it seems likely that ecological conditions could have played a major role in determining the degree to which different societies adopted farming. However, this seemingly simple proposition has been surprisingly difficult to prove and is currently controversial. Here, we investigate how recent agricultural practices relate both to contemporary ecological opportunities and the suitability of local environments for the first species domesticated by humans. Leveraging a globally distributed dataset on 1,291 traditional societies, we show that after accounting for the effects of cultural transmission and more current ecological opportunities, levels of reliance on farming continue to be predicted by the opportunities local ecologies provided to the first human domesticates even after centuries of cultural evolution. Based on the details of our models, we conclude that ecology probably helped shape the geography of agriculture by biasing both human movement and the human-assisted dispersal of domesticates.
Considerable progress in explaining cultural evolutionary dynamics has been made by applying rigorous models from the natural sciences to historical and ethnographic information collected and accessed using novel digital platforms. Initial results have clarified several long-standing debates in cultural evolutionary studies, such as population origins, the role of religion in the evolution of complex societies and the factors that shape global patterns of language diversity. However, future progress requires recognition of the unique challenges posed by cultural data. To address these challenges, standards for data collection, organisation and analysis must be improved and widely adopted. Here, we describe some major challenges to progress in the construction of large comparative databases of cultural history, including recognising the critical role of theory, selecting appropriate units of analysis, data gathering and sampling strategies, winning expert buy-in, achieving reliability and reproducibility in coding, and ensuring interoperability and sustainability of the resulting databases. We conclude by proposing a set of practical guidelines to meet these challenges.