To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Impairment in reciprocal social behavior (RSB), an essential component of early social competence, clinically defines autism spectrum disorder (ASD). However, the behavioral and genetic architecture of RSB in toddlerhood, when ASD first emerges, has not been fully characterized. We analyzed data from a quantitative video-referenced rating of RSB (vrRSB) in two toddler samples: a community-based volunteer research registry (n = 1,563) and an ethnically diverse, longitudinal twin sample ascertained from two state birth registries (n = 714). Variation in RSB was continuously distributed, temporally stable, significantly associated with ASD risk at age 18 months, and only modestly explained by sociodemographic and medical factors (r2 = 9.4%). Five latent RSB factors were identified and corresponded to aspects of social communication or restricted repetitive behaviors, the two core ASD symptom domains. Quantitative genetic analyses indicated substantial heritability for all factors at age 24 months (h2 ≥ .61). Genetic influences strongly overlapped across all factors, with a social motivation factor showing evidence of newly-emerging genetic influences between the ages of 18 and 24 months. RSB constitutes a heritable, trait-like competency whose factorial and genetic structure is generalized across diverse populations, demonstrating its role as an early, enduring dimension of inherited variation in human social behavior. Substantially overlapping RSB domains, measurable when core ASD features arise and consolidate, may serve as markers of specific pathways to autism and anchors to inform determinants of autism's heterogeneity.
OBJECTIVES/GOALS: We sought to examine: 1) variability in center acceptance patterns for heart allografts offered to the highest-priority candidates, 2) impact of this acceptance behavior on candidate survival, and 3) post-transplantation outcomes in candidates who accepted first rank offer vs. previously declined offer. METHODS/STUDY POPULATION: In this retrospective cohort study, the US national transplant registry was queried for all match runs of adult candidates listed for isolated heart transplantation between 2007-2017. We examined center acceptance rates for heart allografts offered to the highest-priority candidates and accounted for covariates in multivariable logistic regression. Competing risks analysis was performed to assess the relationship between center acceptance rate and waitlist mortality. Post-transplantation outcomes (patient survival and graft failure) between candidates who accepted their first-rank offers vs those who accepted previously declined offers were compared using Fine-Gray subdistribution hazards model. RESULTS/ANTICIPATED RESULTS: Among 19,703 unique organ offers, 6,302 (32%) were accepted for first-ranked candidates. After adjustment for donor, recipient, and geographic covariates, transplant centers varied markedly in acceptance rates (12%-62%) of offers made to first-ranked candidates. Lowest acceptance rate centers (<25%) associated with highest cumulative incidence of waitlist mortality. For every 10% increase in adjusted center acceptance rate, waitlist mortality risk decreased by 27% (SHR 0.73, 95% CI 0.67-0.80). No significant difference was observed in 5-year adjusted post-Tx survival and graft failure between hearts accepted at the first-rank vs lower-rank positions. DISCUSSION/SIGNIFICANCE OF IMPACT: Wide variability in heart acceptance rates exists among centers, with candidates listed at low acceptance rate centers more likely to die waiting. Similar post-Tx survival suggests previously declined allografts function as well as those accepted at first offer. Center-level decision is a modifiable behavior associated with waitlist mortality.
Governments and conservation organizations worldwide are motivated to manage invasive species due to quantified and perceived negative ecological and economic impacts invasive species impose. Thus, determining which species cause significant negative impacts, as well as clear articulation of those impacts, is critical to meet conservation priorities. This process of determining which species warrant management can be straightforward when there are clear negative impacts, such as dramatic reductions in native diversity. However, the majority of changes to ecosystem pools and fluxes cannot be readily categorized as ecologically negative or positive (e.g., lower soil pH). Additionally, diverse stakeholders may not all agree on impacts as negative. This complexity challenges our ability to simply and uniformly determine which species cause negative impact, and thus which species merit management, especially as we expand invader impacts to encompass a more holistic ecosystem perspective beyond biodiversity and consider stakeholder perspectives and priorities. Thus, we suggest impact be evaluated in a context that is dictated by governing policies or conservation/land management missions with the support of scientists. In other words, within each jurisdiction, populations are identified as causing negative impact based on the hierarchical governing policies and mission of that parcel. Framing negative impact in a management context has the advantages of (1) easily scaling from individual landscapes to geopolitical states; (2) better representing how managers practice, (3) reflecting invasive species as spatially contextual, not universal, and (4) allowing for flexibility with dynamic ecosystems undergoing global change. We hope that framing negative impact in an applied context aids management prioritization and achieving conservation goals.
Staff members of psychiatric facilities are at high risk of secondhand smoking. Smoking exposure was assessed in 41 nonsmoking employees of a psychiatry department before and after a ban. Subjective exposure measures decreased in 76% of the subjects. Salivary cotinine decreased in the subsample of seven subjects with high pre-ban levels (32 ±8 vs 40 ± 17 ng/ml, p = .045).
Online learning has become an increasingly expected and popular component for education of the modern-day adult learner, including the medical provider. In light of the recent coronavirus pandemic, there has never been more urgency to establish opportunities for supplemental online learning. Heart University aims to be “the go-to online resource” for e-learning in CHD and paediatric-acquired heart disease. It is a carefully curated open access library of paedagogical material for all providers of care to children and adults with CHD or children with acquired heart disease, whether a trainee or a practising provider. In this manuscript, we review the aims, development, current offerings and standing, and future goals of Heart University.
The aim of the current study was to replicate findings in adults indicating that higher sensitivity to stressful events is predictive of both onset and persistence of psychopathological symptoms in a sample of adolescents and young adults. In addition, we tested the hypothesis that sensitivity to mild stressors in particular is predictive of the developmental course of psychopathology.
We analyzed experience sampling and questionnaire data collected at baseline and one-year follow-up of 445 adolescent and young adult twins and non-twin siblings (age range: 15–34). Linear multilevel regression was used for the replication analyses. To test if affective sensitivity to mild stressors in particular was associated with follow-up symptoms, we used a categorical approach adding variables on affective sensitivity to mild, moderate and severe daily stressors to the model.
Linear analyses showed that emotional stress reactivity was not associated with onset (ß = .02; P = .56) or persistence (ß = -.01; P = .78) of symptoms. There was a significant effect of baseline symptom score (ß = .53; P < .001) and average negative affect (NA: ß = .19; P < .001) on follow-up symptoms. Using the categorical approach, we found that affective sensitivity to mild (ß = .25; P < .001), but not moderate (ß = -.03; P = .65) or severe (ß = -.06; P = .42), stressors was associated with symptom persistence one year later.
We were unable to replicate previous findings relating stress sensitivity linearly to symptom onset or persistence in a younger sample. Whereas sensitivity to more severe stressors may reflect adaptive coping, high sensitivity to the mildest of daily stressors may indicate an increased risk for psychopathology.
Mechanistic models (MMs) have served as causal pathway analysis and ‘decision-support’ tools within animal production systems for decades. Such models quantitatively define how a biological system works based on causal relationships and use that cumulative biological knowledge to generate predictions and recommendations (in practice) and generate/evaluate hypotheses (in research). Their limitations revolve around obtaining sufficiently accurate inputs, user training and accuracy/precision of predictions on-farm. The new wave in digitalization technologies may negate some of these challenges. New data-driven (DD) modelling methods such as machine learning (ML) and deep learning (DL) examine patterns in data to produce accurate predictions (forecasting, classification of animals, etc.). The deluge of sensor data and new self-learning modelling techniques may address some of the limitations of traditional MM approaches – access to input data (e.g. sensors) and on-farm calibration. However, most of these new methods lack transparency in the reasoning behind predictions, in contrast to MM that have historically been used to translate knowledge into wisdom. The objective of this paper is to propose means to hybridize these two seemingly divergent methodologies to advance the models we use in animal production systems and support movement towards truly knowledge-based precision agriculture. In order to identify potential niches for models in animal production of the future, a cross-species (dairy, swine and poultry) examination of the current state of the art in MM and new DD methodologies (ML, DL analytics) is undertaken. We hypothesize that there are several ways via which synergy may be achieved to advance both our predictive capabilities and system understanding, being: (1) building and utilizing data streams (e.g. intake, rumination behaviour, rumen sensors, activity sensors, environmental sensors, cameras and near IR) to apply MM in real-time and/or with new resolution and capabilities; (2) hybridization of MM and DD approaches where, for example, a ML framework is augmented by MM-generated parameters or predicted outcomes and (3) hybridization of the MM and DD approaches, where biological bounds are placed on parameters within a MM framework, and the DD system parameterizes the MM for individual animals, farms or other such clusters of data. As animal systems modellers, we should expand our toolbox to explore new DD approaches and big data to find opportunities to increase understanding of biological systems, find new patterns in data and move the field towards intelligent, knowledge-based precision agriculture systems.
Cover crop residue can act as a mulch that will suppress weeds, but as the residue degrades, weed suppression diminishes. Biomass of cover crop residue is positively correlated to weed suppression, but little research is available regarding the composition of cover crop residue and its effect on weed suppression. Field experiments were conducted to determine the impact of cover crop residue properties (i.e., total carbon, total nitrogen, lignin, cellulose, and hemicellulose) on summer annual weed suppression and cash crop yield. Cover crop monocultures and mixtures were planted in the fall and designed to provide a range of biomass and residue properties. Cover crops were followed by corn (Zea mays L.) or soybean [Glycine max (L.) Merr.]. At termination, cover crop biomass and residue components were determined. Biomass ranged from 3,640 to 8,750 kg ha−1, and the carbon-to-nitrogen (C:N) ratio ranged from 12:1 to 36:1. As both cover crop biomass and C:N ratio increased, weed suppression and duration of suppression increased. For example, a C:N ratio of 9:1 is needed to suppress redroot pigweed (Amaranthus retroflexus L.) 50% at 4 wk after termination (WAT), and that increases to 16:1 and 20:1 to have 50% suppression at 6 and 8 WAT, respectively. Similarly, with biomass, 2,800 kg ha−1 is needed for 50% A. retroflexus suppression at 4 WAT, which increases to 5,280 kg ha−1 and 6,610 kg ha−1 needed for 50% suppression at 6 and 8 WAT, respectively. In general, similar trends were observed for pitted morningglory (Ipomoea lacunosa L.) and large crabgrass [Digitaria sanguinalis (L.) Scop.]. Corn and soybean yield increased as both cover crop biomass and C:N ratio increased where no weed control measures were implemented beyond cover crop. The same trend was observed with cash crop yield in the weed-free subblocks, with one exception. This research indicates that cover crop residue composition is important for weed control in addition to biomass.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
To develop a physiological data-driven model for early identification of impending cardiac arrest in neonates and infants with cardiac disease hospitalised in the cardiovascular ICU.
We performed a single-institution retrospective cohort study (11 January 2013–16 September 2015) of patients ≤1 year old with cardiac disease who were hospitalised in the cardiovascular ICU at a tertiary care children’s hospital. Demographics and diagnostic codes of cardiac arrest were obtained via the electronic health record. Diagnosis of cardiac arrest was validated by expert clinician review. Minute-to-minute physiological monitoring data were recorded via bedside monitors. A generalized linear model was used to compute a minute by minute risk score. Training and test data sets both included data from patients who did and did not develop cardiac arrest. An optimal risk-score threshold was derived based on the model’s discriminatory capacity for impending arrest versus non-arrest. Model performance measures included sensitivity, specificity, accuracy, likelihood ratios, and post-test probability of arrest.
The final model consisting of multiple clinical parameters was able to identify impending cardiac arrest at least 2 hours prior to the event with an overall accuracy of 75% (sensitivity = 61%, specificity = 80%) and observed an increase in probability of detection of cardiac arrest from a pre-test probability of 9.6% to a post-test probability of 21.2%.
Our findings demonstrate that a predictive model using physiologic monitoring data in neonates and infants with cardiac disease hospitalised in the paediatric cardiovascular ICU can identify impending cardiac arrest on average 17 hours prior to arrest.
Angiostrongylus cantonensis is a pathogenic nematode and the cause of neuroangiostrongyliasis, an eosinophilic meningitis more commonly known as rat lungworm disease. Transmission is thought to be primarily due to ingestion of infective third stage larvae (L3) in gastropods, on produce, or in contaminated water. The gold standard to determine the effects of physical and chemical treatments on the infectivity of A. cantonensis L3 larvae is to infect rodents with treated L3 larvae and monitor for infection, but animal studies are laborious and expensive and also raise ethical concerns. This study demonstrates propidium iodide (PI) to be a reliable marker of parasite death and loss of infective potential without adversely affecting the development and future reproduction of live A. cantonensis larvae. PI staining allows evaluation of the efficacy of test substances in vitro, an improvement upon the use of lack of motility as an indicator of death. Some potential applications of this assay include determining the effectiveness of various anthelmintics, vegetable washes, electromagnetic radiation and other treatments intended to kill larvae in the prevention and treatment of neuroangiostrongyliasis.
Background: Cervical sponylotic myelopathy (CSM) may present with neck and arm pain. This study investiagtes the change in neck/arm pain post-operatively in CSM. Methods: This ambispective study llocated 402 patients through the Canadian Spine Outcomes and Research Network. Outcome measures were the visual analogue scales for neck and arm pain (VAS-NP and VAS-AP) and the neck disability index (NDI). The thresholds for minimum clinically important differences (MCIDs) for VAS-NP and VAS-AP were determined to be 2.6 and 4.1. Results: VAS-NP improved from mean of 5.6±2.9 to 3.8±2.7 at 12 months (P<0.001). VAS-AP improved from 5.8±2.9 to 3.5±3.0 at 12 months (P<0.001). The MCIDs for VAS-NP and VAS-AP were also reached at 12 months. Based on the NDI, patients were grouped into those with mild pain/no pain (33%) versus moderate/severe pain (67%). At 3 months, a significantly high proportion of patients with moderate/severe pain (45.8%) demonstrated an improvement into mild/no pain, whereas 27.2% with mild/no pain demonstrated worsening into moderate/severe pain (P <0.001). At 12 months, 17.4% with mild/no pain experienced worsening of their NDI (P<0.001). Conclusions: This study suggests that neck and arm pain responds to surgical decompression in patients with CSM and reaches the MCIDs for VAS-AP and VAS-NP at 12 months.
Horseweed is a problematic weed to control, especially in no-tillage production. Increasing cases of herbicide resistance have exacerbated the problem, necessitating alternative control options and an integrated weed management approach. Field experiments were conducted to evaluate horseweed suppression from fall-planted cover crop monocultures and mixtures as well as two fall-applied residual herbicide treatments. Prior to cover crop termination, horseweed density was reduced by 88% to 96% from cover crops. At cover crop termination in late spring, cereal rye biomass was 7,671 kg ha–1, which was similar to cereal rye–containing mixtures (7,720 kg ha–1) but greater than legumes in monoculture (3,335 kg ha–1). After cover crops were terminated in late spring using a roller crimper, corn and soybeans were planted and horseweed was evaluated using density counts, visible ratings, and biomass collection until harvest. Forage radish winterkilled, offering no competition in late winter or biomass to contribute to horseweed suppression after termination. Excluding forage radish in monoculture, no difference in horseweed suppression was detected between cereal rye–containing cover crops and legumes (crimson clover and hairy vetch) in monoculture. Likewise, horseweed suppression was similar between monocultures and mixtures, with the exception of one site-year in which mixtures provided better suppression. In this experiment, the cover crop treatments performed as well as or better than the fall-applied residual herbicides, flumioxazin+paraquat and metribuzin+chlorimuron-ethyl. These results indicate that fall-planted cover crops are a viable option to suppress horseweed and can be an effective part of an integrated weed management program. Furthermore, cover crop mixtures can be used to gain the benefits of legume or brassica cover crop species without sacrificing horseweed suppression.
OBJECTIVES/SPECIFIC AIMS: The purpose of the study was to describe patient characteristics associated with subsequent development of bowel ischemia. Primary outcomes were survival to discharge, 30-day and 1-year survival in patients with LVAD who subsequently develop bowel ischemia. Secondary outcomes included characteristics of patients who survive to discharge after bowel ischemia and those who do not. These included markers of patient condition prior to surgical/endoscopic intervention such as lactate levels, ICU admission, ventilator dependence, vasopressor and renal replacement requirements, as well as presence of sepsis. Of these, we predicted that lactate levels and white blood cell count would be significantly elevated pre- and post-operatively in patients who do not recover from bowel ischemic event. We used Mann-Whitney U Test to examine lactate levels between the two groups as our sample size was <30 and therefore necessitated the use of non-parametric testing. METHODS/STUDY POPULATION: In this single-center retrospective study, we analyzed all patients who underwent durable, CF-LVAD implantation at Duke University Medical Center (DUMC) between January 2008 and November 2018. Patients were screened using CPT codes for abdominal surgical exploration or ICD codes for intestinal vascular insufficiency. Final cohort was selected with confirmed diagnosis of intestinal ischemia based on surgical exploration or endoscopic intervention. Patient characteristics including pre-LVAD comorbidities, indication for LVAD implant, and clinical picture prior to bowel ischemic event were collected. Specific characteristics related to bowel ischemia were summarized, including diagnostic imaging, time from imaging study to operative intervention, and intraoperative details. Patient outcomes including survival to discharge, 30-day-, and 1-year survival were summarized. Patients were stratified based on survival to discharge status. Continuous variables were reported as median and interquartile range and compared using Mann-Whitney U test. Categorical variables were reported as proportions and compared using Fisher’s exact test as appropriate. RESULTS/ANTICIPATED RESULTS: A total of 754 patients underwent durable, CF-LVAD implant at DUMC, of which 21 subsequently developed intestinal ischemia (incidence 2.8%). The majority were male (81%) and treated as destination therapy (76.2%). Ten patients (50%) survived to discharge (one remains hospitalized). The proportions of patients receiving HeartMate II (60% vs. 50%, p=1.0), HeartMate III (20% vs. 10%, p=1.0), and HeartWare (20% vs. 40%, p=0.6) were not significantly different between patients who survived to discharge and patients who did not. Median time from LVAD implant to diagnosis of bowel ischemia did not vary significantly between the patient groups (11.5 days, IQR 34.75 vs. 16.5 days, IQR 173.8; p=0.40), nor did the median time from diagnosis to surgical intervention (264.5 minutes, IQR 497.8 vs. 323 minutes, IQR 440, p=0.82). In the 48 hours leading to diagnosis and intervention, renal replacement therapy (50% vs. 0%, p=0.033) was more prevalent in patients who did not survive to discharge. Differences in pre- and post-operative lactate levels were not significantly different in patient groups. A similar pattern of diagnostic study preference emerged from both groups, with CT being the most common (76.2%) followed by KUB (42.9%). Upper endoscopy/colonoscopy was performed in 7 patients (33.3%), of which 5 also had operative exploration. A total of 19 patients underwent abdominal exploration (90.5%). Nine had large bowel resection (42.9%) while 14 had small bowel resection (66.7% with average 75cm removed). Overall survival at 1-year was 33%. For those making it to discharge (n=10), one year survival was 60%. DISCUSSION/SIGNIFICANCE OF IMPACT: This is the first institutional study to our knowledge to describe intestinal ischemia in patients receiving CF-LVAD therapy. Intestinal ischemia in patients receiving CF-LVAD therapy is associated with high mortality and morbidity. Diagnosis of bowel ischemia should be considered in patients presenting with clinical symptoms of bowel ischemia in addition to requirement of renal replacement therapy. Imaging modalities used were dependent on the clinical situation and were not always necessary prior to intervention. Further investigation is warranted to identify predictors of this morbid complication.
Biological invasions are one of the grand challenges facing society, as exotic species introductions continue to rise and can result in dramatic changes to native ecosystems and economies. The scale of the “biological invasions crisis” spans from hyperlocal to international, involving a myriad of actors focused on mitigating and preventing biological invasions. However, the level of engagement among stakeholders and opportunities to collaboratively solve invasives issues in transdisciplinary ways is poorly understood. The Biological Invasions: Confronting a Crisis workshop engaged a broad group of actors working on various aspects of biological invasions in Virginia, USA—researchers, Extension personnel, educators, local, state, and federal agencies, nongovernmental organizations, and land managers—to discuss their respective roles and how they interact with other groups. Through a series of activities, it became clear that despite shared goals, most groups are not engaging with one another, and that enhanced communication and collaboration among groups is key to designing effective solutions. There is strong support for a multistakeholder coalition to affect change in policy, public education/engagement, and solution design. Confronting the biological invasions crisis will increasingly require engagement among stakeholders.
Historians and some scholars of international relations have long argued that historical contingencies play a critical role in the evolution of the international system, but have not explained whether they do so to a greater extent than in other domains or why such differences may exist. The authors address these lacunae by identifying stable differences between war and other policy domains that render the evolution of the international system more subject to chance events than those other domains. The selection environment of international politics has produced tightly integrated organizations (militaries) as the domain’s key players to a much greater degree than other policy domains. Because there are few players, no law of large numbers holds, and because militaries are tightly integrated, microshocks can reverberate up to macro-organizational levels. The anarchic character of the international system amplifies the impact of these shocks. The authors explore these phenomena in a range of historical examples.
Optimising short- and long-term outcomes for children and patients with CHD depends on continued scientific discovery and translation to clinical improvements in a coordinated effort by multiple stakeholders. Several challenges remain for clinicians, researchers, administrators, patients, and families seeking continuous scientific and clinical advancements in the field. We describe a new integrated research and improvement network – Cardiac Networks United – that seeks to build upon the experience and success achieved to-date to create a new infrastructure for research and quality improvement that will serve the needs of the paediatric and congenital heart community in the future. Existing gaps in data integration and barriers to improvement are described, along with the mission and vision, organisational structure, and early objectives of Cardiac Networks United. Finally, representatives of key stakeholder groups – heart centre executives, research leaders, learning health system experts, and parent advocates – offer their perspectives on the need for this new collaborative effort.