To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter phenology in thirteen economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to four weeks after physiological maturity at multiple sites spread across fourteen states in the southern, northern, and mid-Atlantic U.S. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus species seed shatter was low (0 to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2 to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than ten percent of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
The San Pedro de Atacama oases, located in northern Chile’s hyperarid Atacama Desert, have been occupied for at least 3000 years. Here, we examine cemetery use in the oases, with emphasis on the Middle Period (ca. AD 400–1000). By modeling of a large corpus (n=243) of radiocarbon dates, over 90% of which are direct AMS assays of human bone collagen, we attempt to establish a temporal framework by which to explore the establishment of formalized social inequality in this period. Modeling of these dates at three locally defined scales (all ayllus, inter-ayllu, and intra-ayllu) permit heretofore unavailable insights into the chronological and spatial dimensions of life and mortuary activity in the oases and allow us to better contextualize patterns of social inequality during the dynamic Middle Period. The results of this modeling indicate two distinct peaks of occupation during the Middle Period in San Pedro and document significant temporal variability in cemetery use patterns on both inter- and intra-ayllu scales. These results stress the importance of local social and environmental factors to the occupation of the oases and provide crucial chronological structure for future archaeological and bioarchaeological research in the region.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to four weeks after maturity at multiple sites spread across eleven states in the southern, northern, and mid-Atlantic U.S. From soybean maturity to four weeks after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased as the states moved further north. At soybean maturity, the percent of seed shatter ranged from 1 to 70%. That range had shifted to 5 to 100% (mean: 42%) by 25 days after soybean maturity. There were considerable differences in seed shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output at during certain years.
Establishment of alfalfa by interseeding into corn planted for silage can enhance crop productivity but weed management is a challenge to adoption. Although a glyphosate-based herbicide program could be a simple and effective approach, concerns about herbicide resistance and limitations in available alfalfa varieties exist. Field experiments were conducted to compare the efficacy and selectivity of preemergence (PRE), postemergence (POST) and PRE followed by POST herbicide programs to a glyphosate only strategy when interseeding alfalfa into corn. Experiment 1 compared PRE applications of acetochlor, mesotrione, S-metalochlor, metribuzin, and flumetsulam, and found both rates of acetochlor and metribuzin, and S-metalochlor at 1.1 kg ha-1 were the most effective and selective PRE herbicides 4 weeks after treatment (WAT), but each resulted in greater overall weed cover than glyphosate by 8 WAT. Experiment 2 evaluated POST applications at early and late timings of bentazon, bromoxynil, 2,4-DB, and mesotrione. Several POST herbicides exhibited similar effectiveness and selectivity as glyphosate including early applications of bromoxynil (0.14 kg ha-1) and 2,4-DB (0.84 or 1.68 kg ha-1), as well as late applications of bromoxynil (0.42 kg ha-1), 2,4-DB (0.84 kg ha-1) and mesotrione (0.05 or 0.11 kg ha-1). A third experiment compared applications of acetochlor PRE, bromoxynil POST, and the combination of acetochlor PRE with bromoxynil POST. All treatments were effective and safe for use in this interseeded system, although interseeded alfalfa provided 65-70% weed suppression in corn planted for silage without any herbicide. Herbicide treatments had no observable impacts on corn and alfalfa yields so weed management was likely of limited economic importance. However, weed competitiveness can vary based on several different factors including weed species, density, and site-specific factors, and so further investigations under different environments and conditions are needed.
Reward Deficiency Syndrome (RDS) is an umbrella term for all drug and nondrug addictive behaviors, due to a dopamine deficiency, “hypodopaminergia.” There is an opioid-overdose epidemic in the USA, which may result in or worsen RDS. A paradigm shift is needed to combat a system that is not working. This shift involves the recognition of dopamine homeostasis as the ultimate treatment of RDS via precision, genetically guided KB220 variants, called Precision Behavioral Management (PBM). Recognition of RDS as an endophenotype and an umbrella term in the future DSM 6, following the Research Domain Criteria (RDoC), would assist in shifting this paradigm.
Prolonged survival of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) on environmental surfaces and personal protective equipment may lead to these surfaces transmitting this pathogen to others. We sought to determine the effectiveness of a pulsed-xenon ultraviolet (PX-UV) disinfection system in reducing the load of SARS-CoV-2 on hard surfaces and N95 respirators.
Chamber slides and N95 respirator material were directly inoculated with SARS-CoV-2 and were exposed to different durations of PX-UV.
For hard surfaces, disinfection for 1, 2, and 5 minutes resulted in 3.53 log10, >4.54 log10, and >4.12 log10 reductions in viral load, respectively. For N95 respirators, disinfection for 5 minutes resulted in >4.79 log10 reduction in viral load. PX-UV significantly reduced SARS-CoV-2 on hard surfaces and N95 respirators.
With the potential to rapidly disinfectant environmental surfaces and N95 respirators, PX-UV devices are a promising technology to reduce environmental and personal protective equipment bioburden and to enhance both healthcare worker and patient safety by reducing the risk of exposure to SARS-CoV-2.
Heavy alcohol consumption is associated with poorer cognitive function in older adults. Although understudied in middle-aged adults, the relationship between alcohol and cognition may also be influenced by genetics such as the apolipoprotein (ApoE) ε4 allele, a risk factor for Alzheimer’s disease. We examined the relationship between alcohol consumption, ApoE genotype, and cognition in middle-aged adults and hypothesized that light and/or moderate drinkers (≤2 drinks per day) would show better cognitive performance than heavy drinkers or non-drinkers. Additionally, we hypothesized that the association between alcohol use and cognitive function would differ by ApoE genotype (ε4+ vs. ε4−).
Participants were 1266 men from the Vietnam Era Twin Study of Aging (VETSA; M age = 56; range 51–60) who completed a neuropsychological battery assessing seven cognitive abilities: general cognitive ability (GCA), episodic memory, processing speed, executive function, abstract reasoning, verbal fluency, and visuospatial ability. Alcohol consumption was categorized into five groups: never, former, light, moderate, and heavy.
In fully adjusted models, there was no significant main effect of alcohol consumption on cognitive functions. However, there was a significant interaction between alcohol consumption and ApoE ε4 status for GCA and episodic memory, such that the relationship of alcohol consumption and cognition was stronger in ε4 carriers. The ε4+ heavy drinking subgroup had the poorest GCA and episodic memory.
Presence of the ε4 allele may increase vulnerability to the deleterious effects of heavy alcohol consumption. Beneficial effects of light or moderate alcohol consumption were not observed.
Intensified cover-cropping practices are increasingly viewed as a herbicide-resistance management tool but clear distinction between reactive and proactive resistance management performance targets is needed. We evaluated two proactive performance targets for integrating cover-cropping tactics, including (1) facilitation of reduced herbicide inputs and (2) reduced herbicide selection pressure. We conducted corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] field experiments in Pennsylvania and Delaware using synthetic weed seedbanks of horseweed [Conyza canadensis (L.) Cronquist] and smooth pigweed (Amaranthus hybridus L.) to assess winter and summer annual population dynamics, respectively. The effect of alternative cover crops was evaluated across a range of herbicide inputs. Cover crop biomass production ranged from 2,000 to 8,500 kg ha−1 in corn and 3,000 to 5,500 kg ha−1 in soybean. Experimental results demonstrated that herbicide-based tactics were the primary drivers of total weed biomass production, with cover-cropping tactics providing an additive weed-suppression benefit. Substitution of cover crops for PRE or POST herbicide programs did not reduce total weed control levels or cash crop yields but did result in lower net returns due to higher input costs. Cover-cropping tactics significantly reduced C. canadensis populations in three of four cover crop treatments and decreased the number of large rosettes (>7.6-cm diameter) at the time of preplant herbicide exposure. Substitution of cover crops for PRE herbicides resulted in increased selection pressure on POST herbicides, but reduced the number of large individuals (>10 cm) at POST applications. Collectively, our findings suggest that cover crops can reduce the intensity of selection pressure on POST herbicides, but the magnitude of the effect varies based on weed life-history traits. Additional work is needed to describe proactive resistance management concepts and performance targets for integrating cover crops so producers can apply these concepts in site-specific, within-field management practices.
In 2019, a 42-year-old African man who works as an Ebola virus disease (EVD) researcher traveled from the Democratic Republic of Congo (DRC), near an ongoing EVD epidemic, to Philadelphia and presented to the Hospital of the University of Pennsylvania Emergency Department with altered mental status, vomiting, diarrhea, and fever. He was classified as a “wet” person under investigation for EVD, and his arrival activated our hospital emergency management command center and bioresponse teams. He was found to be in septic shock with multisystem organ dysfunction, including circulatory dysfunction, encephalopathy, metabolic lactic acidosis, acute kidney injury, acute liver injury, and diffuse intravascular coagulation. Critical care was delivered within high-risk pathogen isolation in the ED and in our Special Treatment Unit until a diagnosis of severe cerebral malaria was confirmed and EVD was definitively excluded.
This report discusses our experience activating a longitudinal preparedness program designed for rare, resource-intensive events at hospitals physically remote from any active epidemic but serving a high-volume international air travel port-of-entry.
UK Biobank is a well-characterised cohort of over 500 000 participants including genetics, environmental data and imaging. An online mental health questionnaire was designed for UK Biobank participants to expand its potential.
Describe the development, implementation and results of this questionnaire.
An expert working group designed the questionnaire, using established measures where possible, and consulting a patient group. Operational criteria were agreed for defining likely disorder and risk states, including lifetime depression, mania/hypomania, generalised anxiety disorder, unusual experiences and self-harm, and current post-traumatic stress and hazardous/harmful alcohol use.
A total of 157 366 completed online questionnaires were available by August 2017. Participants were aged 45–82 (53% were ≥65 years) and 57% women. Comparison of self-reported diagnosed mental disorder with a contemporary study shows a similar prevalence, despite respondents being of higher average socioeconomic status. Lifetime depression was a common finding, with 24% (37 434) of participants meeting criteria and current hazardous/harmful alcohol use criteria were met by 21% (32 602), whereas other criteria were met by less than 8% of the participants. There was extensive comorbidity among the syndromes. Mental disorders were associated with a high neuroticism score, adverse life events and long-term illness; addiction and bipolar affective disorder in particular were associated with measures of deprivation.
The UK Biobank questionnaire represents a very large mental health survey in itself, and the results presented here show high face validity, although caution is needed because of selection bias. Built into UK Biobank, these data intersect with other health data to offer unparalleled potential for crosscutting biomedical research involving mental health.
Non-invasive prenatal testing (NIPT) for the detection of foetal aneuploidy through analysis of cell-free DNA (cfDNA) in maternal blood is offered routinely by many healthcare providers across the developed world. This testing has recently been recommended for evaluative implementation in the UK National Health Service (NHS) foetal anomaly screening pathway as a contingent screen following an increased risk of trisomy 21, 18 or 13. In preparation for delivering a national service, we have implemented cfDNA-based NIPT in our Regional Genetics Laboratory. Here, we describe our validation and verification processes and initial experiences of the technology prior to rollout of a national screening service.
Data are presented from more than 1000 patients (215 retrospective and 840 prospective) from ‘high- and low-risk pregnancies’ with outcome data following birth or confirmatory invasive prenatal sampling. NIPT was by the Illumina Verifi® test.
Our data confirm a high-fidelity service with a failure rate of ~0.24% and a high sensitivity and specificity for the detection of foetal trisomy 13, 18 and 21. Secondly, the data show that a significant proportion of patients continue their pregnancies without prenatal invasive testing or intervention after receiving a high-risk cfDNA-based result. A total of 46.5% of patients referred to date were referred for reasons other than high screen risk. Ten percent (76/840 clinical service referrals) of patients were referred with ultrasonographic finding of a foetal structural anomaly, and data analysis indicates high- and low-risk scan indications for NIPT.
NIPT can be successfully implemented into NHS regional genetics laboratories to provide high-quality services. NHS provision of NIPT in patients with high-risk screen results will allow for a reduction of invasive testing and partially improve equality of access to cfDNA-based NIPT in the pregnant population. Patients at low risk for a classic trisomy or with other clinical indications are likely to continue to access cfDNA-based NIPT as a private test.
Recent years have seen an exponential increase in the variety of healthcare data captured across numerous sources. However, mechanisms to leverage these data sources to support scientific investigation have remained limited. In 2013 the Pediatric Heart Network (PHN), funded by the National Heart, Lung, and Blood Institute, developed the Integrated CARdiac Data and Outcomes (iCARD) Collaborative with the goals of leveraging available data sources to aid in efficiently planning and conducting PHN studies; supporting integration of PHN data with other sources to foster novel research otherwise not possible; and mentoring young investigators in these areas. This review describes lessons learned through the development of iCARD, initial efforts and scientific output, challenges, and future directions. This information can aid in the use and optimisation of data integration methodologies across other research networks and organisations.
Much of the peace agreement durability literature assumes that stronger peace agreements are more likely to survive the trials of the post-conflict environment. This work does an excellent job identifying which provisions indicate that agreements are more likely to endure. However, there is no widely accepted way to directly measure the strength of agreements, and existing measures suffer from a lack of nuance or reliance on subjective weighting. We use a Bayesian item response theory model to develop a principled measure of the latent strength of peace agreements in civil conflicts from 1975 to 2005. We illustrate the measure's utility by exploring how various international factors such as sanctions and mediation contribute to the strength or weakness of agreements.
A 2018 workshop on the White Mountain Apache Tribe lands in Arizona examined ways to enhance investigations into cultural property crime (CPC) through applications of rapidly evolving methods from archaeological science. CPC (also looting, graverobbing) refers to unauthorized damage, removal, or trafficking in materials possessing blends of communal, aesthetic, and scientific values. The Fort Apache workshop integrated four generally partitioned domains of CPC expertise: (1) theories of perpetrators’ motivations and methods; (2) recommended practice in sustaining public and community opposition to CPC; (3) tactics and strategies for documenting, investigating, and prosecuting CPC; and (4) forensic sedimentology—uses of biophysical sciences to link sediments from implicated persons and objects to crime scenes. Forensic sedimentology served as the touchstone for dialogues among experts in criminology, archaeological sciences, law enforcement, and heritage stewardship. Field visits to CPC crime scenes and workshop deliberations identified pathways toward integrating CPC theory and practice with forensic sedimentology’s potent battery of analytic methods.
Surgery for CHD has been slow to develop in parts of the former Soviet Union. The impact of an 8-year surgical assistance programme between an emerging centre and a multi-disciplinary international team that comprised healthcare professionals from developed cardiac programmes is analysed and presented.
Material and methods
The international paediatric assistance programme included five main components – intermittent clinical visits to the site annually, medical education, biomedical engineering support, nurse empowerment, and team-based practice development. Data were analysed from visiting teams and local databases before and since commencement of assistance in 2007 (era A: 2000–2007; era B: 2008–2015). The following variables were compared between periods: annual case volume, operative mortality, case complexity based on Risk Adjustment for Congenital Heart Surgery (RACHS-1), and RACHS-adjusted standardised mortality ratio.
A total of 154 RACHS-classifiable operations were performed during era A, with a mean annual case volume by local surgeons of 19.3 at 95% confidence interval 14.3–24.2, with an operative mortality of 4.6% and a standardised mortality ratio of 2.1. In era B, surgical volume increased to a mean of 103.1 annual cases (95% confidence interval 69.1–137.2, p<0.0001). There was a non-significant (p=0.84) increase in operative mortality (5.7%), but a decrease in standardised mortality ratio (1.2) owing to an increase in case complexity. In era B, the proportion of local surgeon-led surgeries during visits from the international team increased from 0% (0/27) in 2008 to 98% (58/59) in the final year of analysis.
The model of assistance described in this report led to improved adjusted mortality, increased case volume, complexity, and independent operating skills.