To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter phenology in thirteen economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to four weeks after physiological maturity at multiple sites spread across fourteen states in the southern, northern, and mid-Atlantic U.S. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus species seed shatter was low (0 to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2 to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than ten percent of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to four weeks after maturity at multiple sites spread across eleven states in the southern, northern, and mid-Atlantic U.S. From soybean maturity to four weeks after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased as the states moved further north. At soybean maturity, the percent of seed shatter ranged from 1 to 70%. That range had shifted to 5 to 100% (mean: 42%) by 25 days after soybean maturity. There were considerable differences in seed shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output at during certain years.
Few studies have examined burnout in psychosocial oncology clinicians. The aim of this systematic review was to summarize what is known about the prevalence and severity of burnout in psychosocial clinicians who work in oncology settings and the factors that are believed to contribute or protect against it.
Articles on burnout (including compassion fatigue and secondary trauma) in psychosocial oncology clinicians were identified by searching PubMed/MEDLINE, EMBASE, PsycINFO, the Cumulative Index to Nursing and Allied Health Literature, and the Web of Science Core Collection.
Thirty-eight articles were reviewed at the full-text level, and of those, nine met study inclusion criteria. All were published between 2004 and 2018 and included data from 678 psychosocial clinicians. Quality assessment revealed relatively low risk of bias and high methodological quality. Study composition and sample size varied greatly, and the majority of clinicians were aged between 40 and 59 years. Across studies, 10 different measures were used to assess burnout, secondary traumatic stress, and compassion fatigue, in addition to factors that might impact burnout, including work engagement, meaning, and moral distress. When compared with other medical professionals, psychosocial oncology clinicians endorsed lower levels of burnout.
Significance of results
This systematic review suggests that psychosocial clinicians are not at increased risk of burnout compared with other health care professionals working in oncology or in mental health. Although the data are quite limited, several factors appear to be associated with less burnout in psychosocial clinicians, including exposure to patient recovery, discussing traumas, less moral distress, and finding meaning in their work. More research using standardized measures of burnout with larger samples of clinicians is needed to examine both prevalence rates and how the experience of burnout changes over time. By virtue of their training, psychosocial clinicians are well placed to support each other and their nursing and medical colleagues.
Radiocarbon (14C) ages cannot provide absolutely dated chronologies for archaeological or paleoenvironmental studies directly but must be converted to calendar age equivalents using a calibration curve compensating for fluctuations in atmospheric 14C concentration. Although calibration curves are constructed from independently dated archives, they invariably require revision as new data become available and our understanding of the Earth system improves. In this volume the international 14C calibration curves for both the Northern and Southern Hemispheres, as well as for the ocean surface layer, have been updated to include a wealth of new data and extended to 55,000 cal BP. Based on tree rings, IntCal20 now extends as a fully atmospheric record to ca. 13,900 cal BP. For the older part of the timescale, IntCal20 comprises statistically integrated evidence from floating tree-ring chronologies, lacustrine and marine sediments, speleothems, and corals. We utilized improved evaluation of the timescales and location variable 14C offsets from the atmosphere (reservoir age, dead carbon fraction) for each dataset. New statistical methods have refined the structure of the calibration curves while maintaining a robust treatment of uncertainties in the 14C ages, the calendar ages and other corrections. The inclusion of modeled marine reservoir ages derived from a three-dimensional ocean circulation model has allowed us to apply more appropriate reservoir corrections to the marine 14C data rather than the previous use of constant regional offsets from the atmosphere. Here we provide an overview of the new and revised datasets and the associated methods used for the construction of the IntCal20 curve and explore potential regional offsets for tree-ring data. We discuss the main differences with respect to the previous calibration curve, IntCal13, and some of the implications for archaeology and geosciences ranging from the recent past to the time of the extinction of the Neanderthals.
Intensified cover-cropping practices are increasingly viewed as a herbicide-resistance management tool but clear distinction between reactive and proactive resistance management performance targets is needed. We evaluated two proactive performance targets for integrating cover-cropping tactics, including (1) facilitation of reduced herbicide inputs and (2) reduced herbicide selection pressure. We conducted corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] field experiments in Pennsylvania and Delaware using synthetic weed seedbanks of horseweed [Conyza canadensis (L.) Cronquist] and smooth pigweed (Amaranthus hybridus L.) to assess winter and summer annual population dynamics, respectively. The effect of alternative cover crops was evaluated across a range of herbicide inputs. Cover crop biomass production ranged from 2,000 to 8,500 kg ha−1 in corn and 3,000 to 5,500 kg ha−1 in soybean. Experimental results demonstrated that herbicide-based tactics were the primary drivers of total weed biomass production, with cover-cropping tactics providing an additive weed-suppression benefit. Substitution of cover crops for PRE or POST herbicide programs did not reduce total weed control levels or cash crop yields but did result in lower net returns due to higher input costs. Cover-cropping tactics significantly reduced C. canadensis populations in three of four cover crop treatments and decreased the number of large rosettes (>7.6-cm diameter) at the time of preplant herbicide exposure. Substitution of cover crops for PRE herbicides resulted in increased selection pressure on POST herbicides, but reduced the number of large individuals (>10 cm) at POST applications. Collectively, our findings suggest that cover crops can reduce the intensity of selection pressure on POST herbicides, but the magnitude of the effect varies based on weed life-history traits. Additional work is needed to describe proactive resistance management concepts and performance targets for integrating cover crops so producers can apply these concepts in site-specific, within-field management practices.
We describe 14 yr of public data from the Parkes Pulsar Timing Array (PPTA), an ongoing project that is producing precise measurements of pulse times of arrival from 26 millisecond pulsars using the 64-m Parkes radio telescope with a cadence of approximately 3 weeks in three observing bands. A comprehensive description of the pulsar observing systems employed at the telescope since 2004 is provided, including the calibration methodology and an analysis of the stability of system components. We attempt to provide full accounting of the reduction from the raw measured Stokes parameters to pulse times of arrival to aid third parties in reproducing our results. This conversion is encapsulated in a processing pipeline designed to track provenance. Our data products include pulse times of arrival for each of the pulsars along with an initial set of pulsar parameters and noise models. The calibrated pulse profiles and timing template profiles are also available. These data represent almost 21 000 h of recorded data spanning over 14 yr. After accounting for processes that induce time-correlated noise, 22 of the pulsars have weighted root-mean-square timing residuals of
in at least one radio band. The data should allow end users to quickly undertake their own gravitational wave analyses, for example, without having to understand the intricacies of pulsar polarisation calibration or attain a mastery of radio frequency interference mitigation as is required when analysing raw data files.
This systematic review examines the effectiveness and cost-effectiveness of behavioural health integration into primary healthcare in the management of depression and unhealthy alcohol use in low- and middle-income countries. Following PRISMA guidelines, this review included research that studied patients aged ≥18 years with unhealthy alcohol use and/or depression of any clinical severity. An exploration of the models of integration was used to characterise a typology of behavioural health integration specific for low- and middle-income countries.
Fifty-eight articles met inclusion criteria. Studies evidenced increased effectiveness of integrated care over treatment as usual for both conditions. The economic evaluations found increased direct health costs but cost-effective estimates. The included studies used six distinct behavioural health integration models.
Behavioural health integration may yield improved health outcomes, although it may require additional resources. The proposed typology can assist decision-makers to advance the implementation of integrated models.
People with ‘prodromal’ symptoms have a very high risk of developing psychosis. We used functional MRI to examine the neurocognitive basis of this vulnerability.
Cross-sectional comparison of subjects with an ARMS (n=17), first episode schizophreniform psychosis (n=10) and healthy volunteers (n=15). Subjects were studied using functional MRI while they performed an overt verbal fluency task, a random movement generation paradigm and an N-Back working memory task.
During an N-Back task the ARMS group engaged inferior frontal and posterior parietal cortex less than controls but more than the first episode group. During a motor generation task, the ARMS group showed less activation in the left inferior parietal cortex than controls, but greater activation than the first episode group. During verbal fluency using ‘Easy’ letters, the ARMS group demonstrated intermediate activation in the left inferior frontal cortex, with first episode groups showing least, and controls most, activation. When processing ‘Hard’ letters, differential activation was evident in two left inferior frontal regions. In its dorsolateral portion, the ARMS group showed less activation than controls but more than the first episode group, while in the opercular part of the left inferior frontal gyrus / anterior insula activation was greatest in the first episode group, weakest in controls and intermediate in the ARMS group.
The ARMS is associated with abnormalities of regional brain function that are qualitatively similar to those in patients who have just developed psychosis but less severe.
Lumateperone (ITI-007) is in late-phase clinical development for schizophrenia. Lumateperone has a unique mechanism of action that modulates serotonin, dopamine, and glutamate neurotransmission. This pooled analysis of lumateperone in 3 randomized, double-blind, placebo-controlled studies was conducted to evaluate the safety and tolerability of lumateperone 42mg (ITI-007 60mg).
Data were pooled from the 3 controlled late-phase studies of lumateperone 42mg in patients with acute exacerbation of schizophrenia. Safety assessments of all patients who received at least one dose of any treatment included treatment-emergent adverse events (TEAEs), changes in laboratory parameters, extrapyramidal symptoms (EPS), and vital signs.
The safety population comprised 1,073 patients (placebo [n=412], lumateperone 42mg [n=406], risperidone [n=255]). TEAEs that occurred in the lumateperone 42mg group at a rate of ≥5% and twice placebo were somnolence/sedation (24.1% vs 10.0%) and dry mouth (5.9% vs 2.2%). Rates of discontinuation due to TEAEs with lumateperone 42mg (0.5%) were similar to placebo (0.5%) and lower than risperidone (4.7%). Mean change in weight and rates of EPS-related TEAEs were less for lumateperone 42mg and placebo patients than risperidone patients. Mean change from baseline in metabolic parameters were similar or smaller for lumateperone 42mg vs placebo. Mean changes were notably higher in risperidone patients vs lumateperone 42mg and placebo for glucose, cholesterol, triglycerides, and prolactin.
In this pooled analysis, lumateperone 42mg showed good tolerability with potential benefits over risperidone for metabolic, prolactin, and EPS risks. The only TEAE that occurred in >10% of lumateperone patients was somnolence/sedation, which was impacted by morning administration; in subsequent studies that administered lumateperone in the evening, somnolence/sedation rates were markedly reduced. These results suggest that lumateperone 42mg may be a promising new treatment for schizophrenia.
Supported by funding from Intra-Cellular Therapies, Inc.
UK Biobank is a well-characterised cohort of over 500 000 participants including genetics, environmental data and imaging. An online mental health questionnaire was designed for UK Biobank participants to expand its potential.
Describe the development, implementation and results of this questionnaire.
An expert working group designed the questionnaire, using established measures where possible, and consulting a patient group. Operational criteria were agreed for defining likely disorder and risk states, including lifetime depression, mania/hypomania, generalised anxiety disorder, unusual experiences and self-harm, and current post-traumatic stress and hazardous/harmful alcohol use.
A total of 157 366 completed online questionnaires were available by August 2017. Participants were aged 45–82 (53% were ≥65 years) and 57% women. Comparison of self-reported diagnosed mental disorder with a contemporary study shows a similar prevalence, despite respondents being of higher average socioeconomic status. Lifetime depression was a common finding, with 24% (37 434) of participants meeting criteria and current hazardous/harmful alcohol use criteria were met by 21% (32 602), whereas other criteria were met by less than 8% of the participants. There was extensive comorbidity among the syndromes. Mental disorders were associated with a high neuroticism score, adverse life events and long-term illness; addiction and bipolar affective disorder in particular were associated with measures of deprivation.
The UK Biobank questionnaire represents a very large mental health survey in itself, and the results presented here show high face validity, although caution is needed because of selection bias. Built into UK Biobank, these data intersect with other health data to offer unparalleled potential for crosscutting biomedical research involving mental health.
Organic grain producers are interested in interseeding cover crops into corn (Zea mays L.) in regions that have a narrow growing season window for post-harvest establishment of cover crops. A field experiment was replicated across 2 years on three commercial organic farms in Pennsylvania to compare the effects of drill- and broadcast-interseeding to standard grower practices, which included post-harvest seeding cereal rye (Secale cereale L.) at the more southern location and winter fallow at the more northern locations. Drill- and broadcast-interseeding treatments occurred just after last cultivation and used a cover crop mixture of annual ryegrass [Lolium perenne L. ssp. multiflorum (Lam.) Husnot] + orchardgrass (Dactylis glomerata L.) + forage radish (Raphanus sativus L. ssp. longipinnatus). Higher mean fall cover crop biomass and forage radish abundance (% of total) was observed in drill-interseeding treatments compared with broadcast-interseeding. However, corn grain yield and weed suppression and N retention in late-fall and spring were similar among interseeding treatments, which suggests that broadcast-interseeding at last cultivation has the potential to produce similar production and conservation benefits at lower labor and equipment costs in organic systems. Post-harvest seeding cereal rye resulted in greater spring biomass production and N retention compared with interseeded cover crops at the southern location, whereas variable interseeding establishment success and dominance of winter-killed forage radish produced conditions that increased the likelihood of N loss at more northern locations. Additional research is needed to contrast conservation benefits and management tradeoffs between interseeding and post-harvest establishment methods.
Organic grain producers are interested in reducing tillage to conserve soil and decrease labor and fuel costs. We examined agronomic and economic tradeoffs associated with alternative strategies for reducing tillage frequency and intensity in a cover crop–soybean (Glycine max L. Merr.) sequence within a corn (Zea mays L.)–soybean–spelt (Triticum spelta L.) organic cropping system experiment in Pennsylvania. Tillage-based soybean production preceded by a cover crop mixture of annual ryegrass (Lolium perenne L. ssp. multiflorum), orchardgrass (Dactylis glomerata L.) and forage radish (Raphanus sativus L.) interseeded into corn grain (Z. mays L.) was compared with reduced-tillage soybean production preceded by roller-crimped cereal rye (Secale cereale L.) that was sown after corn silage. Total aboveground weed biomass did not differ between soybean production strategies. Each strategy, however, was characterized by high inter-annual variability in weed abundance. Tillage-based soybean production marginally increased grain yield by 0.28 Mg ha−1 compared with reduced-tillage soybean. A path model of soybean yield indicated that soybean stand establishment and weed biomass were primary drivers of yield, but soybean production strategy had a measurable effect on yields due to factors other than within-season weed–crop competition. Cumulative tillage frequency and intensity were quantified for each cover crop—sequence using the Soil Tillage Intensity Rating (STIR) index. The reduced-tillage soybean sequence resulted in 50% less soil disturbance compared to tillage-based soybean sequence across study years. Finally, enterprise budget comparisons showed that the reduced-tillage soybean sequence resulted in lower input costs than the tillage-based soybean sequence but was approximately $114 ha−1 less profitable because of lower average yields.
Resource allocation planning for emergency medical services (EMS) systems determines appropriate resources including what paramedic qualification and how rapidly to respond to patients for optimal outcomes. The British Columbia Emergency Health Services implemented a revised response plan in 2013.
A pre- and post-methodology was used to evaluate the effect of the resource allocation plan revision on 24-hour mortality. All adult cases with evaluable outcome data (obtained through linked provincial health administrative data) were analyzed. Multivariable logistic regression was used to adjust for variations in other significant associated factors. Interrupted time series analysis was used to estimate immediate changes in level or trend of outcome after the start of the revised resource allocation plan implementation, while simultaneously controlling for pre-existing trends.
The derived cohort comprised 562,546 cases (April 2012–March 2015). When adjusted for age, sex, urban/metro region, season, day, hour, and dispatch determinant, the probability of dying within 24 hours of an EMS call was 7% lower in the post-resource allocation plan-revision cohort (OR = 0.936; 95% CI: 0.886–0.989; p = 0.018). A subgroup analysis of immediately life-threatening cases demonstrated similar effect (OR = 0.890; 95% CI: 0.808–0.981; p = 0.019). Using time series analysis, the descending changes in overall 24-hour mortality trend and the 24-hour mortality trend in immediately life-threatening cases, were both statistically significant (p < 0.001).
Comprehensive, evidence-informed reconstruction of a provincial EMS resource allocation plan is feasible. Despite change in crew level response and resource allocation, there was significant decrease in 24-hour mortality in this pan-provincial population-based cohort.
Current surveillance for healthcare-associated (HA) urinary tract infection (UTI) is focused on catheter-associated infection with hospital onset (HO-CAUTI), yet this surveillance does not represent the full burden of HA-UTI to patients. Our objective was to measure the incidence of potentially HA, community-onset (CO) UTI in a retrospective cohort of hospitalized patients.
Retrospective cohort study.
Academic, quaternary care, referral center.
Hospitalized adults at risk for HA-UTI from May 2009 to December 2011 were included.
Patients who did not experience a UTI during the index hospitalization were followed for 30 days post discharge to identify cases of potentially HA-CO UTI.
We identified 3,273 patients at risk for potentially HA-CO UTI. The incidence of HA-CO UTI in the 30 days post discharge was 29.8 per 1,000 patients. Independent risk factors of HA-CO UTI included paraplegia or quadriplegia (adjusted odds ratio [aOR], 4.6; 95% confidence interval [CI], 1.2–18.0), indwelling catheter during index hospitalization (aOR, 1.5; 95% CI, 1.0–2.3), prior piperacillin-tazobactam prescription (aOR, 2.3; 95% CI, 1.1–4.5), prior penicillin class prescription (aOR, 1.7; 95% CI, 1.0–2.8), and private insurance (aOR, 0.6; 95% CI, 0.4–0.9).
HA-CO UTI may be common within 30 days following hospital discharge. These data suggest that surveillance efforts may need to be expanded to capture the full burden to patients and better inform antibiotic prescribing decisions for patients with a history of hospitalization.
A 2018 workshop on the White Mountain Apache Tribe lands in Arizona examined ways to enhance investigations into cultural property crime (CPC) through applications of rapidly evolving methods from archaeological science. CPC (also looting, graverobbing) refers to unauthorized damage, removal, or trafficking in materials possessing blends of communal, aesthetic, and scientific values. The Fort Apache workshop integrated four generally partitioned domains of CPC expertise: (1) theories of perpetrators’ motivations and methods; (2) recommended practice in sustaining public and community opposition to CPC; (3) tactics and strategies for documenting, investigating, and prosecuting CPC; and (4) forensic sedimentology—uses of biophysical sciences to link sediments from implicated persons and objects to crime scenes. Forensic sedimentology served as the touchstone for dialogues among experts in criminology, archaeological sciences, law enforcement, and heritage stewardship. Field visits to CPC crime scenes and workshop deliberations identified pathways toward integrating CPC theory and practice with forensic sedimentology’s potent battery of analytic methods.
Invasions can be genetically diverse, and that diversity may have implications for invasion management in terms of resistance or tolerance to control methods. We analyzed the population genetics of Russian-olive (Elaeagnus angustifolia L.), an ecologically important and common invasive tree found in many western U.S. riparian areas. We found three cpDNA haplotypes and, using 11 microsatellite loci, identified three genetic clusters in the 460 plants from 46 populations in the western United States. We found high levels of polymorphism in the microsatellites (5 to 15 alleles per locus; 106 alleles total). Our native-range sampling was limited, and we did not find a genetic match for the most common cpDNA invasive haplotype or a strong confirmation of origin for the most common microsatellite genetic cluster. We did not find geographic population structure (isolation by distance) across the U.S. invasion, but we did identify invasive populations that had the most diversity, and we suggest these as choices for initial biological control–release monitoring. Accessions from each genetic cluster, which coarsely represent the range of genetic diversity found in the invasion, are now included in potential classical biological control agent efficacy testing.
Proactive integrated weed management (IWM) is critically needed in no-till production to reduce the intensity of selection pressure for herbicide-resistant weeds. Reducing the density of emerged weed populations and the number of larger individuals within the population at the time of herbicide application are two practical management objectives when integrating cover crops as a complementary tactic in herbicide-based production systems. We examined the following demographic questions related to the effects of alternative cover-cropping tactics following small grain harvest on preplant, burndown management of horseweed (Erigeron canadensis L.) in no-till commodity-grain production: (1) Do cover crops differentially affect E. canadensis density and size inequality at the time of herbicide exposure? (2) Which cover crop response traits are drivers of E. canadensis suppression at time of herbicide exposure? Interannual variation in growing conditions (study year) and intra-annual variation in soil fertility (low vs. high nitrogen) were the primary drivers of cover crop response traits and significantly affected E. canadensis density at the time of herbicide exposure. In comparison to the fallow control, cover crop treatments reduced E. canadensis density 52% to 86% at the time of a preplant, burndown application. Cereal rye (Secale cereale L.) alone or in combination with forage radish (Raphanus sativus L.) provided the most consistent E. canadensis suppression. Fall and spring cover crop biomass production was negatively correlated with E. canadensis density at the preplant burndown application timing. Our results also show that winter-hardy cover crops reduce the size inequality of E. canadensis populations at the time of herbicide exposure by reducing the number of large individuals within the population. Finally, we advocate for advancement in our understanding of complementarity between cover crop– and herbicide-based management tactics in no-till systems to facilitate development of proactive, herbicide-resistant management strategies.