To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Salt marshes have been lost or degraded as the intensity of human impacts to coastal landscapes has increased due to agriculture, transportation, urban and industrial development, and climate change. Because salt marshes have limited distribution and embody a variety of ecological functions that are important to humans (see ecosystem services, Chapter 15), many societies have recognized the need to preserve remaining marshes, restore those that have been degraded, and create new marshes in areas where they have been lost. An emerging and critical threat to tidal marshes across the globe is increasing rates of sea level rise and other aspects of climate change, which complicates but also heightens the urgency for restoration. By restoration we mean re-establishing natural conditions and the processes needed to support their functions, especially self-maintenance (see Box 17.1). Typically, salt marshes are self-maintaining, with salt tolerant plants, mineral sediments, and tidal flooding interacting to maintain elevation and ecological functions under dynamic conditions (Chapters 4, 7, 8).
Emerson and colleagues (2020) provide new isotopic evidence on directly dated human bone from the Greater Cahokia region. They conclude that maize was not adopted in the region prior to AD 900. Placing this result within the larger context of maize histories in northeastern North America, they suggest that evidence from the lower Great Lakes and St. Lawrence River valley for earlier maize is “enigmatic” and “perplexing.” Here, we review that evidence, accumulated over the course of several decades, and question why Emerson and colleagues felt the need to offer opinions on that evidence without providing any new contradictory empirical evidence for the region.
Ecosystem modeling, a pillar of the systems ecology paradigm (SEP), addresses questions such as, how much carbon and nitrogen are cycled within ecological sites, landscapes, or indeed the earth system? Or how are human activities modifying these flows? Modeling, when coupled with field and laboratory studies, represents the essence of the SEP in that they embody accumulated knowledge and generate hypotheses to test understanding of ecosystem processes and behavior. Initially, ecosystem models were primarily used to improve our understanding about how biophysical aspects of ecosystems operate. However, current ecosystem models are widely used to make accurate predictions about how large-scale phenomena such as climate change and management practices impact ecosystem dynamics and assess potential effects of these changes on economic activity and policy making. In sum, ecosystem models embedded in the SEP remain our best mechanism to integrate diverse types of knowledge regarding how the earth system functions and to make quantitative predictions that can be confronted with observations of reality. Modeling efforts discussed are the Century ecosystem model, DayCent ecosystem model, Grassland Ecosystem Model ELM, food web models, Savanna model, agent-based and coupled systems modeling, and Bayesian modeling.
During the Randomized Assessment of Rapid Endovascular Treatment (EVT) of Ischemic Stroke (ESCAPE) trial, patient-level micro-costing data were collected. We report a cost-effectiveness analysis of EVT, using ESCAPE trial data and Markov simulation, from a universal, single-payer system using a societal perspective over a patient’s lifetime.
Primary data collection alongside the ESCAPE trial provided a 3-month trial-specific, non-model, based cost per quality-adjusted life year (QALY). A Markov model utilizing ongoing lifetime costs and life expectancy from the literature was built to simulate the cost per QALY adopting a lifetime horizon. Health states were defined using the modified Rankin Scale (mRS) scores. Uncertainty was explored using scenario analysis and probabilistic sensitivity analysis.
The 3-month trial-based analysis resulted in a cost per QALY of $201,243 of EVT compared to the best standard of care. In the model-based analysis, using a societal perspective and a lifetime horizon, EVT dominated the standard of care; EVT was both more effective and less costly than the standard of care (−$91). When the time horizon was shortened to 1 year, EVT remains cost savings compared to standard of care (∼$15,376 per QALY gained with EVT). However, if the estimate of clinical effectiveness is 4% less than that demonstrated in ESCAPE, EVT is no longer cost savings compared to standard of care.
Results support the adoption of EVT as a treatment option for acute ischemic stroke, as the increase in costs associated with caring for EVT patients was recouped within the first year of stroke, and continued to provide cost savings over a patient’s lifetime.
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Few studies have examined burnout in psychosocial oncology clinicians. The aim of this systematic review was to summarize what is known about the prevalence and severity of burnout in psychosocial clinicians who work in oncology settings and the factors that are believed to contribute or protect against it.
Articles on burnout (including compassion fatigue and secondary trauma) in psychosocial oncology clinicians were identified by searching PubMed/MEDLINE, EMBASE, PsycINFO, the Cumulative Index to Nursing and Allied Health Literature, and the Web of Science Core Collection.
Thirty-eight articles were reviewed at the full-text level, and of those, nine met study inclusion criteria. All were published between 2004 and 2018 and included data from 678 psychosocial clinicians. Quality assessment revealed relatively low risk of bias and high methodological quality. Study composition and sample size varied greatly, and the majority of clinicians were aged between 40 and 59 years. Across studies, 10 different measures were used to assess burnout, secondary traumatic stress, and compassion fatigue, in addition to factors that might impact burnout, including work engagement, meaning, and moral distress. When compared with other medical professionals, psychosocial oncology clinicians endorsed lower levels of burnout.
Significance of results
This systematic review suggests that psychosocial clinicians are not at increased risk of burnout compared with other health care professionals working in oncology or in mental health. Although the data are quite limited, several factors appear to be associated with less burnout in psychosocial clinicians, including exposure to patient recovery, discussing traumas, less moral distress, and finding meaning in their work. More research using standardized measures of burnout with larger samples of clinicians is needed to examine both prevalence rates and how the experience of burnout changes over time. By virtue of their training, psychosocial clinicians are well placed to support each other and their nursing and medical colleagues.
Radiocarbon (14C) ages cannot provide absolutely dated chronologies for archaeological or paleoenvironmental studies directly but must be converted to calendar age equivalents using a calibration curve compensating for fluctuations in atmospheric 14C concentration. Although calibration curves are constructed from independently dated archives, they invariably require revision as new data become available and our understanding of the Earth system improves. In this volume the international 14C calibration curves for both the Northern and Southern Hemispheres, as well as for the ocean surface layer, have been updated to include a wealth of new data and extended to 55,000 cal BP. Based on tree rings, IntCal20 now extends as a fully atmospheric record to ca. 13,900 cal BP. For the older part of the timescale, IntCal20 comprises statistically integrated evidence from floating tree-ring chronologies, lacustrine and marine sediments, speleothems, and corals. We utilized improved evaluation of the timescales and location variable 14C offsets from the atmosphere (reservoir age, dead carbon fraction) for each dataset. New statistical methods have refined the structure of the calibration curves while maintaining a robust treatment of uncertainties in the 14C ages, the calendar ages and other corrections. The inclusion of modeled marine reservoir ages derived from a three-dimensional ocean circulation model has allowed us to apply more appropriate reservoir corrections to the marine 14C data rather than the previous use of constant regional offsets from the atmosphere. Here we provide an overview of the new and revised datasets and the associated methods used for the construction of the IntCal20 curve and explore potential regional offsets for tree-ring data. We discuss the main differences with respect to the previous calibration curve, IntCal13, and some of the implications for archaeology and geosciences ranging from the recent past to the time of the extinction of the Neanderthals.
Intensified cover-cropping practices are increasingly viewed as a herbicide-resistance management tool but clear distinction between reactive and proactive resistance management performance targets is needed. We evaluated two proactive performance targets for integrating cover-cropping tactics, including (1) facilitation of reduced herbicide inputs and (2) reduced herbicide selection pressure. We conducted corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] field experiments in Pennsylvania and Delaware using synthetic weed seedbanks of horseweed [Conyza canadensis (L.) Cronquist] and smooth pigweed (Amaranthus hybridus L.) to assess winter and summer annual population dynamics, respectively. The effect of alternative cover crops was evaluated across a range of herbicide inputs. Cover crop biomass production ranged from 2,000 to 8,500 kg ha−1 in corn and 3,000 to 5,500 kg ha−1 in soybean. Experimental results demonstrated that herbicide-based tactics were the primary drivers of total weed biomass production, with cover-cropping tactics providing an additive weed-suppression benefit. Substitution of cover crops for PRE or POST herbicide programs did not reduce total weed control levels or cash crop yields but did result in lower net returns due to higher input costs. Cover-cropping tactics significantly reduced C. canadensis populations in three of four cover crop treatments and decreased the number of large rosettes (>7.6-cm diameter) at the time of preplant herbicide exposure. Substitution of cover crops for PRE herbicides resulted in increased selection pressure on POST herbicides, but reduced the number of large individuals (>10 cm) at POST applications. Collectively, our findings suggest that cover crops can reduce the intensity of selection pressure on POST herbicides, but the magnitude of the effect varies based on weed life-history traits. Additional work is needed to describe proactive resistance management concepts and performance targets for integrating cover crops so producers can apply these concepts in site-specific, within-field management practices.
We describe 14 yr of public data from the Parkes Pulsar Timing Array (PPTA), an ongoing project that is producing precise measurements of pulse times of arrival from 26 millisecond pulsars using the 64-m Parkes radio telescope with a cadence of approximately 3 weeks in three observing bands. A comprehensive description of the pulsar observing systems employed at the telescope since 2004 is provided, including the calibration methodology and an analysis of the stability of system components. We attempt to provide full accounting of the reduction from the raw measured Stokes parameters to pulse times of arrival to aid third parties in reproducing our results. This conversion is encapsulated in a processing pipeline designed to track provenance. Our data products include pulse times of arrival for each of the pulsars along with an initial set of pulsar parameters and noise models. The calibrated pulse profiles and timing template profiles are also available. These data represent almost 21 000 h of recorded data spanning over 14 yr. After accounting for processes that induce time-correlated noise, 22 of the pulsars have weighted root-mean-square timing residuals of
in at least one radio band. The data should allow end users to quickly undertake their own gravitational wave analyses, for example, without having to understand the intricacies of pulsar polarisation calibration or attain a mastery of radio frequency interference mitigation as is required when analysing raw data files.
This systematic review examines the effectiveness and cost-effectiveness of behavioural health integration into primary healthcare in the management of depression and unhealthy alcohol use in low- and middle-income countries. Following PRISMA guidelines, this review included research that studied patients aged ≥18 years with unhealthy alcohol use and/or depression of any clinical severity. An exploration of the models of integration was used to characterise a typology of behavioural health integration specific for low- and middle-income countries.
Fifty-eight articles met inclusion criteria. Studies evidenced increased effectiveness of integrated care over treatment as usual for both conditions. The economic evaluations found increased direct health costs but cost-effective estimates. The included studies used six distinct behavioural health integration models.
Behavioural health integration may yield improved health outcomes, although it may require additional resources. The proposed typology can assist decision-makers to advance the implementation of integrated models.
People with ‘prodromal’ symptoms have a very high risk of developing psychosis. We used functional MRI to examine the neurocognitive basis of this vulnerability.
Cross-sectional comparison of subjects with an ARMS (n=17), first episode schizophreniform psychosis (n=10) and healthy volunteers (n=15). Subjects were studied using functional MRI while they performed an overt verbal fluency task, a random movement generation paradigm and an N-Back working memory task.
During an N-Back task the ARMS group engaged inferior frontal and posterior parietal cortex less than controls but more than the first episode group. During a motor generation task, the ARMS group showed less activation in the left inferior parietal cortex than controls, but greater activation than the first episode group. During verbal fluency using ‘Easy’ letters, the ARMS group demonstrated intermediate activation in the left inferior frontal cortex, with first episode groups showing least, and controls most, activation. When processing ‘Hard’ letters, differential activation was evident in two left inferior frontal regions. In its dorsolateral portion, the ARMS group showed less activation than controls but more than the first episode group, while in the opercular part of the left inferior frontal gyrus / anterior insula activation was greatest in the first episode group, weakest in controls and intermediate in the ARMS group.
The ARMS is associated with abnormalities of regional brain function that are qualitatively similar to those in patients who have just developed psychosis but less severe.
Lumateperone (ITI-007) is in late-phase clinical development for schizophrenia. Lumateperone has a unique mechanism of action that modulates serotonin, dopamine, and glutamate neurotransmission. This pooled analysis of lumateperone in 3 randomized, double-blind, placebo-controlled studies was conducted to evaluate the safety and tolerability of lumateperone 42mg (ITI-007 60mg).
Data were pooled from the 3 controlled late-phase studies of lumateperone 42mg in patients with acute exacerbation of schizophrenia. Safety assessments of all patients who received at least one dose of any treatment included treatment-emergent adverse events (TEAEs), changes in laboratory parameters, extrapyramidal symptoms (EPS), and vital signs.
The safety population comprised 1,073 patients (placebo [n=412], lumateperone 42mg [n=406], risperidone [n=255]). TEAEs that occurred in the lumateperone 42mg group at a rate of ≥5% and twice placebo were somnolence/sedation (24.1% vs 10.0%) and dry mouth (5.9% vs 2.2%). Rates of discontinuation due to TEAEs with lumateperone 42mg (0.5%) were similar to placebo (0.5%) and lower than risperidone (4.7%). Mean change in weight and rates of EPS-related TEAEs were less for lumateperone 42mg and placebo patients than risperidone patients. Mean change from baseline in metabolic parameters were similar or smaller for lumateperone 42mg vs placebo. Mean changes were notably higher in risperidone patients vs lumateperone 42mg and placebo for glucose, cholesterol, triglycerides, and prolactin.
In this pooled analysis, lumateperone 42mg showed good tolerability with potential benefits over risperidone for metabolic, prolactin, and EPS risks. The only TEAE that occurred in >10% of lumateperone patients was somnolence/sedation, which was impacted by morning administration; in subsequent studies that administered lumateperone in the evening, somnolence/sedation rates were markedly reduced. These results suggest that lumateperone 42mg may be a promising new treatment for schizophrenia.
Supported by funding from Intra-Cellular Therapies, Inc.
UK Biobank is a well-characterised cohort of over 500 000 participants including genetics, environmental data and imaging. An online mental health questionnaire was designed for UK Biobank participants to expand its potential.
Describe the development, implementation and results of this questionnaire.
An expert working group designed the questionnaire, using established measures where possible, and consulting a patient group. Operational criteria were agreed for defining likely disorder and risk states, including lifetime depression, mania/hypomania, generalised anxiety disorder, unusual experiences and self-harm, and current post-traumatic stress and hazardous/harmful alcohol use.
A total of 157 366 completed online questionnaires were available by August 2017. Participants were aged 45–82 (53% were ≥65 years) and 57% women. Comparison of self-reported diagnosed mental disorder with a contemporary study shows a similar prevalence, despite respondents being of higher average socioeconomic status. Lifetime depression was a common finding, with 24% (37 434) of participants meeting criteria and current hazardous/harmful alcohol use criteria were met by 21% (32 602), whereas other criteria were met by less than 8% of the participants. There was extensive comorbidity among the syndromes. Mental disorders were associated with a high neuroticism score, adverse life events and long-term illness; addiction and bipolar affective disorder in particular were associated with measures of deprivation.
The UK Biobank questionnaire represents a very large mental health survey in itself, and the results presented here show high face validity, although caution is needed because of selection bias. Built into UK Biobank, these data intersect with other health data to offer unparalleled potential for crosscutting biomedical research involving mental health.
Organic grain producers are interested in interseeding cover crops into corn (Zea mays L.) in regions that have a narrow growing season window for post-harvest establishment of cover crops. A field experiment was replicated across 2 years on three commercial organic farms in Pennsylvania to compare the effects of drill- and broadcast-interseeding to standard grower practices, which included post-harvest seeding cereal rye (Secale cereale L.) at the more southern location and winter fallow at the more northern locations. Drill- and broadcast-interseeding treatments occurred just after last cultivation and used a cover crop mixture of annual ryegrass [Lolium perenne L. ssp. multiflorum (Lam.) Husnot] + orchardgrass (Dactylis glomerata L.) + forage radish (Raphanus sativus L. ssp. longipinnatus). Higher mean fall cover crop biomass and forage radish abundance (% of total) was observed in drill-interseeding treatments compared with broadcast-interseeding. However, corn grain yield and weed suppression and N retention in late-fall and spring were similar among interseeding treatments, which suggests that broadcast-interseeding at last cultivation has the potential to produce similar production and conservation benefits at lower labor and equipment costs in organic systems. Post-harvest seeding cereal rye resulted in greater spring biomass production and N retention compared with interseeded cover crops at the southern location, whereas variable interseeding establishment success and dominance of winter-killed forage radish produced conditions that increased the likelihood of N loss at more northern locations. Additional research is needed to contrast conservation benefits and management tradeoffs between interseeding and post-harvest establishment methods.
Organic grain producers are interested in reducing tillage to conserve soil and decrease labor and fuel costs. We examined agronomic and economic tradeoffs associated with alternative strategies for reducing tillage frequency and intensity in a cover crop–soybean (Glycine max L. Merr.) sequence within a corn (Zea mays L.)–soybean–spelt (Triticum spelta L.) organic cropping system experiment in Pennsylvania. Tillage-based soybean production preceded by a cover crop mixture of annual ryegrass (Lolium perenne L. ssp. multiflorum), orchardgrass (Dactylis glomerata L.) and forage radish (Raphanus sativus L.) interseeded into corn grain (Z. mays L.) was compared with reduced-tillage soybean production preceded by roller-crimped cereal rye (Secale cereale L.) that was sown after corn silage. Total aboveground weed biomass did not differ between soybean production strategies. Each strategy, however, was characterized by high inter-annual variability in weed abundance. Tillage-based soybean production marginally increased grain yield by 0.28 Mg ha−1 compared with reduced-tillage soybean. A path model of soybean yield indicated that soybean stand establishment and weed biomass were primary drivers of yield, but soybean production strategy had a measurable effect on yields due to factors other than within-season weed–crop competition. Cumulative tillage frequency and intensity were quantified for each cover crop—sequence using the Soil Tillage Intensity Rating (STIR) index. The reduced-tillage soybean sequence resulted in 50% less soil disturbance compared to tillage-based soybean sequence across study years. Finally, enterprise budget comparisons showed that the reduced-tillage soybean sequence resulted in lower input costs than the tillage-based soybean sequence but was approximately $114 ha−1 less profitable because of lower average yields.
Resource allocation planning for emergency medical services (EMS) systems determines appropriate resources including what paramedic qualification and how rapidly to respond to patients for optimal outcomes. The British Columbia Emergency Health Services implemented a revised response plan in 2013.
A pre- and post-methodology was used to evaluate the effect of the resource allocation plan revision on 24-hour mortality. All adult cases with evaluable outcome data (obtained through linked provincial health administrative data) were analyzed. Multivariable logistic regression was used to adjust for variations in other significant associated factors. Interrupted time series analysis was used to estimate immediate changes in level or trend of outcome after the start of the revised resource allocation plan implementation, while simultaneously controlling for pre-existing trends.
The derived cohort comprised 562,546 cases (April 2012–March 2015). When adjusted for age, sex, urban/metro region, season, day, hour, and dispatch determinant, the probability of dying within 24 hours of an EMS call was 7% lower in the post-resource allocation plan-revision cohort (OR = 0.936; 95% CI: 0.886–0.989; p = 0.018). A subgroup analysis of immediately life-threatening cases demonstrated similar effect (OR = 0.890; 95% CI: 0.808–0.981; p = 0.019). Using time series analysis, the descending changes in overall 24-hour mortality trend and the 24-hour mortality trend in immediately life-threatening cases, were both statistically significant (p < 0.001).
Comprehensive, evidence-informed reconstruction of a provincial EMS resource allocation plan is feasible. Despite change in crew level response and resource allocation, there was significant decrease in 24-hour mortality in this pan-provincial population-based cohort.