To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Arid regions are especially vulnerable to climate change and land use. More than one-third of Earth's population relies on these ecosystems. Modern observations lack the temporal depth to determine vegetation responses to climate and human activity, but paleoecological and archaeological records can be used to investigate these relationships. Decreasing rainfall across the Late Holocene provides a case study for vegetation response to changing hydroclimate. Rock hyrax (Procavia capensis) middens preserve paleoenvironmental indicators in arid environments where traditional archives are unavailable. Pollen from modern middens collected in Dhofar, Oman, demonstrates the reliability of this archive. Pollen, stable isotope (δ13C, δ15N), and microcharcoal data from fossil middens reveal changes in vegetation, relative moisture, and fire from 4000 cal yr BP to the present. Trees limited to moister areas (e.g., Terminalia) today existed farther inland at ~3100 cal yr BP. After ~2900 cal yr BP, taxa with more xeric affiliations (e.g., Senegalia) had increased. Coprophilous fungal spores (Sporormiella) and grazing indicator pollen revealed an amplified signal of domesticate grazing at ~1000 cal yr BP. This indicates that trees associated with semiarid environments were maintained in the interior desert during ~3000–4000 yr of decreasing rainfall and that impacts of human activity intensified after the transition to a drier environment.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
The causal impacts of recreational cannabis legalization are not well understood due to the number of potential confounds. We sought to quantify possible causal effects of recreational cannabis legalization on substance use, substance use disorder, and psychosocial functioning, and whether vulnerable individuals are more susceptible to the effects of cannabis legalization than others.
We used a longitudinal, co-twin control design in 4043 twins (N = 240 pairs discordant on residence), first assessed in adolescence and now age 24–49, currently residing in states with different cannabis policies (40% resided in a recreationally legal state). We tested the effect of legalization on outcomes of interest and whether legalization interacts with established vulnerability factors (age, sex, or externalizing psychopathology).
In the co-twin control design accounting for earlier cannabis frequency and alcohol use disorder (AUD) symptoms respectively, the twin living in a recreational state used cannabis on average more often (βw = 0.11, p = 1.3 × 10−3), and had fewer AUD symptoms (βw = −0.11, p = 6.7 × 10−3) than their co-twin living in an non-recreational state. Cannabis legalization was associated with no other adverse outcome in the co-twin design, including cannabis use disorder. No risk factor significantly interacted with legalization status to predict any outcome.
Recreational legalization was associated with increased cannabis use and decreased AUD symptoms but was not associated with other maladaptations. These effects were maintained within twin pairs discordant for residence. Moreover, vulnerabilities to cannabis use were not exacerbated by the legal cannabis environment. Future research may investigate causal links between cannabis consumption and outcomes.
The rise of jawed vertebrates (gnathostomes) and extinction of nearly all jawless vertebrates (agnathans) is one of the most important transitions in vertebrate evolution, but the causes are poorly understood. Competition between agnathans and gnathostomes during the Devonian period is the most commonly hypothesized cause; however, no formal attempts to test this hypothesis have been made. Generally, competition between species increases as morphological similarity increases; therefore, this study uses the largest to date morphometric comparison of Silurian and Devonian agnathan and gnathostome groups to determine which groups were most and least likely to have competed. Five agnathan groups (Anaspida, Heterostraci, Osteostraci, Thelodonti, and Furcacaudiformes) were compared with five gnathostome groups (Acanthodii, Actinopterygii, Chondrichthyes, Placodermi, and Sarcopterygii) including taxa from most major orders. Morphological dissimilarity was measured by Gower's dissimilarity coefficient, and the differences between agnathan and gnathostome body forms across early vertebrate morphospace were compared using principal coordinate analysis. Our results indicate competition between some agnathans and gnathostomes is plausible, but not all agnathan groups were similar to gnathostomes. Furcacaudiformes (fork-tailed thelodonts) are distinct from other early vertebrate groups and the least likely to have competed with other groups.
Despite extensive paleoenvironmental research on the postglacial history of the Kenai Peninsula, Alaska, uncertainties remain regarding the region's deglaciation, vegetation development, and past hydroclimate. To elucidate this complex environmental history, we present new proxy datasets from Hidden and Kelly lakes, located in the eastern Kenai lowlands at the foot of the Kenai Mountains, including sedimentological properties (magnetic susceptibility, organic matter, grain size, and biogenic silica), pollen and macrofossils, diatom assemblages, and diatom oxygen isotopes. We use a simple hydrologic and isotope mass balance model to constrain interpretations of the diatom oxygen isotope data. Results reveal that glacier ice retreated from Hidden Lake's headwaters by ca. 13.1 cal ka BP, and that groundwater was an important component of Kelly Lake's hydrologic budget in the Early Holocene. As the forest developed and the climate became wetter in the Middle to Late Holocene, Kelly Lake reached or exceeded its modern level. In the last ca. 75 years, rising temperature caused rapid changes in biogenic silica content and diatom oxygen isotope values. Our findings demonstrate the utility of mass balance modeling to constrain interpretations of paleolimnologic oxygen isotope data, and that groundwater can exert a strong influence on lake water isotopes, potentially confounding interpretations of regional climate.
To determine the incidence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection among healthcare personnel (HCP) and to assess occupational risks for SARS-CoV-2 infection.
Prospective cohort of healthcare personnel (HCP) followed for 6 months from May through December 2020.
Large academic healthcare system including 4 hospitals and affiliated clinics in Atlanta, Georgia.
HCP, including those with and without direct patient-care activities, working during the coronavirus disease 2019 (COVID-19) pandemic.
Incident SARS-CoV-2 infections were determined through serologic testing for SARS-CoV-2 IgG at enrollment, at 3 months, and at 6 months. HCP completed monthly surveys regarding occupational activities. Multivariable logistic regression was used to identify occupational factors that increased the risk of SARS-CoV-2 infection.
Of the 304 evaluable HCP that were seronegative at enrollment, 26 (9%) seroconverted for SARS-CoV-2 IgG by 6 months. Overall, 219 participants (73%) self-identified as White race, 119 (40%) were nurses, and 121 (40%) worked on inpatient medical-surgical floors. In a multivariable analysis, HCP who identified as Black race were more likely to seroconvert than HCP who identified as White (odds ratio, 4.5; 95% confidence interval, 1.3–14.2). Increased risk for SARS-CoV-2 infection was not identified for any occupational activity, including spending >50% of a typical shift at a patient’s bedside, working in a COVID-19 unit, or performing or being present for aerosol-generating procedures (AGPs).
In our study cohort of HCP working in an academic healthcare system, <10% had evidence of SARS-CoV-2 infection over 6 months. No specific occupational activities were identified as increasing risk for SARS-CoV-2 infection.
To describe the cumulative seroprevalence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies during the coronavirus disease 2019 (COVID-19) pandemic among employees of a large pediatric healthcare system.
Design, setting, and participants:
Prospective observational cohort study open to adult employees at the Children’s Hospital of Philadelphia, conducted April 20–December 17, 2020.
Employees were recruited starting with high-risk exposure groups, utilizing e-mails, flyers, and announcements at virtual town hall meetings. At baseline, 1 month, 2 months, and 6 months, participants reported occupational and community exposures and gave a blood sample for SARS-CoV-2 antibody measurement by enzyme-linked immunosorbent assays (ELISAs). A post hoc Cox proportional hazards regression model was performed to identify factors associated with increased risk for seropositivity.
In total, 1,740 employees were enrolled. At 6 months, the cumulative seroprevalence was 5.3%, which was below estimated community point seroprevalence. Seroprevalence was 5.8% among employees who provided direct care and was 3.4% among employees who did not perform direct patient care. Most participants who were seropositive at baseline remained positive at follow-up assessments. In a post hoc analysis, direct patient care (hazard ratio [HR], 1.95; 95% confidence interval [CI], 1.03–3.68), Black race (HR, 2.70; 95% CI, 1.24–5.87), and exposure to a confirmed case in a nonhealthcare setting (HR, 4.32; 95% CI, 2.71–6.88) were associated with statistically significant increased risk for seropositivity.
Employee SARS-CoV-2 seroprevalence rates remained below the point-prevalence rates of the surrounding community. Provision of direct patient care, Black race, and exposure to a confirmed case in a nonhealthcare setting conferred increased risk. These data can inform occupational protection measures to maximize protection of employees within the workplace during future COVID-19 waves or other epidemics.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Among 353 healthcare personnel in a longitudinal cohort in 4 hospitals in Atlanta, Georgia (May–June 2020), 23 (6.5%) had severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies. Spending >50% of a typical shift at the bedside (OR, 3.4; 95% CI, 1.2–10.5) and black race (OR, 8.4; 95% CI, 2.7–27.4) were associated with SARS-CoV-2 seropositivity.
Incumbent city councillors have an almost insurmountable advantage in Canadian municipal elections. This article aims to improve our understanding of the municipal incumbency advantage by considering the ability of electors to correctly identify the two most competitive candidates in one's ward and the factors associated with being able to do so. Using survey data from the Canadian Municipal Election Study (CMES), we consider the case of the 2018 elections in Mississauga, a city with typically high rates of incumbent re-election. Survey respondents were asked to identify the two most competitive candidates in their local ward races. We find that comparatively few electors are able to recognize which challenger serves as the strongest threat to a sitting councillor, a finding that suggests that coordination problems may help to contribute to high rates of incumbent success. We identify several individual-level and ward-level correlates of correctly identifying the first-place and second-place finishers. We do note, however, that there is a significant amount of variation among the thousands of municipalities in Canada, so findings from this case should be tested in other settings, including larger or smaller cities where levels of information might be different.
To determine the effect of an electronic medical record (EMR) nudge at reducing total and inappropriate orders testing for hospital-onset Clostridioides difficile infection (HO-CDI).
An interrupted time series analysis of HO-CDI orders 2 years before and 2 years after the implementation of an EMR intervention designed to reduce inappropriate HO-CDI testing. Orders for C. difficile testing were considered inappropriate if the patient had received a laxative or stool softener in the previous 24 hours.
Four hospitals in an academic healthcare network.
All patients with a C. difficile order after hospital day 3.
Orders for C. difficile testing in patients administered a laxative or stool softener in <24 hours triggered an EMR alert defaulting to cancellation of the order (“nudge”).
Of the 17,694 HO-CDI orders, 7% were inappropriate (8% prentervention vs 6% postintervention; P < .001). Monthly HO-CDI orders decreased by 21% postintervention (level-change rate ratio [RR], 0.79; 95% confidence interval [CI], 0.73–0.86), and the rate continued to decrease (postintervention trend change RR, 0.99; 95% CI, 0.98–1.00). The intervention was not associated with a level change in inappropriate HO-CDI orders (RR, 0.80; 95% CI, 0.61–1.05), but the postintervention inappropriate order rate decreased over time (RR, 0.95; 95% CI, 0.93–0.97).
An EMR nudge to minimize inappropriate ordering for C. difficile was effective at reducing HO-CDI orders, and likely contributed to decreasing the inappropriate HO-CDI order rate after the intervention.
To evaluate long-term efficacy of deutetrabenazine in patients with tardive dyskinesia (TD) by examining response rates from baseline in Abnormal Involuntary Movement Scale (AIMS) scores. Preliminary results of the responder analysis are reported in this analysis.
In the 12-week ARM-TD and AIM-TD studies, the odds of response to deutetrabenazine treatment were higher than the odds of response to placebo at all response levels, and there were low rates of overall adverse events and discontinuations associated with deutetrabenazine.
Patients with TD who completed ARM-TD or AIM-TD were included in this open-label, single-arm extension study, in which all patients restarted/started deutetrabenazine 12mg/day, titrating up to a maximum total daily dose of 48mg/day based on dyskinesia control and tolerability. The study comprised a 6-week titration and a long-term maintenance phase. The cumulative proportion of AIMS responders from baseline was assessed. Response was defined as a percent improvement from baseline for each patient from 10% to 90% in 10% increments. AlMS score was assessed by local site ratings for this analysis.
343 patients enrolled in the extension study (111 patients received placebo in the parent study and 232 patients received deutetrabenazine). At Week 54 (n=145; total daily dose [mean±standard error]: 38.1±0.9mg), 63% of patients receiving deutetrabenazine achieved ≥30% response, 48% of patients achieved ≥50% response, and 26% achieved ≥70% response. At Week 80 (n=66; total daily dose: 38.6±1.1mg), 76% of patients achieved ≥30% response, 59% of patients achieved ≥50% response, and 36% achieved ≥70% response. Treatment was generally well tolerated.
Patients who received long-term treatment with deutetrabenazine achieved response rates higher than those observed in positive short-term studies, indicating clinically meaningful long-term treatment benefit.
Presented at: American Academy of Neurology Annual Meeting; April 21–27, 2018, Los Angeles, California, USA.
Funding Acknowledgements: This study was supported by Teva Pharmaceuticals, Petach Tikva, Israel.
To evaluate the long-term safety and tolerability of deutetrabenazine in patients with tardive dyskinesia (TD) at 2years.
In the 12-week ARM-TD and AIM-TD studies, deutetrabenazine showed clinically significant improvements in Abnormal Involuntary Movement Scale scores compared with placebo, and there were low rates of overall adverse events (AEs) and discontinuations associated with deutetrabenazine.
Patients who completed ARM-TD or AIM-TD were included in this open-label, single-arm extension study, in which all patients restarted/started deutetrabenazine 12mg/day, titrating up to a maximum total daily dose of 48mg/day based on dyskinesia control and tolerability. The study comprised a 6-week titration period and a long-term maintenance phase. Safety measures included incidence of AEs, serious AEs (SAEs), and AEs leading to withdrawal, dose reduction, or dose suspension. Exposure-adjusted incidence rates (EAIRs; incidence/patient-years) were used to compare AE frequencies for long-term treatment with those for short-term treatment (ARM-TD and AIM-TD). This analysis reports results up to 2 years (Week106).
343 patients were enrolled (111 patients received placebo in the parent study and 232 received deutetrabenazine). There were 331.4 patient-years of exposure in this analysis. Through Week 106, EAIRs of AEs were comparable to or lower than those observed with short-term deutetrabenazine and placebo, including AEs of interest (akathisia/restlessness [long-term EAIR: 0.02; short-term EAIR range: 0–0.25], anxiety [0.09; 0.13–0.21], depression [0.09; 0.04–0.13], diarrhea [0.06; 0.06–0.34], parkinsonism [0.01; 0–0.08], somnolence/sedation [0.09; 0.06–0.81], and suicidality [0.02; 0–0.13]). The frequency of SAEs (EAIR 0.15) was similar to those observed with short-term placebo (0.33) and deutetrabenazine (range 0.06–0.33) treatment. AEs leading to withdrawal (0.08), dose reduction (0.17), and dose suspension (0.06) were uncommon.
These results confirm the safety outcomes seen in the ARM-TD and AIM-TD parent studies, demonstrating that deutetrabenazine is well tolerated for long-term use in TD patients.
Presented at: American Academy of Neurology Annual Meeting; April 21–27, 2018, Los Angeles, California,USA
Funding Acknowledgements: Funding: This study was supported by Teva Pharmaceuticals, Petach Tikva, Israel
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
A shattercane biotype resistant to primisulfuron was identified during greenhouse evaluations of seed obtained from southeastern and south-central Nebraska fields previously treated with primisulfuron or nicosulfuron a minimum of 3 consecutive yr. Absorption, translocation, metabolism, and acetolactate synthase (ALS) assay experiments were conducted to determine resistance mechanism(s) by comparing ALS-susceptible forage sorghum (ROX) to resistant shattercane (RS). The ROX had 10 and 12% greater 14C absorption than RS 24 and 96 hours after treatment (HAT), respectively. Absorption of 14C increased over time for both ROX and RS, while 14C translocation from the treated leaf to the shoots and roots was similar for ROX and RS. Unmetabolized primisulfuron accounted for > 80% of the radioactivity recovered 24 h after application for both ROX and RS. The ROX and RS had similar ALS-specific activities and total protein concentrations. Km values for RS and ROX were 10.4 and 5.8 mM pyruvate, respectively. The ALS isolated from RS was less sensitive than ROX to inhibition by primisulfuron. The I50 values for RS and ROX were 231 and 0.025 μM primisulfuron, respectively. The mechanism of primisulfuron resistance in this RS biotype is an altered ALS with decreased sensitivity to primisulfuron.
Tephra-fall deposits from Cook Inlet volcanoes were detected in sediment cores from Tustumena and Paradox Lakes, Kenai Peninsula, Alaska, using magnetic susceptibility and petrography. The ages of tephra layers were estimated using 21 14C ages on macrofossils. Tephras layers are typically fine, gray ash, 1–5 mm thick, and composed of varying proportions of glass shards, pumice, and glass-coated phenocrysts. Of the two lakes, Paradox Lake contained a higher frequency of tephra (0.8 tephra/100 yr; 109 over the 13,200-yr record). The unusually large number of tephra in this lake relative to others previously studied in the area is attributed to the lake's physiography, sedimentology, and limnology. The frequency of ash fall was not constant through the Holocene. In Paradox Lake, tephra layers are absent between ca. 800–2200, 3800–4800, and 9000–10,300 cal yr BP, despite continuously layered lacustrine sediment. In contrast, between 5000 and 9000 cal yr BP, an average of 1.7 tephra layers are present per 100 yr. The peak period of tephra fall (7000–9000 cal yr BP; 2.6 tephra/100 yr) in Paradox Lake is consistent with the increase in volcanism between 7000 and 9000 yr ago recorded in the Greenland ice cores.
High-resolution pollen and magnetic susceptibility (MS) analyses have been carried out on a sediment core taken from a high-elevation alpine bog area located in Sierra Nevada, southern Spain. The earliest part of the record, from 8200 to about 7000 cal yr BP, is characterized by the highest abundance of arboreal pollen and Pediastrum, indicating the warmest and wettest conditions in the area at that time. The pollen record shows a progressive aridification since 7000 cal yr BP that occurred in two steps, first shown by a decrease in Pinus, replaced by Poaceae from 7000 to 4600 cal yr BP and then by Cyperaceae, Artemisia and Amaranthaceae from 4600 to 1200 cal yr BP. Pediastrum also decreased progressively and totally disappeared at ca. 3000 yr ago. The progressive aridification is punctuated by periodically enhanced drought at ca. 6500, 5200 and 4000 cal yr BP that coincide in timing and duration with well-known dry events in the Mediterranean and other areas. Since 1200 cal yr BP, several changes are observed in the vegetation that probably indicate the high-impact of humans in the Sierra Nevada, with pasturing leading to nutrient enrichment and eutrophication of the bog, Pinus reforestation and Olea cultivation at lower elevations.
Conifer wood, probably spruce (Picea sp.), of middle Wisconsinan age (29,200 ± 500 yr B.P.) was recovered from late-glacial lake sediments from Upper South Branch Pond, Maine. If the wood was derived from a local source, deglaciation of part of northern New England is suggested for this time. The occurrence also has implications for understanding the problem associated with radiocarbon dating of bulk lake sediment containing small amounts of organic matter.