To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Commercialization of crops resistant to application of dicamba is a cause of major concern for sweetpotato producers regarding potential negative impacts due to herbicide drift or sprayer contamination events. A field study was initiated in 2014 and repeated in 2015 to assess impacts of reduced rates of BAPMA or DGA salt of dicamba, glyphosate, or a combination of these individually in separate trials with glyphosate on sweetpotato. Reduced rates of 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1000 of the 1x use rate of each dicamba formulation at 0.56 kg ha-1, glyphosate at 1.12 kg ha-1, and the combination of the two at aforementioned rates were applied to ‘Beauregard’ sweetpotato at storage root formation (10 d after transplanting) in one trial or storage root development (30 d after transplanting) in a separate trial. Injury with each salt of dicamba (BAPMA or DGA) applied alone or with glyphosate was generally equal or greater than glyphosate applied alone at equivalent rates, indicating that injury is most attributable to the dicamba in the combination. There was a quadratic increase in crop injury and quadratic decrease in crop yield (with respect to most yield grades) observed with increase herbicide rate of dicamba applied alone or in combination with glyphosate applied at storage root development. However, this relationship as well as the significance of herbicide rate was not observed on crop injury or sweetpotato yield when herbicide application occurred at storage root formation stage with a few exceptions. In general, crop injury and yield reduction was greatest at the highest rate (1/10x) of either salt of dicamba applied alone or in combination with glyphosate, although injury observed at lower rates would be cause for concern after initial observation by sweetpotato producers. However, in some cases yield reduction of no.1 and marketable grades was observed following 1/250, 1/100, or 1/10x application rate of dicamba alone or with glyphosate when applied at storage root development.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
Identifying routes of transmission among hospitalized patients during a healthcare-associated outbreak can be tedious, particularly among patients with complex hospital stays and multiple exposures. Data mining of the electronic health record (EHR) has the potential to rapidly identify common exposures among patients suspected of being part of an outbreak.
We retrospectively analyzed 9 hospital outbreaks that occurred during 2011–2016 and that had previously been characterized both according to transmission route and by molecular characterization of the bacterial isolates. We determined (1) the ability of data mining of the EHR to identify the correct route of transmission, (2) how early the correct route was identified during the timeline of the outbreak, and (3) how many cases in the outbreaks could have been prevented had the system been running in real time.
Correct routes were identified for all outbreaks at the second patient, except for one outbreak involving >1 transmission route that was detected at the eighth patient. Up to 40 or 34 infections (78% or 66% of possible preventable infections, respectively) could have been prevented if data mining had been implemented in real time, assuming the initiation of an effective intervention within 7 or 14 days of identification of the transmission route, respectively.
Data mining of the EHR was accurate for identifying routes of transmission among patients who were part of the outbreak. Prospective validation of this approach using routine whole-genome sequencing and data mining of the EHR for both outbreak detection and route attribution is ongoing.
Increasing evidence indicates that gut microbiota may influence colorectal cancer risk. Diet, particularly fibre intake, may modify gut microbiota composition, which may affect cancer risk. We investigated the relationship between dietary fibre intake and gut microbiota in adults. Using 16S rRNA gene sequencing, we assessed gut microbiota in faecal samples from 151 adults in two independent study populations: National Cancer Institute (NCI), n 75, and New York University (NYU), n 76. We calculated energy-adjusted fibre intake based on FFQ. For each study population with adjustment for age, sex, race, BMI and smoking, we evaluated the relationship between fibre intake and gut microbiota community composition and taxon abundance. Total fibre intake was significantly associated with overall microbial community composition in NYU (P=0·008) but not in NCI (P=0·81). In a meta-analysis of both study populations, higher fibre intake tended to be associated with genera of class Clostridia, including higher abundance of SMB53 (fold change (FC)=1·04, P=0·04), Lachnospira (FC=1·03, P=0·05) and Faecalibacterium (FC=1·03, P=0·06), and lower abundance of Actinomyces (FC=0·95, P=0·002), Odoribacter (FC=0·95, P=0·03) and Oscillospira (FC=0·96, P=0·06). A species-level meta-analysis showed that higher fibre intake was marginally associated with greater abundance of Faecalibacterium prausnitzii (FC=1·03, P=0·07) and lower abundance of Eubacterium dolichum (FC=0·96, P=0·04) and Bacteroides uniformis (FC=0·97, P=0·05). Thus, dietary fibre intake may impact gut microbiota composition, particularly class Clostridia, and may favour putatively beneficial bacteria such as F. prausnitzii. These findings warrant further understanding of diet–microbiota relationships for future development of colorectal cancer prevention strategies.
OBJECTIVES/SPECIFIC AIMS: We aimed to develop an assay to measure new protein synthesis after Antisense Oligonucleotide treatment, which we hypothesized to be the earliest biochemical identification of RNA-targeting therapy efficacy. METHODS/STUDY POPULATION: We treated 2 transgenic animal models expressing proteins implicated in neurodegenerative disease: human tau protein (hTau) and human superoxide dismutase 1 (hSOD1), with ASO against these mRNA transcripts. Animals received isotope-labeled 13C6-Leucine via drinking water to label newly synthesized proteins. We assayed target protein synthesis and concentration after ASO treatment to determine the earliest identification of ASO target engagement. RESULTS/ANTICIPATED RESULTS: hTau ASO treatment in transgenic mice lowered hTau protein concentration 23 days post-treatment in cortex (95% CI: 0.05%–64.0% reduction). In the same tissue, we observed lowering of hTau protein synthesis as early as 13 days (95% CI: 29.4%–123%). In hSOD1 transgenic rats, we observed lowering of 13C6-leucine-labeled hSOD1 in the cerebrospinal fluid 30 days after ASO treatment compared with inactive ASO control (95% CI: 12.0%–48.4%). DISCUSSION/SIGNIFICANCE OF IMPACT: In progressive neurodegenerative diseases, it is crucial to develop measurements that identify treatment efficacy early to improve patient outcomes. These data support the use of stable isotope labeling of amino acids to measure new protein synthesis as an early pharmacodynamics measurement for therapies that target RNA and inhibit the translation of proteins.
The Cambrian-Ordovician Diversity Plateau, between the Cambrian Explosion and the Ordovician Radiation, is punctuated by a series of well-documented Laurentian trilobite extinction events. These events define the bounding surfaces of trilobite ‘biomeres’ that correspond to North American stages, including those of the Sunwaptan and Skullrockian. Trilobites show a consistent pattern of recovery across these boundaries, and commonly each extinction and replacement of taxa is interpreted as a single event as changing environmental conditions spurred shoreward migration of shelf or oceanic faunas that displaced established cratonic faunas. Linguliform brachiopods are also abundant in strata of this interval, and we investigate their stratigraphic distribution across the Sunwaptan–Skullrockian Stage boundary in Texas through high-resolution stratigraphic sampling of subtidal sediments. We document complete genus- and species-level turnover of the linguliform brachiopod fauna coincident with trilobite extinction events, suggesting that these brachiopods were affected by the same factors that affected trilobites. The Skullrockian replacement fauna was cosmopolitan, with ties to Gondwana and Kazakhstan and to the Laurentian shelf environment. The timing of appearances of taxa suggests that the faunal migration onto the Laurentian shelf came from elsewhere during a transgression. The disappearance of the Sunwaptan fauna and the arrival of the Skullrockian fauna are distinct events. We suggest that ‘biomere’ events may be complex, and the cause of the extinction is not necessarily the same event that facilitates the appearance of a replacement fauna. We describe one new species, Schizambon langei.
While previous work showed that the Centers for Disease Control and Prevention toolkit for carbapenem-resistant Enterobacteriaceae (CRE) can reduce spread regionally, these interventions are costly, and decisions makers want to know whether and when economic benefits occur.
Orange County, California
Using our Regional Healthcare Ecosystem Analyst (RHEA)-generated agent-based model of all inpatient healthcare facilities, we simulated the implementation of the CRE toolkit (active screening of interfacility transfers) in different ways and estimated their economic impacts under various circumstances.
Compared to routine control measures, screening generated cost savings by year 1 when hospitals implemented screening after identifying ≤20 CRE cases (saving $2,000–$9,000) and by year 7 if all hospitals implemented in a regional coordinated manner after 1 hospital identified a CRE case (hospital perspective). Cost savings was achieved only if hospitals independently screened after identifying 10 cases (year 1, third-party payer perspective). Cost savings was achieved by year 1 if hospitals independently screened after identifying 1 CRE case and by year 3 if all hospitals coordinated and screened after 1 hospital identified 1 case (societal perspective). After a few years, all strategies cost less and have positive health effects compared to routine control measures; most strategies generate a positive cost-benefit each year.
Active screening of interfacility transfers garnered cost savings in year 1 of implementation when hospitals acted independently and by year 3 if all hospitals collectively implemented the toolkit in a coordinated manner. Despite taking longer to manifest, coordinated regional control resulted in greater savings over time.
The History, Electrocardiogram (ECG), Age, Risk Factors, and Troponin (HEART) score is a decision aid designed to risk stratify emergency department (ED) patients with acute chest pain. It has been validated for ED use, but it has yet to be evaluated in a prehospital setting.
A prehospital modified HEART score can predict major adverse cardiac events (MACE) among undifferentiated chest pain patients transported to the ED.
A retrospective cohort study of patients with chest pain transported by two county-based Emergency Medical Service (EMS) agencies to a tertiary care center was conducted. Adults without ST-elevation myocardial infarction (STEMI) were included. Inter-facility transfers and those without a prehospital 12-lead ECG or an ED troponin measurement were excluded. Modified HEART scores were calculated by study investigators using a standardized data collection tool for each patient. All MACE (death, myocardial infarction [MI], or coronary revascularization) were determined by record review at 30 days. The sensitivity and negative predictive values (NPVs) for MACE at 30 days were calculated.
Over the study period, 794 patients met inclusion criteria. A MACE at 30 days was present in 10.7% (85/794) of patients with 12 deaths (1.5%), 66 MIs (8.3%), and 12 coronary revascularizations without MI (1.5%). The modified HEART score identified 33.2% (264/794) of patients as low risk. Among low-risk patients, 1.9% (5/264) had MACE (two MIs and three revascularizations without MI). The sensitivity and NPV for 30-day MACE was 94.1% (95% CI, 86.8-98.1) and 98.1% (95% CI, 95.6-99.4), respectively.
Prehospital modified HEART scores have a high NPV for MACE at 30 days. A study in which prehospital providers prospectively apply this decision aid is warranted.
Extinction is the complete loss of a species, but the accuracy of that status depends on the overall information about the species. Dracaena umbraculifera was described in 1797 from a cultivated plant attributed to Mauritius, but repeated surveys failed to relocate it and it was categorized as Extinct on the IUCN Red List. However, several individuals labelled as D. umbraculifera grow in botanical gardens, suggesting that the species’ IUCN status may be inaccurate. The goal of this study was to understand (1) where D. umbraculifera originated, (2) which species are its close relatives, (3) whether it is extinct, and (4) the identity of the botanical garden accessions and whether they have conservation value. We sequenced a cpDNA region of Dracaena from Mauritius, botanical garden accessions labelled as D. umbraculifera, and individuals confirmed to be D. umbraculifera based on morphology, one of which is a living plant in a private garden. We included GenBank accessions of Dracaena from Madagascar and other locations and reconstructed the phylogeny using Bayesian and parsimony approaches. Phylogenies indicated that D. umbraculifera is more closely related to Dracaena reflexa from Madagascar than to Mauritian Dracaena. As anecdotal information indicated that the living D. umbraculifera originated from Madagascar, we conducted field expeditions there and located five wild populations; the species’ IUCN status should therefore be Critically Endangered because < 50 wild individuals remain. Although the identity of many botanical garden samples remains unresolved, this study highlights the importance of living collections for facilitating new discoveries and the importance of documenting and conserving the flora of Madagascar.
The Arctic marine environment is undergoing a transition from thick multi-year to first-year sea-ice cover with coincident lengthening of the melt season. Such changes are evident in the Baffin Bay-Davis Strait-Labrador Sea (BDL) region where melt onset has occurred ~8 days decade−1 earlier from 1979 to 2015. A series of anomalously early events has occurred since the mid-1990s, overlapping a period of increased upper-air ridging across Greenland and the northwestern North Atlantic. We investigate an extreme early melt event observed in spring 2013. (~6σ below the 1981–2010 melt climatology), with respect to preceding sub-seasonal mid-tropospheric circulation conditions as described by a daily Greenland Blocking Index (GBI). The 40-days prior to the 2013 BDL melt onset are characterized by a persistent, strong 500 hPa anticyclone over the region (GBI >+1 on >75% of days). This circulation pattern advected warm air from northeastern Canada and the northwestern Atlantic poleward onto the thin, first-year sea ice and caused melt ~50 days earlier than normal. The episodic increase in the ridging atmospheric pattern near western Greenland as in 2013, exemplified by large positive GBI values, is an important recent process impacting the atmospheric circulation over a North Atlantic cryosphere undergoing accelerated regional climate change.