To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Chinese privet (Ligustrum sinense Lour.) is a deciduous to evergreen shrub with an expansive non-native global range. Control costs are often high, so land managers must carefully consider whether the plant’s potential negative effects warrant active management. To help facilitate this decision-making process, we reviewed and synthesized the literature on the potential ecological effects of L. sinense invasion. We also identified research gaps in need of further study. We found ample evidence of negative relationships between L. sinense invasion and native plant communities. While observational studies are not able to confirm whether L. sinense is driving these relationships, experimental evidence suggests that there is a cause-effect relationship. Of particular concern is the possibility that L. sinense could suppress forest regeneration and cause these areas to transition from forest to L. sinense-dominated shrublands. Although this outcome would obviously impact a wide variety of wildlife species, empirical evidence of negative effects of L. sinense on wildlife are limited and some species may actually benefit from the additional cover and foraging opportunities that L. sinense can provide. Further research on the potential effects of L. sinense invasion on large-scale forest structure and wildlife populations is needed. In areas where L. sinense invasion is a concern, evidence suggests early detection and management can mitigate control costs.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
The Murchison Widefield Array (MWA) is an electronically steered low-frequency (<300 MHz) radio interferometer, with a ‘slew’ time less than 8 s. Low-frequency (∼100 MHz) radio telescopes are ideally suited for rapid response follow-up of transients due to their large field of view, the inverted spectrum of coherent emission, and the fact that the dispersion delay between a 1 GHz and 100 MHz pulse is on the order of 1–10 min for dispersion measures of 100–2000 pc/cm3. The MWA has previously been used to provide fast follow-up for transient events including gamma-ray bursts (GRBs), fast radio bursts (FRBs), and gravitational waves, using systems that respond to gamma-ray coordinates network packet-based notifications. We describe a system for automatically triggering MWA observations of such events, based on Virtual Observatory Event standard triggers, which is more flexible, capable, and accurate than previous systems. The system can respond to external multi-messenger triggers, which makes it well-suited to searching for prompt coherent radio emission from GRBs, the study of FRBs and gravitational waves, single pulse studies of pulsars, and rapid follow-up of high-energy superflares from flare stars. The new triggering system has the capability to trigger observations in both the regular correlator mode (limited to ≥0.5 s integrations) and using the Voltage Capture System (VCS, 0.1 ms integration) of the MWA and represents a new mode of operation for the MWA. The upgraded standard correlator triggering capability has been in use since MWA observing semester 2018B (July–Dec 2018), and the VCS and buffered mode triggers will become available for observing in a future semester.
Recent years have seen an exponential increase in the variety of healthcare data captured across numerous sources. However, mechanisms to leverage these data sources to support scientific investigation have remained limited. In 2013 the Pediatric Heart Network (PHN), funded by the National Heart, Lung, and Blood Institute, developed the Integrated CARdiac Data and Outcomes (iCARD) Collaborative with the goals of leveraging available data sources to aid in efficiently planning and conducting PHN studies; supporting integration of PHN data with other sources to foster novel research otherwise not possible; and mentoring young investigators in these areas. This review describes lessons learned through the development of iCARD, initial efforts and scientific output, challenges, and future directions. This information can aid in the use and optimisation of data integration methodologies across other research networks and organisations.
To evaluate long-term efficacy of deutetrabenazine in patients with tardive dyskinesia (TD) by examining response rates from baseline in Abnormal Involuntary Movement Scale (AIMS) scores. Preliminary results of the responder analysis are reported in this analysis.
In the 12-week ARM-TD and AIM-TD studies, the odds of response to deutetrabenazine treatment were higher than the odds of response to placebo at all response levels, and there were low rates of overall adverse events and discontinuations associated with deutetrabenazine.
Patients with TD who completed ARM-TD or AIM-TD were included in this open-label, single-arm extension study, in which all patients restarted/started deutetrabenazine 12mg/day, titrating up to a maximum total daily dose of 48mg/day based on dyskinesia control and tolerability. The study comprised a 6-week titration and a long-term maintenance phase. The cumulative proportion of AIMS responders from baseline was assessed. Response was defined as a percent improvement from baseline for each patient from 10% to 90% in 10% increments. AlMS score was assessed by local site ratings for this analysis.
343 patients enrolled in the extension study (111 patients received placebo in the parent study and 232 patients received deutetrabenazine). At Week 54 (n=145; total daily dose [mean±standard error]: 38.1±0.9mg), 63% of patients receiving deutetrabenazine achieved ≥30% response, 48% of patients achieved ≥50% response, and 26% achieved ≥70% response. At Week 80 (n=66; total daily dose: 38.6±1.1mg), 76% of patients achieved ≥30% response, 59% of patients achieved ≥50% response, and 36% achieved ≥70% response. Treatment was generally well tolerated.
Patients who received long-term treatment with deutetrabenazine achieved response rates higher than those observed in positive short-term studies, indicating clinically meaningful long-term treatment benefit.
Presented at: American Academy of Neurology Annual Meeting; April 21–27, 2018, Los Angeles, California, USA.
Funding Acknowledgements: This study was supported by Teva Pharmaceuticals, Petach Tikva, Israel.
To evaluate the long-term safety and tolerability of deutetrabenazine in patients with tardive dyskinesia (TD) at 2years.
In the 12-week ARM-TD and AIM-TD studies, deutetrabenazine showed clinically significant improvements in Abnormal Involuntary Movement Scale scores compared with placebo, and there were low rates of overall adverse events (AEs) and discontinuations associated with deutetrabenazine.
Patients who completed ARM-TD or AIM-TD were included in this open-label, single-arm extension study, in which all patients restarted/started deutetrabenazine 12mg/day, titrating up to a maximum total daily dose of 48mg/day based on dyskinesia control and tolerability. The study comprised a 6-week titration period and a long-term maintenance phase. Safety measures included incidence of AEs, serious AEs (SAEs), and AEs leading to withdrawal, dose reduction, or dose suspension. Exposure-adjusted incidence rates (EAIRs; incidence/patient-years) were used to compare AE frequencies for long-term treatment with those for short-term treatment (ARM-TD and AIM-TD). This analysis reports results up to 2 years (Week106).
343 patients were enrolled (111 patients received placebo in the parent study and 232 received deutetrabenazine). There were 331.4 patient-years of exposure in this analysis. Through Week 106, EAIRs of AEs were comparable to or lower than those observed with short-term deutetrabenazine and placebo, including AEs of interest (akathisia/restlessness [long-term EAIR: 0.02; short-term EAIR range: 0–0.25], anxiety [0.09; 0.13–0.21], depression [0.09; 0.04–0.13], diarrhea [0.06; 0.06–0.34], parkinsonism [0.01; 0–0.08], somnolence/sedation [0.09; 0.06–0.81], and suicidality [0.02; 0–0.13]). The frequency of SAEs (EAIR 0.15) was similar to those observed with short-term placebo (0.33) and deutetrabenazine (range 0.06–0.33) treatment. AEs leading to withdrawal (0.08), dose reduction (0.17), and dose suspension (0.06) were uncommon.
These results confirm the safety outcomes seen in the ARM-TD and AIM-TD parent studies, demonstrating that deutetrabenazine is well tolerated for long-term use in TD patients.
Presented at: American Academy of Neurology Annual Meeting; April 21–27, 2018, Los Angeles, California,USA
Funding Acknowledgements: Funding: This study was supported by Teva Pharmaceuticals, Petach Tikva, Israel
A robust biomedical informatics infrastructure is essential for academic health centers engaged in translational research. There are no templates for what such an infrastructure encompasses or how it is funded. An informatics workgroup within the Clinical and Translational Science Awards network conducted an analysis to identify the scope, governance, and funding of this infrastructure. After we identified the essential components of an informatics infrastructure, we surveyed informatics leaders at network institutions about the governance and sustainability of the different components. Results from 42 survey respondents showed significant variations in governance and sustainability; however, some trends also emerged. Core informatics components such as electronic data capture systems, electronic health records data repositories, and related tools had mixed models of funding including, fee-for-service, extramural grants, and institutional support. Several key components such as regulatory systems (e.g., electronic Institutional Review Board [IRB] systems, grants, and contracts), security systems, data warehouses, and clinical trials management systems were overwhelmingly supported as institutional infrastructure. The findings highlighted in this report are worth noting for academic health centers and funding agencies involved in planning current and future informatics infrastructure, which provides the foundation for a robust, data-driven clinical and translational research program.
To summarize and discuss logistic and administrative challenges we encountered during the Benefits of Enhanced Terminal Room (BETR) Disinfection Study and lessons learned that are pertinent to future utilization of ultraviolet (UV) disinfection devices in other hospitals
Multicenter cluster randomized trial
SETTING AND PARTICIPANTS
Nine hospitals in the southeastern United States
All participating hospitals developed systems to implement 4 different strategies for terminal room disinfection. We measured compliance with disinfection strategy, barriers to implementation, and perceptions from nurse managers and environmental services (EVS) supervisors throughout the 28-month trial.
Implementation of enhanced terminal disinfection with UV disinfection devices provides unique challenges, including time pressures from bed control personnel, efficient room identification, negative perceptions from nurse managers, and discharge volume. In the course of the BETR Disinfection Study, we utilized several strategies to overcome these barriers: (1) establishing safety as the priority; (2) improving communication between EVS, bed control, and hospital administration; (3) ensuring availability of necessary resources; and (4) tracking and providing feedback on compliance. Using these strategies, we deployed ultraviolet (UV) disinfection devices in 16,220 (88%) of 18,411 eligible rooms during our trial (median per hospital, 89%; IQR, 86%–92%).
Implementation of enhanced terminal room disinfection strategies using UV devices requires recognition and mitigation of 2 key barriers: (1) timely and accurate identification of rooms that would benefit from enhanced terminal disinfection and (2) overcoming time constraints to allow EVS cleaning staff sufficient time to properly employ enhanced terminal disinfection methods.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
An internationally approved and globally used classification scheme for the diagnosis of CHD has long been sought. The International Paediatric and Congenital Cardiac Code (IPCCC), which was produced and has been maintained by the International Society for Nomenclature of Paediatric and Congenital Heart Disease (the International Nomenclature Society), is used widely, but has spawned many “short list” versions that differ in content depending on the user. Thus, efforts to have a uniform identification of patients with CHD using a single up-to-date and coordinated nomenclature system continue to be thwarted, even if a common nomenclature has been used as a basis for composing various “short lists”. In an attempt to solve this problem, the International Nomenclature Society has linked its efforts with those of the World Health Organization to obtain a globally accepted nomenclature tree for CHD within the 11th iteration of the International Classification of Diseases (ICD-11). The International Nomenclature Society has submitted a hierarchical nomenclature tree for CHD to the World Health Organization that is expected to serve increasingly as the “short list” for all communities interested in coding for congenital cardiology. This article reviews the history of the International Classification of Diseases and of the IPCCC, and outlines the process used in developing the ICD-11 congenital cardiac disease diagnostic list and the definitions for each term on the list. An overview of the content of the congenital heart anomaly section of the Foundation Component of ICD-11, published herein in its entirety, is also included. Future plans for the International Nomenclature Society include linking again with the World Health Organization to tackle procedural nomenclature as it relates to cardiac malformations. By doing so, the Society will continue its role in standardising nomenclature for CHD across the globe, thereby promoting research and better outcomes for fetuses, children, and adults with congenital heart anomalies.
To determine whether antimicrobial-impregnated textiles decrease the acquisition of pathogens by healthcare provider (HCP) clothing.
We completed a 3-arm randomized controlled trial to test the efficacy of 2 types of antimicrobial-impregnated clothing compared to standard HCP clothing. Cultures were obtained from each nurse participant, the healthcare environment, and patients during each shift. The primary outcome was the change in total contamination on nurse scrubs, measured as the sum of colony-forming units (CFU) of bacteria.
PARTICIPANTS AND SETTING
Nurses working in medical and surgical ICUs in a 936-bed tertiary-care hospital.
Nurse subjects wore standard cotton-polyester surgical scrubs (control), scrubs that contained a complex element compound with a silver-alloy embedded in its fibers (Scrub 1), or scrubs impregnated with an organosilane-based quaternary ammonium and a hydrophobic fluoroacrylate copolymer emulsion (Scrub 2). Nurse participants were blinded to scrub type and randomly participated in all 3 arms during 3 consecutive 12-hour shifts in the intensive care unit.
In total, 40 nurses were enrolled and completed 3 shifts. Analyses of 2,919 cultures from the environment and 2,185 from HCP clothing showed that scrub type was not associated with a change in HCP clothing contamination (P=.70). Mean difference estimates were 0.118 for the Scrub 1 arm (95% confidence interval [CI], −0.206 to 0.441; P=.48) and 0.009 for the Scrub 2 rm (95% CI, −0.323 to 0.342; P=.96) compared to the control. HCP became newly contaminated with important pathogens during 19 of the 120 shifts (16%).
Antimicrobial-impregnated scrubs were not effective at reducing HCP contamination. However, the environment is an important source of HCP clothing contamination.
Risk adjustment is needed to fairly compare central-line–associated bloodstream infection (CLABSI) rates between hospitals. Until 2017, the Centers for Disease Control and Prevention (CDC) methodology adjusted CLABSI rates only by type of intensive care unit (ICU). The 2017 CDC models also adjust for hospital size and medical school affiliation. We hypothesized that risk adjustment would be improved by including patient demographics and comorbidities from electronically available hospital discharge codes.
Using a cohort design across 22 hospitals, we analyzed data from ICU patients admitted between January 2012 and December 2013. Demographics and International Classification of Diseases, Ninth Edition, Clinical Modification (ICD-9-CM) discharge codes were obtained for each patient, and CLABSIs were identified by trained infection preventionists. Models adjusting only for ICU type and for ICU type plus patient case mix were built and compared using discrimination and standardized infection ratio (SIR). Hospitals were ranked by SIR for each model to examine and compare the changes in rank.
Overall, 85,849 ICU patients were analyzed and 162 (0.2%) developed CLABSI. The significant variables added to the ICU model were coagulopathy, paralysis, renal failure, malnutrition, and age. The C statistics were 0.55 (95% CI, 0.51–0.59) for the ICU-type model and 0.64 (95% CI, 0.60–0.69) for the ICU-type plus patient case-mix model. When the hospitals were ranked by adjusted SIRs, 10 hospitals (45%) changed rank when comorbidity was added to the ICU-type model.
Our risk-adjustment model for CLABSI using electronically available comorbidities demonstrated better discrimination than did the CDC model. The CDC should strongly consider comorbidity-based risk adjustment to more accurately compare CLABSI rates across hospitals.
To determine which comorbid conditions are considered causally related to central-line associated bloodstream infection (CLABSI) and surgical-site infection (SSI) based on expert consensus.
Using the Delphi method, we administered an iterative, 2-round survey to 9 infectious disease and infection control experts from the United States.
Based on our selection of components from the Charlson and Elixhauser comorbidity indices, 35 different comorbid conditions were rated from 1 (not at all related) to 5 (strongly related) by each expert separately for CLABSI and SSI, based on perceived relatedness to the outcome. To assign expert consensus on causal relatedness for each comorbid condition, all 3 of the following criteria had to be met at the end of the second round: (1) a majority (>50%) of experts rating the condition at 3 (somewhat related) or higher, (2) interquartile range (IQR)≤1, and (3) standard deviation (SD)≤1.
From round 1 to round 2, the IQR and SD, respectively, decreased for ratings of 21 of 35 (60%) and 33 of 35 (94%) comorbid conditions for CLABSI, and for 17 of 35 (49%) and 32 of 35 (91%) comorbid conditions for SSI, suggesting improvement in consensus among this group of experts. At the end of round 2, 13 of 35 (37%) and 17 of 35 (49%) comorbid conditions were perceived as causally related to CLABSI and SSI, respectively.
Our results have produced a list of comorbid conditions that should be analyzed as risk factors for and further explored for risk adjustment of CLABSI and SSI.
Epidemiological studies use georeferenced health data to identify disease clusters but the accuracy of this georeferencing is obfuscated by incorrectly assigning the source of infection and by aggregating case data to larger geographical areas. Often, place of residence (residence) is used as a proxy for the source of infection (source) which may not be accurate. Using a 21-year dataset from South Australia of human infections with the mosquito-borne Ross River virus, we found that 37% of cases were believed to have been acquired away from home. We constructed two risk maps using age-standardized morbidity ratios (SMRs) calculated using residence and patient-reported source. Both maps confirm significant inter-suburb variation in SMRs. Areas frequently named as the source (but not residence) and the highest-risk suburbs both tend to be tourist locations with vector mosquito habitat, and camping or outdoor recreational opportunities. We suggest the highest-risk suburbs as places to focus on for disease control measures. We also use a novel application of ambient population data (LandScan) to improve the interpretation of these risk maps and propose how this approach can aid in implementing disease abatement measures on a smaller scale than for which disease data are available.
During 1990 we surveyed the southern sky using a multi-beam receiver at frequencies of 4850 and 843 MHz. The half-power beamwidths were 4 and 25 arcmin respectively. The finished surveys cover the declination range between +10 and −90 degrees declination, essentially complete in right ascension, an area of 7.30 steradians. Preliminary analysis of the 4850 MHz data indicates that we will achieve a five sigma flux density limit of about 30 mJy. We estimate that we will find between 80 000 and 90 000 new sources above this limit. This is a revised version of the paper presented at the Regional Meeting by the first four authors; the surveys now have been completed.
Napiergrass has potential as a cellulosic biofuel crop because of its rapid growth habit in the southern United States. However, it is also listed as a potential invasive species by the Florida Exotic Pest Plant Council. For field renovation, information about napiergrass control in response to tillage and herbicides is required. Field studies were initiated to evaluate control of napiergrass established in fields for over 3 yr at Plains, GA, and Tifton, GA. For tillage and POST herbicides, imazapyr plus glyphosate consistently controlled napiergrass relative to diclosulam plus glyphosate, sulfentrazone plus glyphosate, or tillage in terms of visual injury, stem height and dry biomass reduction. One application of imazapyr plus glyphosate controlled napiergrass 74 and 94%, and reduced plant stem height to 6 and 15% of the nontreated control. When diclosulam plus glyphosate, sulfentrazone plus glyphosate, or tillage was used alone with no sequential herbicides, napiergrass control ranged from 12 to 33%; when these control tactics were followed by two sequential applications of either sethoxydim or glyphosate, napiergrass control varied from 45 to 99%. Reductions in plant heights were reflective of injury 47 d after final herbicide applications (May/June). Napiergrass yield in dry biomass production was reduced by imazapyr plus glyphosate ≥ 86% relative to the nontreated control (NTC). Diclosulam plus glyphosate, sulfentrazone plus glyphosate, or tillage alone was not effective in reducing napiergrass dry biomass yields ranging from 1 to 47% compared with the NTC; when these treatments were followed by sequential applications of sethoxydim or glyphosate, napiergrass dry biomass was reduced 46 to 91% compared with the NTC. Tillage plus two applications of sethoxydim or glyphosate exhibited control potential because they provided levels of napiergrass control similar to imazapyr-based treatments. Tillage plus multiple applications of sethoxydim or glyphosate offers flexibility to crop rotations as compared with the residual herbicide imazapyr, which has many crop rotation restrictions because of carryover concerns.
Silver Lake is the modern terminal playa of the Mojave River in southern California (USA). As a result, it is well located to record both influences from the winter precipitation dominated San Bernardino Mountains – the source of the Mojave River – and from the late summer to early fall North American monsoon at Silver Lake. Here, we present various physical, chemical and biological data from a new radiocarbon-dated, 8.2 m sediment core taken from Silver Lake that spans modern through 14.8 cal ka BP. Texturally, the core varies between sandy clay, clayey sand, and sand-silt-clay, often with abrupt sedimentological transitions. These grain-size changes are used to divide the core into six lake status intervals over the past 14.8 cal ka BP. Notable intervals include a dry Younger Dryas chronozone, a wet early Holocene terminating 7.8 – 7.4 cal ka BP, a distinct mid-Holocene arid interval, and a late Holocene return to ephemeral lake conditions. A comparison to potential climatic forcings implicates a combination of changing summer – winter insolation and tropical and N Pacific sea-surface temperature dynamics as the primary drivers of Holocene climate in the central Mojave Desert.
Hearts in which the arterial trunks arise from the morphologically appropriate ventricles, but in a parallel manner, rather than the usual spiralling arrangement, have long fascinated anatomists. These rare entities, for quite some time, were considered embryological impossibilities, but ongoing experience has shown that they can be found in various segmental combinations. Problems still exist about how best to describe them, as the different variants are often described with esoteric terms, such as anatomically corrected malposition or isolated ventricular inversion. In this review, based on our combined clinical and morphological experience, we demonstrate that the essential feature of all hearts described in this manner is a parallel arrangement of the arterial trunks as they exit from the ventricular mass. We show that the relationship of the arterial roots needs to be described in terms of the underlying ventricular topology, rather than according to the arrangement of the atrial chambers. We then discuss the importance of determining atrial arrangement on the basis of the morphology of the appendages, following the precepts as set out in the so-called “morphological method” and distinguished according to the extent of the pectinate muscles relative to the atrioventricular junctions as opposed to basing diagnosis on the venoatrial connections. We show that, when approached in this manner, the various combinations can be readily diagnosed in the clinical setting and described in straightforward way.