Book chapters will be unavailable on Saturday 24th August between 8am-12pm BST. This is for essential maintenance which will provide improved performance going forwards. Please accept our apologies for any inconvenience caused.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To evaluate the long-term safety and tolerability of deutetrabenazine in patients with tardive dyskinesia (TD) at 2years.
In the 12-week ARM-TD and AIM-TD studies, deutetrabenazine showed clinically significant improvements in Abnormal Involuntary Movement Scale scores compared with placebo, and there were low rates of overall adverse events (AEs) and discontinuations associated with deutetrabenazine.
Patients who completed ARM-TD or AIM-TD were included in this open-label, single-arm extension study, in which all patients restarted/started deutetrabenazine 12mg/day, titrating up to a maximum total daily dose of 48mg/day based on dyskinesia control and tolerability. The study comprised a 6-week titration period and a long-term maintenance phase. Safety measures included incidence of AEs, serious AEs (SAEs), and AEs leading to withdrawal, dose reduction, or dose suspension. Exposure-adjusted incidence rates (EAIRs; incidence/patient-years) were used to compare AE frequencies for long-term treatment with those for short-term treatment (ARM-TD and AIM-TD). This analysis reports results up to 2 years (Week106).
343 patients were enrolled (111 patients received placebo in the parent study and 232 received deutetrabenazine). There were 331.4 patient-years of exposure in this analysis. Through Week 106, EAIRs of AEs were comparable to or lower than those observed with short-term deutetrabenazine and placebo, including AEs of interest (akathisia/restlessness [long-term EAIR: 0.02; short-term EAIR range: 0–0.25], anxiety [0.09; 0.13–0.21], depression [0.09; 0.04–0.13], diarrhea [0.06; 0.06–0.34], parkinsonism [0.01; 0–0.08], somnolence/sedation [0.09; 0.06–0.81], and suicidality [0.02; 0–0.13]). The frequency of SAEs (EAIR 0.15) was similar to those observed with short-term placebo (0.33) and deutetrabenazine (range 0.06–0.33) treatment. AEs leading to withdrawal (0.08), dose reduction (0.17), and dose suspension (0.06) were uncommon.
These results confirm the safety outcomes seen in the ARM-TD and AIM-TD parent studies, demonstrating that deutetrabenazine is well tolerated for long-term use in TD patients.
Presented at: American Academy of Neurology Annual Meeting; April 21–27, 2018, Los Angeles, California,USA
Funding Acknowledgements: Funding: This study was supported by Teva Pharmaceuticals, Petach Tikva, Israel
To evaluate long-term efficacy of deutetrabenazine in patients with tardive dyskinesia (TD) by examining response rates from baseline in Abnormal Involuntary Movement Scale (AIMS) scores. Preliminary results of the responder analysis are reported in this analysis.
In the 12-week ARM-TD and AIM-TD studies, the odds of response to deutetrabenazine treatment were higher than the odds of response to placebo at all response levels, and there were low rates of overall adverse events and discontinuations associated with deutetrabenazine.
Patients with TD who completed ARM-TD or AIM-TD were included in this open-label, single-arm extension study, in which all patients restarted/started deutetrabenazine 12mg/day, titrating up to a maximum total daily dose of 48mg/day based on dyskinesia control and tolerability. The study comprised a 6-week titration and a long-term maintenance phase. The cumulative proportion of AIMS responders from baseline was assessed. Response was defined as a percent improvement from baseline for each patient from 10% to 90% in 10% increments. AlMS score was assessed by local site ratings for this analysis.
343 patients enrolled in the extension study (111 patients received placebo in the parent study and 232 patients received deutetrabenazine). At Week 54 (n=145; total daily dose [mean±standard error]: 38.1±0.9mg), 63% of patients receiving deutetrabenazine achieved ≥30% response, 48% of patients achieved ≥50% response, and 26% achieved ≥70% response. At Week 80 (n=66; total daily dose: 38.6±1.1mg), 76% of patients achieved ≥30% response, 59% of patients achieved ≥50% response, and 36% achieved ≥70% response. Treatment was generally well tolerated.
Patients who received long-term treatment with deutetrabenazine achieved response rates higher than those observed in positive short-term studies, indicating clinically meaningful long-term treatment benefit.
Presented at: American Academy of Neurology Annual Meeting; April 21–27, 2018, Los Angeles, California, USA.
Funding Acknowledgements: This study was supported by Teva Pharmaceuticals, Petach Tikva, Israel.
Modern high-throughput molecular and analytical tools offer exciting opportunities to gain a mechanistic understanding of unique traits of weeds. During the past decade, tremendous progress has been made within the weed science discipline using genomic techniques to gain deeper insights into weedy traits such as invasiveness, hybridization, and herbicide resistance. Though the adoption of newer “omics” techniques such as proteomics, metabolomics, and physionomics has been slow, applications of these omics platforms to study plants, especially agriculturally important crops and weeds, have been increasing over the years. In weed science, these platforms are now used more frequently to understand mechanisms of herbicide resistance, weed resistance evolution, and crop–weed interactions. Use of these techniques could help weed scientists to further reduce the knowledge gaps in understanding weedy traits. Although these techniques can provide robust insights about the molecular functioning of plants, employing a single omics platform can rarely elucidate the gene-level regulation and the associated real-time expression of weedy traits due to the complex and overlapping nature of biological interactions. Therefore, it is desirable to integrate the different omics technologies to give a better understanding of molecular functioning of biological systems. This multidimensional integrated approach can therefore offer new avenues for better understanding of questions of interest to weed scientists. This review offers a retrospective and prospective examination of omics platforms employed to investigate weed physiology and novel approaches and new technologies that can provide holistic and knowledge-based weed management strategies for future.
Leafy spurge (Euphorbia esula L.) is an invasive perennial weed infesting range and recreational lands of North America. Previous research and omics projects with E. esula have helped develop it as a model for studying many aspects of perennial plant development and response to abiotic stress. However, the lack of an assembled genome for E. esula has limited the power of previous transcriptomics studies to identify functional promoter elements and transcription factor binding sites. An assembled genome for E. esula would enhance our understanding of signaling processes controlling plant development and responses to environmental stress and provide a better understanding of genetic factors impacting weediness traits, evolution, and herbicide resistance. A comprehensive transcriptome database would also assist in analyzing future RNA-seq studies and is needed to annotate and assess genomic sequence assemblies. Here, we assembled and annotated 56,234 unigenes from an assembly of 589,235 RNA-seq-derived contigs and a previously published Sanger-sequenced expressed sequence tag collection. The resulting data indicate that we now have sequence for >90% of the expressed E. esula protein-coding genes. We also assembled the gene space of E. esula by using a limited coverage (18X) genomic sequence database. In this study, the programs Velvet and Trinity produced the best gene-space assemblies based on representation of expressed and conserved eukaryotic genes. The results indicate that E. esula contains as much as 23% repetitive sequences, of which 11% are unique. Our sequence data were also sufficient for assembling a full chloroplast and partial mitochondrial genome. Further, marker analysis identified more than 150,000 high-quality variants in our E. esula L-RNA–scaffolded, whole-genome, Trinity-assembled genome. Based on these results, E. esula appears to have limited heterozygosity. This study provides a blueprint for low-cost genomic assemblies in weed species and new resources for identifying conserved and novel promoter regions among coordinately expressed genes of E. esula.
To summarize and discuss logistic and administrative challenges we encountered during the Benefits of Enhanced Terminal Room (BETR) Disinfection Study and lessons learned that are pertinent to future utilization of ultraviolet (UV) disinfection devices in other hospitals
Multicenter cluster randomized trial
SETTING AND PARTICIPANTS
Nine hospitals in the southeastern United States
All participating hospitals developed systems to implement 4 different strategies for terminal room disinfection. We measured compliance with disinfection strategy, barriers to implementation, and perceptions from nurse managers and environmental services (EVS) supervisors throughout the 28-month trial.
Implementation of enhanced terminal disinfection with UV disinfection devices provides unique challenges, including time pressures from bed control personnel, efficient room identification, negative perceptions from nurse managers, and discharge volume. In the course of the BETR Disinfection Study, we utilized several strategies to overcome these barriers: (1) establishing safety as the priority; (2) improving communication between EVS, bed control, and hospital administration; (3) ensuring availability of necessary resources; and (4) tracking and providing feedback on compliance. Using these strategies, we deployed ultraviolet (UV) disinfection devices in 16,220 (88%) of 18,411 eligible rooms during our trial (median per hospital, 89%; IQR, 86%–92%).
Implementation of enhanced terminal room disinfection strategies using UV devices requires recognition and mitigation of 2 key barriers: (1) timely and accurate identification of rooms that would benefit from enhanced terminal disinfection and (2) overcoming time constraints to allow EVS cleaning staff sufficient time to properly employ enhanced terminal disinfection methods.
It has been an underlying assumption in many studies that near-surface layers imaged by ground-penetrating radar (GPR) can be interpreted as depositional markers or isochrones. It has been shown that GPR layers can be approximately reproduced from the measured electrical properties of ice, but these material layers are generally narrower and more closely spaced than can be resolved by typical GPR systems operating in the range 50−400 MHz. Thus GPR layers should be interpreted as interference patterns produced from closely spaced and potentially discontinuous material layers, and should not be assumed to be interpretable as precise markers of isochrones. We present 100 MHz GPR data from Lyddan Ice Rise, Antarctica, in which near-surface (<50 m deep) layers are clearly imaged. The growth of the undulations in these layers with depth is approximately linear, implying that, rather than resulting from a pattern of vertical strain rate, they do correspond to some pattern of snowfall variation. Furthermore, comparison of the GPR layers with snow-stake measurements suggests that around 80% of the rms variability in mean annual accumulation is present in the GPR layers. The observations suggest that, at least in this case, the GPR layers do approximate isochrones, and that patterns of snow accumulation over Lyddan Ice Rise are dominated by extremely persistent spatial variations with only a small residual spatial variability. If this condition is shown to be widely applicable it may reduce the period required for measurements of surface elevation change to be taken as significant indications of mass imbalance.
To determine whether antimicrobial-impregnated textiles decrease the acquisition of pathogens by healthcare provider (HCP) clothing.
We completed a 3-arm randomized controlled trial to test the efficacy of 2 types of antimicrobial-impregnated clothing compared to standard HCP clothing. Cultures were obtained from each nurse participant, the healthcare environment, and patients during each shift. The primary outcome was the change in total contamination on nurse scrubs, measured as the sum of colony-forming units (CFU) of bacteria.
PARTICIPANTS AND SETTING
Nurses working in medical and surgical ICUs in a 936-bed tertiary-care hospital.
Nurse subjects wore standard cotton-polyester surgical scrubs (control), scrubs that contained a complex element compound with a silver-alloy embedded in its fibers (Scrub 1), or scrubs impregnated with an organosilane-based quaternary ammonium and a hydrophobic fluoroacrylate copolymer emulsion (Scrub 2). Nurse participants were blinded to scrub type and randomly participated in all 3 arms during 3 consecutive 12-hour shifts in the intensive care unit.
In total, 40 nurses were enrolled and completed 3 shifts. Analyses of 2,919 cultures from the environment and 2,185 from HCP clothing showed that scrub type was not associated with a change in HCP clothing contamination (P=.70). Mean difference estimates were 0.118 for the Scrub 1 arm (95% confidence interval [CI], −0.206 to 0.441; P=.48) and 0.009 for the Scrub 2 rm (95% CI, −0.323 to 0.342; P=.96) compared to the control. HCP became newly contaminated with important pathogens during 19 of the 120 shifts (16%).
Antimicrobial-impregnated scrubs were not effective at reducing HCP contamination. However, the environment is an important source of HCP clothing contamination.
The Yellow Sea region is of high global importance for waterbird populations, but recent systematic bird count data enabling identification of the most important sites are relatively sparse for some areas. Surveys of waterbirds at three sites on the coast of southern Jiangsu Province, China, in 2014 and 2015 produced peak counts of international importance for 24 species, including seven globally threatened and six Near Threatened species. The area is of particular global importance for the ‘Critically Endangered’ Spoon-billed Sandpiper Calidris pygmaea (peak count across all three study sites: 62 in spring  and 225 in autumn  and ‘Endangered’ Spotted Greenshank Tringa guttifer (peak count across all three study sites: 210 in spring  and 1,110 in autumn ). The southern Jiangsu coast is therefore currently the most important migratory stopover area in the world, in both spring and autumn, for both species. Several serious and acute threats to waterbirds were recorded at these study sites. Paramount is the threat of large-scale land claim which would completely destroy intertidal mudflats of critical importance to waterbirds. Degradation of intertidal mudflat habitats through the spread of invasive Spartina, and mortality of waterbirds by entrapment in nets or deliberate poisoning are also real and present serious threats here. Collisions with, and displacement by, wind turbines and other structures, and industrial chemical pollution may represent additional potential threats. We recommend the rapid establishment of effective protected areas for waterbirds in the study area, maintaining large areas of open intertidal mudflat, and the urgent removal of all serious threats currently faced by waterbirds here.
Hypertension following primary coarctation repair affects up to a third of subjects. A number of studies suggest that future hypertension risk is reduced if primary repair is performed at a younger age.
The objective of this study was to evaluate the risk of future medical treatment for hypertension depending on age of primary coarctation repair.
This study was carried out at a tertiary paediatric cardiology referral centre. Retrospective database evaluation of children aged <16 years undergoing primary surgical coarctation repair between October, 2005 and October, 2014 was carried out. Patients with complex heart diseases were excluded. The following age groups were considered: neonate (⩽28 days), infant (>28 days and ⩽12 months), and children (>12 months). Main outcome measure is the need for long-term anti-hypertensive medication. The risk for re-coarctation was also evaluated.
A total of 87 patients were analysed: 60 neonates, 17 infants, 10 children. Among them, 6.7% neonates, 29.4% infants, and 40% children required long-term anti-hypertensive medications. Group differences were statistically significant (p=0.004). After adjustment for type of repair, the risk of long-term anti-hypertensive therapy was 4.5 (95% confidence interval 1.2–16.9, p=0.025) and 10.5 times (95% confidence interval 2.6–42.3, p=0.001) higher if primary repair was carried out in infancy and childhood, respectively, compared with neonates. Among all, 13 patients developed re-coarctation: 21.7% in the neonatal group, 5.9% in the infant group, and 20% in the child group. We could not demonstrate a significant difference between these proportions or calculate a reliable risk for developing re-coarctation.
Risk of medical treatment for hypertension was lowest when primary repair was carried out during the neonatal period, rising 10-fold if first operated on as a child. Knowing the likelihood of hypertension development depending on age of primary repair is useful for long-term surveillance and counselling.
To determine which comorbid conditions are considered causally related to central-line associated bloodstream infection (CLABSI) and surgical-site infection (SSI) based on expert consensus.
Using the Delphi method, we administered an iterative, 2-round survey to 9 infectious disease and infection control experts from the United States.
Based on our selection of components from the Charlson and Elixhauser comorbidity indices, 35 different comorbid conditions were rated from 1 (not at all related) to 5 (strongly related) by each expert separately for CLABSI and SSI, based on perceived relatedness to the outcome. To assign expert consensus on causal relatedness for each comorbid condition, all 3 of the following criteria had to be met at the end of the second round: (1) a majority (>50%) of experts rating the condition at 3 (somewhat related) or higher, (2) interquartile range (IQR)≤1, and (3) standard deviation (SD)≤1.
From round 1 to round 2, the IQR and SD, respectively, decreased for ratings of 21 of 35 (60%) and 33 of 35 (94%) comorbid conditions for CLABSI, and for 17 of 35 (49%) and 32 of 35 (91%) comorbid conditions for SSI, suggesting improvement in consensus among this group of experts. At the end of round 2, 13 of 35 (37%) and 17 of 35 (49%) comorbid conditions were perceived as causally related to CLABSI and SSI, respectively.
Our results have produced a list of comorbid conditions that should be analyzed as risk factors for and further explored for risk adjustment of CLABSI and SSI.
We aimed to compare the procedural and mid-term performance of a specifically designed self-expanding stent with balloon-expandable stents in patients undergoing hybrid palliation for hypoplastic left heart syndrome and its variants.
The lack of specifically designed stents has led to off-label use of coronary, biliary, or peripheral stents in the neonatal ductus arteriosus. Recently, a self-expanding stent, specifically designed for use in hypoplastic left heart syndrome, has become available.
We carried out a retrospective cohort comparison of 69 neonates who underwent hybrid ductal stenting with balloon-expandable and self-expanding stents from December, 2005 to July, 2014.
In total, 43 balloon-expandable stents were implanted in 41 neonates and more recently 47 self-expanding stents in 28 neonates. In the balloon-expandable stents group, stent-related complications occurred in nine patients (22%), compared with one patient in the self-expanding stent group (4%). During follow-up, percutaneous re-intervention related to the ductal stent was performed in five patients (17%) in the balloon-expandable stent group and seven patients (28%) in self-expanding stents group.
Hybrid ductal stenting with self-expanding stents produced favourable results when compared with the results obtained with balloon-expandable stents. Immediate additional interventions and follow-up re-interventions were similar in both groups with complications more common in those with balloon-expandable stents.
To describe the epidemiology of complex surgical site infection (SSI) following commonly performed surgical procedures in community hospitals and to characterize trends of SSI prevalence rates over time for MRSA and other common pathogens
We prospectively collected SSI data at 29 community hospitals in the southeastern United States from 2008 through 2012. We determined the overall prevalence rates of SSI for commonly performed procedures during this 5-year study period. For each year of the study, we then calculated prevalence rates of SSI stratified by causative organism. We created log-binomial regression models to analyze trends of SSI prevalence over time for all pathogens combined and specifically for MRSA.
A total of 3,988 complex SSIs occurred following 532,694 procedures (prevalence rate, 0.7 infections per 100 procedures). SSIs occurred most frequently after small bowel surgery, peripheral vascular bypass surgery, and colon surgery. Staphylococcus aureus was the most common pathogen. The prevalence rate of SSI decreased from 0.76 infections per 100 procedures in 2008 to 0.69 infections per 100 procedures in 2012 (prevalence rate ratio [PRR], 0.90; 95% confidence interval [CI], 0.82–1.00). A more substantial decrease in MRSA SSI (PRR, 0.69; 95% CI, 0.54–0.89) was largely responsible for this overall trend.
The prevalence of MRSA SSI decreased from 2008 to 2012 in our network of community hospitals. This decrease in MRSA SSI prevalence led to an overall decrease in SSI prevalence over the study period.
To determine the association (1) between shorter operative duration and surgical site infection (SSI) and (2) between surgeon median operative duration and SSI risk among first-time hip and knee arthroplasties.
Retrospective cohort study
A total of 43 community hospitals located in the southeastern United States.
Adults who developed SSIs according to National Healthcare Safety Network criteria within 365 days of first-time knee or hip arthroplasties performed between January 1, 2008 and December 31, 2012.
Log-binomial regression models estimated the association (1) between operative duration and SSI outcome and (2) between surgeon median operative duration and SSI outcome. Hip and knee arthroplasties were evaluated in separate models. Each model was adjusted for American Society of Anesthesiology score and patient age.
A total of 25,531 hip arthroplasties and 42,187 knee arthroplasties were included in the study. The risk of SSI in knee arthroplasties with an operative duration shorter than the 25th percentile was 0.40 times the risk of SSI in knee arthroplasties with an operative duration between the 25th and 75th percentile (risk ratio [RR], 0.40; 95% confidence interval [CI], 0.38–0.56; P<.01). Short operative duration did not demonstrate significant association with SSI for hip arthroplasties (RR, 1.04; 95% CI, 0.79–1.37; P=.36). Knee arthroplasty surgeons with shorter median operative durations had a lower risk of SSI than surgeons with typical median operative durations (RR, 0.52; 95% CI, 0.43–0.64; P<.01).
Short operative durations were not associated with a higher SSI risk for knee or hip arthroplasty procedures in our analysis.
Infect. Control Hosp. Epidemiol. 2015;36(12):1431–1436
The current Fire/Emergency Medical Services (EMS) model throughout the United States involves emergency vehicles which respond from a primary location (ie, firehouse or municipal facility) to emergency calls. Quick response vehicles (QRVs) have been used in various Fire/EMS systems; however, their effectiveness has never been studied.
The goal of this study was to determine if patient response times would decrease by placing an Advanced Life Support (ALS) QRV in an integrated Fire/EMS system.
Response times from an integrated Fire/EMS system with an annual EMS call volume of 3,261 were evaluated over the three years prior to the implementation of this study. For a 2-month period, an ALS QRV staffed by a firefighter/paramedic responded to emergency calls during peak call volume hours of 8:00 am to 5:00 pm. The staging of this vehicle was based on historical call volume percentages using respective geocodes as well as system requirements during multiple emergency dispatches.
Prior to the study, the citywide average response time for the twelve months preceding was 5.44 minutes. During the study, the citywide average response time decreased to 4.09 minutes, resulting in a 27.62% reduction in patient response time.
The implementation of an ALS QRV in an integrated Fire/EMS system reduces patient response time. Having a QRV that is not staged continuously in a traditional fire station or municipal location reduces the time needed to reach patients. Also, using predictive models of historic call volume can aid Fire and EMS administrators in reduction of call response times.
AndersonDW, DhindsaHS, WanW, SalotD. Does the Implementation of an Advanced Life Support Quick Response Vehicle (QRV) in an Integrated Fire/EMS System Improve Patient Contact Response Time?Prehosp Disaster Med. 2015;30(4):1 – 3.