To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Herbicides have been a primary means of managing undesirable brush on grazing lands across the southwestern United States for decades. Continued encroachment of honey mesquite and huisache on grazing lands warrants evaluation of treatment life and economics of current and experimental treatments. Treatment life is defined as the time between treatment application and when canopy cover of undesirable brush returns to a competitive level with native forage grasses (i.e., 25% canopy cover for mesquite and 30% canopy cover for huisache). Treatment life of industry-standard herbicides was compared with that of aminocyclopyrachlor plus triclopyr amine (ACP+T) from 10 broadcast-applied honey mesquite and five broadcast-applied huisache trials established from 2007 through 2013 across Texas. On average, the treatment life of industry standard treatments (IST) for huisache was 3 yr. In comparison, huisache canopy cover was only 2.5% in plots treated with ACP+T 3 yr after treatment. The average treatment life of IST for honey mesquite was 8.6 yr, whereas plots treated with ACP+T had just 2% mesquite canopy cover at that time. Improved treatment life of ACP+T compared with IST life was due to higher mortality resulting in more consistent brush canopy reduction. The net present values (NPVs) of ACP+T and IST for both huisache and mesquite were similar until the treatment life of the IST application was reached (3 yr for huisache and 8.6 yr for honey mesquite). At that point, NPVs of the programs diverged as a result of brush competition with desirable forage grasses and additional input costs associated with theoretical follow-up IST necessary to maintain optimum livestock forage production. The ACP+T treatments did not warrant a sequential application over the 12-yr analysis for huisache or 20-yr analysis for honey mesquite that this research covered. These results indicate ACP+T provides cost-effective, long-term control of honey mesquite and huisache.
Kochia [Bassia scoparia (L.) A. J. Scott] is one of the most troublesome weeds throughout the North American Great Plains. Herbicides such as glyphosate and dicamba have been used widely to control B. scoparia for decades. However, many B. scoparia populations have evolved resistance to these herbicides due to selection. Especially, dicamba-resistant B. scoparia populations are often also found to be glyphosate-resistant. The objective of this research was to determine whether these two herbicide resistances are linked in B. scoparia. Reciprocal crosses were performed between glyphosate- and dicamba-resistant (GDR) and glyphosate- and dicamba-susceptible (GDS) B. scoparia to produce F1 and F2 progeny. Two F1 and seven F2 progeny families were screened with various doses of dicamba or glyphosate. All the F1 progeny survived both dicamba and glyphosate treatments. Chi-square analyses of F2 progeny suggest (1) glyphosate and dicamba resistances in B. scoparia are inherited via single, dominant nuclear genes; and (2) glyphosate- and dicamba-resistant genes are not linked. Thus, the dicamba and glyphosate resistances appear to have evolved independently due to intense selection but do not seem to spread together.
Following stage 1 palliation, delayed sternal closure may be used as a technique to enhance thoracic compliance but may also prolong the length of stay and increase the risk of infection.
We reviewed all neonates undergoing stage 1 palliation at our institution between 2010 and 2017 to describe the effects of delayed sternal closure.
During the study period, 193 patients underwent stage 1 palliation, of whom 12 died before an attempt at sternal closure. Among the 25 patients who underwent primary sternal closure, 4 (16%) had sternal reopening within 24 hours. Among the 156 infants who underwent delayed sternal closure at 4 [3,6] days post-operatively, 11 (7.1%) had one or more failed attempts at sternal closure. Patients undergoing primary sternal closure had a shorter duration of mechanical ventilation and intensive care unit length of stay. Patients who failed delayed sternal closure had a longer aortic cross-clamp time (123±42 versus 99±35 minutes, p=0.029) and circulatory arrest time (39±28 versus 19±17 minutes, p=0.0009) than those who did not fail. Failure of delayed sternal closure was also closely associated with Technical Performance Score: 1.3% of patients with a score of 1 failed sternal closure compared with 18.9% of patients with a score of 3 (p=0.0028). Among the haemodynamic and ventilatory parameters studied, only superior caval vein saturation following sternal closure was different between patients who did and did not fail sternal closure (30±7 versus 42±10%, p=0.002). All patients who failed sternal closure did so within 24 hours owing to hypoxaemia, hypercarbia, or haemodynamic impairment.
When performed according to our current clinical practice, sternal closure causes transient and mild changes in haemodynamic and ventilatory parameters. Monitoring of SvO2 following sternal closure may permit early identification of patients at risk for failure.
Background: Central neuropathic pain syndromes are a result of central nervous system injury, most commonly related to stroke, traumatic spinal cord injury, or multiple sclerosis. These syndromes are distinctly less common than peripheral neuropathic pain, and less is known regarding the underlying pathophysiology, appropriate pharmacotherapy, and long-term outcomes. The objective of this study was to determine the long-term clinical effectiveness of the management of central neuropathic pain relative to peripheral neuropathic pain at tertiary pain centers. Methods: Patients diagnosed with central (n=79) and peripheral (n=710) neuropathic pain were identified for analysis from a prospective observational cohort study of patients with chronic neuropathic pain recruited from seven Canadian tertiary pain centers. Data regarding patient characteristics, analgesic use, and patient-reported outcomes were collected at baseline and 12-month follow-up. The primary outcome measure was the composite of a reduction in average pain intensity and pain interference. Secondary outcome measures included assessments of function, mood, quality of life, catastrophizing, and patient satisfaction. Results: At 12-month follow-up, 13.5% (95% confidence interval [CI], 5.6-25.8) of patients with central neuropathic pain and complete data sets (n=52) achieved a ≥30% reduction in pain, whereas 38.5% (95% CI, 25.3-53.0) achieved a reduction of at least 1 point on the Pain Interference Scale. The proportion of patients with central neuropathic pain achieving both these measures, and thus the primary outcome, was 9.6% (95% CI, 3.2-21.0). Patients with peripheral neuropathic pain and complete data sets (n=463) were more likely to achieve this primary outcome at 12 months (25.3% of patients; 95% CI, 21.4-29.5) (p=0.012). Conclusion: Patients with central neuropathic pain syndromes managed in tertiary care centers were less likely to achieve a meaningful improvement in pain and function compared with patients with peripheral neuropathic pain at 12-month follow-up.
The Neotoma Paleoecology Database is a community-curated data resource that supports interdisciplinary global change research by enabling broad-scale studies of taxon and community diversity, distributions, and dynamics during the large environmental changes of the past. By consolidating many kinds of data into a common repository, Neotoma lowers costs of paleodata management, makes paleoecological data openly available, and offers a high-quality, curated resource. Neotoma’s distributed scientific governance model is flexible and scalable, with many open pathways for participation by new members, data contributors, stewards, and research communities. The Neotoma data model supports, or can be extended to support, any kind of paleoecological or paleoenvironmental data from sedimentary archives. Data additions to Neotoma are growing and now include >3.8 million observations, >17,000 datasets, and >9200 sites. Dataset types currently include fossil pollen, vertebrates, diatoms, ostracodes, macroinvertebrates, plant macrofossils, insects, testate amoebae, geochronological data, and the recently added organic biomarkers, stable isotopes, and specimen-level data. Multiple avenues exist to obtain Neotoma data, including the Explorer map-based interface, an application programming interface, the neotoma R package, and digital object identifiers. As the volume and variety of scientific data grow, community-curated data resources such as Neotoma have become foundational infrastructure for big data science.
Background: Painful diabetic neuropathy (PDN) is a frequent complication of diabetes mellitus. Current treatment recommendations are based on short-term trials, generally of ≤3 months’ duration. Limited data are available on the long-term outcomes of this chronic disease. The objective of this study was to determine the long-term clinical effectiveness of the management of chronic PDN at tertiary pain centres. Methods: From a prospective observational cohort study of patients with chronic neuropathic non-cancer pain recruited from seven Canadian tertiary pain centres, 60 patients diagnosed with PDN were identified for analysis. Data were collected according to Initiative on Methods, Measurement, and Pain Assessment in Clinical Trials guidelines including the Brief Pain Inventory. Results: At 12-month follow-up, 37.2% (95% confidence interval [CI], 23.0-53.3) of 43 patients with complete data achieved pain reduction of ≥30%, 51.2% (95% CI, 35.5-66.7) achieved functional improvement with a reduction of ≥1 on the Pain Interference Scale (0-10, Brief Pain Inventory) and 30.2% (95% CI, 17.2-46.1) had achieved both these measures. Symptom management included at least two medication classes in 55.3% and three medication classes in 25.5% (opioids, antidepressants, anticonvulsants). Conclusions: Almost one-third of patients being managed for PDN in a tertiary care setting achieve meaningful improvements in pain and function in the long term. Polypharmacy including analgesic antidepressants and anticonvulsants were the mainstays of effective symptom management.
New paediatric cardiology trainees are required to rapidly assimilate knowledge and gain clinical skills to which they have limited or no exposure during residency. The Pediatric Cardiology Fellowship Boot Camp (PCBC) at Boston Children’s Hospital was designed to provide incoming fellows with an intensive exposure to congenital cardiac pathology and a broad overview of major areas of paediatric cardiology practice.
The PCBC curriculum was designed by core faculty in cardiac pathology, echocardiography, electrophysiology, interventional cardiology, exercise physiology, and cardiac intensive care. Individual faculty contributed learning objectives, which were refined by fellowship directors and used to build a programme of didactics, hands-on/simulation-based activities, and self-guided learning opportunities.
A total of 16 incoming fellows participated in the 4-week boot camp, with no concurrent clinical responsibilities, over 2 years. On the basis of pre- and post-PCBC surveys, 80% of trainees strongly agreed that they felt more prepared for clinical responsibilities, and a similar percentage felt that PCBC should be offered to future incoming fellows. Fellows showed significant increase in their confidence in all specific knowledge and skills related to the learning objectives. Fellows rated hands-on learning experiences and simulation-based exercises most highly.
We describe a novel 4-week-long boot camp designed to expose incoming paediatric cardiology fellows to the broad spectrum of knowledge and skills required for the practice of paediatric cardiology. The experience increased trainee confidence and sense of preparedness to begin fellowship-related responsibilities. Given that highly interactive activities were rated most highly, boot camps in paediatric cardiology should strongly emphasise these elements.
The Fellowship Program of the Department of Cardiology at Boston Children’s Hospital seeks to train academically oriented leaders in clinical care and laboratory and clinical investigation of cardiovascular disease in the young. The core clinical fellowship involves 3 years in training, comprising 24 months of clinical rotations and 12 months of elective and research experience. Trainees have access to a vast array of research opportunities – clinical, basic, and translational. Clinical fellows interested in basic science may reverse the usual sequence and start their training in the laboratory, deferring clinical training for 1 or more years. An increasing number of clinical trainees apply to spend a fourth year as a senior fellow in one of the subspecialty areas of paediatric cardiology. From the founding of the Department to the present, we have maintained a fundamental and unwavering commitment to training and education in clinical care and research in basic science and clinical investigation, as well as to the training of outstanding young clinicians and investigators.
Monozygotic (MZ) twins provide a natural system for investigating developmental plasticity and the potential epigenetic origins of disease. A major difference in the intrauterine environment between MZ pairs is whether they share a common placenta or have separate placentas. Using DNA methylation measured at >400,000 points in the genome on the Illumina HumanMethylation450 array, we demonstrate that the co-twins of MZ pairs (average age of 14) that shared a common placenta (n = 18 pairs) have more similar DNA methylation levels in blood throughout the genome relative to those with separate placentas (n = 16 pairs). Functional annotation of the genomic regions that show significantly different correlation between monochorionic (MC) and dichorionic (DC) MZ pairs found an over-representation of genes involved in the regulation of transcription, neuronal development, and cellular differentiation. These results support the idea that prenatal environmental exposures may have a lasting effect on an individual's epigenetic landscape, and the potential for these changes to have functional consequences.
Elevated levels of interleukin-6 (IL-6) have been associated with the development of common mental disorders, such as depression, but its role in symptom resolution is unclear.
We examined the association between IL-6 and symptom resolution in a non-clinical sample of participants with psychological distress.
Relative to high IL-6 levels, low levels at baseline were associated with symptom resolution at follow-up [age- and sex-adjusted risk ratio (RR) = 1.15, 95% confidence interval (CI) 1.06–1.25]. Further adjustment for covariates had little effect on the association. Symptomatic participants with repeated low IL-6 were more likely to be symptom-free at follow-up compared with those with repeated high IL-6 (RR = 1.21, 95% CI 1.03–1.41). Among the symptomatic participants with elevated IL-6 at baseline, IL-6 decreased along with symptom resolution.
IL-6 is potentially related to the mechanisms underlying recovery from symptoms of mental ill health. Further studies are needed to examine these mechanisms and to confirm the findings in relation to clinical depression.
It is rare for isolated sphenoid sinusitis to cause orbital cellulitis. We present a rare case of posterior orbital cellulitis, so caused, together with a review of the relevant literature.
A 29-year-old woman presented with a 6-week history of progressive, unilateral, retro-orbital and periorbital right eye pain. On examination, the only finding was reduced visual acuity in the right eye. A computed tomography scan demonstrated right frontal and sphenoid sinus opacification. Sphenoidotomy and frontal sinus trephination were subsequently performed, following failure to respond to intravenous antibiotics. After surgery, the patient's vision returned to normal.
Isolated sphenoid sinusitis is rare but can cause significant visual disturbance and permanent loss of vision. Vague symptoms unsupported by clinical signs at presentation are a feature of posterior orbital cellulitis. The presented case highlights the problem, and the need for a high index of clinical suspicion even in the absence of firm clinical signs, in order to prevent permanent visual loss.
Two community-based density case-control studies were performed to assess risk factors for cholera transmission during inter-peak periods of the ongoing epidemic in two Haitian urban settings, Gonaives and Carrefour. The strongest associations were: close contact with cholera patients (sharing latrines, visiting cholera patients, helping someone with diarrhoea), eating food from street vendors and washing dishes with untreated water. Protective factors were: drinking chlorinated water, receiving prevention messages via television, church or training sessions, and high household socioeconomic level. These findings suggest that, in addition to contaminated water, factors related to direct and indirect inter-human contact play an important role in cholera transmission during inter-peak periods. In order to reduce cholera transmission in Haiti intensive preventive measures such as hygiene promotion and awareness campaigns should be implemented during inter-peak lulls, when prevention activities are typically scaled back.
Diagnosis of depressive disorder using interviewer-administered instruments is expensive and frequently impractical in large epidemiological surveys. The aim of this study was to assess the validity of three self-completion measures of depressive disorder and other psychiatric disorders in older people against an interviewer-administered instrument.
A random sample stratified by sex, age and social position was selected from the Whitehall II study participants. This sample was supplemented by inclusion of depressed Whitehall II participants. Depressive disorder and other mental disorders were assessed by the interviewer-administered structured revised Clinical Interview Schedule (CIS-R) in 277 participants aged 58–80 years. Participants also completed a computerized self-completion version of the CIS-R in addition to the General Health Questionnaire (GHQ) and the Center for Epidemiologic Studies Depression Scale (CES-D).
The mean total score was similar for the interviewer-administered (4.43) and self-completion (4.35) versions of the CIS-R [95% confidence interval (CI) for difference −0.31 to 0.16]. Differences were not related to sex, age, social position or presence of chronic physical illness. Sensitivity/specificity of self-completion CIS-R was 74%/98% for any mental disorder and 75%/98% for depressive episode. The corresponding figures were 86%/87% and 78%/83% for GHQ and 77%/89% and 89%/86% for CES-D.
The self-completion computerized version of the CIS-R is feasible and has good validity as a measure of any mental disorder and depression in people aged ⩾ 60 years. GHQ and CES-D also have good criterion validity as measures of any mental disorder and depressive disorder respectively.
Gerard J. Allan, Department of Biological Sciences, Northern Arizona University,
Stephen M. Shuster, Department of Biological Sciences, Northern Arizona University,
Scott Woolbright, The Institute for Genomic Biology, University of Illinois,
Faith Walker, Department of Biological Sciences, Northern Arizona University,
Nashelly Meneses, Department of Biological Sciences, Northern Arizona University,
Arthur Keith, Department of Biological Sciences, Northern Arizona University,
Joseph K. Bailey, Department of Ecology and Evolutionary Biology, University of Tennessee,
Thomas G. Whitham, Department of Biological Sciences, Northern Arizona University
Trait-mediated indirect interactions (TMIIs) are important mediators of community diversity and structure and associated ecosystem processes. Elucidating the genetic basis of ecologically important phenotypic traits is the first step toward understanding the complex interactions that occur among community members. Molecular markers routinely used in quantitative trait loci (QTL) analyses (e.g., amplified fragment length polymorphisms (AFLPs), simple sequence repeats (SSRs)) have provided researchers with a toolbox for investigating the genetic basis of heritable traits. A goal of this research is to link genetically based traits to community interactions and ecosystem function. Ultimately, this insight can open a window onto the evolutionary dynamics that shape community structure and associated ecosystem processes (e.g., nutrient cycling). Such an approach is important as it bears on the continued development of the field of community genetics, which seeks to understand the genetic interactions that occur between species and their abiotic environment in complex communities (e.g., Whitham et al. 2003, 2006; Johnson and Agrawal 2005; LeRoy et al. 2006; Bangert et al. 2006a, b; Schweitzer et al. 2008; Crutsinger et al. 2009; Bailey et al. 2009).