To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter phenology in thirteen economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to four weeks after physiological maturity at multiple sites spread across fourteen states in the southern, northern, and mid-Atlantic U.S. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus species seed shatter was low (0 to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2 to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than ten percent of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to four weeks after maturity at multiple sites spread across eleven states in the southern, northern, and mid-Atlantic U.S. From soybean maturity to four weeks after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased as the states moved further north. At soybean maturity, the percent of seed shatter ranged from 1 to 70%. That range had shifted to 5 to 100% (mean: 42%) by 25 days after soybean maturity. There were considerable differences in seed shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output at during certain years.
In 2019, a 42-year-old African man who works as an Ebola virus disease (EVD) researcher traveled from the Democratic Republic of Congo (DRC), near an ongoing EVD epidemic, to Philadelphia and presented to the Hospital of the University of Pennsylvania Emergency Department with altered mental status, vomiting, diarrhea, and fever. He was classified as a “wet” person under investigation for EVD, and his arrival activated our hospital emergency management command center and bioresponse teams. He was found to be in septic shock with multisystem organ dysfunction, including circulatory dysfunction, encephalopathy, metabolic lactic acidosis, acute kidney injury, acute liver injury, and diffuse intravascular coagulation. Critical care was delivered within high-risk pathogen isolation in the ED and in our Special Treatment Unit until a diagnosis of severe cerebral malaria was confirmed and EVD was definitively excluded.
This report discusses our experience activating a longitudinal preparedness program designed for rare, resource-intensive events at hospitals physically remote from any active epidemic but serving a high-volume international air travel port-of-entry.
A growing body of evidence suggests that synaptic plasticity is involved in addictive behaviour and nicotine dependence (ND). Neurotrophic factors, such as neurotrophin 3 (NT3) play a key role in modulating neuronal plasticity. Therefore, an association between nicotine, smoking and neurotrophic factors has been suggested. However, the role of NT3 in ND has not been thoroughly investigated in humans so far.
We investigated the influence of chronic (long-term smoking) and acute nicotine administration on the plasma level of NT3. We measured plasma NT3 levels at baseline and then 15 and 45 minutes after nicotine or placebo administration using an enzyme-linked-immunoabsorbent-assay (ELISA). Smokers showed higher NT3 level than non-smokers at baseline. Interestingly, 15 minutes after acute nicotine injection, plasma level of NT3 in both smokers and non-smokers decreased significantly and went back to baseline levels after 30 minutes. We found that plasma nicotine and NT3 levels were positively correlated in smokers at baseline.
There is a direct interaction of nicotine with NT3, which is different in acute and chronic exposure. Interestingly, the concentration of NT3 is correlated and up-regulated in smokers. We propose that neuroplasticity, which plays a role in addictive behaviour such as smoking or nicotine dependence (ND) might be mediated by these interactions of nicotine and NT3. We speculate that these might even play a part in the so called “self-medicating” with cigarettes that is often seen in patients with certain mental disorders.
Reconstructions of prehistoric vegetation composition help establish natural baselines, variability, and trajectories of forest dynamics before and during the emergence of intensive anthropogenic land use. Pollen–vegetation models (PVMs) enable such reconstructions from fossil pollen assemblages using process-based representations of taxon-specific pollen production and dispersal. However, several PVMs and variants now exist, and the sensitivity of vegetation inferences to PVM selection, variant, and calibration domain is poorly understood. Here, we compare the reconstructions, parameter estimates, and structure of a Bayesian hierarchical PVM, STEPPS, both to observations and to REVEALS, a widely used PVM, for the pre–Euro-American settlement-era vegetation in the northeastern United States (NEUS). We also compare NEUS-based STEPPS parameter estimates to those for the upper midwestern United States (UMW). Both PVMs predict the observed macroscale patterns of vegetation composition in the NEUS; however, reconstructions of minor taxa are less accurate and predictions for some taxa differ between PVMs. These differences can be attributed to intermodel differences in structure and parameter estimates. Estimates of pollen productivity from STEPPS broadly agree with estimates produced for use in REVEALS, while comparison between pollen dispersal parameter estimates shows no significant relationship. STEPPS parameter estimates are similar between the UMW and NEUS, suggesting that STEPPS parameter estimates are transferable between floristically similar regions and scales.
Single nucleotide polymorphisms (SNPs) contribute small increases in risk for late-onset Alzheimer's disease (LOAD). LOAD SNPs cluster around genes with similar biological functions (pathways). Polygenic risk scores (PRS) aggregate the effect of SNPs genome-wide. However, this approach has not been widely used for SNPs within specific pathways.
We investigated whether pathway-specific PRS were significant predictors of LOAD case/control status.
We mapped SNPs to genes within 8 pathways implicated in LOAD. For our polygenic analysis, the discovery sample comprised 13,831 LOAD cases and 29,877 controls. LOAD risk alleles for SNPs in our 8 pathways were identified at a P-value threshold of 0.5. Pathway-specific PRS were calculated in a target sample of 3332 cases and 9832 controls. The genetic data were pruned with R2 > 0.2 while retaining the SNPs most significantly associated with AD. We tested whether pathway-specific PRS were associated with LOAD using logistic regression, adjusting for age, sex, country, and principal components. We report the proportion of variance in liability explained by each pathway.
The most strongly associated pathways were the immune response (NSNPs = 9304, = 5.63 × 10−19, R2 = 0.04) and hemostasis (NSNPs = 7832, P = 5.47 × 10−7, R2 = 0.015). Regulation of endocytosis, hematopoietic cell lineage, cholesterol transport, clathrin and protein folding were also significantly associated but accounted for less than 1% of the variance. With APOE excluded, all pathways remained significant except proteasome-ubiquitin activity and protein folding.
Genetic risk for LOAD can be split into contributions from different biological pathways. These offer a means to explore disease mechanisms and to stratify patients.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Studies of the association between blood BDNF levels and delirium are very few and have yielded mixed results.
To investigate the blood BDNF levels in the occurrence and recovery of delirium.
Prospective, longitudinal study. Participants were assessed twice weekly with MoCA, DRS-R98, APACHE-II. BDNF levels of the same were estimated with ELISA method. Delirium has been define as per DRS-98R (cut-off > 16) and recovery of delirium as at least two consequently assessments without delirium prior to discharge.
No differences in the levels of BDNF between those with delirium and those who never developed it. Excluding those who never developed delirium (n = 140), we analysed the effects of BDNF and the other variables on delirium resolution and recovery. Of the 58 remained with delirium in the subsequently observations (max = 8) some of them continue to be delirious until discharge or death (n = 39) while others recovered (n = 19). BDNF levels and MoCA scores were significantly associated with both delirium cases who became non-delirious (resolution) during the assessments and with overall recovery. BDNF (Wald χ2 = 11.652, df: 1 P = .001), for resolution. For recovery Wald χ2 = 7.155; df: 1, P = .007. No significant association was found for the other variables (APACHE-II, history of dementia, age or gender)
BDNF do not have a direct effect in the occurrence of delirium but for those delirious of whom the levels are increased during the hospitalisation they are more likely to recover from delirium.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
Sleep disturbances are prevalent in cancer patients, especially those with advanced disease. There are few published intervention studies that address sleep issues in advanced cancer patients during the course of treatment. This study assesses the impact of a multidisciplinary quality of life (QOL) intervention on subjective sleep difficulties in patients with advanced cancer.
This randomized trial investigated the comparative effects of a multidisciplinary QOL intervention (n = 54) vs. standard care (n = 63) on sleep quality in patients with advanced cancer receiving radiation therapy as a secondary endpoint. The intervention group attended six intervention sessions, while the standard care group received informational material only. Sleep quality was assessed using the Pittsburgh Sleep Quality Index (PSQI) and Epworth Sleepiness Scale (ESS), administered at baseline and weeks 4 (post-intervention), 27, and 52.
The intervention group had a statistically significant improvement in the PSQI total score and two components of sleep quality and daytime dysfunction than the control group at week 4. At week 27, although both groups showed improvements in sleep measures from baseline, there were no statistically significant differences between groups in any of the PSQI total and component scores, or ESS. At week 52, the intervention group used less sleep medication than control patients compared to baseline (p = 0.04) and had a lower ESS score (7.6 vs. 9.3, p = 0.03).
Significance of results
A multidisciplinary intervention to improve QOL can also improve sleep quality of advanced cancer patients undergoing radiation therapy. Those patients who completed the intervention also reported the use of less sleep medication.
We describe the design and deployment of GREENBURST, a commensal Fast Radio Burst (FRB) search system at the Green Bank Telescope. GREENBURST uses the dedicated L-band receiver tap to search over the 960–1 920 MHz frequency range for pulses with dispersion measures out to
. Due to its unique design, GREENBURST is capable of conducting searches for FRBs when the L-band receiver is not being used for scheduled observing. This makes it a sensitive single pixel detector capable of reaching deeper in the radio sky. While single pulses from Galactic pulsars and rotating radio transients will be detectable in our observations, and will form part of the database we archive, the primary goal is to detect and study FRBs. Based on recent determinations of the all-sky rate, we predict that the system will detect approximately one FRB for every 2–3 months of continuous operation. The high sensitivity of GREENBURST means that it will also be able to probe the slope of the FRB fluence distribution, which is currently uncertain in this observing band.
Grains rich in starch constitute the primary source of energy for both pigs and humans, but there is incomplete understanding of physiological mechanisms that determine the extent of digestion of grain starch in monogastric animals including pigs and humans. Slow digestion of starch to produce glucose in the small intestine (SI) leads to undigested starch escaping to the large intestine where it is fermented to produce short-chain fatty acids. Glucose generated from starch provides more energy than short-chain fatty acids for normal metabolism and growth in monogastrics. While incomplete digestion of starch leads to underutilised feed in pigs and economic losses, it is desirable in human nutrition to maintain consistent body weight in adults. Undigested nutrients reaching the ileum may trigger the ileal brake, and fermentation of undigested nutrients or fibre in the large intestine triggers the colonic brake. These intestinal brakes reduce the passage rate in an attempt to maximise nutrient utilisation, and lead to increased satiety that may reduce feed intake. The three physiological mechanisms that control grain digestion and feed intake are: (1) gastric emptying rate; (2) interplay of grain digestion and passage rate in the SI controlling the activation of the ileal brake; and (3) fermentation of undigested nutrients or fibre in the large intestine activating the colonic brake. Fibre plays an important role in influencing these mechanisms and the extent of their effects. In this review, an account of the physiological mechanisms controlling the passage rate, feed intake and enzymatic digestion of grains is presented: (1) to evaluate the merits of recently developed methods of grain/starch digestion for application purposes; and (2) to identify opportunities for future research to advance our understanding of how the combination of controlled grain digestion and fibre content can be manipulated to physiologically influence satiety and food intake.
Surgery for CHD has been slow to develop in parts of the former Soviet Union. The impact of an 8-year surgical assistance programme between an emerging centre and a multi-disciplinary international team that comprised healthcare professionals from developed cardiac programmes is analysed and presented.
Material and methods
The international paediatric assistance programme included five main components – intermittent clinical visits to the site annually, medical education, biomedical engineering support, nurse empowerment, and team-based practice development. Data were analysed from visiting teams and local databases before and since commencement of assistance in 2007 (era A: 2000–2007; era B: 2008–2015). The following variables were compared between periods: annual case volume, operative mortality, case complexity based on Risk Adjustment for Congenital Heart Surgery (RACHS-1), and RACHS-adjusted standardised mortality ratio.
A total of 154 RACHS-classifiable operations were performed during era A, with a mean annual case volume by local surgeons of 19.3 at 95% confidence interval 14.3–24.2, with an operative mortality of 4.6% and a standardised mortality ratio of 2.1. In era B, surgical volume increased to a mean of 103.1 annual cases (95% confidence interval 69.1–137.2, p<0.0001). There was a non-significant (p=0.84) increase in operative mortality (5.7%), but a decrease in standardised mortality ratio (1.2) owing to an increase in case complexity. In era B, the proportion of local surgeon-led surgeries during visits from the international team increased from 0% (0/27) in 2008 to 98% (58/59) in the final year of analysis.
The model of assistance described in this report led to improved adjusted mortality, increased case volume, complexity, and independent operating skills.
Vitamin B12 is synthesised in the rumen from cobalt (Co) and has a major role in metabolism in the peri-paturient period, although few studies have evaluated the effect of the dietary inclusion of Co, vitamin B12 or injecting vitamin B12 on the metabolism, health and performance of high yielding dairy cows. A total of 56 Holstein-Friesian dairy cows received one of four treatments from 8 weeks before calving to 8 weeks post-calving: C, no added Co; DC, additional 0.2 mg Co/kg dry matter (DM); DB, additional 0.68 mg vitamin B12/kg DM; IB, intra-muscular injection of vitamin B12 to supply 0.71 mg/cow per day prepartum and 1.42 mg/cow per day post-partum. The basal and lactation rations both contained 0.21 mg Co/kg DM. Cows were weighed and condition scored at drying off, 4 weeks before calving, within 24 h of calving and at 2, 4 and 8 weeks post-calving, with blood samples collected at drying off, 2 weeks pre-calving, calving and 2, 4 and 8 weeks post-calving. Liver biopsy samples were collected from all animals at drying off and 4 weeks post-calving. Live weight changed with time, but there was no effect of treatment (P>0.05), whereas cows receiving IB had the lowest mean body condition score and DB the highest (P<0.05). There was no effect of treatment on post-partum DM intake, milk yield or milk fat concentration (P>0.05) with mean values of 21.6 kg/day, 39.6 kg/day and 40.4 g/kg, respectively. Cows receiving IB had a higher plasma vitamin B12 concentration than those receiving any of the other treatments (P<0.001), but there was no effect (P>0.05) of treatment on homocysteine or succinate concentrations, although mean plasma methylmalonic acid concentrations were lower (P=0.019) for cows receiving IB than for Control cows. Plasma β-hydroxybutyrate concentrations increased sharply at calving followed by a decline, but there was no effect of treatment. Similarly, there was no effect (P>0.05) of treatment on plasma non-esterified fatty acids or glucose. Whole tract digestibility of DM and fibre measured at week 7 of lactation were similar between treatments, and there was little effect of treatment on the milk fatty acid profile except for C15:0, which was lower in cows receiving DC than IB (P<0.05). It is concluded that a basal dietary concentration of 0.21 mg Co/kg DM is sufficient to meet the requirements of high yielding dairy cows during the transition period, and there is little benefit from additional Co or vitamin B12.
Knowledge of the effects of burial depth and burial duration on seed viability and, consequently, seedbank persistence of Palmer amaranth (Amaranthus palmeri S. Watson) and waterhemp [Amaranthus tuberculatus (Moq.) J. D. Sauer] ecotypes can be used for the development of efficient weed management programs. This is of particular interest, given the great fecundity of both species and, consequently, their high seedbank replenishment potential. Seeds of both species collected from five different locations across the United States were investigated in seven states (sites) with different soil and climatic conditions. Seeds were placed at two depths (0 and 15 cm) for 3 yr. Each year, seeds were retrieved, and seed damage (shrunken, malformed, or broken) plus losses (deteriorated and futile germination) and viability were evaluated. Greater seed damage plus loss averaged across seed origin, burial depth, and year was recorded for lots tested at Illinois (51.3% and 51.8%) followed by Tennessee (40.5% and 45.1%) and Missouri (39.2% and 42%) for A. palmeri and A. tuberculatus, respectively. The site differences for seed persistence were probably due to higher volumetric water content at these sites. Rates of seed demise were directly proportional to burial depth (α=0.001), whereas the percentage of viable seeds recovered after 36 mo on the soil surface ranged from 4.1% to 4.3% compared with 5% to 5.3% at the 15-cm depth for A. palmeri and A. tuberculatus, respectively. Seed viability loss was greater in the seeds placed on the soil surface compared with the buried seeds. The greatest influences on seed viability were burial conditions and time and site-specific soil conditions, more so than geographical location. Thus, management of these weed species should focus on reducing seed shattering, enhancing seed removal from the soil surface, or adjusting tillage systems.
Cardiomyopathy develops in >90% of Duchenne muscular dystrophy (DMD) patients by the second decade of life. We assessed the associations between DMD gene mutations, as well as Latent transforming growth factor-beta-binding protein 4 (LTBP4) haplotypes, and age at onset of myocardial dysfunction in DMD. DMD patients with baseline normal left ventricular systolic function and genotyping between 2004 and 2013 were included. Patients were grouped in multiple ways: specific DMD mutation domains, true loss-of-function mutations (group A) versus possible residual gene expression (group B), and LTBP4 haplotype. Age at onset of myocardial dysfunction was the first echocardiogram with an ejection fraction <55% and/or shortening fraction <28%. Of 101 DMD patients, 40 developed cardiomyopathy. There was no difference in age at onset of myocardial dysfunction among DMD genotype mutation domains (13.7±4.8 versus 14.3±1.0 versus 14.3±2.9 versus 13.8±2.5, p=0.97), groups A and B (14.4±2.8 versus 12.1±4.4, p=0.09), or LTBP4 haplotypes (14.5±3.2 versus 13.1±3.2 versus 11.0±2.8, p=0.18). DMD gene mutations involving the hinge 3 region, actin-binding domain, and exons 45–49, as well as the LTBP4 IAAM haplotype, were not associated with age of left ventricular dysfunction onset in DMD.
Radio jets are the large-scale and extragalactic footprints of accretion onto supermassive black holes, and are suggested to be the key ingredient controlling the galaxy stellar mass function. Of particular importance is their jet power - the time-averaged energetic feedback into their environment. Hence, the dynamics, energetics and life-cycles of radio-loud AGN (RLAGN) must be understood in order to build a qualitative and quantitative picture of their impact over cosmic time. Here, we present a study of the spectral age of two powerful, cluster-center radio galaxies, and compare with an analytic model to robustly determine their jet powers. We also present some recent LOFAR observations of the different phases of RLAGN activity, namely the remnant and subsequent restarting phases, which are key to understanding the dynamics of RLAGN over their total lifetime.
The History, Electrocardiogram (ECG), Age, Risk Factors, and Troponin (HEART) score is a decision aid designed to risk stratify emergency department (ED) patients with acute chest pain. It has been validated for ED use, but it has yet to be evaluated in a prehospital setting.
A prehospital modified HEART score can predict major adverse cardiac events (MACE) among undifferentiated chest pain patients transported to the ED.
A retrospective cohort study of patients with chest pain transported by two county-based Emergency Medical Service (EMS) agencies to a tertiary care center was conducted. Adults without ST-elevation myocardial infarction (STEMI) were included. Inter-facility transfers and those without a prehospital 12-lead ECG or an ED troponin measurement were excluded. Modified HEART scores were calculated by study investigators using a standardized data collection tool for each patient. All MACE (death, myocardial infarction [MI], or coronary revascularization) were determined by record review at 30 days. The sensitivity and negative predictive values (NPVs) for MACE at 30 days were calculated.
Over the study period, 794 patients met inclusion criteria. A MACE at 30 days was present in 10.7% (85/794) of patients with 12 deaths (1.5%), 66 MIs (8.3%), and 12 coronary revascularizations without MI (1.5%). The modified HEART score identified 33.2% (264/794) of patients as low risk. Among low-risk patients, 1.9% (5/264) had MACE (two MIs and three revascularizations without MI). The sensitivity and NPV for 30-day MACE was 94.1% (95% CI, 86.8-98.1) and 98.1% (95% CI, 95.6-99.4), respectively.
Prehospital modified HEART scores have a high NPV for MACE at 30 days. A study in which prehospital providers prospectively apply this decision aid is warranted.