To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Lewy body dementia, consisting of both dementia with Lewy bodies (DLB) and Parkinson's disease dementia (PDD), is considerably under-recognised clinically compared with its frequency in autopsy series.
This study investigated the clinical diagnostic pathways of patients with Lewy body dementia to assess if difficulties in diagnosis may be contributing to these differences.
We reviewed the medical notes of 74 people with DLB and 72 with non-DLB dementia matched for age, gender and cognitive performance, together with 38 people with PDD and 35 with Parkinson's disease, matched for age and gender, from two geographically distinct UK regions.
The cases of individuals with DLB took longer to reach a final diagnosis (1.2 v. 0.6 years, P = 0.017), underwent more scans (1.7 v. 1.2, P = 0.002) and had more alternative prior diagnoses (0.8 v. 0.4, P = 0.002), than the cases of those with non-DLB dementia. Individuals diagnosed in one region of the UK had significantly more core features (2.1 v. 1.5, P = 0.007) than those in the other region, and were less likely to have dopamine transporter imaging (P < 0.001). For patients with PDD, more than 1.4 years prior to receiving a dementia diagnosis: 46% (12 of 26) had documented impaired activities of daily living because of cognitive impairment, 57% (16 of 28) had cognitive impairment in multiple domains, with 38% (6 of 16) having both, and 39% (9 of 23) already receiving anti-dementia drugs.
Our results show the pathway to diagnosis of DLB is longer and more complex than for non-DLB dementia. There were also marked differences between regions in the thresholds clinicians adopt for diagnosing DLB and also in the use of dopamine transporter imaging. For PDD, a diagnosis of dementia was delayed well beyond symptom onset and even treatment.
Commercialization of 2,4-D–tolerant crops is a major concern for sweetpotato producers because of potential 2,4-D drift that can cause severe crop injury and yield reduction. A field study was initiated in 2014 and repeated in 2015 to assess impacts of reduced rates of 2,4-D, glyphosate, or a combination of 2,4-D with glyphosate on sweetpotato. In one study, 2,4-D and glyphosate were applied alone and in combination at 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1,000 of anticipated field use rates (1.05 kg ha−1 for 2,4-D and 1.12 kg ha−1 for glyphosate) to ‘Beauregard’ sweetpotato at storage root formation (10 days after transplanting [DAP]). In a separate study, all these treatments were applied to ‘Beauregard’ sweetpotato at storage root development (30 DAP). Injury with 2,4-D alone or in combination with glyphosate was generally equal or greater than with glyphosate applied alone at equivalent herbicide rates, indicating that injury is attributable mostly to 2,4-D in the combination. There was a quadratic increase in crop injury and quadratic decrease in crop yield (with respect to most yield grades) with increased rate of 2,4-D applied alone or in combination with glyphosate applied at storage root development. However, neither the results of this relationship nor of the significance of herbicide rate were observed on crop injury or sweetpotato yield when herbicide application occurred at storage root formation, with a few exceptions. In general, crop injury and yield reduction were greatest at the highest rate (1/10×) of 2,4-D applied alone or in combination with glyphosate, although injury observed at lower rates was also a concern after initial observation by sweetpotato producers. However, in some cases, yield reduction of U.S. no.1 and marketable grades was also observed after application of 1/250×, 1/100×, or 1/10× rates of 2,4-D alone or with glyphosate when applied at storage root development.
A major concern of sweetpotato producers is the potential negative effects from herbicide drift or sprayer contamination events when dicamba is applied to nearby dicamba-resistant crops. A field study was initiated in 2014 and repeated in 2015 to assess the effects of reduced rates of N,N-Bis-(3-aminopropyl)methylamine (BAPMA) or diglycloamine (DGA) salt of dicamba, glyphosate, or a combination of these individually in separate trials with glyphosate on sweetpotato. Reduced rates of 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1,000 of the 1× use rate of each dicamba formulation at 0.56 kg ha−1, glyphosate at 1.12 kg ha−1, and a combination of the two at aforementioned rates were applied to ‘Beauregard’ sweetpotato at storage root formation (10 d after transplanting) in one trial and storage root development (30 d after transplanting) in a separate trial. Injury with each salt of dicamba (BAPMA or DGA) applied alone or with glyphosate was generally equal to or greater than glyphosate applied alone at equivalent rates, indicating that injury is most attributable to the dicamba in the combination. There was a quadratic increase in crop injury and a quadratic decrease in crop yield (with respect to most yield grades) observed with an increased herbicide rate of dicamba applied alone or in combination with glyphosate applied at storage root development. However, with a few exceptions, neither this relationship nor the significance of herbicide rate was observed on crop injury or sweetpotato yield when herbicide application occurred at the storage root formation stage. In general, crop injury and yield reduction were greatest at the highest rate (1/10×) of either salt of dicamba applied alone or in combination with glyphosate, although injury observed at lower rates would be cause for concern after initial observation by sweetpotato producers. However, in some cases yield reduction of No.1 and marketable grades was observed following 1/250×, 1/100×, or 1/10× application rates of dicamba alone or with glyphosate when applied at storage root development.
Objectives: To describe multivariate base rates (MBRs) of low scores and reliable change (decline) scores on Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) in college athletes at baseline, as well as to assess MBR differences among demographic and medical history subpopulations. Methods: Data were reported on 15,909 participants (46.5% female) from the NCAA/DoD CARE Consortium. MBRs of ImPACT composite scores were derived using published CARE normative data and reliability metrics. MBRs of sex-corrected low scores were reported at <25th percentile (Low Average), <10th percentile (Borderline), and ≤2nd percentile (Impaired). MBRs of reliable decline scores were reported at the 75%, 90%, 95%, and 99% confidence intervals. We analyzed subgroups by sex, race, attention-deficit/hyperactivity disorder and/or learning disability (ADHD/LD), anxiety/depression, and concussion history using chi-square analyses. Results: Base rates of low scores and reliable decline scores on individual composites approximated the normative distribution. Athletes obtained ≥1 low score with frequencies of 63.4% (Low Average), 32.0% (Borderline), and 9.1% (Impaired). Athletes obtained ≥1 reliable decline score with frequencies of 66.8%, 32.2%, 18%, and 3.8%, respectively. Comparatively few athletes had low scores or reliable decline on ≥2 composite scores. Black/African American athletes and athletes with ADHD/LD had higher rates of low scores, while greater concussion history was associated with lower MBRs (p < .01). MBRs of reliable decline were not associated with demographic or medical factors. Conclusions: Clinical interpretation of low scores and reliable decline on ImPACT depends on the strictness of the low score cutoff, the reliable change criterion, and the number of scores exceeding these cutoffs. Race and ADHD influence the frequency of low scores at all cutoffs cross-sectionally.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
Childhood maltreatment (CM) plays an important role in the development of major depressive disorder (MDD). The aim of this study was to examine whether CM severity and type are associated with MDD-related brain alterations, and how they interact with sex and age.
Within the ENIGMA-MDD network, severity and subtypes of CM using the Childhood Trauma Questionnaire were assessed and structural magnetic resonance imaging data from patients with MDD and healthy controls were analyzed in a mega-analysis comprising a total of 3872 participants aged between 13 and 89 years. Cortical thickness and surface area were extracted at each site using FreeSurfer.
CM severity was associated with reduced cortical thickness in the banks of the superior temporal sulcus and supramarginal gyrus as well as with reduced surface area of the middle temporal lobe. Participants reporting both childhood neglect and abuse had a lower cortical thickness in the inferior parietal lobe, middle temporal lobe, and precuneus compared to participants not exposed to CM. In males only, regardless of diagnosis, CM severity was associated with higher cortical thickness of the rostral anterior cingulate cortex. Finally, a significant interaction between CM and age in predicting thickness was seen across several prefrontal, temporal, and temporo-parietal regions.
Severity and type of CM may impact cortical thickness and surface area. Importantly, CM may influence age-dependent brain maturation, particularly in regions related to the default mode network, perception, and theory of mind.
In this paper, we ask whether or not we can afford to realize the potential benefits of genetic testing as a screening tool for adoptees. Our method is to provide reasonable cost and savings estimates. We argue that the prospect of cost neutrality should be sufficient to explore the targeted screening for a population who will otherwise suffer an avoidable health disparity in access to inherited disease information. Our goal here is to establish that the investment needed to attain these benefits is not beyond our means.
To assess variability in antimicrobial use and associations with infection testing in pediatric ventilator-associated events (VAEs).
Descriptive retrospective cohort with nested case-control study.
Pediatric intensive care units (PICUs), cardiac intensive care units (CICUs), and neonatal intensive care units (NICUs) in 6 US hospitals.
Children≤18 years ventilated for≥1 calendar day.
We identified patients with pediatric ventilator-associated conditions (VACs), pediatric VACs with antimicrobial use for≥4 days (AVACs), and possible ventilator-associated pneumonia (PVAP, defined as pediatric AVAC with a positive respiratory diagnostic test) according to previously proposed criteria.
Among 9,025 ventilated children, we identified 192 VAC cases, 43 in CICUs, 70 in PICUs, and 79 in NICUs. AVAC criteria were met in 79 VAC cases (41%) (58% CICU; 51% PICU; and 23% NICU), and varied by hospital (CICU, 20–67%; PICU, 0–70%; and NICU, 0–43%). Type and duration of AVAC antimicrobials varied by ICU type. AVAC cases in CICUs and PICUs received broad-spectrum antimicrobials more often than those in NICUs. Among AVAC cases, 39% had respiratory infection diagnostic testing performed; PVAP was identified in 15 VAC cases. Also, among AVAC cases, 73% had no associated positive respiratory or nonrespiratory diagnostic test.
Antimicrobial use is common in pediatric VAC, with variability in spectrum and duration of antimicrobials within hospitals and across ICU types, while PVAP is uncommon. Prolonged antimicrobial use despite low rates of PVAP or positive laboratory testing for infection suggests that AVAC may provide a lever for antimicrobial stewardship programs to improve utilization.
OBJECTIVES/SPECIFIC AIMS: The purpose of the present secondary data analysis was to examine the effect of moderate-severe disturbed sleep before the start of radiation therapy (RT) on subsequent RT-induced pain. METHODS/STUDY POPULATION: Analyses were performed on 676 RT-naïve breast cancer patients (mean age 58, 100% female) scheduled to receive RT from a previously completed nationwide, multicenter, phase II randomized controlled trial examining the efficacy of oral curcumin on radiation dermatitis severity. The trial was conducted at 21 community oncology practices throughout the US affiliated with the University of Rochester Cancer Center NCI’s Community Oncology Research Program (URCC NCORP) Research Base. Sleep disturbance was assessed using a single item question from the modified MD Anderson Symptom Inventory (SI) on a 0–10 scale, with higher scores indicating greater sleep disturbance. Total subjective pain as well as the subdomains of pain (sensory, affective, and perceived) were assessed by the short-form McGill Pain Questionnaire. Pain at treatment site (pain-Tx) was also assessed using a single item question from the SI. These assessments were included for pre-RT (baseline) and post-RT. For the present analyses, patients were dichotomized into 2 groups: those who had moderate-severe disturbed sleep at baseline (score≥4 on the SI; n=101) Versus those who had mild or no disturbed sleep (control group; score=0–3 on the SI; n=575). RESULTS/ANTICIPATED RESULTS: Prior to the start of RT, breast cancer patients with moderate-severe disturbed sleep at baseline were younger, less likely to have had lumpectomy or partial mastectomy while more likely to have had total mastectomy and chemotherapy, more likely to be on sleep, anti-anxiety/depression, and prescription pain medications, and more likely to suffer from depression or anxiety disorder than the control group (all p’s≤0.02). Spearman rank correlations showed that changes in sleep disturbance from baseline to post-RT were significantly correlated with concurrent changes in total pain (r=0.38; p<0.001), sensory pain (r=0.35; p<0.001), affective pain (r=0.21; p<0.001), perceived pain intensity (r=0.37; p<0.001), and pain-Tx (r=0.35; p<0.001). In total, 92% of patients with moderate-severe disturbed sleep at baseline reported post-RT total pain compared with 79% of patients in the control group (p=0.006). Generalized linear estimating equations, after controlling for baseline pain and other covariates (baseline fatigue and distress, age, sleep medications, anti-anxiety/depression medications, prescription pain medications, and depression or anxiety disorder), showed that patients with moderate-severe disturbed sleep at baseline had significantly higher mean values of post-RT total pain (by 39%; p=0.033), post-RT sensory pain (by 41%; p=0.046), and post-RT affective pain (by 55%; p=0.035) than the control group. Perceived pain intensity (p=0.066) and pain-Tx (p=0.086) at post-RT were not significantly different between the 2 groups. DISCUSSION/SIGNIFICANCE OF IMPACT: These findings suggest that moderate-severe disturbed sleep prior to RT is an important predictor for worsening of pain at post-RT in breast cancer patients. There could be several plausible reasons for this. Sleep disturbance, such as sleep loss and sleep continuity disturbance, could result in impaired sleep related recovery and repair of tissue damage associated with cancer and its treatment; thus, resulting in the amplification of pain. Sleep disturbance may also reduce pain tolerance threshold through increased sensitization of the central nervous system. In addition, pain and sleep disturbance may share common neuroimmunological pathways. Sleep disturbance may modulate inflammation, which in turn may contribute to increased pain. Further research is needed to confirm these findings and whether interventions targeting sleep disturbance in early phase could be potential alternate approaches to reduce pain after RT.
Adult ventilator-associated event (VAE) definitions include ventilator-associated conditions (VAC) and subcategories for infection-related ventilator-associated complications (IVAC) and possible ventilator-associated pneumonia (PVAP). We explored these definitions for children.
Pediatric, cardiac, or neonatal intensive care units (ICUs) in 6 US hospitals
Patients ≤18 years old ventilated for ≥1 day
We identified patients with pediatric VAC based on previously proposed criteria. We applied adult temperature, white blood cell count, antibiotic, and culture criteria for IVAC and PVAP to these patients. We matched pediatric VAC patients with controls and evaluated associations with adverse outcomes using Cox proportional hazards models.
In total, 233 pediatric VACs (12,167 ventilation episodes) were identified. In the cardiac ICU (CICU), 62.5% of VACs met adult IVAC criteria; in the pediatric ICU (PICU), 54.2% of VACs met adult IVAC criteria; and in the neonatal ICU (NICU), 20.2% of VACs met adult IVAC criteria. Most patients had abnormal white blood cell counts and temperatures; we therefore recommend simplifying surveillance by focusing on “pediatric VAC with antimicrobial use” (pediatric AVAC). Pediatric AVAC with a positive respiratory diagnostic test (“pediatric PVAP”) occurred in 8.9% of VACs in the CICU, 13.3% of VACs in the PICU, and 4.3% of VACs in the NICU. Hospital mortality was increased, and hospital and ICU length of stay and duration of ventilation were prolonged among all pediatric VAE subsets compared with controls.
We propose pediatric AVAC for surveillance related to antimicrobial use, with pediatric PVAP as a subset of AVAC. Studies on generalizability and responsiveness of these metrics to quality improvement initiatives are needed, as are studies to determine whether lower pediatric VAE rates are associated with improvements in other outcomes.
Correlation of lower Cambrian strata is often confounded by provincialism of key fauna. The widespread occurrence of the micromollusc Watsonella crosbyi Grabau, 1900 is therefore an important biostratigraphic signpost with potential for international correlation of lower Cambrian successions. Previous correlations of W. crosbyi from Australia (Normanville Group) suggested an Atdabanian- to Botoman-equivalent age. However, in the upper part of the Mount Terrible Formation, stratigraphic ranges of W. crosbyi and Aldanella sp. cf. golubevi overlap prior to the incoming of vertically burrowed ‘piperock’, which is indicative of an age no earlier than Cambrian Stage 2. The stratigraphic range of W. crosbyi in the Normanville Group, South Australia correlates with the ranges of the taxon in China, France, Mongolia and Siberia (though not Newfoundland). The new Australian data add further support for considering the first occurrence of W. crosbyi a good potential candidate for defining the base of Cambrian Stage 2. The stratigraphic range of W. crosbyi through the lower Cambrian Normanville Group has been determined based on collections from measured sections. Although rare, W. crosbyi is part of an assemblage of micromolluscs including Bemella sp., Parailsanella sp. cf. murenica and a sinistral form of Aldanella (A. sp. cf. A. golubevi). Other fauna present include Australohalkieria sp., Eremactis mawsoni, chancelloriids and Cupitheca sp.
The Dark Energy Survey is undertaking an observational programme imaging 1/4 of the southern hemisphere sky with unprecedented photometric accuracy. In the process of observing millions of faint stars and galaxies to constrain the parameters of the dark energy equation of state, the Dark Energy Survey will obtain pre-discovery images of the regions surrounding an estimated 100 gamma-ray bursts over 5 yr. Once gamma-ray bursts are detected by, e.g., the Swift satellite, the DES data will be extremely useful for follow-up observations by the transient astronomy community. We describe a recently-commissioned suite of software that listens continuously for automated notices of gamma-ray burst activity, collates information from archival DES data, and disseminates relevant data products back to the community in near-real-time. Of particular importance are the opportunities that non-public DES data provide for relative photometry of the optical counterparts of gamma-ray bursts, as well as for identifying key characteristics (e.g., photometric redshifts) of potential gamma-ray burst host galaxies. We provide the functional details of the DESAlert software, and its data products, and we show sample results from the application of DESAlert to numerous previously detected gamma-ray bursts, including the possible identification of several heretofore unknown gamma-ray burst hosts.
Both physical activity (PA) and diet are important contributors to health and well-being; however, there is limited information on the association of these behaviours and whether observed associations differ by weight. The present study aimed to evaluate whether nutrient intake is associated with PA and if this association varies by weight in young adults.
Cross-sectional study to analyse the association between PA and nutrient intake.
Participants were stratified as normal weight (18·5 kg/m2 <BMI <25·0 kg/m2) and overweight/obese (BMI≥25·0 kg/m2). PA level (PAL) was calculated (PAL=total daily energy expenditure/RMR) and used to stratify groups (PAL<1·6, 1·6≤PAL<1·9, PAL≥1·9).
Adults (n 407; age 27·6 (sd 3·8) years, 48 % male), with BMI between 20 and 35 kg/m2, having at least two 24 h diet recalls and at least 5 d (including two weekend days) of valid, objectively measured PA data were included in the analysis.
In normal-weight participants, higher PAL was associated with higher intakes of minerals (except Ca, Fe and Zn), B-vitamins and choline (P for trend <0·05). In the overweight/obese group, higher PAL was associated with higher intakes of fibre, K, Na and Cu (P for trend <0·05). These differences, however, were no longer significant after additionally controlling for total energy intake.
More active young adults have higher intakes of essential micronutrients. The benefits of PA may be predominantly due to a higher overall food intake while maintaining energy balance rather than a healthier diet.
Relative sea-level change (RSL), from the Late Glacial through to the late Holocene, is reconstructed for the Assynt region, northwest Scotland, based on bio- and lithostratigraphical analysis. Four new radiocarbon-dated sea-level index points help constrain RSL change for the Late Glacial to the late Holocene. These new data, in addition to published material, capture the RSL fall during the Late Glacial and the rise and fall associated with the mid-Holocene highstand. Two of these index points constrain the Late Glacial RSL history in Assynt for the first time, reconstructing RSL falling from 2.47 ± 0.59 m OD to 0.15 ± 0.59 m OD at c. 14,000–15,000 cal yr BP. These new data test model predictions of glacial isostatic adjustment (GIA), particularly during the early deglacial period which is currently poorly constrained throughout the British Isles. Whilst the empirical data from the mid- to late-Holocene to present matches quite well with the recent GIA model output, there is a relatively poor fit between the timing of the Late Glacial RSL fall and early Holocene RSL rise. This mismatch, also evident elsewhere in northwest Scotland, may result from uncertainties associated with both the global and local ice components of GIA models.
New solar soft X-ray (SXR) and extreme ultraviolet (EUV) irradiance observations from NASA Solar Dynamics Observatory (SDO) EUV Variability Experiment (EVE) provide full coverage from 0.1 to 106 nm and continuously at a cadence of 10 seconds for spectra at 0.1 nm resolution. These observations during flares can usually be decomposed into four distinct characteristics: impulsive phase, gradual phase, coronal dimming, and EUV late phase. Over 6000 flares have been observed during the SDO mission; some flares show all four phases, and some only show the gradual phase. The focus is on the newer results about the EUV late phase and coronal dimming and its relationship to coronal mass ejections (CMEs). These EVE flare measurements are based on observing the sun-as-a-star, so these results could exemplify stellar flares. Of particular interest is that new coronal dimming measurements of stars could be used to estimate mass and velocity of stellar CMEs.