To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To develop and validate the Discrepancy-based Evidence for Loss of Thinking Abilities (DELTA) score. The DELTA score characterizes the strength of evidence for cognitive decline on a continuous spectrum using well-established psychometric principles for improving detection of cognitive changes.
DELTA score development used neuropsychological test scores from the Alzheimer’s Disease Neuroimaging Initiative (ADNI) cohort (two tests each from Memory, Executive Function, and Language domains). We derived regression-based normative reference scores using age, gender, years of education, and word-reading ability from robust cognitively normal ADNI participants. Discrepancies between predicted and observed scores were used for calculating the DELTA score (range 0–15). We validated DELTA scores primarily against longitudinal Clinical Dementia Rating-Sum of Boxes (CDR-SOB) and Functional Activities Questionnaire (FAQ) scores (baseline assessment through Year 3) using linear mixed models and secondarily against cross-sectional Alzheimer’s biomarkers.
There were 1359 ADNI participants with calculable baseline DELTA scores (age 73.7 ± 7.1 years, 55.4% female, 100% white/Caucasian). Higher baseline DELTA scores (stronger evidence of cognitive decline) predicted higher baseline CDR-SOB (ΔR2 = .318) and faster rates of CDR-SOB increase over time (ΔR2 = .209). Longitudinal changes in DELTA scores tracked closely and in the same direction as CDR-SOB scores (fixed and random effects of mean + mean-centered DELTA, ΔR2 > .7). Results were similar for FAQ scores. High DELTA scores predicted higher PET-Aβ SUVr (ρ = 324), higher CSF-pTau/CSF-Aβ ratio (ρ = .460), and demonstrated PPV > .9 for positive Alzheimer’s disease biomarker classification.
Data support initial development and validation of the DELTA score through its associations with longitudinal functional changes and Alzheimer’s biomarkers. We provide several considerations for future research and include an automated scoring program for clinical use.
We sought to define the prevalence of echocardiographic abnormalities in long-term survivors of paediatric hematopoietic stem cell transplantation and determine the utility of screening in asymptomatic patients. We analysed echocardiograms performed on survivors who underwent hematopoietic stem cell transplantation from 1982 to 2006. A total of 389 patients were alive in 2017, with 114 having an echocardiogram obtained ⩾5 years post-infusion. A total of 95 patients had echocardiogram performed for routine surveillance. The mean time post-hematopoietic stem cell transplantation was 13 years. Of 95 patients, 77 (82.1%) had ejection fraction measured, and 10/77 (13.0%) had ejection fraction z-scores ⩽−2.0, which is abnormally low. Those patients with abnormal ejection fraction were significantly more likely to have been exposed to anthracyclines or total body irradiation. Among individuals who received neither anthracyclines nor total body irradiation, only 1/31 (3.2%) was found to have an abnormal ejection fraction of 51.4%, z-score −2.73. In the cohort of 77 patients, the negative predictive value of having a normal ejection fraction given no exposure to total body irradiation or anthracyclines was 96.7% at 95% confidence interval (83.3–99.8%). Systolic dysfunction is relatively common in long-term survivors of paediatric hematopoietic stem cell transplantation who have received anthracyclines or total body irradiation. Survivors who are asymptomatic and did not receive radiation or anthracyclines likely do not require surveillance echocardiograms, unless otherwise indicated.
The development of laser wakefield accelerators (LWFA) over the past several years has led to an interest in very compact sources of X-ray radiation – such as “table-top” free electron lasers. However, the use of conventional undulators using permanent magnets also implies system sizes which are large. In this work, we assess the possibilities for the use of novel mini-undulators in conjunction with a LWFA so that the dimensions of the undulator become comparable with the acceleration distances for LWFA experiments (i.e., centimeters). The use of a prototype undulator using laser machining of permanent magnets for this application is described and the emission characteristics and limitations of such a system are determined. Preliminary electron propagation and X-ray emission measurements are taken with a LWFA electron beam at the University of Michigan.
The northern New England region includes the states of Vermont, New Hampshire, and Maine and encompasses a large degree of climate and edaphic variation across a relatively small spatial area, making it ideal for studying climate change impacts on agricultural weed communities. We sampled weed seedbanks and measured soil physical and chemical characteristics on 77 organic farms across the region and analyzed the relationships between weed community parameters and select geographic, climatic, and edaphic variables using multivariate procedures. Temperature-related variables (latitude, longitude, mean maximum and minimum temperature) were the strongest and most consistent correlates with weed seedbank composition. Edaphic variables were, for the most part, relatively weaker and inconsistent correlates with weed seedbanks. Our analyses also indicate that a number of agriculturally important weed species are associated with specific U.S. Department of Agriculture plant hardiness zones, implying that future changes in climate factors that result in geographic shifts in these zones will likely be accompanied by changes in the composition of weed communities and therefore new management challenges for farmers.
There is a well-established discrepancy between paleontological and molecular data regarding the timing of the origin and diversification of placental mammals. Molecular estimates place interordinal diversification dates in the Cretaceous, while no unambiguous crown placental fossils have been found prior to the end-Cretaceous mass extinction. Here, the completeness of the eutherian fossil record through geological time is evaluated to assess the suggestion that a poor fossil record is largely responsible for the difference in estimates of placental origins. The completeness of fossil specimens was measured using the character completeness metric, which quantifies the completeness of fossil taxa as the percentage of phylogenetic characters available to be scored for any given taxon. Our data set comprised 33 published cladistic matrices representing 445 genera, of which 333 were coded at the species level.
There was no significant difference in eutherian completeness across the Cretaceous/Paleogene (K/Pg) boundary. This suggests that the lack of placental mammal fossils in the Cretaceous is not due to a poor fossil record but more likely represents a genuine absence of placental mammals in the Cretaceous. This result supports the “explosive model” of early placental evolution, whereby placental mammals originated around the time of the K/Pg boundary and diversified soon after.
No correlation was found between the completeness pattern observed in this study and those of previous completeness studies on birds and sauropodomorph dinosaurs, suggesting that different factors affect the preservation of these groups. No correlations were found with various isotope proxy measures, but Akaike information criterion analysis found that eutherian character completeness metric scores were best explained by models involving the marine-carbonate strontium-isotope ratios (87Sr/86Sr), suggesting that tectonic activity might play a role in controlling the completeness of the eutherian fossil record.
This paper presents the first major data release and survey description for the ANU WiFeS SuperNovA Programme. ANU WiFeS SuperNovA Programme is an ongoing supernova spectroscopy campaign utilising the Wide Field Spectrograph on the Australian National University 2.3-m telescope. The first and primary data release of this programme (AWSNAP-DR1) releases 357 spectra of 175 unique objects collected over 82 equivalent full nights of observing from 2012 July to 2015 August. These spectra have been made publicly available via the WISEREP supernova spectroscopy repository.
We analyse the ANU WiFeS SuperNovA Programme sample of Type Ia supernova spectra, including measurements of narrow sodium absorption features afforded by the high spectral resolution of the Wide Field Spectrograph instrument. In some cases, we were able to use the integral-field nature of the Wide Field Spectrograph instrument to measure the rotation velocity of the SN host galaxy near the SN location in order to obtain precision sodium absorption velocities. We also present an extensive time series of SN 2012dn, including a near-nebular spectrum which both confirms its ‘super-Chandrasekhar’ status and enables measurement of the sub-solar host metallicity at the SN site.
Pigweeds are among the most abundant and troublesome weed species across Midwest and mid-South soybean production systems because of their prolific growth characteristics and ability to rapidly evolve resistance to several herbicide sites of action. This has renewed interest in diversifying weed management strategies by implementing integrated weed management (IWM) programs to efficiently manage weeds, increase soybean light interception, and increase grain yield. Field studies were conducted across 16 site-years to determine the effectiveness of soybean row width, seeding rate, and herbicide strategy as components of IWM in glufosinate-resistant soybean. Sites were grouped according to optimum adaptation zones for soybean maturity groups (MGs). Across all MG regions, pigweed density and height at the POST herbicide timing, and end-of-season pigweed density, height, and fecundity were reduced in IWM programs using a PRE followed by (fb) POST herbicide strategy. Furthermore, a PRE fb POST herbicide strategy treatment increased soybean cumulative intercepted photosynthetically active radiation (CIPAR) and subsequently, soybean grain yield across all MG regions. Soybean row width and seeding rate manipulation effects were highly variable. Narrow row width (≤ 38 cm) and a high seeding rate (470,000 seeds ha−1) reduced end-of-season height and fecundity variably across MG regions compared with wide row width (≥ 76 cm) and moderate to low (322,000 to 173,000 seeds ha−1) seeding rates. However, narrow row widths and high seeding rates did not reduce pigweed density at the POST herbicide application timing or at soybean harvest. Across all MG regions, soybean CIPAR increased as soybean row width decreased and seeding rate increased; however, row width and seeding rate had variable effects on soybean yield. Furthermore, soybean CIPAR was not associated with end-of-season pigweed growth and fecundity. A PRE fb POST herbicide strategy was a necessary component for an IWM program as it simultaneously managed pigweeds, increased soybean CIPAR, and increased grain yield.
To determine the effect of graft choice (allograft, bone-patellar tendon-bone autograft, or hamstring autograft) on deep tissue infections following anterior cruciate ligament (ACL) reconstructions.
Retrospective cohort study.
SETTING AND POPULATION
Patients from 6 US health plans who underwent ACL reconstruction from January 1, 2000, through December 31, 2008.
We identified ACL reconstructions and potential postoperative infections using claims data. A hierarchical stratified sampling strategy was used to identify patients for medical record review to confirm ACL reconstructions and to determine allograft vs autograft tissue implanted, clinical characteristics, and infection status. We estimated infection rates overall and by graft type. We used logistic regression to assess the association between infections and patients’ demographic characteristics, comorbidities, and choice of graft.
On review of 1,452 medical records, we found 55 deep wound infections. With correction for sampling weights, infection rates varied by graft type: 0.5% (95% CI, 0.3%-0.8%) with allografts, 0.6% (0.1%–1.5%) with bone-patellar tendon-bone autografts, and 2.5% (1.9%–3.1%) with hamstring autograft. After adjusting for potential confounders, we found an increased infection risk with hamstring autografts compared with allografts (odds ratio, 5.9; 95% CI, 2.8–12.8). However, there was no difference in infection risk among bone-patellar tendon-bone autografts vs allografts (odds ratio, 1.2; 95% CI, 0.3–4.8).
The overall risk for deep wound infections following ACL reconstruction is low but it does vary by graft type. Infection risk was highest in hamstring autograft recipients compared with allograft recipients and bone-patellar tendon-bone autograft recipients.
We discuss the stellar halos of massive elliptical galaxies, as revealed by our ambitious integral-field spectroscopic survey MASSIVE. We show that metallicity drops smoothly as a function of radius out to ~ 2.5 Re, while the [α/Fe] abundance ratios stay flat. The stars in the outskirts likely formed rapidly (to explain the high ratio of alpha to Fe) but in a relatively shallow potential (to explain the low metallicities). This is consistent with expectations for a two-phase growth of massive galaxies, in which the second phase involves accretion of small satellites. We also show some preliminary study of the gas content of these most MASSIVE galaxies.
To explore the feasibility of identifying anterior cruciate ligament (ACL) allograft implantations and infections using claims.
Retrospective cohort study.
We identified ACL reconstructions using procedure codes at 6 health plans from 2000 to 2008. We then identified potential infections using claims-based indicators of infection, including diagnoses, procedures, antibiotic dispensings, specialty consultations, emergency department visits, and hospitalizations. Patients’ medical records were reviewed to determine graft type, validate infection status, and calculate sensitivity and positive predictive value (PPV) for indicators of ACL allografts and infections.
A total of 11,778 patients with codes for ACL reconstruction were identified. After chart review, PPV for ACL reconstruction was 96% (95% confidence interval [CI], 94%–97%). Of the confirmed ACL reconstructions, 39% (95% CI, 35%–42%) used allograft tissues. The deep infection rate after ACL reconstruction was 1.0% (95% CI, 0.7%–1.4%). The odds ratio of infection for allografts versus autografts was 0.41 (95% CI, 0.19–0.78). Sensitivity of individual claims-based indicators for deep infection after ACL reconstruction ranged from 0% to 75% and PPV from 0% to 100%. Claims-based infection indicators could be combined to enhance sensitivity or PPV but not both.
While claims data accurately identify ACL reconstructions, they poorly distinguish between allografts and autografts and identify infections with variable accuracy. Claims data could be useful to monitor infection trends after ACL reconstruction, with different algorithms optimized for different surveillance goals.
Laboratory data are the cornerstone in surveillance of infectious disease. We investigated whether changes in reported incidence of Campylobacter and Salmonella infection might be explained by changes in stool sampling rates. Data were extracted from a national database on 585 843 patient stool samples tested by microbiology laboratories in Wales between 1998 and 2008. Salmonella incidence fell from 43 to 19 episodes/100 000 population but Campylobacter incidence after declining from 111/100 000 in 1998 to 84/100 000 in 2003 rose to 119/100 000 in 2008. The proportion of the population sampled rose from 2·0% in 1998 to 2·8% in 2008, mostly due to increases in samples from hospital patients and older adults. The proportion of positive samples declined for both Salmonella and Campylobacter from 3·1% to 1·1% and from 8·9% to 7·5%, respectively. The decline in Salmonella incidence is so substantial that it is not masked even by increased stool sampling, but the recent rise in Campylobacter incidence may be a surveillance artefact largely due to the increase in stool sampling in older people.