To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Tree-rings representing annual dates from live and deadwood Pinus flexilis at ten sites across the central Great Basin (~38°N) yielded a cumulative record across 4002 years (1983 BC–AD 2019). Individual site chronologies ranged in length from 861–4002 years; all were continuous over their sample depths. Correlations of growth with climate were positive for water relations and mostly negative for summer temperatures. Growth was generally correlated across sites, with the central Nevada stands most distinct. Although growth was low during the Late Holocene Dry Period, variability marked this interval, suggesting that it was not pervasively dry. All sites had low growth during the first half of the Medieval Climate Anomaly, high growth during the mid-interval pluvial, and low growth subsequently. Little synchrony occurred across sites for the early Little Ice Age. After AD 1650, growth was depressed until the early twentieth century. Growth at all sites declined markedly ca. AD 1985, was similar to the lowest growth period of the full records, and indicative of recent severe droughts. A small rebound in growth occurred after ca. AD 2010. A strong signal for Atlantic Multidecadal Oscillation (AMO) occurred in growth response at most sites. The persistence of all stands despite climate variability indicates high resilience of this species.
We present continuous estimates of snow and firn density, layer depth and accumulation from a multi-channel, multi-offset, ground-penetrating radar traverse. Our method uses the electromagnetic velocity, estimated from waveform travel-times measured at common-midpoints between sources and receivers. Previously, common-midpoint radar experiments on ice sheets have been limited to point observations. We completed radar velocity analysis in the upper ~2 m to estimate the surface and average snow density of the Greenland Ice Sheet. We parameterized the Herron and Langway (1980) firn density and age model using the radar-derived snow density, radar-derived surface mass balance (2015–2017) and reanalysis-derived temperature data. We applied structure-oriented filtering to the radar image along constant age horizons and increased the depth at which horizons could be reliably interpreted. We reconstructed the historical instantaneous surface mass balance, which we averaged into annual and multidecadal products along a 78 km traverse for the period 1984–2017. We found good agreement between our physically constrained parameterization and a firn core collected from the dry snow accumulation zone, and gained insights into the spatial correlation of surface snow density.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
Background: Delayed or in vitro inactive empiric antibiotic therapy may be detrimental to survival in patients with bloodstream infections (BSIs). Understanding the landscape of delayed or discordant empiric antibiotic therapy (DDEAT) across different patient, pathogen, and hospital types, as well as by their baseline resistance milieu, may enable providers, antimicrobial stewardship programs, and policy makers to optimize empiric prescribing. Methods: Inpatients with clinically suspected serious infection (based on sampling of blood cultures and receiving systemic antibiotic therapy on the same or next day) found to have BSI were identified in the Cerner Healthfacts EHR database. Patients were considered to have received DDEAT when, on culture sampling day, they received either no antibiotic(s) or none that displayed in vitro activity against the pathogenic bloodstream isolate. Antibiotic-resistant phenotypes were defined by in vitro resistance to taxon-specific prototype antibiotics (eg, methicillin/oxacillin resistance in S. aureus) and were used to estimate baseline resistance prevalence encountered by the hospital. The probability of DDEAT was examined by bacterial taxon, by time of BSI onset, and by presence versus absence of antibiotic-resistance phenotypes, sepsis or septic shock, hospital type, and baseline resistance. Results: Of 26,036 assessable patients with a BSI at 131 US hospitals between 2005 and 2014, 14,658 (56%) had sepsis, 3,623 (14%) had septic shock, 5,084 (20%) had antibiotic-resistant phenotypes, and 8,593 (33%) received DDEAT. Also, 4,428 (52%) recipients of DDEAT received no antibiotics on culture sampling day, whereas the remaining 4,165 (48%) received in vitro discordant therapy. DDEAT occurred most often in S. maltophilia (87%) and E. faecium (80%) BSIs; however, 75% of DDEAT cases and 76% of deaths among recipients of DDEAT collectively occurred among patients with S. aureus and Enterobacteriales BSIs. For every 8 bacteremic patients presenting with septic shock, 1 patient did not receive any antibiotics on culture day (Fig. 1A). Patients with BSIs of hospital (vs community) onset were twice as likely to receive no antibiotics on culture day, whereas those with bloodstream pathogens displaying antibiotic-resistant (vs susceptible) phenotypes were 3 times as likely to receive in vitro discordant therapy (Fig. 1B). The median proportion of DDEAT ranged between 25% (14, 37%) in eight <300-bed teaching hospitals in the lowest baseline resistance quartile and 40% (31, 50%) at five ≥300-bed teaching hospitals in the third baseline resistance quartile (Fig. 2). Conclusions: Delayed or in vitro discordant empiric antibiotic therapy is common among patients with BSI in US hospitals regardless of hospital size, teaching status, or local resistance patterns. Prompt empiric antibiotic therapy in septic shock and hospital-onset BSI needs more support. Reliable detection of S. aureus and Enterobacteriales bloodstream pathogens and their resistance patterns earlier with rapid point-of-care diagnostics may mitigate the population-level impact of DDEAT in BSI.
Funding: This study was funded in part by the National Institutes of Health Clinical Center, National Institutes of Allergy and Infectious Diseases, National Cancer Institute (NCI contract no. HHSN261200800001E) and the Agency for Healthcare Research and Quality.
The emphasis on team science in clinical and translational research increases the importance of collaborative biostatisticians (CBs) in healthcare. Adequate training and development of CBs ensure appropriate conduct of robust and meaningful research and, therefore, should be considered as a high-priority focus for biostatistics groups. Comprehensive training enhances clinical and translational research by facilitating more productive and efficient collaborations. While many graduate programs in Biostatistics and Epidemiology include training in research collaboration, it is often limited in scope and duration. Therefore, additional training is often required once a CB is hired into a full-time position. This article presents a comprehensive CB training strategy that can be adapted to any collaborative biostatistics group. This strategy follows a roadmap of the biostatistics collaboration process, which is also presented. A TIE approach (Teach the necessary skills, monitor the Implementation of these skills, and Evaluate the proficiency of these skills) was developed to support the adoption of key principles. The training strategy also incorporates a “train the trainer” approach to enable CBs who have successfully completed training to train new staff or faculty.
Evidence indicates that Antarctic minke whales (AMWs) in the Ross Sea affect the foraging behaviour, especially diet, of sympatric Adélie penguins (ADPEs) by, we hypothesize, influencing the availability of prey they have in common, mainly crystal krill. To further investigate this interaction, we undertook a study in McMurdo Sound during 2012–2013 and 2014–2015 using telemetry and biologging of whales and penguins, shore-based observations and quantification of the preyscape. The 3D distribution and density of prey were assessed using a remotely operated vehicle deployed along and to the interior of the fast-ice edge where AMWs and ADPEs focused their foraging. Acoustic surveys of prey and foraging behaviour of predators indicate that prey remained abundant under the fast ice, becoming successively available to air-breathing predators only as the fast ice retreated. Over both seasons, the ADPE diet included less krill and more Antarctic silverfish once AMWs became abundant, but the penguins' foraging behaviour (i.e. time spent foraging, dive depth, distance from colony) did not change. In addition, over time, krill abundance decreased in the upper water column near the ice edge, consistent with the hypothesis (and previously gathered information) that AMW and ADPE foraging contributed to an alteration of prey availability.
Classical stewardship efforts have targeted immunocompetent patients; however, appropriate use of antimicrobials in the immunocompromised host has become a target of interest. Cytomegalovirus (CMV) infection is one of the most common and significant complications after solid-organ transplant (SOT). The treatment of CMV requires a dual approach of antiviral drug therapy and reduction of immunosuppression for optimal outcomes. This dual approach to CMV management increases complexity and requires individualization of therapy to balance antiviral efficacy with the risk of allograft rejection. In this review, we focus on the development and implementation of CMV stewardship initiatives, as a component of antimicrobial stewardship in the immunocompromised host, to optimize the management of prevention and treatment of CMV in SOT recipients. These initiatives have the potential not only to improve judicious use of antivirals and prevent resistance but also to improve patient and graft survival given the interconnection between CMV infection and allograft function.
Previous research in clinical, community, and school settings has demonstrated positive outcomes for the Secret Agent Society (SAS) social skills training program. This is designed to help children on the autism spectrum become more aware of emotions in themselves and others and to ‘problem-solve’ complex social scenarios. Parents play a key role in the implementation of the SAS program, attending information and support sessions with other parents and providing supervision, rewards, and feedback as their children complete weekly ‘home mission’ assignments. Drawing on data from a school-based evaluation of the SAS program, we examined whether parents’ engagement with these elements of the intervention was linked to the quality of their children’s participation and performance. Sixty-eight 8–14-year-olds (M age = 10.7) with a diagnosis of autism participated in the program. The findings indicated that ratings of parental engagement were positively correlated with children’s competence in completing home missions and with the quality of their contribution during group teaching sessions. However, there was a less consistent relationship between parental engagement and measures of children’s social and emotional skill gains over the course of the program.
No evidence-based therapy for borderline personality disorder (BPD) exhibits a clear superiority. However, BPD is highly heterogeneous, and different patients may specifically benefit from the interventions of a particular treatment.
From a randomized trial comparing a year of dialectical behavior therapy (DBT) to general psychiatric management (GPM) for BPD, long-term (2-year-post) outcome data and patient baseline variables (n = 156) were used to examine individual and combined patient-level moderators of differential treatment response. A two-step bootstrapped and partially cross-validated moderator identification process was employed for 20 baseline variables. For identified moderators, 10-fold bootstrapped cross-validated models estimated response to each therapy, and long-term outcomes were compared for patients randomized to their model-predicted optimal v. non-optimal treatment.
Significant moderators surviving the two-step process included psychiatric symptom severity, BPD impulsivity symptoms (both GPM > DBT), dependent personality traits, childhood emotional abuse, and social adjustment (all DBT > GPM). Patients randomized to their model-predicted optimal treatment had significantly better long-term outcomes (d = 0.36, p = 0.028), especially if the model had a relatively stronger (top 60%) prediction for that patient (d = 0.61, p = 0.004). Among patients with a stronger prediction, this advantage held even when applying a conservative statistical check (d = 0.46, p = 0.043).
Patient characteristics influence the degree to which they respond to two treatments for BPD. Combining information from multiple moderators may help inform providers and patients as to which treatment is the most likely to lead to long-term symptom relief. Further research on personalized medicine in BPD is needed.
Psychotherapies for depression are equally effective on average, but individual responses vary widely. Outcomes can be improved by optimizing treatment selection using multivariate prediction models. A promising approach is the Personalized Advantage Index (PAI) that predicts the optimal treatment for a given individual and the magnitude of the advantage. The current study aimed to extend the PAI to long-term depression outcomes after acute-phase psychotherapy.
Data come from a randomized trial comparing cognitive therapy (CT, n = 76) and interpersonal psychotherapy (IPT, n = 75) for major depressive disorder (MDD). Primary outcome was depression severity, as assessed by the BDI-II, during 17-month follow-up. First, predictors and moderators were selected from 38 pre-treatment variables using a two-step machine learning approach. Second, predictors and moderators were combined into a final model, from which PAI predictions were computed with cross-validation. Long-term PAI predictions were then compared to actual follow-up outcomes and post-treatment PAI predictions.
One predictor (parental alcohol abuse) and two moderators (recent life events; childhood maltreatment) were identified. Individuals assigned to their PAI-indicated treatment had lower follow-up depression severity compared to those assigned to their PAI-non-indicated treatment. This difference was significant in two subsets of the overall sample: those whose PAI score was in the upper 60%, and those whose PAI indicated CT, irrespective of magnitude. Long-term predictions did not overlap substantially with predictions for acute benefit.
If replicated, long-term PAI predictions could enhance precision medicine by selecting the optimal treatment for a given depressed individual over the long term.
The diagnosis of anti-N-methyl-d-aspartate receptor (NMDAR) encephalitis relies on the detection of NMDAR IgG autoantibodies in the serum or cerebrospinal fluid (CSF) of symptomatic patients. Commercial kits are available that allow NMDAR IgG autoantibodies to be measured in local laboratories. However, the performance of these tests outside of reference laboratories is unknown.
To report an unexpectedly low rate of NMDAR autoantibody detection in serum from patients with anti-NMDAR encephalitis tested using a commercially available diagnostic kit in an exemplar clinical laboratory.
Paired CSF and serum samples from seven patients with definite anti-NMDAR encephalitis were tested for NMDAR IgG autoantibodies using commercially available cell-based assays run according to manufacturer’s recommendations. Rates of autoantibody detection in serum tested at our center were compared with those derived from systematic review and meta-analyses incorporating studies published during or before March 2019.
NMDAR IgG autoantibodies were detected in the CSF of all patients tested at our clinical laboratory but not in paired serum samples. Rates of the detection were lower than those previously reported. A similar association was recognized through meta-analyses, with lower odds of NMDAR IgG autoantibody detection associated with serum testing performed in nonreference laboratories.
Commercial kits may yield lower-than-expected rates of NMDAR IgG autoantibody detection in serum when run in exemplar clinical (nonreference) laboratories. Additional studies are needed to decipher the factors that contribute to lower-than-expected rates of serum positivity. CSF testing is recommended in patients with suspected anti-NMDAR encephalitis.
Little is known about the neural substrates of suicide risk in mood disorders. Improving the identification of biomarkers of suicide risk, as indicated by a history of suicide-related behavior (SB), could lead to more targeted treatments to reduce risk.
Participants were 18 young adults with a mood disorder with a history of SB (as indicated by endorsing a past suicide attempt), 60 with a mood disorder with a history of suicidal ideation (SI) but not SB, 52 with a mood disorder with no history of SI or SB (MD), and 82 healthy comparison participants (HC). Resting-state functional connectivity within and between intrinsic neural networks, including cognitive control network (CCN), salience and emotion network (SEN), and default mode network (DMN), was compared between groups.
Several fronto-parietal regions (k > 57, p < 0.005) were identified in which individuals with SB demonstrated distinct patterns of connectivity within (in the CCN) and across networks (CCN-SEN and CCN-DMN). Connectivity with some of these same regions also distinguished the SB group when participants were re-scanned after 1–4 months. Extracted data defined SB group membership with good accuracy, sensitivity, and specificity (79–88%).
These results suggest that individuals with a history of SB in the context of mood disorders may show reliably distinct patterns of intrinsic network connectivity, even when compared to those with mood disorders without SB. Resting-state fMRI is a promising tool for identifying subtypes of patients with mood disorders who may be at risk for suicidal behavior.
Apolipoprotein E (APOE) E4 is the main genetic risk factor for Alzheimer’s disease (AD). Due to the consistent association, there is interest as to whether E4 influences the risk of other neurodegenerative diseases. Further, there is a constant search for other genetic biomarkers contributing to these phenotypes, such as microtubule-associated protein tau (MAPT) haplotypes. Here, participants from the Ontario Neurodegenerative Disease Research Initiative were genotyped to investigate whether the APOE E4 allele or MAPT H1 haplotype are associated with five neurodegenerative diseases: (1) AD and mild cognitive impairment (MCI), (2) amyotrophic lateral sclerosis, (3) frontotemporal dementia (FTD), (4) Parkinson’s disease, and (5) vascular cognitive impairment.
Genotypes were defined for their respective APOE allele and MAPT haplotype calls for each participant, and logistic regression analyses were performed to identify the associations with the presentations of neurodegenerative diseases.
Our work confirmed the association of the E4 allele with a dose-dependent increased presentation of AD, and an association between the E4 allele alone and MCI; however, the other four diseases were not associated with E4. Further, the APOE E2 allele was associated with decreased presentation of both AD and MCI. No associations were identified between MAPT haplotype and the neurodegenerative disease cohorts; but following subtyping of the FTD cohort, the H1 haplotype was significantly associated with progressive supranuclear palsy.
This is the first study to concurrently analyze the association of APOE isoforms and MAPT haplotypes with five neurodegenerative diseases using consistent enrollment criteria and broad phenotypic analysis.
Angiostrongylus cantonensis is a pathogenic nematode and the cause of neuroangiostrongyliasis, an eosinophilic meningitis more commonly known as rat lungworm disease. Transmission is thought to be primarily due to ingestion of infective third stage larvae (L3) in gastropods, on produce, or in contaminated water. The gold standard to determine the effects of physical and chemical treatments on the infectivity of A. cantonensis L3 larvae is to infect rodents with treated L3 larvae and monitor for infection, but animal studies are laborious and expensive and also raise ethical concerns. This study demonstrates propidium iodide (PI) to be a reliable marker of parasite death and loss of infective potential without adversely affecting the development and future reproduction of live A. cantonensis larvae. PI staining allows evaluation of the efficacy of test substances in vitro, an improvement upon the use of lack of motility as an indicator of death. Some potential applications of this assay include determining the effectiveness of various anthelmintics, vegetable washes, electromagnetic radiation and other treatments intended to kill larvae in the prevention and treatment of neuroangiostrongyliasis.
Clinical Enterobacteriacae isolates with a colistin minimum inhibitory concentration (MIC) ≥4 mg/L from a United States hospital were screened for the mcr-1 gene using real-time polymerase chain reaction (RT-PCR) and confirmed by whole-genome sequencing. Four colistin-resistant Escherichia coli isolates contained mcr-1. Two isolates belonged to the same sequence type (ST-632). All subjects had prior international travel and antimicrobial exposure.
Recent commercialization of auxin herbicide–based weed control systems has led to increased off-target exposure of susceptible cotton cultivars to auxin herbicides. Off-target deposition of dilute concentrations of auxin herbicides can occur on cotton at any stage of growth. Field experiments were conducted at two locations in Mississippi from 2014 to 2016 to assess the response of cotton at various growth stages after exposure to a sublethal 2,4-D concentration of 8.3 g ae ha−1. Herbicide applications occurred weekly from 0 to 14 weeks after emergence (WAE). Cotton exposure to 2,4-D at 2 to 9 WAE resulted in up to 64% visible injury, whereas 2,4-D exposure 5 to 6 WAE resulted in machine-harvested yield reductions of 18% to 21%. Cotton maturity was delayed after exposure 2 to 10 WAE, and height was increased from exposure 6 to 9 WAE due to decreased fruit set after exposure. Total hand-harvested yield was reduced from 2,4-D exposure 3, 5 to 8, and 13 WAE. Growth stage at time of exposure influenced the distribution of yield by node and position. Yield on lower and inner fruiting sites generally decreased from exposure, and yield partitioned to vegetative or aborted positions and upper fruiting sites increased. Reductions in gin turnout, micronaire, fiber length, fiber-length uniformity, and fiber elongation were observed after exposure at certain growth stages, but the overall effects on fiber properties were small. These results indicate that cotton is most sensitive to low concentrations of 2,4-D during late vegetative and squaring growth stages.
Drawing on a landscape analysis of existing data-sharing initiatives, in-depth interviews with expert stakeholders, and public deliberations with community advisory panels across the U.S., we describe features of the evolving medical information commons (MIC). We identify participant-centricity and trustworthiness as the most important features of an MIC and discuss the implications for those seeking to create a sustainable, useful, and widely available collection of linked resources for research and other purposes.
Annually dated tree-rings of 509 live and deadwood limber pine (Pinus flexilis) samples from the semi-arid Wassuk Range, Nevada, yielded a 3996-yr record extending from 1983 BC to AD 2013. Correlations of radial growth with climate were positive for water relations and negative for summer temperatures. Long-term trends of ring-width corresponded to climate variability documented from other proxies, including low growth during the Late Holocene Dry Period and Medieval Climate Anomaly (MCA) and elevated growth during cool, wet periods of the Neoglacial and Little Ice Age. Spline fit of the data indicated that growth decrease in the last 20 years was second lowest on record, surpassed by lowest growth at 20 BC—AD 150. Demographics of limber pine by aspect and elevation were not strongly related to long-term climate dynamics, except in the case of extirpations on all but north aspects at the end of the MCA. Pines occurred persistently on north aspects, where a continuous record existed to present. Elevation shifts were not obvious on any aspect, and no evidence existed for migration above current treeline. Non-climatic factors appear to interact with climate to make north slopes refugial for upland pines in semi-arid regions across four millennia.
The introduction of auxin herbicide weed control systems has led to increased occurrence of crop injury in susceptible soybeans and cotton. Off-target exposure to sublethal concentrations of dicamba can occur at varying growth stages, which may affect crop response. Field experiments were conducted in Mississippi in 2014, 2015, and 2016 to characterize cotton response to a sublethal concentration of dicamba equivalent to 1/16X the labeled rate. Weekly applications of dicamba at 35 g ae ha−1 were made to separate sets of replicated plots immediately following planting until 14 wk after emergence (WAE). Exposure to dicamba from 1 to 9 WAE resulted in up to 32% visible injury, and exposure from 7 to 10 WAE delayed crop maturity. Exposure from 8 to 10 and 13 WAE led to increased cotton height, while an 18% reduction in machine-harvested yield resulted from exposure at 6 WAE. Cotton exposure at 3 to 9 WAE reduced the seed cotton weight partitioned to position 1 fruiting sites, while exposure at 3 to 6 WAE also reduced yield in position 2 fruiting sites. Exposure at 2, 3, and 5 to 7 WAE increased the percent of yield partitioned to vegetative branches. An increase in percent of yield partitioned to plants with aborted terminals occurred following exposure from 3 to 7 WAE and corresponded with reciprocal decreases in yield partitioned to positional fruiting sites. Minimal effects were observed on fiber quality, except for decreases in fiber length uniformity resulting from exposure at 9 and 10 WAE.
The Canadian Stroke Best Practice Recommendations suggests that patients suspected of transient ischemic attack (TIA)/minor stroke receive urgent brain imaging, preferably computed tomography angiography (CTA). Yet, high requisition rates for non-cerebrovascular patients overburden limited radiological resources, putting patients at risk. We hypothesize that our clinical decision support tool (CDST) developed for risk stratification of TIA in the emergency department (ED), and which incorporates Canadian guidelines, could improve CTA utilization.
Retrospective study design with clinical information gathered from ED patient referrals to an outpatient TIA unit in Victoria, BC, from 2015-2016. Actual CTA orders by ED and TIA unit staff were compared to hypothetical CTA ordering if our CDST had been used in the ED upon patient arrival.
For 1,679 referrals, clinicians ordered 954 CTAs. Our CDST would have ordered a total of 977 CTAs for these patients. Overall, this would have increased the number of imaged-TIA patients by 89 (10.1%) while imaging 98 (16.1%) fewer non-cerebrovascular patients over the 2-year period. Our CDST would have ordered CTA for 18 (78.3%) of the recurrent stroke patients in the sample.
Our CDST could enhance CTA utilization in the ED for suspected TIA patients, and facilitate guideline-based stroke care. Use of our CDST would increase the number of TIA patients receiving CTA before ED discharge (rather than later at TIA units) and reduce the burden of imaging stroke mimics in radiological departments.