To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
Access to cutting-edge technologies is essential for investigators to advance translational research. The Indiana Clinical and Translational Sciences Institute (CTSI) spans three major and preeminent universities, four large academic campuses across the state of Indiana, and is mandate to provide best practices to a whole state.
To address the need to facilitate the availability of innovative technologies to its investigators, the Indiana CTSI implemented the Access Technology Program (ATP). The activities of the ATP, or any program of the Indiana CTSI, are challenged to connect technologies and investigators on the multiple Indiana CTSI campuses by the geographical distances between campuses (1–4 hr driving time).
Herein, we describe the initiatives developed by the ATP to increase the availability of state-of-the-art technologies to its investigators on all Indiana CTSI campuses, and the methods developed by the ATP to bridge the distance between campuses, technologies, and investigators for the advancement of clinical translational research.
The methods and practices described in this publication may inform other approaches to enhance translational research, dissemination, and usage of innovative technologies by translational investigators, especially when distance or multi-campus cultural differences are factors to efficient application.
The emphasis on team science in clinical and translational research increases the importance of collaborative biostatisticians (CBs) in healthcare. Adequate training and development of CBs ensure appropriate conduct of robust and meaningful research and, therefore, should be considered as a high-priority focus for biostatistics groups. Comprehensive training enhances clinical and translational research by facilitating more productive and efficient collaborations. While many graduate programs in Biostatistics and Epidemiology include training in research collaboration, it is often limited in scope and duration. Therefore, additional training is often required once a CB is hired into a full-time position. This article presents a comprehensive CB training strategy that can be adapted to any collaborative biostatistics group. This strategy follows a roadmap of the biostatistics collaboration process, which is also presented. A TIE approach (Teach the necessary skills, monitor the Implementation of these skills, and Evaluate the proficiency of these skills) was developed to support the adoption of key principles. The training strategy also incorporates a “train the trainer” approach to enable CBs who have successfully completed training to train new staff or faculty.
Prescribing metrics, cost, and surrogate markers are often used to describe the value of antimicrobial stewardship (AMS) programs. However, process measures are only indirectly related to clinical outcomes and may not represent the total effect of an intervention. We determined the global impact of a multifaceted AMS initiative for hospitalized adults with common infections.
Single center, quasi-experimental study.
Hospitalized adults with urinary, skin, and respiratory tract infections discharged from family medicine and internal medicine wards before (January 2017–June 2017) and after (January 2018–June 2018) an AMS initiative on a family medicine ward were included. A series of AMS-focused initiatives comprised the development and dissemination of: handheld prescribing tools, AMS positive feedback cases, and academic modules. We compared the effect on an ordinal end point consisting of clinical resolution, adverse drug events, and antimicrobial optimization between the preintervention and postintervention periods.
In total, 256 subjects were included before and after an AMS intervention. Excessive durations of therapy were reduced from 40.3% to 22% (P < .001). Patients without an optimized antimicrobial course were more likely to experience clinical failure (OR, 2.35; 95% CI, 1.17–4.72). The likelihood of a better global outcome was greater in the family medicine intervention arm (62.0%, 95% CI, 59.6–67.1) than in the preintervention family medicine arm.
Collaborative, targeted feedback with prescribing metrics, AMS cases, and education improved global outcomes for hospitalized adults on a family medicine ward.
In response to advancing clinical practice guidelines regarding concussion management, service members, like athletes, complete a baseline assessment prior to participating in high-risk activities. While several studies have established test stability in athletes, no investigation to date has examined the stability of baseline assessment scores in military cadets. The objective of this study was to assess the test–retest reliability of a baseline concussion test battery in cadets at U.S. Service Academies.
All cadets participating in the Concussion Assessment, Research, and Education (CARE) Consortium investigation completed a standard baseline battery that included memory, balance, symptom, and neurocognitive assessments. Annual baseline testing was completed during the first 3 years of the study. A two-way mixed-model analysis of variance (intraclass correlation coefficent (ICC)3,1) and Kappa statistics were used to assess the stability of the metrics at 1-year and 2-year time intervals.
ICC values for the 1-year test interval ranged from 0.28 to 0.67 and from 0.15 to 0.57 for the 2-year interval. Kappa values ranged from 0.16 to 0.21 for the 1-year interval and from 0.29 to 0.31 for the 2-year test interval. Across all measures, the observed effects were small, ranging from 0.01 to 0.44.
This investigation noted less than optimal reliability for the most common concussion baseline assessments. While none of the assessments met or exceeded the accepted clinical threshold, the effect sizes were relatively small suggesting an overlap in performance from year-to-year. As such, baseline assessments beyond the initial evaluation in cadets are not essential but could aid concussion diagnosis.
Commercialization of 2,4-D–tolerant crops is a major concern for sweetpotato producers because of potential 2,4-D drift that can cause severe crop injury and yield reduction. A field study was initiated in 2014 and repeated in 2015 to assess impacts of reduced rates of 2,4-D, glyphosate, or a combination of 2,4-D with glyphosate on sweetpotato. In one study, 2,4-D and glyphosate were applied alone and in combination at 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1,000 of anticipated field use rates (1.05 kg ha−1 for 2,4-D and 1.12 kg ha−1 for glyphosate) to ‘Beauregard’ sweetpotato at storage root formation (10 days after transplanting [DAP]). In a separate study, all these treatments were applied to ‘Beauregard’ sweetpotato at storage root development (30 DAP). Injury with 2,4-D alone or in combination with glyphosate was generally equal or greater than with glyphosate applied alone at equivalent herbicide rates, indicating that injury is attributable mostly to 2,4-D in the combination. There was a quadratic increase in crop injury and quadratic decrease in crop yield (with respect to most yield grades) with increased rate of 2,4-D applied alone or in combination with glyphosate applied at storage root development. However, neither the results of this relationship nor of the significance of herbicide rate were observed on crop injury or sweetpotato yield when herbicide application occurred at storage root formation, with a few exceptions. In general, crop injury and yield reduction were greatest at the highest rate (1/10×) of 2,4-D applied alone or in combination with glyphosate, although injury observed at lower rates was also a concern after initial observation by sweetpotato producers. However, in some cases, yield reduction of U.S. no.1 and marketable grades was also observed after application of 1/250×, 1/100×, or 1/10× rates of 2,4-D alone or with glyphosate when applied at storage root development.
Motivated by the occurrence of a moderately nearby supernova near the beginning of the Pleistocene, possibly as part of a long-term series beginning in the Miocene, we investigated whether nitrate rainout resulting from the atmospheric ionization of enhanced cosmic ray flux could have, through its fertilizer effect, initiated carbon dioxide drawdown. Such a drawdown could possibly reduce the greenhouse effect and induce the climate change that led to the Pleistocene glaciations. We estimate that the nitrogen flux enhancement onto the surface from an event at 50 pc would be of order 10%, probably too small for dramatic changes. We estimate deposition of iron (another potential fertilizer) and find it is also too small to be significant. There are also competing effects of opposite sign, including muon irradiation and reduction in photosynthetic yield caused by UV increase from stratospheric ozone layer depletion, leading to an ambiguous result. However, if the atmospheric ionization induces a large increase in the frequency of lightning, as argued elsewhere, the amount of nitrate synthesis should be much larger, dominate over the other effects and induce the climate change. More work needs to be done to clarify the effects on lightning frequency.
A major concern of sweetpotato producers is the potential negative effects from herbicide drift or sprayer contamination events when dicamba is applied to nearby dicamba-resistant crops. A field study was initiated in 2014 and repeated in 2015 to assess the effects of reduced rates of N,N-Bis-(3-aminopropyl)methylamine (BAPMA) or diglycloamine (DGA) salt of dicamba, glyphosate, or a combination of these individually in separate trials with glyphosate on sweetpotato. Reduced rates of 1/10, 1/100, 1/250, 1/500, 1/750, and 1/1,000 of the 1× use rate of each dicamba formulation at 0.56 kg ha−1, glyphosate at 1.12 kg ha−1, and a combination of the two at aforementioned rates were applied to ‘Beauregard’ sweetpotato at storage root formation (10 d after transplanting) in one trial and storage root development (30 d after transplanting) in a separate trial. Injury with each salt of dicamba (BAPMA or DGA) applied alone or with glyphosate was generally equal to or greater than glyphosate applied alone at equivalent rates, indicating that injury is most attributable to the dicamba in the combination. There was a quadratic increase in crop injury and a quadratic decrease in crop yield (with respect to most yield grades) observed with an increased herbicide rate of dicamba applied alone or in combination with glyphosate applied at storage root development. However, with a few exceptions, neither this relationship nor the significance of herbicide rate was observed on crop injury or sweetpotato yield when herbicide application occurred at the storage root formation stage. In general, crop injury and yield reduction were greatest at the highest rate (1/10×) of either salt of dicamba applied alone or in combination with glyphosate, although injury observed at lower rates would be cause for concern after initial observation by sweetpotato producers. However, in some cases yield reduction of No.1 and marketable grades was observed following 1/250×, 1/100×, or 1/10× application rates of dicamba alone or with glyphosate when applied at storage root development.
Introduction: In 2018, Canadian postgraduate specialist Emergency Medicine (EM) programs began implementing a competency-based medical education (CBME) assessment system. To support improvement of this assessment program, we sought to evaluate its short-term educational outcomes nationally and within individual programs. Methods: Program-level data from the 2018 resident cohort were amalgamated and analyzed. The number of Entrustable Professional Activity (EPA) assessments (overall and for each EPA) and the timing of resident promotion through program stages was compared between programs and to the guidelines provided by the national EM specialty committee. Total EPA observations from each program were correlated with the number of EM and pediatric EM rotations. Results: Data from 15 of 17 (88.2%) EM programs containing 9,842 EPA observations from 68 of the 77 (88.3%) Canadian EM specialist residents in the 2018 cohort were analyzed. The average number of EPAs observed per resident in each program varied from 92.5 to 229.6 and correlated strongly with the number of blocks spent on EM and pediatric EM (r = 0.83, p < 0.001). Relative to the guidelines outlined by the specialty committee, residents were promoted later than expected and with fewer EPA observations than suggested. Conclusion: We present a new approach to the amalgamation of national and program-level assessment data. There was demonstrable variation in both EPA-based assessment numbers and promotion timelines between programs and with national guidelines. This evaluation data will inform the revision of local programs and national guidelines and serve as a starting point for further reaching outcome evaluation. This process could be replicated by other national assessment programs.
The utility of questionnaire based self-report measures for non-clinical psychotic symptoms is unclear and there are few reliable data about the nature and prevalence of these phenomena in children. The study aimed to investigate psychosis-like symptoms (PLIKS) in children utilizing both self-report measures and semi-structured observer rated assessments.
The study was cross-sectional; the setting being an assessment clinic for members of the ALSPAC birth cohort in Bristol, UK. 6455 respondents were assessed over 21 months, mean age 12.9 years. The main outcome measure was: 12 self-report screening questions for psychotic symptoms followed by semi-structured observer rated assessments by trained psychology graduates. The assessment instrument utilised stem questions, glossary definitions, and rating rules adapted from DISC-IV and SCAN items.
The 6-month period prevalence for one or more PLIKS rated by self-report questions was 38.9 % (95% CI = 37.7-40.1). Prevalence using observer rated assessments was 13.7% (95% CI = 12.8-14.5). Positive Predictive Values for the screen questions versus observer rated scores were low, except for auditory hallucinations (PPV=70%; 95% CI = 67.1-74.2). The most frequent observer rated symptom was auditory hallucinations (7.3%); in 18.8% of these cases symptoms occurred weekly or more. The prevalence of DSM-IV ‘core’ schizophrenia symptoms was 3.62%. Rates were significantly higher in children with low socio-economic status.
With the exception of auditory hallucinations, self-rated questionnaires are likely to substantially over-estimate the frequency of PLIKS in 12-year-old children. However, more reliable observer rated assessments reveal that PLIKS occur in a significant proportion of children.
Little is known about who would benefit from Internet-based personalised nutrition (PN) interventions. This study aimed to evaluate the characteristics of participants who achieved greatest improvements (i.e. benefit) in diet, adiposity and biomarkers following an Internet-based PN intervention. Adults (n 1607) from seven European countries were recruited into a 6-month, randomised controlled trial (Food4Me) and randomised to receive conventional dietary advice (control) or PN advice. Information on dietary intake, adiposity, physical activity (PA), blood biomarkers and participant characteristics was collected at baseline and month 6. Benefit from the intervention was defined as ≥5 % change in the primary outcome (Healthy Eating Index) and secondary outcomes (waist circumference and BMI, PA, sedentary time and plasma concentrations of cholesterol, carotenoids and omega-3 index) at month 6. For our primary outcome, benefit from the intervention was greater in older participants, women and participants with lower HEI scores at baseline. Benefit was greater for individuals reporting greater self-efficacy for ‘sticking to healthful foods’ and who ‘felt weird if [they] didn’t eat healthily’. Participants benefited more if they reported wanting to improve their health and well-being. The characteristics of individuals benefiting did not differ by other demographic, health-related, anthropometric or genotypic characteristics. Findings were similar for secondary outcomes. These findings have implications for the design of more effective future PN intervention studies and for tailored nutritional advice in public health and clinical settings.
We describe and analyse an outbreak of measles that affected Belgium early 2017. In total, 289 cases were reported, mostly (53%) in people 15 years or older. For 133 (46%) vaccination status was unknown and a further 117 (41%) were not vaccinated. According to national guidelines, 83 of the unvaccinated cases (29% of total cases) should have received minimum one dose of vaccine, but did not. One in five cases (21%) did not present with the classical triad of fever, rash and any of coryza, conjunctivitis or cough. Rash was the most sensitive symptom, being absent in only six cases. A large proportion of cases (125/289, 43%) required hospitalisation. In hospitalised patients, the most commonly observed complications were hepatic disorders (present in 58/125 hospitalised patients, 46%). Thirty-six of the cases (12%) were in healthcare workers and nosocomial spread contributed importantly to the outbreak. Older age at presentation, altered clinical presentations and presence of complications like hepatitis can delay the correct diagnosis of measles. Clinicians should maintain a high index of suspicion in any individual presenting with rash. If the elimination target is to be reached, catch-up vaccination campaigns should be intensified and target young adults and health care workers.
Brominated flame retardants (BFR) are primarily used as flame retardant additives in insulating materials. These lipophilic compounds can bioaccumulate in animal tissues, leading to human exposure via food ingestion. Although their concentration in food is not yet regulated, several of these products are recognised as persistent organic pollutants; they are thought to act as endocrine disruptors. The present study aimed to characterise the occurrence of two families of BFRs (hexabromocyclododecane (HBCDD) and polybrominated diphenyl ethers (PBDE)) in hen eggs and broiler or pig meat in relation to their rearing environments. Epidemiological studies were carried out on 60 hen egg farms (34 without an open-air range, 26 free-range), 57 broiler farms (27 without an open-air range, 30 free-range) and 42 pig farms without an open-air range in France from 2013 to 2015. For each farm, composite samples from either 12 eggs, five broiler pectoral muscles or three pig tenderloins were obtained. Eight PBDE congeners and three HBCDD stereoisomers were quantified in product fat using gas chromatography–high-resolution mass spectrometry, or high-performance liquid chromatography–tandem mass spectrometry, respectively. The frequencies of PBDE detection were 28% for eggs (median concentration 0.278 ng/g fat), 72% for broiler muscle (0.392 ng/g fat) and 49% for pig muscle (0.403 ng/g fat). At least one HBCDD stereoisomer was detected in 17% of eggs (0.526 ng/g fat), 46% of broiler muscle (0.799 ng/g fat) and 36% of pig muscle (0.616 ng/g fat). Results were similar in concentration to those obtained in French surveillance surveys from 2012 to 2016. Nevertheless, the contamination of free-range eggs and broilers was found to be more frequent than that of conventional ones, suggesting that access to an open-air range could be an additional source of exposure to BFRs for animals. However, the concentration of BFRs in all products remained generally very low. No direct relationship could be established between the occurrence of BFRs in eggs and meat and the characteristics of farm buildings (age, building materials). The potential presence of BFRs in insulating materials is not likely to constitute a significant source of animal exposure as long as the animals do not have direct access to these materials.
Iraq and Afghanistan Veterans with posttraumatic stress disorder (PTSD) and traumatic brain injury (TBI) history have high rates of performance validity test (PVT) failure. The study aimed to determine whether those with scores in the invalid versus valid range on PVTs show similar benefit from psychotherapy and if psychotherapy improves PVT performance.
Veterans (N = 100) with PTSD, mild-to-moderate TBI history, and cognitive complaints underwent neuropsychological testing at baseline, post-treatment, and 3-month post-treatment. Veterans were randomly assigned to cognitive processing therapy (CPT) or a novel hybrid intervention integrating CPT with TBI psychoeducation and cognitive rehabilitation strategies from Cognitive Symptom Management and Rehabilitation Therapy (CogSMART). Performance below standard cutoffs on any PVT trial across three different PVT measures was considered invalid (PVT-Fail), whereas performance above cutoffs on all measures was considered valid (PVT-Pass).
Although both PVT groups exhibited clinically significant improvement in PTSD symptoms, the PVT-Pass group demonstrated greater symptom reduction than the PVT-Fail group. Measures of post-concussive and depressive symptoms improved to a similar degree across groups. Treatment condition did not moderate these results. Rate of valid test performance increased from baseline to follow-up across conditions, with a stronger effect in the SMART-CPT compared to CPT condition.
Both PVT groups experienced improved psychological symptoms following treatment. Veterans who failed PVTs at baseline demonstrated better test engagement following treatment, resulting in higher rates of valid PVTs at follow-up. Veterans with invalid PVTs should be enrolled in trauma-focused treatment and may benefit from neuropsychological assessment after, rather than before, treatment.
Masses have been computed for the white dwarfs (WDs) in eclipsing, mass exchange (symbiotic), WD–red giant (RG) binaries by using single-lined spectroscopic orbits, orbital inclinations, and the RG masses. Inclinations have been measured for 13 eclipsing symbiotic binaries. Using Gaia data the mass of the RG can be found from evolutionary tracks. Since the WD evolved from the more massive star in the binary, the WD should be more massive than predicted from the mass of the current RG. Typically the WD has a lower mass than expected implying a previous mass exchange stage for these systems.
In the present study, we aimed to compare anthropometric indicators as predictors of mortality in a community-based setting.
We conducted a population-based longitudinal study nested in a cluster-randomized trial. We assessed weight, height and mid-upper arm circumference (MUAC) on children 12 months after the trial began and used the trial’s annual census and monitoring visits to assess mortality over 2 years.
Children aged 6–60 months during the study.
Of 1023 children included in the study at baseline, height-for-age Z-score, weight-for-age Z-score, weight-for-height Z-score and MUAC classified 777 (76·0 %), 630 (61·6 %), 131 (12·9 %) and eighty (7·8 %) children as moderately to severely malnourished, respectively. Over the 2-year study period, fifty-eight children (5·7 %) died. MUAC had the greatest AUC (0·68, 95 % CI 0·61, 0·75) and had the strongest association with mortality in this sample (hazard ratio = 2·21, 95 % CI 1·26, 3·89, P = 0·006).
MUAC appears to be a better predictor of mortality than other anthropometric indicators in this community-based, high-malnutrition setting in Niger.
Dementia, a term that describes a variety of brain conditions marked by gradual, persistent and progressive cognitive decline, affects a significant proportion of older adults. Older adults with dementia are sometimes perceived less favourably than those without dementia. Furthermore, compared to persons without dementia, those with dementia are often perceived by others as having reduced personhood. This study was aimed at investigating whether differences in attitudes towards dementia and personhood perceptions vary as a function of age group, care-giver status, attitudes towards ageing, dementia knowledge, gender and education. In total 196 younger, middle-aged and older adults were recruited. Findings revealed that being a care-giver as well as having less ageist attitudes were predictive of being more comfortable around persons with dementia, having more knowledge about dementia and ascribing greater personhood to people with dementia. Those with more dementia knowledge (prior to the study) were less comfortable around people with dementia. Finally, when controlling this prior dementia knowledge, older adults were more comfortable around people with dementia compared to younger and middle-aged adults. Gender and education were not associated with any of the variables under study. Findings contribute to a better understanding of the role of age- and care-giver-related factors in the determination of attitudes towards dementia.
The aggregation of neurocognitive deficits among the non-psychotic first-degree relatives of adult- and childhood-onset schizophrenia patients suggests that there may be a common etiology for these deficits in childhood- and adult-onset illness. However, there is considerable heterogeneity in the presentation of neurobiological abnormalities, and whether there are differences in the extent of familial transmission for specific domains of cognitive function has not been systematically addressed.
We employed variance components analysis, as implemented in SOLAR-Eclipse, to evaluate the evidence of familial transmission for empirically derived composite scores representing attention, working memory, verbal learning, verbal retention, and memory for faces. We contrast estimates for adult- and childhood-onset schizophrenia families and matched community control pedigrees, and compare our findings to previous reports based on analogous neurocognitive assessments.
We observed varying degrees of familial transmission; attention and working memory yielded comparable, significant estimates for adult-onset and community control pedigrees; verbal learning was significant for childhood-onset and community control pedigrees; and facial memory demonstrated significant familial transmission only for childhood-onset schizophrenia. Model-fitting analyses indicated significant differences in familiality between adult- and childhood-onset schizophrenia for attention, working memory, and verbal learning.
By comprehensively assessing a wide range of neurocognitive domains in adult- and childhood-onset schizophrenia families, we provide additional support for specific neurocognitive domains as schizophrenia endophenotypes. Whereas comparable estimates of familial transmission for certain dimensions of cognitive functioning support a shared etiology of adult- and childhood-onset neurocognitive function, observed differences may be taken as preliminary evidence of partially divergent multifactorial architectures.