To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Social and environmental factors such as poverty or violence modulate the risk and course of schizophrenia. However, how they affect the brain in patients with psychosis remains unclear.
We studied how environmental factors are related to brain structure in patients with schizophrenia and controls in Latin America, where these factors are large and unequally distributed.
This is a multicentre study of magnetic resonance imaging in patients with schizophrenia and controls from six Latin American cities. Total and voxel-level grey matter volumes, and their relationship with neighbourhood characteristics such as average income and homicide rates, were analysed with a general linear model.
A total of 334 patients with schizophrenia and 262 controls were included. Income was differentially related to total grey matter volume in both groups (P = 0.006). Controls showed a positive correlation between total grey matter volume and income (R = 0.14, P = 0.02). Surprisingly, this relationship was not present in patients with schizophrenia (R = −0.076, P = 0.17). Voxel-level analysis confirmed that this interaction was widespread across the cortex. After adjusting for global brain changes, income was positively related to prefrontal cortex volumes only in controls. Conversely, the hippocampus in patients with schizophrenia, but not in controls, was relatively larger in affluent environments. There was no significant correlation between environmental violence and brain structure.
Our results highlight the interplay between environment, particularly poverty, and individual characteristics in psychosis. This is particularly important for harsh environments such as low- and middle-income countries, where potentially less brain vulnerability (less grey matter loss) is sufficient to become unwell in adverse (poor) environments.
Many cognitive functions are under strong genetic control and twin studies have demonstrated genetic overlap between some aspects of cognition and schizophrenia. How the genetic relationship between specific cognitive functions and schizophrenia is influenced by IQ is currently unknown.
We applied selected tests from the Cambridge Neuropsychological Test Automated Battery (CANTAB) to examine the heritability of specific cognitive functions and associations with schizophrenia liability. Verbal and performance IQ were estimated using The Wechsler Adult Intelligence Scale-III and the Danish Adult Reading Test. In total, 214 twins including monozygotic (MZ = 32) and dizygotic (DZ = 22) pairs concordant or discordant for a schizophrenia spectrum disorder, and healthy control pairs (MZ = 29, DZ = 20) were recruited through the Danish national registers. Additionally, eight twins from affected pairs participated without their sibling.
Significant heritability was observed for planning/spatial span (h2 = 25%), self-ordered spatial working memory (h2 = 64%), sustained attention (h2 = 56%), and movement time (h2 = 47%), whereas only unique environmental factors contributed to set-shifting, reflection impulsivity, and thinking time. Schizophrenia liability was associated with planning/spatial span (rph = −0.34), self-ordered spatial working memory (rph = −0.24), sustained attention (rph = −0.23), and set-shifting (rph = −0.21). The association with planning/spatial span was not driven by either performance or verbal IQ. The remaining associations were shared with performance, but not verbal IQ.
This study provides further evidence that some cognitive functions are heritable and associated with schizophrenia, suggesting a partially shared genetic etiology. These functions may constitute endophenotypes for the disorder and provide a basis to explore genes common to cognition and schizophrenia.
Intensified cover-cropping practices are increasingly viewed as a herbicide-resistance management tool but clear distinction between reactive and proactive resistance management performance targets is needed. We evaluated two proactive performance targets for integrating cover-cropping tactics, including (1) facilitation of reduced herbicide inputs and (2) reduced herbicide selection pressure. We conducted corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] field experiments in Pennsylvania and Delaware using synthetic weed seedbanks of horseweed [Conyza canadensis (L.) Cronquist] and smooth pigweed (Amaranthus hybridus L.) to assess winter and summer annual population dynamics, respectively. The effect of alternative cover crops was evaluated across a range of herbicide inputs. Cover crop biomass production ranged from 2,000 to 8,500 kg ha−1 in corn and 3,000 to 5,500 kg ha−1 in soybean. Experimental results demonstrated that herbicide-based tactics were the primary drivers of total weed biomass production, with cover-cropping tactics providing an additive weed-suppression benefit. Substitution of cover crops for PRE or POST herbicide programs did not reduce total weed control levels or cash crop yields but did result in lower net returns due to higher input costs. Cover-cropping tactics significantly reduced C. canadensis populations in three of four cover crop treatments and decreased the number of large rosettes (>7.6-cm diameter) at the time of preplant herbicide exposure. Substitution of cover crops for PRE herbicides resulted in increased selection pressure on POST herbicides, but reduced the number of large individuals (>10 cm) at POST applications. Collectively, our findings suggest that cover crops can reduce the intensity of selection pressure on POST herbicides, but the magnitude of the effect varies based on weed life-history traits. Additional work is needed to describe proactive resistance management concepts and performance targets for integrating cover crops so producers can apply these concepts in site-specific, within-field management practices.
OBJECTIVES/GOALS: The overall goal of this study was to determine the effect of early life stress (ELS) on the intestinal CD4+ T cell immune compartment, at homeostasis and after induction of experimental Inflammatory Bowel Disease (IBD). METHODS/STUDY POPULATION: We used a mouse model of ELS, maternal separation with early weaning (MSEW). We used IL-10 reporter mice to enable analysis of IL-10-producing cells. Mice were examined on postnatal day 28 to determine the impact of ELS on gut regulatory T cells. Plasma levels of corticosterone (rodent stress response hormone) was determined by ELISA. Colitis was induced in MSEW and normal rear (NR) mice via intraperitoneal injection of α-IL-10R every 5 days until day 15. Mice were euthanized on days 20 and 30. Colonic tissue sections were stained for histological analysis. Remaining tissue was further processed for flow cytometric analysis of CD4+ T cells and innate lymphoid cells. RESULTS/ANTICIPATED RESULTS: Plasma corticosterone was elevated in MSEW mice compared to their NR counterparts at 4 weeks of age. We observed that the MSEW stress protocol does not affect the baseline colonic CD4+ T cell or innate lymphoid cell populations. There was a reduction in the intestinal CD4+ T cells and regulatory T cells on day 20 in α-IL-10R MSEW mice compared to NR counterparts. This difference disappeared by day 30. Histological scoring showed no difference in disease severity between α-IL-10R treated MSEW and NR mice on day 20. However, on day 30, when α-IL-10R NR mice are recovering from colitis, MSEW mice showed persistent histological inflammation, mainly attributable to sustained epithelial damage. DISCUSSION/SIGNIFICANCE OF IMPACT: Our results suggest that ELS prolongs intestinal inflammation and impairs epithelial repair. Future studies will focus on elucidating the mechanisms responsible for ELS-dependent impairment of mucosal repair in experimental colitis.
Implementation of genome-scale sequencing in clinical care has significant challenges: the technology is highly dimensional with many kinds of potential results, results interpretation and delivery require expertise and coordination across multiple medical specialties, clinical utility may be uncertain, and there may be broader familial or societal implications beyond the individual participant. Transdisciplinary consortia and collaborative team science are well poised to address these challenges. However, understanding the complex web of organizational, institutional, physical, environmental, technologic, and other political and societal factors that influence the effectiveness of consortia is understudied. We describe our experience working in the Clinical Sequencing Evidence-Generating Research (CSER) consortium, a multi-institutional translational genomics consortium.
A key aspect of the CSER consortium was the juxtaposition of site-specific measures with the need to identify consensus measures related to clinical utility and to create a core set of harmonized measures. During this harmonization process, we sought to minimize participant burden, accommodate project-specific choices, and use validated measures that allow data sharing.
Identifying platforms to ensure swift communication between teams and management of materials and data were essential to our harmonization efforts. Funding agencies can help consortia by clarifying key study design elements across projects during the proposal preparation phase and by providing a framework for data sharing data across participating projects.
In summary, time and resources must be devoted to developing and implementing collaborative practices as preparatory work at the beginning of project timelines to improve the effectiveness of research consortia.
Little is known about emotional quality-of-life in paediatric heart disease in low- and middle-income countries where the prevalence of uncorrected lesions is high. Research on emotional quality-of-life and its predictors in these settings is key to planning interventions.
Ten-year retrospective cross-sectional study of children aged 6–17 years with uncorrected congenital or acquired heart disease in 12 low- and middle-income countries was conducted. Emotional functioning score of the PedsQL TM 4.0 generic core scale and data on patient-reported limitation in sports participation were collected via in-person interview and analysed using regression analyses.
Ninety-four children reported mean emotional functioning scores of 71.94 (SD 25.32) [95% CI 66.75–77.13] with lower scores independently associated with having a parent with a chronic illness or who had died (p = 0.005), having less than three siblings (p = 0.007), and reporting a subjective limitation in carrying an item equivalent to a 4 lb load (p = 0.021). Patient-reported limitation in sports participation at least “sometimes” was present in 69% and was independently associated with experiencing symptoms at least once a month (p < 0.001).
Some of the factors which were associated with better emotional quality-of-life were similar to those identified in previous studies in patients with corrected defects. Patient-reported limitation in sports participation is common. In addition to corrective surgery and exercise, numerous other interventions which are practicable during surgical missions might improve emotional quality-of-life.
Organic grain producers are interested in reducing tillage to conserve soil and decrease labor and fuel costs. We examined agronomic and economic tradeoffs associated with alternative strategies for reducing tillage frequency and intensity in a cover crop–soybean (Glycine max L. Merr.) sequence within a corn (Zea mays L.)–soybean–spelt (Triticum spelta L.) organic cropping system experiment in Pennsylvania. Tillage-based soybean production preceded by a cover crop mixture of annual ryegrass (Lolium perenne L. ssp. multiflorum), orchardgrass (Dactylis glomerata L.) and forage radish (Raphanus sativus L.) interseeded into corn grain (Z. mays L.) was compared with reduced-tillage soybean production preceded by roller-crimped cereal rye (Secale cereale L.) that was sown after corn silage. Total aboveground weed biomass did not differ between soybean production strategies. Each strategy, however, was characterized by high inter-annual variability in weed abundance. Tillage-based soybean production marginally increased grain yield by 0.28 Mg ha−1 compared with reduced-tillage soybean. A path model of soybean yield indicated that soybean stand establishment and weed biomass were primary drivers of yield, but soybean production strategy had a measurable effect on yields due to factors other than within-season weed–crop competition. Cumulative tillage frequency and intensity were quantified for each cover crop—sequence using the Soil Tillage Intensity Rating (STIR) index. The reduced-tillage soybean sequence resulted in 50% less soil disturbance compared to tillage-based soybean sequence across study years. Finally, enterprise budget comparisons showed that the reduced-tillage soybean sequence resulted in lower input costs than the tillage-based soybean sequence but was approximately $114 ha−1 less profitable because of lower average yields.
Hyperbaric oxygen therapy (HBOT) shows promising results in treating radionecrosis (RN) but there is limited evidence for its use in brain RN. The purpose of this study is to report the outcomes of using HBOT for symptomatic brain RN at a single institution.
This was a retrospective review of patients with symptomatic brain RN between 2008 and 2018 and was treated with HBOT. Demographic data, steroid use, clinical response, radiologic response and toxicities were collected. The index time for analysis was the first day of HBOT. The primary endpoint was clinical improvement of a presenting symptom, including steroid dose reduction.
Thirteen patients who received HBOT for symptomatic RN were included. The median time from last brain radiation therapy to presenting symptoms of brain RN was 6 months. Twelve patients (92%) had clinical improvement with median time to symptom improvement of 33 days (range 1–109 days). One patient had transient improvement after HBOT but had recurrent symptomatic RN at 12 months. Of the eight patients with evaluable follow-up MRI, four patients had radiological improvement while four had stable necrosis appearance. Two patients had subsequent deterioration in MRI appearances, one each in the background of initial radiologic improvement and stability. Median survival was 15 months with median follow-up of 10 months. Seven patients reported side effects attributable to HBOT (54%), four of which were otologic in origin.
HBOT is a safe and effective treatment for brain RN. HBOT showed clinical and radiologic improvement or stability in most patients. Prospective studies to further evaluate the effectiveness and side effects of HBOT are needed.
Identifying risk factors of individuals in a clinical-high-risk state for psychosis are vital to prevention and early intervention efforts. Among prodromal abnormalities, cognitive functioning has shown intermediate levels of impairment in CHR relative to first-episode psychosis and healthy controls, highlighting a potential role as a risk factor for transition to psychosis and other negative clinical outcomes. The current study used the AX-CPT, a brief 15-min computerized task, to determine whether cognitive control impairments in CHR at baseline could predict clinical status at 12-month follow-up.
Baseline AX-CPT data were obtained from 117 CHR individuals participating in two studies, the Early Detection, Intervention, and Prevention of Psychosis Program (EDIPPP) and the Understanding Early Psychosis Programs (EP) and used to predict clinical status at 12-month follow-up. At 12 months, 19 individuals converted to a first episode of psychosis (CHR-C), 52 remitted (CHR-R), and 46 had persistent sub-threshold symptoms (CHR-P). Binary logistic regression and multinomial logistic regression were used to test prediction models.
Baseline AX-CPT performance (d-prime context) was less impaired in CHR-R compared to CHR-P and CHR-C patient groups. AX-CPT predictive validity was robust (0.723) for discriminating converters v. non-converters, and even greater (0.771) when predicting CHR three subgroups.
These longitudinal outcome data indicate that cognitive control deficits as measured by AX-CPT d-prime context are a strong predictor of clinical outcome in CHR individuals. The AX-CPT is brief, easily implemented and cost-effective measure that may be valuable for large-scale prediction efforts.
Antimicrobial stewardship programs typically use days of therapy to assess antimicrobial use. However, this metric does not account for the antimicrobial spectrum of activity. We applied an antibiotic spectrum index to a population of very-low-birth-weight infants to assess its utility to evaluate the impact of antimicrobial stewardship interventions.
Many school-based interventions for obesity prevention have been proposed with positive changes in behaviour, but with unsatisfactory results on weight change. The objective was to verify the effectiveness of a combined school- and home-based obesity prevention programme on excessive weight gain in adolescents. Teachers delivered the school-based primary prevention programme to fifth- and sixth-graders (nine schools, forty-eight control classes, forty-nine intervention classes), which included encouraging healthy eating habits and physical activity. A subgroup of overweight or obese adolescents also received a home-based secondary prevention programme delivered by community health professionals. Schools were randomised to intervention or control group. Intent-to-treat analysis used mixed models for repeated continuous measures and considered the cluster effect. The main outcomes were changes in BMI and percentage body fat (%body fat) after one school-year of intervention and follow-up. Against our hypothesis, BMI increased more in the intervention group than in the control group (Δ = 0·3 kg/m2; P = 0·05) with a greater decrease in %body fat among boys (Δ = –0·6 %; P = 0·03) in the control group. The intervention group increased physical activity by 12·5 min per week compared with the control group. Female adolescents in the intervention group ate healthier items more frequently than in the control group. The subgroup that received both the school and home interventions had an increase in %body fat than in the control group (Δ = 0·89 %; P = 0·01). In the present study, a behavioural change led to a small increase in physical activity and healthy eating habits but also to an overall increase in food intake.
In preparation for a multisite antibiotic stewardship intervention, we assessed knowledge and attitudes toward management of asymptomatic bacteriuria (ASB) plus teamwork and safety climate among providers, nurses, and clinical nurse assistants (CNAs).
Prospective surveys during January–June 2018.
All acute and long-term care units of 4 Veterans’ Affairs facilities.
The survey instrument included 2 previously tested subcomponents: the Kicking CAUTI survey (ASB knowledge and attitudes) and the Safety Attitudes Questionnaire (SAQ).
A total of 534 surveys were completed, with an overall response rate of 65%. Cognitive biases impacting management of ASB were identified. For example, providers presented with a case scenario of an asymptomatic patient with a positive urine culture were more likely to give antibiotics if the organism was resistant to antibiotics. Additionally, more than 80% of both nurses and CNAs indicated that foul smell is an appropriate indication for a urine culture. We found significant interprofessional differences in teamwork and safety climate (defined as attitudes about issues relevant to patient safety), with CNAs having highest scores and resident physicians having the lowest scores on self-reported perceptions of teamwork and safety climates (P < .001). Among providers, higher safety-climate scores were significantly associated with appropriate risk perceptions related to ASB, whereas social norms concerning ASB management were correlated with higher teamwork climate ratings.
Our survey revealed substantial misunderstanding regarding management of ASB among providers, nurses, and CNAs. Educating and empowering these professionals to discourage unnecessary urine culturing and inappropriate antibiotic use will be key components of antibiotic stewardship efforts.
Proximal environments could facilitate smoking cessation among low-income smokers by making cessation appealing to strive for and tenable.
We sought to examine how home smoking rules and proximal environmental factors such as other household members' and peers' smoking behaviors and attitudes related to low-income smokers' past quit attempts, readiness, and self-efficacy to quit.
This analysis used data from Offering Proactive Treatment Intervention (OPT-IN) (randomized control trial of proactive tobacco cessation outreach) baseline survey, which was completed by 2,406 participants in 2011/12. We tested the associations between predictors (home smoking rules and proximal environmental factors) and outcomes (past-year quit attempts, readiness to quit, and quitting self-efficacy).
Smokers who lived in homes with more restrictive household smoking rules, and/or reported having ‘important others’ who would be supportive of their quitting, were more likely to report having made a quit attempt in the past year, had greater readiness to quit, and greater self-efficacy related to quitting.
Adjustments to proximal environments, including strengthening household smoking rules, might encourage cessation even if other household members are smokers.
In March 2017, the New Jersey Department of Health received reports of 3 patients who developed septic arthritis after receiving intra-articular injections for osteoarthritis knee pain at the same private outpatient facility in New Jersey. The risk of septic arthritis resulting from intra-articular injection is low. However, outbreaks of septic arthritis associated with unsafe injection practices in outpatient settings have been reported.
An infection prevention assessment of the implicated facility’s practices was conducted because of the ongoing risk to public health. The assessment included an environmental inspection of the facility, staff interviews, infection prevention practice observations, and a medical record and office document review. A call for cases was disseminated to healthcare providers in New Jersey to identify patients treated at the facility who developed septic arthritis after receiving intra-articular injections.
We identified 41 patients with septic arthritis associated with intra-articular injections. Cultures of synovial fluid or tissue from 15 of these 41 case patients (37%) recovered bacteria consistent with oral flora. The infection prevention assessment of facility practices identified multiple breaches of recommended infection prevention practices, including inadequate hand hygiene, unsafe injection practices, and poor cleaning and disinfection practices. No additional cases were identified after infection prevention recommendations were implemented by the facility.
Aseptic technique is imperative when handling, preparing, and administering injectable medications to prevent microbial contamination.
This investigation highlights the importance of adhering to infection prevention recommendations. All healthcare personnel who prepare, handle, and administer injectable medications should be trained in infection prevention and safe injection practices.
To assess the safety of, and subsequent allergy documentation associated with, an antimicrobial stewardship intervention consisting of test-dose challenge procedures prompted by an electronic guideline for hospitalized patients with reported β-lactam allergies.
Retrospective cohort study.
Large healthcare system consisting of 2 academic and 3 community acute-care hospitals between April 2016 and December 2017.
We evaluated β-lactam antibiotic test-dose outcomes, including adverse drug reactions (ADRs), hypersensitivity reactions (HSRs), and electronic health record (EHR) allergy record updates. HSR predictors were examined using a multivariable logistic regression model. Modification of the EHR allergy record after test doses considered relevant allergy entries added, deleted, and/or specified.
We identified 1,046 test-doses: 809 (77%) to cephalosporins, 148 (14%) to penicillins, and 89 (9%) to carbapenems. Overall, 78 patients (7.5%; 95% confidence interval [CI], 5.9%–9.2%) had signs or symptoms of an ADR, and 40 (3.8%; 95% CI, 2.8%–5.2%) had confirmed HSRs. Most HSRs occurred at the second (ie, full-dose) step (68%) and required no treatment beyond drug discontinuation (58%); 3 HSR patients were treated with intramuscular epinephrine. Reported cephalosporin allergy history was associated with an increased odds of HSR (odds ratio [OR], 2.96; 95% CI, 1.34–6.58). Allergies were updated for 474 patients (45%), with records specified (82%), deleted (16%), and added (8%).
This antimicrobial stewardship intervention using β-lactam test-dose procedures was safe. Overall, 3.8% of patients with β-lactam allergy histories had an HSR; cephalosporin allergy histories conferred a 3-fold increased risk. Encouraging EHR documentation might improve this safe, effective, and practical acute-care antibiotic stewardship tool.
Childhood adversity is associated with poor mental and physical health outcomes across the life span. Alterations in the hypothalamic–pituitary–adrenal axis are considered a key mechanism underlying these associations, although findings have been mixed. These inconsistencies suggest that other aspects of stress processing may underlie variations in this these associations, and that differences in adversity type, sex, and age may be relevant. The current study investigated the relationship between childhood adversity, stress perception, and morning cortisol, and examined whether differences in adversity type (generalized vs. threat and deprivation), sex, and age had distinct effects on these associations. Salivary cortisol samples, daily hassle stress ratings, and retrospective measures of childhood adversity were collected from a large sample of youth at risk for serious mental illness including psychoses (n = 605, mean age = 19.3). Results indicated that childhood adversity was associated with increased stress perception, which subsequently predicted higher morning cortisol levels; however, these associations were specific to threat exposures in females. These findings highlight the role of stress perception in stress vulnerability following childhood adversity and highlight potential sex differences in the impact of threat exposures.
Bipolar disorder is less prevalent in older people but accounts for 8–10% of psychiatric admissions. Treating and managing bipolar disorder in older people is challenging because of medical comorbidity. We review the cognitive problems observed in older people, explore why these are important and consider current treatment options. There are very few studies examining the cognitive profiles of older people with bipolar disorder and symptomatic depression and mania, and these show significant impairments in executive function. Most studies have focused on cognitive impairment in euthymic older people: as in euthymic adults of working age, significant impairments are observed in tests of attention, memory and executive function/processing speeds. Screening tests are not always helpful in euthymic older people as the impairment can be relatively subtle, and more in-depth neuropsychological testing may be needed to show impairments. Cognitive impairment may be more pronounced in older people with ‘late-onset’ bipolar disorder than in those with ‘early-onset’ disorder. Strategies to address symptomatic cognitive impairment in older people include assertive treatment of the mood disorder, minimising drugs that can adversely affect cognition, optimising physical healthcare and reducing relapse rates.
After reading this article you will be able to:
•understand that cognitive impairment in euthymic older people with bipolar disorder is similar to that in working-age adults with the disorder, affecting attention, memory and executive function/processing speeds
•recognise that cognitive impairment in older people is likely to be a major determinant of functional outcomes
•Implement approaches to treat cognitive impairment in bipolar disorder.
DECLARATION OF INTEREST
B.J.S. consults for Cambridge Cognition, PEAK (www.peak.net) and Mundipharma.
We identified a pseudo-outbreak of Mycobacterium avium in an outpatient bronchoscopy clinic following an increase in clinic procedure volume. We terminated the pseudo-outbreak by increasing the frequency of automated endoscope reprocessors (AER) filter changes from quarterly to monthly. Filter changing schedules should depend on use rather than fixed time intervals.