To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This handbook focuses on the development and nurturance of creativity across the lifespan, from early childhood to adolescence, adulthood, and later life. It answers the question: how can we help individuals turn their creative potential into achievement? Each chapter examines various contexts in which creativity exists, including school, workplace, community spaces, and family life. It covers various modalities for fostering creativity such as play, storytelling, explicit training procedures, shifting of attitudes about creative capacity, and many others. The authors review research findings across disciplines, encompassing the work of psychologists, educators, neuroscientists, and creators themselves, to describe the best practices for fostering creativity at each stage of development.
Research on disaster behavioral health presents significant methodological challenges. Challenges are even more complex for research on mass violence events that involve military members, families, and communities, due to the cultural and logistical considerations of working with this population. The current article aims to inform and educate on this specialized area of research, by presenting a case study on the experience of designing and conducting disaster behavioral health research after a mass violence event in a military setting: the 2013 mass shooting at the Washington Navy Yard, in Washington, D.C. Using the case example, the authors explore methodological challenges and lessons learned from conducting research in this context, and provide guidance for future researchers.
Pharmacogenomic testing has emerged to aid medication selection for patients with major depressive disorder (MDD) by identifying potential gene-drug interactions (GDI). Many pharmacogenomic tests are available with varying levels of supporting evidence, including direct-to-consumer and physician-ordered tests. We retrospectively evaluated the safety of using a physician-ordered combinatorial pharmacogenomic test (GeneSight) to guide medication selection for patients with MDD in a large, randomized, controlled trial (GUIDED).
Materials and Methods
Patients diagnosed with MDD who had an inadequate response to ≥1 psychotropic medication were randomized to treatment as usual (TAU) or combinatorial pharmacogenomic test-guided care (guided-care). All received combinatorial pharmacogenomic testing and medications were categorized by predicted GDI (no, moderate, or significant GDI). Patients and raters were blinded to study arm, and physicians were blinded to test results for patients in TAU, through week 8. Measures included adverse events (AEs, present/absent), worsening suicidal ideation (increase of ≥1 on the corresponding HAM-D17 question), or symptom worsening (HAM-D17 increase of ≥1). These measures were evaluated based on medication changes [add only, drop only, switch (add and drop), any, and none] and study arm, as well as baseline medication GDI.
Most patients had a medication change between baseline and week 8 (938/1,166; 80.5%), including 269 (23.1%) who added only, 80 (6.9%) who dropped only, and 589 (50.5%) who switched medications. In the full cohort, changing medications resulted in an increased relative risk (RR) of experiencing AEs at both week 4 and 8 [RR 2.00 (95% CI 1.41–2.83) and RR 2.25 (95% CI 1.39–3.65), respectively]. This was true regardless of arm, with no significant difference observed between guided-care and TAU, though the RRs for guided-care were lower than for TAU. Medication change was not associated with increased suicidal ideation or symptom worsening, regardless of study arm or type of medication change. Special attention was focused on patients who entered the study taking medications identified by pharmacogenomic testing as likely having significant GDI; those who were only taking medications subject to no or moderate GDI at week 8 were significantly less likely to experience AEs than those who were still taking at least one medication subject to significant GDI (RR 0.39, 95% CI 0.15–0.99, p=0.048). No other significant differences in risk were observed at week 8.
These data indicate that patient safety in the combinatorial pharmacogenomic test-guided care arm was no worse than TAU in the GUIDED trial. Moreover, combinatorial pharmacogenomic-guided medication selection may reduce some safety concerns. Collectively, these data demonstrate that combinatorial pharmacogenomic testing can be adopted safely into clinical practice without risking symptom degradation among patients.
Sleep disruption is a common precursor to deterioration and relapse in people living with psychotic disorders. Understanding the temporal relationship between sleep and psychopathology is important for identifying and developing interventions which target key variables that contribute to relapse.
We used a purpose-built digital platform to sample self-reported sleep and psychopathology variables over 1 year, in 36 individuals with schizophrenia. Once-daily measures of sleep duration and sleep quality, and fluctuations in psychopathology (positive and negative affect, cognition and psychotic symptoms) were captured. We examined the temporal relationship between these variables using the Differential Time-Varying Effect (DTVEM) hybrid exploratory-confirmatory model.
Poorer sleep quality and shorter sleep duration maximally predicted deterioration in psychosis symptoms over the subsequent 1–8 and 1–12 days, respectively. These relationships were also mediated by negative affect and cognitive symptoms. Psychopathology variables also predicted sleep quality, but not sleep duration, and the effect sizes were smaller and of shorter lag duration.
Reduced sleep duration and poorer sleep quality anticipate the exacerbation of psychotic symptoms by approximately 1–2 weeks, and negative affect and cognitive symptoms mediate this relationship. We also observed a reciprocal relationship that was of shorter duration and smaller magnitude. Sleep disturbance may play a causal role in symptom exacerbation and relapse, and represents an important and tractable target for intervention. It warrants greater attention as an early warning sign of deterioration, and low-burden, user-friendly digital tools may play a role in its early detection.
The Rapid ASKAP Continuum Survey (RACS) is the first large-area survey to be conducted with the full 36-antenna Australian Square Kilometre Array Pathfinder (ASKAP) telescope. RACS will provide a shallow model of the ASKAP sky that will aid the calibration of future deep ASKAP surveys. RACS will cover the whole sky visible from the ASKAP site in Western Australia and will cover the full ASKAP band of 700–1800 MHz. The RACS images are generally deeper than the existing NRAO VLA Sky Survey and Sydney University Molonglo Sky Survey radio surveys and have better spatial resolution. All RACS survey products will be public, including radio images (with
15 arcsec resolution) and catalogues of about three million source components with spectral index and polarisation information. In this paper, we present a description of the RACS survey and the first data release of 903 images covering the sky south of declination
made over a 288-MHz band centred at 887.5 MHz.
Psychosocial stress in childhood and adolescence is linked to stress system dysregulation, although few studies have examined the relative impacts of parental harshness and parental disengagement. This study prospectively tested whether parental harshness and disengagement show differential associations with overall cortisol output in adolescence. Associations between overall cortisol output and adolescent mental health problems were tested concurrently. Adolescents from the Fragile Families and Child Wellbeing Study (FFCWS) provided hair samples for cortisol assay at 15 years (N = 171). Caregivers reported on parental harshness and disengagement experiences at 1, 3, 5, 9, and 15 years, and adolescents reported at 15 years. Both parent and adolescent reported depressive and anxiety symptoms and antisocial behaviors at 15. Greater parental harshness from 1–15 years, and harshness reported at 15 years in particular, was associated with higher overall cortisol output at 15. Greater parental disengagement from 1–15 years, and disengagement at 1 year specifically, was associated with lower cortisol output. There were no significant associations between cortisol output and depressive symptoms, anxiety symptoms, or antisocial behaviors. These results suggest that the unique variances of parental harshness and disengagement may have opposing associations with cortisol output at 15 years, with unclear implications for adolescent mental health.
To estimate the impact of California’s antimicrobial stewardship program (ASP) mandate on methicillin-resistant Staphylococcus aureus (MRSA) and Clostridioides difficile infection (CDI) rates in acute-care hospitals.
Centers for Medicare and Medicaid Services (CMS)–certified acute-care hospitals in the United States.
2013–2017 data from the CMS Hospital Compare, Provider of Service File and Medicare Cost Reports.
Difference-in-difference model with hospital fixed effects to compare California with all other states before and after the ASP mandate. We considered were standardized infection ratios (SIRs) for MRSA and CDI as the outcomes. We analyzed the following time-variant covariates: medical school affiliation, bed count, quality accreditation, number of changes in ownership, compliance with CMS requirements, % intensive care unit beds, average length of stay, patient safety index, and 30-day readmission rate.
In 2013, California hospitals had an average MRSA SIR of 0.79 versus 0.94 in other states, and an average CDI SIR of 1.01 versus 0.77 in other states. California hospitals had increases (P < .05) of 23%, 30%, and 20% in their MRSA SIRs in 2015, 2016, and 2017, respectively. California hospitals were associated with a 20% (P < .001) decrease in the CDI SIR only in 2017.
The mandate was associated with a decrease in CDI SIR and an increase in MRSA SIR.
Herbicide resistance has for decades been an increasing problem of agronomic crops such as corn and soybean. Several weed species have evolved herbicide resistance in turfgrass systems such as golf courses, sports fields, and sod production—particularly biotypes of annual bluegrass and goosegrass. Consequences of herbicide resistance in agronomic cropping systems indicate what could happen in turfgrass if herbicide resistance becomes broader in terms of species, distribution, and mechanisms of action. The turfgrass industry must take action to develop effective resistance management programs while this problem is still relatively small in scope. We propose that lessons learned from a series of national listening sessions conducted by the Herbicide Resistance Education Committee of the Weed Science Society of America to better understand the human dimensions affecting herbicide resistance in crop production provide tremendous insight into what themes to address when developing effective resistance management programs for the turfgrass industry.
At present, analysis of diet and bladder cancer (BC) is mostly based on the intake of individual foods. The examination of food combinations provides a scope to deal with the complexity and unpredictability of the diet and aims to overcome the limitations of the study of nutrients and foods in isolation. This article aims to demonstrate the usability of supervised data mining methods to extract the food groups related to BC. In order to derive key food groups associated with BC risk, we applied the data mining technique C5.0 with 10-fold cross-validation in the BLadder cancer Epidemiology and Nutritional Determinants study, including data from eighteen case–control and one nested case–cohort study, compromising 8320 BC cases out of 31 551 participants. Dietary data, on the eleven main food groups of the Eurocode 2 Core classification codebook, and relevant non-diet data (i.e. sex, age and smoking status) were available. Primarily, five key food groups were extracted; in order of importance, beverages (non-milk); grains and grain products; vegetables and vegetable products; fats, oils and their products; meats and meat products were associated with BC risk. Since these food groups are corresponded with previously proposed BC-related dietary factors, data mining seems to be a promising technique in the field of nutritional epidemiology and deserves further examination.
We describe an ultra-wide-bandwidth, low-frequency receiver recently installed on the Parkes radio telescope. The receiver system provides continuous frequency coverage from 704 to 4032 MHz. For much of the band (
), the system temperature is approximately 22 K and the receiver system remains in a linear regime even in the presence of strong mobile phone transmissions. We discuss the scientific and technical aspects of the new receiver, including its astronomical objectives, as well as the feed, receiver, digitiser, and signal processor design. We describe the pipeline routines that form the archive-ready data products and how those data files can be accessed from the archives. The system performance is quantified, including the system noise and linearity, beam shape, antenna efficiency, polarisation calibration, and timing stability.
The Genomics Used to Improve DEpresssion Decisions (GUIDED) trial assessed outcomes associated with combinatorial pharmacogenomic (PGx) testing in patients with major depressive disorder (MDD). Analyses used the 17-item Hamilton Depression (HAM-D17) rating scale; however, studies demonstrate that the abbreviated, core depression symptom-focused, HAM-D6 rating scale may have greater sensitivity toward detecting differences between treatment and placebo. However, the sensitivity of HAM-D6 has not been tested for two active treatment arms. Here, we evaluated the sensitivity of the HAM-D6 scale, relative to the HAM-D17 scale, when assessing outcomes for actively treated patients in the GUIDED trial.
Outpatients (N=1,298) diagnosed with MDD and an inadequate treatment response to >1 psychotropic medication were randomized into treatment as usual (TAU) or combinatorial PGx-guided (guided-care) arms. Combinatorial PGx testing was performed on all patients, though test reports were only available to the guided-care arm. All patients and raters were blinded to study arm until after week 8. Medications on the combinatorial PGx test report were categorized based on the level of predicted gene-drug interactions: ‘use as directed’, ‘moderate gene-drug interactions’, or ‘significant gene-drug interactions.’ Patient outcomes were assessed by arm at week 8 using HAM-D6 and HAM-D17 rating scales, including symptom improvement (percent change in scale), response (≥50% decrease in scale), and remission (HAM-D6 ≤4 and HAM-D17 ≤7).
At week 8, the guided-care arm demonstrated statistically significant symptom improvement over TAU using HAM-D6 scale (Δ=4.4%, p=0.023), but not using the HAM-D17 scale (Δ=3.2%, p=0.069). The response rate increased significantly for guided-care compared with TAU using both HAM-D6 (Δ=7.0%, p=0.004) and HAM-D17 (Δ=6.3%, p=0.007). Remission rates were also significantly greater for guided-care versus TAU using both scales (HAM-D6 Δ=4.6%, p=0.031; HAM-D17 Δ=5.5%, p=0.005). Patients taking medication(s) predicted to have gene-drug interactions at baseline showed further increased benefit over TAU at week 8 using HAM-D6 for symptom improvement (Δ=7.3%, p=0.004) response (Δ=10.0%, p=0.001) and remission (Δ=7.9%, p=0.005). Comparatively, the magnitude of the differences in outcomes between arms at week 8 was lower using HAM-D17 (symptom improvement Δ=5.0%, p=0.029; response Δ=8.0%, p=0.008; remission Δ=7.5%, p=0.003).
Combinatorial PGx-guided care achieved significantly better patient outcomes compared with TAU when assessed using the HAM-D6 scale. These findings suggest that the HAM-D6 scale is better suited than is the HAM-D17 for evaluating change in randomized, controlled trials comparing active treatment arms.
To measure the association between statewide adoption of the Centers for Disease Control and Prevention’s (CDC’s) Core Elements for Hospital Antimicrobial Stewardship Programs (Core Elements) and hospital-associated methicillin-resistant Staphylococcus aureus bacteremia (MRSA) and Clostridioides difficile infection (CDI) rates in the United States. We hypothesized that states with a higher percentage of reported compliance with the Core Elements have significantly lower MRSA and CDI rates.
All US states.
Observational longitudinal study.
We used 2014–2016 data from Hospital Compare, Provider of Service files, Medicare cost reports, and the CDC’s Patient Safety Atlas website. Outcomes were MRSA standardized infection ratio (SIR) and CDI SIR. The key explanatory variable was the percentage of hospitals that meet the Core Elements in each state. We estimated state and time fixed-effects models with time-variant controls, and we weighted our analyses for the number of hospitals in the state.
The percentage of hospitals reporting compliance with the Core Elements between 2014 and 2016 increased in all states. A 1% increase in reported ASP compliance was associated with a 0.3% decrease (P < .01) in CDIs in 2016 relative to 2014. We did not find an association for MRSA infections.
Increasing documentation of the Core Elements may be associated with decreases in the CDI SIR. We did not find evidence of such an association for the MRSA SIR, probably due to the short length of the study and variety of stewardship strategies that ASPs may encompass.
The reported incidence of Clostridoides difficile infection (CDI) has increased in recent years, partly due to broadening adoption of nucleic acid amplification tests (NAATs) replacing enzyme immunoassay (EIA) methods. Our aim was to quantify the impact of this switch on reported CDI rates using a large, multihospital, empirical dataset.
We analyzed 9 years of retrospective CDI data (2009–2017) from 47 hospitals in the southeastern United States; 37 hospitals switched to NAAT during this period, including 24 with sufficient pre- and post-switch data for statistical analyses. Poisson regression was used to quantify the NAAT-over-EIA incidence rate ratio (IRR) at hospital and network levels while controlling for longitudinal trends, the proportion of intensive care unit patient days, changes in surveillance methodology, and previously detected infection cluster periods. We additionally used change-point detection methods to identify shifts in the mean and/or slope of hospital-level CDI rates, and we compared results to recorded switch dates.
For hospitals that transitioned to NAAT, average unadjusted CDI rates increased substantially after the test switch from 10.9 to 23.9 per 10,000 patient days. Individual hospital IRRs ranged from 0.75 to 5.47, with a network-wide IRR of 1.75 (95% confidence interval, 1.62–1.89). Reported CDI rates significantly changed 1.6 months on average after switching to NAAT testing (standard deviation, 1.9 months).
Hospitals that switched from EIA to NAAT testing experienced an average postswitch increase of 75% in reported CDI rates after adjusting for other factors, and this increase was often gradual or delayed.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
Myocardial strain measurements are increasingly used to detect complications following heart transplantation. However, the temporal association of these changes with allograft rejection is not well defined. The aim of this study was to describe the evolution of strain measurements prior to the diagnosis of rejection in paediatric heart transplant recipients.
All paediatric heart transplant recipients (2004–2015) with at least one episode of acute rejection were identified. Longitudinal and circumferential strain measurements were assessed at the time of rejection and retrospectively on all echocardiograms until the most recent negative biopsy. Smoothing technique (LOESS) was used to visualise the changes of each variable over time and estimate the time preceding rejection at which alterations are first detectable.
A total of 58 rejection episodes were included from 37 unique patients. In the presence of rejection, there were decrements from baseline in global longitudinal strain (−18.2 versus −14.1), global circumferential strain (−24.1 versus −19.6), longitudinal strain rate (−1 versus −0.8), circumferential strain rate (−1.3 versus −1.1), peak longitudinal early diastolic strain rate (1.3 versus 1), and peak circumferential early diastolic strain rate (1.5 versus 1.3) (p<0.01 for all). The earliest detectable changes occurred 45 days prior to rejection with simultaneous alterations in myocardial strain and ejection fraction.
Changes in graft function can be detected non-invasively prior to the diagnosis of rejection. However, changes in strain occur concurrently with a decline in ejection fraction. Strain measurements aid in the non-invasive detection of rejection, but may not facilitate earlier diagnosis compared to more traditional measures of ventricular function.
The second Singapore Mental Health Study (SMHS) – a nationwide, cross-sectional, epidemiological survey - was initiated in 2016 with the intent of tracking the state of mental health of the general population in Singapore. The study employed the same methodology as the first survey initiated in 2010. The SMHS 2016 aimed to (i) establish the 12-month and lifetime prevalence and correlates of major depressive disorder (MDD), dysthymia, bipolar disorder, generalised anxiety disorder (GAD), obsessive compulsive disorder (OCD) and alcohol use disorder (AUD) (which included alcohol abuse and dependence) and (ii) compare the prevalence of these disorders with reference to data from the SMHS 2010.
Door-to-door household surveys were conducted with adult Singapore residents aged 18 years and above from 2016 to 2018 (n = 6126) which yielded a response rate of 69.0%. The subjects were randomly selected using a disproportionate stratified sampling method and assessed using World Health Organization Composite International Diagnostic Interview version 3.0 (WHO-CIDI 3.0). The diagnoses of lifetime and 12-month selected mental disorders including MDD, dysthymia, bipolar disorder, GAD, OCD, and AUD (alcohol abuse and alcohol dependence), were based on the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV) criteria.
The lifetime prevalence of at least one mood, anxiety or alcohol use disorder was 13.9% in the adult population. MDD had the highest lifetime prevalence (6.3%) followed by alcohol abuse (4.1%). The 12-month prevalence of any DSM-IV mental disorders was 6.5%. OCD had the highest 12-month prevalence (2.9%) followed by MDD (2.3%). Lifetime and 12-month prevalence of mental disorders assessed in SMHS 2016 (13.8% and 6.4%) was significantly higher than that in SMHS 2010 (12.0% and 4.4%). A significant increase was observed in the prevalence of lifetime GAD (0.9% to 1.6%) and alcohol abuse (3.1% to 4.1%). The 12-month prevalence of GAD (0.8% vs. 0.4%) and OCD (2.9% vs. 1.1%) was significantly higher in SMHS 2016 as compared to SMHS 2010.
The high prevalence of OCD and the increase across the two surveys needs to be tackled at a population level both in terms of creating awareness of the disorder and the need for early treatment. Youth emerge as a vulnerable group who are more likely to be associated with mental disorders and thus targeted interventions in this group with a focus on youth friendly and accessible care centres may lead to earlier detection and treatment of mental disorders.
Common bean (Phaseolus vulgaris L.) is perhaps the most important grain legume in sub-Saharan Africa (SSA) smallholder systems for food security and household income. Although a wide choice of varieties is available, smallholder farmers in western Kenya realize yields that are low and variable since they operate in risky production environments. Significant seasonal variations exist in rainfall and severity of pests and diseases. This situation is worsened by the low and declining soil fertility, coupled with low capacity of farmers to purchase production inputs such as fertilizers, fungicides and insecticides, and land scarcity. The objective of this study was to investigate whether growing multiple-bean varieties instead of a single variety can enable farmers enhance yield stability over seasons and ensure food security. Five common bean varieties were evaluated in multiple farms for 11 seasons at Kapkerer in Nandi County, western Kenya. Data were collected on grain yield, days to 50% flowering and major diseases. In addition, daily rainfall was recorded throughout the growing seasons. The five varieties were combined in all possible ways to create 31 single- and multiple-bean production strategies. The strategies were evaluated for grain yield performance and yield stability over seasons to determine the risk of not attaining a particular yield target. Results indicated that cropping multiple-bean varieties can be an effective way for reducing production risks in heterogeneous smallholder systems. Yield stability can be greatly enhanced across diverse environments, leading to improved food security, especially for the resource-poor smallholder farmers operating in risk-prone environments. Although the results show that some of the single-bean variety strategies were high yielding, their yield stability was generally lower than those of multiple strategies. Resource-poor risk averse farmers can greatly increase the probability of exceeding their yield targets by cropping multiple-bean varieties with relatively low yields but high grain yield stability. Trading-off high grain yield for yield stability might be an important strategy for minimizing bean production risks.