We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The cornerstone of obesity treatment is behavioural weight management, resulting in significant improvements in cardio-metabolic and psychosocial health. However, there is ongoing concern that dietary interventions used for weight management may precipitate the development of eating disorders. Systematic reviews demonstrate that, while for most participants medically supervised obesity treatment improves risk scores related to eating disorders, a subset of people who undergo obesity treatment may have poor outcomes for eating disorders. This review summarises the background and rationale for the formation of the Eating Disorders In weight-related Therapy (EDIT) Collaboration. The EDIT Collaboration will explore the complex risk factor interactions that precede changes to eating disorder risk following weight management. In this review, we also outline the programme of work and design of studies for the EDIT Collaboration, including expected knowledge gains. The EDIT studies explore risk factors and the interactions between them using individual-level data from international weight management trials. Combining all available data on eating disorder risk from weight management trials will allow sufficient sample size to interrogate our hypothesis: that individuals undertaking weight management interventions will vary in their eating disorder risk profile, on the basis of personal characteristics and intervention strategies available to them. The collaboration includes the integration of health consumers in project development and translation. An important knowledge gain from this project is a comprehensive understanding of the impact of weight management interventions on eating disorder risk.
We detected no correlation between standardized antimicrobial administration ratios (SAARs) and healthcare facility-onset Clostridioides difficile infection (HO-CDI) rates in 102 acute-care Veterans Affairs medical centers over 16 months. SAARs may be useful for investigating trends in local antimicrobial use, but no ratio threshold demarcated HO-CDI risk.
To conduct a contemporary detailed assessment of outpatient antibiotic prescribing and outcomes for positive urine cultures in a mixed-sex cohort.
Design:
Multicenter retrospective cohort review.
Setting:
The study was conducted using data from 31 Veterans’ Affairs medical centers.
Patients:
Outpatient adults with positive urine cultures.
Methods:
From 2016 to 2019, data were extracted through a nationwide database and manual chart review. Positive urine cultures were reviewed at the chart, clinician, and aggregate levels. Cases were classified as cystitis, pyelonephritis, or asymptomatic bacteriuria (ASB) based upon documented signs and symptoms. Preferred therapy definitions were applied for subdiagnoses: ASB (no antibiotics), cystitis (trimethoprim-sulfamethoxazole, nitrofurantoin, β-lactams), and pyelonephritis (trimethoprim-sulfamethoxazole, fluoroquinolone). Outcomes included 30-day clinical failure or hospitalization. Odds ratios for outcomes between treatments were estimated using logistic regression.
Results:
Of 3,255 cases reviewed, ASB was identified in 1,628 cases (50%), cystitis was identified in 1,156 cases (36%), and pyelonephritis was identified in 471 cases (15%). Of all 2,831 cases, 1,298 (46%) received preferred therapy selection and duration for cases where it could be defined. The most common antibiotic class prescribed was a fluoroquinolone (34%). Patients prescribed preferred therapy had lower odds of clinical failure: preferred (8%) versus nonpreferred (10%) (unadjusted OR, 0.74; 95% confidence interval [CI], 0.58–0.95; P = .018). They also had lower odds of 30-day hospitalization: preferred therapy (3%) versus nonpreferred therapy (5%) (unadjusted OR, 0.55; 95% CI, 0.37–0.81; P = .002). Odds of clinical treatment failure or hospitalization was higher for β-lactams relative to ciprofloxacin (unadjusted OR, 1.89; 95% CI, 1.23–2.90; P = .002).
Conclusions:
Clinicians prescribed preferred therapy 46% of the time. Those prescribed preferred therapy had lower odds of clinical failure and of being hospitalized.
Deficits in visuospatial attention, known as neglect, are common following brain injury, but underdiagnosed and poorly treated, resulting in long-term cognitive disability. In clinical settings, neglect is often assessed using simple pen-and-paper tests. While convenient, these cannot characterise the full spectrum of neglect. This protocol reports a research programme that compares traditional neglect assessments with a novel virtual reality attention assessment platform: The Attention Atlas (AA).
Methods/design:
The AA was codesigned by researchers and clinicians to meet the clinical need for improved neglect assessment. The AA uses a visual search paradigm to map the attended space in three dimensions and seeks to identify the optimal parameters that best distinguish neglect from non-neglect, and the spectrum of neglect, by providing near-time feedback to clinicians on system-level behavioural performance. A series of experiments will address procedural, scientific, patient, and clinical feasibility domains.
Results:
Analyses focuses on descriptive measures of reaction time, accuracy data for target localisation, and histogram-based raycast attentional mapping analysis; which measures the individual’s orientation in space, and inter- and intra-individual variation of visuospatial attention. We will compare neglect and control data using parametric between-subjects analyses. We present example individual-level results produced in near-time during visual search.
Conclusions:
The development and validation of the AA is part of a new generation of translational neuroscience that exploits the latest advances in technology and brain science, including technology repurposed from the consumer gaming market. This approach to rehabilitation has the potential for highly accurate, highly engaging, personalised care.
Although printing music from movable type was the predominant method employed for nearly two hundred years after Ottaviano Petrucci's pioneering work at the opening of the sixteenth century, occasional use of engraving did occur at intervals during this period, beginning in the 1530s with Francesco da Milano's Intabolatura da leuto (c.1535). The technique of engraving copper plates had been in use for reproducing works of art from the mid-fifteenth century and was soon extended to maps and the like, so there was nothing problematic in printing music by this method. The most immediate advantage was that any type of music could be accommodated, so it is not surprising that lute music and keyboard music were the two genres that first made use of it; although it was possible to print such music typographically, the result was rarely elegant. Parthenia (1612/13), described on the title-page as ‘the first musicke that ever was printed for the virginalls’ and engraved by William Hole, marked the appearance of engraved music in England. It was, however, only in the closing decades of the seventeenth century that engraved music came to the fore. One reason for its slow adoption was that copper plates were comparatively expensive, and the hardness of the metal also made it slow to engrave. While this was of minor consequence for single-plate works of art or maps, a music publication with several dozen pages was a different matter. According to Sir John Hawkins, Estienne Roger in Amsterdam around the end of the seventeenth century found that a softer amalgam of copper could make life easier for the engraver, but the real breakthrough came with the discovery about the same time that pewter offered a satisfactory alternative to copper: cheaper and more easily engraved, yet resilient enough to withstand the stresses of a rolling press. Thomas Cross and John Walsh the elder were among the first to adopt it in England, and while the former remained faithful to engraving everything by hand with a graver, the latter embraced an important second innovation, the use of a series of punches to provide the note heads, clefs, dynamic signs, and text, which eased much of the work. This gradually became general practice in the eighteenth century, and it then remained an essentially unchanged method at least of producing the master copy of a page of music until well into the second half of the twentieth century.
Early reporting of atypical symptoms following a mild traumatic brain injury (mTBI) may be an early indicator of poor prognosis. This study aimed to determine the percentage of people reporting atypical symptoms 1-month post-mTBI and explore links to recovery 12 months later in a community-dwelling mTBI sample.
Methods:
Adult participants (>16 years) who had experienced a mTBI were identified from a longitudinal incidence study (BIONIC). At 1-month post-injury, 260 participants completed the Rivermead Post-Concussion Symptoms Questionnaire (typical symptoms) plus four atypical symptom items (hemiplegia, difficulty swallowing, digestion problems and difficulties with fine motor tasks). At 12 months post-injury, 73.9% (n = 193) rated their overall recovery on a 100-point scale. An ordinal regression explored the association between atypical symptoms at 1 month and recovery at 12 months post-injury (low = 0–80, moderate = 81–99 and complete recovery = 100), whilst controlling for age, sex, rehabilitation received, ethnicity, mental and physical comorbidities and additional injuries sustained at the time of injury.
Results:
At 1-month post-injury <1% of participants reported hemiplegia, 5.4% difficulty swallowing, 10% digestion problems and 15.4% difficulties with fine motor tasks. The ordinal regression model revealed atypical symptoms were not significant predictors of self-rated recovery at 12 months. Older age at injury and higher typical symptoms at 1 month were independently associated with poorer recovery at 12 months, p < 0.01.
Conclusion:
Atypical symptoms on initial presentation were not linked to global self-reported recovery at 12 months. Age at injury and typical symptoms are stronger early indicators of longer-term prognosis. Further research is needed to determine if atypical symptoms predict other outcomes following mTBI.
Irritability is a transdiagnostic symptom dimension in developmental psychopathology, closely related to the Research Domain Criteria (RDoC) construct of frustrative nonreward. Consistent with the RDoC framework and calls for transdiagnostic, developmentally-sensitive assessment methods, we report data from a smartphone-based, naturalistic ecological momentary assessment (EMA) study of irritability. We assessed 109 children and adolescents (Mage = 12.55 years; 75.20% male) encompassing several diagnostic groups – disruptive mood dysregulation disorder (DMDD), attention-deficit/hyperactivity disorder (ADHD), anxiety disorders (ANX), healthy volunteers (HV). The participants rated symptoms three times per day for 1 week. Compliance with the EMA protocol was high. As tested using multilevel modeling, EMA ratings of irritability were strongly and consistently associated with in-clinic, gold-standard measures of irritability. Further, EMA ratings of irritability were significantly related to subjective frustration during a laboratory task eliciting frustrative nonreward. Irritability levels exhibited an expected graduated pattern across diagnostic groups, and the different EMA items measuring irritability were significantly associated with one another within all groups, supporting the transdiagnostic phenomenology of irritability. Additional analyses utilized EMA ratings of anxiety as a comparison with respect to convergent validity and transdiagnostic phenomenology. The results support new measurement tools that can be used in future studies of irritability and frustrative nonreward.
This research adds to scarce literature regarding adolescent experiences of traumatic brain injury (TBI). Retrospective accounts of young adults who had sustained a TBI in adolescence were analysed to explore the perceived impact this had on their lives and forming identities during this important developmental stage.
Methods:
Thirteen adults (aged 20–25 years; mean 23 years) who sustained a mild or moderate TBI during adolescence (i.e. aged 13–17 years at injury), approximately 7.7 years (range = 6.7–8.0 years) prior, participated in the research. Semi-structured individual interviews, analysed using thematic analysis, explored participants’ experiences following their TBIs.
Results:
Thematic analysis of interview data produced two categories of themes: (1) Impacts on Important Areas of Life, which included: schoolwork suffered, career opportunities became limited, struggling with work and missing out socially; and (2) Impacts on Identity: with themes including feeling ‘stupid’, feeling self-conscious, loss of social identity and being dependent.
Conclusions:
TBI sustained during adolescence can have broad impacts on important areas of life and on developing identity.
This chapter synthesises insights from the Deep Decarbonisation Pathways Project (DDPP), which provided detailed analysis of how 16 countries representing three-quarters of global emissions can transition to very low-carbon economies. The four ‘pillars’ of decarbonisation are identified as: achieving low or zero-carbon electricity supply; electrification and fuel switching in transport, industry and housing; ambitious energy efficiency improvements; and reducing non-energy emissions. The chapter focuses on decarbonisation scenarios for Australia. It shows that electricity supply can be readily decarbonised and greatly expanded to cater for electrification of transport, industry and buildings. There would be remaining emissions principally from industry and agriculture, these could be fully compensated through land-based carbon sequestration. The analysis shows that such decarbonisation would be consistent with continued growth in GDP and trade, and would require very little change in economic structure of Australia’s economy. Australia is rich in renewable energy potential, which could re-enable new industries such as energy-intensive manufacturing for export
Testosterone (T) and cortisol (C) are the end products of neuroendocrine axes that interact with the process of shaping brain structure and function. Relative levels of T:C (TC ratio) may alter prefrontal–amygdala functional connectivity in adulthood. What remains unclear is whether TC-related effects are rooted to childhood and adolescence. We used a healthy cohort of 4–22-year-olds to test for associations between TC ratios, brain structure (amygdala volume, cortical thickness (CTh), and their coordinated growth), as well as cognitive and behavioral development. We found greater TC ratios to be associated with the growth of specific brain structures: 1) parietal CTh; 2) covariance of the amygdala with CTh in visual and somatosensory areas. These brain parameters were in turn associated with lower verbal/executive function and higher spatial working memory. In sum, individual TC profiles may confer a particular brain phenotype and set of cognitive strengths and vulnerabilities, prior to adulthood.
The objective of this scoping review was to examine the research question: In the adults with or without cardiometabolic risk, what is the availability of literature examining interventions to improve or maintain nutrition and physical activity-related outcomes? Sub-topics included: (1) behaviour counseling or coaching from a dietitian/nutritionist or exercise practitioner, (2) mobile applications to improve nutrition and physical activity and (3) nutritional ergogenic aids.
Design:
The current study is a scoping review. A literature search of the Medline Complete, CINAHL Complete, Cochrane Database of Systematic Reviews and other databases was conducted to identify articles published in the English language from January 2005 until May 2020. Data were synthesised using bubble charts and heat maps.
Setting:
Out-patient, community and workplace.
Participants:
Adults with or without cardiometabolic risk factors living in economically developed countries.
Results:
Searches resulted in 19 474 unique articles and 170 articles were included in this scoping review, including one guideline, thirty systematic reviews (SR), 134 randomised controlled trials and five non-randomised trials. Mobile applications (n 37) as well as ergogenic aids (n 87) have been addressed in several recent studies, including SR. While primary research has examined the effect of individual-level nutrition and physical activity counseling or coaching from a dietitian/nutritionist and/or exercise practitioner (n 48), interventions provided by these practitioners have not been recently synthesised in SR.
Conclusion:
SR of behaviour counseling or coaching provided by a dietitian/nutritionist and/or exercise practitioner are needed and can inform practice for practitioners working with individuals who are healthy or have cardiometabolic risk.
Background: Implementation of antimicrobial stewardship programs (ASPs) in acute-care facilities may optimize antibiotic use and decrease antibiotic resistance. To explore the relationship between ASPs and clinical outcomes, we reviewed bivariate relationships between VA Medical Center (VAMC) complexity level and presence of an ASP, presence of an ASP and inpatient antibiotic use, and antibiotic use and antibiotic resistance or Clostridioides difficile infection (CDI). Methods: We conducted a cross-sectional study of national data using the following elements: a detailed survey of antimicrobial stewardship practices at VAMCs in 2012 which included facility complexity designations; data from the VA national Electronic Health Record (EHR) for inpatient antibiotic use (2009–2012 in days of therapy per 1,000 bed days of care); EHR laboratory data in 2013 for antibiotic resistance in E. coli isolates; and 2013 CDI rate data from the VA Inpatient Evaluation Center. These data were reviewed for assessment of the presence of ASPs and for antibiotic use and resistance. We assessed 4 groups of antibiotics for use and resistance: total antibiotics, fluoroquinolones, cephalosporins, and carbapenems. Categorical, t test, or nonparametric analyses were performed, as appropriate. Results: 120 VAMCs were evaluated; 71% had ASPs. Proportions of VAMCs with ASPs were not significantly different by facility complexity level. Differences were observed between presence or absence of ASP and some antibiotic use groups (Table). Presence or absence of an ASP was not statistically associated with a difference in E. coli resistance (any antibiotic group examined) or CDI rates. In addition, antibiotic use (any group) did not statistically associate with E. coli resistance rates, and this result remained unchanged when stratified by presence or absence of an ASP. Conclusions: Total antibiotic use and fluroquinolone use were lower among facilities with ASPs than without, a finding consistent with ASP implementation reducing the amount of antibiotics prescribed. Although we did not find an association between facilities with an ASP and antibiotic resistance or CDI rates in this preliminary review, it sets the stage for future multivariate analyses. Furthermore, given the years of antibiotic use needed for development of resistance, the limited years evaluated may not have been sufficient to determine an impact, highlighting the need for further research into understanding clinical outcomes.
Background: Inappropriate use of MRSA-spectrum antibiotics is an important antimicrobial stewardship target. Contributors to inappropriate use include empiric treatment of patients who are determined to not be infected or who are infected but lack MRSA risk factors, and by excessive treatment duration when suspected MRSA infection is disproven. To characterize opportunities for improvement, we conducted a medical use evaluation (MUE) in 27 VA medical centers. The primary objectives were to assess the following proportions: (1) courses of unjustified empiric vancomycin therapy (patients in whom all antibacterials were halted within 2 days or without a principal or secondary discharge infection diagnosis); (2) courses of unjustified continuation of anti-MRSA therapy beyond day 4 (no MRSA risk factors or proven MRSA infection); and (3) excess anti-MRSA days of therapy (DOT), that is, DOT in unjustified empiric courses plus DOT after day 4 in unjustified continued courses. Methods: Clinical pharmacists performed retrospective, structured, manual record reviews of patients started on intravenous vancomycin on day 1 or 2 of hospitalization from June 2017 to May 2018. Exclusion criteria included surgical prophylaxis, recent MRSA infection, β-lactam allergy, renal insufficiency, severe immunosuppression, or infection that warranted anti-MRSA therapy other than vancomycin. Results: Of 2,493 evaluated patients, 1,320 met the inclusion criteria. Among them, 44% of courses were initiated in the emergency department, 37% of patients had ≥1 risk factor for healthcare-associated infections, and 50% of patients had ≥2 SIRS criteria or required vasopressor support. The most common admission diagnoses were skin and soft-tissue infection (SSTI, 40%; 68% nonpurulent) and pneumonia (27%; 46% without healthcare risk factors). Clinical cultures recovered MRSA from 8% of patients. Empiric therapy was not justified in 342 patients (26%; 57% were clinically stable). Continued therapy was unjustified in 46% of the 320 patients who received >4 days of anti-MRSA therapy. Of all days of anti-MRSA therapy, 23% were unjustified; 65% of these were due to unjustified empiric therapy. Site-specific variations in unjustified empiric therapy better correlated with the proportion of unjustified DOT than did unjustified continuation of therapy (Pearson correlation coefficients [PCC], 0.75 and 0.54, respectively) (Fig. 1). Facility-specific proportions of unjustified DOT modestly correlated with anti-MRSA DOT (PCC, 0.45; n = 27) (Fig. 2) but not the anti-MRSA standardized antimicrobial administration ratio (PCC, 0.15; n = 21). Conclusions: In this multicenter MUE, 26% of all days of anti-MRSA therapy lacked justification; this rate correlated with total facility-specific anti-MRSA DOT. Unnecessary empiric therapy, largely in the ED and for nonpurulent SSTIs and pneumonia without risk factors, was the principal contributor to unjustified DOT.
Idiopathic Parkinson’s disease (PD) is the second most common neurodegenerative disorder after Alzheimer’s disease (AD), and motorically it is characterized by tremor, ridigity, bradykinesia, and postural instability. Whilst it was historically considered to be a movement disorder there are multiple non-motor symptoms, which often precede the motor symptoms by years or even decades. These include dysautonomia, sleep disturbances, neuropsychiatric disturbances, pain, and sensory problems. These have a negative effect on quality of life and are associated with overall higher carer burden and, potentially, higher care costs whilst being frequently undeclared by patients.
Despite its seeming defeat at the hands of the new domestic melodrama and, later, the innovations of dramatic realism, the Gothic, beyond its heyday during period 1790–1830, continued to stalk the English stage well into the nineteenth century. Shape-shifting and refusing to die outright, the Gothic mode would inform melodrama, domestic drama, sensation drama and even the emerging realist dramas to the end of the century. Moreover, while according to received narratives of theatre history, the new modes of realism would claim a victorious precedence over the archaic drama of immured heroines and haunted castles, this chapter argues that as the fin de siècle loomed, attempts to repress the theatrical Gothic were met with an increasingly Gothic representation of the theatre itself within the wider popular and literary imagination.
The diagnosis of an advanced cancer in young adulthood can bring one's life to an abrupt halt, calling attention to the present moment and creating anguish about an uncertain future. There is seldom time or physical stamina to focus on forward-thinking, social roles, relationships, or dreams. As a result, young adults (YAs) with advanced cancer frequently encounter existential distress, despair, and question the purpose of their life. We sought to investigate the meaning and function of hope throughout YAs’ disease trajectory; to discern the psychosocial processes YAs employ to engage hope; and to develop a substantive theory of hope of YAs diagnosed with advanced cancer.
Method
Thirteen YAs (ages 23–38) diagnosed with a stage III or IV cancer were recruited throughout the eastern and southeastern United States. Participants completed one semi-structured interview in-person, by phone, or Skype, that incorporated an original timeline instrument assessing fluctuations in hope and an online socio-demographic survey. Glaser's grounded theory methodology informed constant comparative methods of data collection, analysis, and interpretation.
Results
Findings from this study informed the development of the novel contingent hope theoretical framework, which describes the pattern of psychosocial behaviors YAs with advanced cancer employ to reconcile identities and strive for a life of meaning. The ability to cultivate the necessary agency and pathways to reconcile identities became contingent on the YAs’ participation in each of the psychosocial processes of the contingent hope theoretical framework: navigating uncertainty, feeling broken, disorienting grief, finding bearings, and identity reconciliation.
Significance of Results
Study findings portray the influential role of hope in motivating YAs with advanced cancer through disorienting grief toward an integrated sense of self that marries cherished aspects of multiple identities. The contingent hope theoretical framework details psychosocial behaviors to inform assessments and interventions fostering hope and identity reconciliation.
Historically, Parkinson's disease was viewed as a motor disorder and it is only in recent years that the spectrum of non-motor disorders associated with the condition has been fully recognised. There is a broad scope of neuropsychiatric manifestations, including depression, anxiety, apathy, psychosis and cognitive impairment. Patients are more predisposed to delirium, and Parkinson's disease treatments give rise to specific syndromes, including impulse control disorders, dopamine agonist withdrawal syndrome and dopamine dysregulation syndrome. This article gives a broad overview of the spectrum of these conditions, describes the association with severity of Parkinson's disease and the degree to which dopaminergic degeneration and/or treatment influence symptoms. We highlight useful assessment scales that inform diagnosis and current treatment strategies to ameliorate these troublesome symptoms, which frequently negatively affect quality of life.
Emerging data suggest that recovery from mild traumatic brain injury (mTBI) takes longer than previously thought. This paper examines trajectories for cognitive recovery up to 48 months post-mTBI, presenting these visually using a Sankey diagram and growth curve analysis.
Methods:
This sample (n = 301) represents adults (≥16 years) from a population-based Brain Injury Outcomes in the New Zealand Community study over a 4-year follow-up on the CNS-Vital Signs neuropsychological test. Data were collected within 2 weeks of injury, and then at 1, 6, 12 and 48 months post-injury.
Results:
Significant improvement in cognitive functioning was seen up to 6 months post-injury. Using growth curve modelling, we found significant improvements in overall neurocognition from baseline to 6 months, on average participants improved one point per month (0.9; 95% CI 0.42–1.39) p < 0.001. No change in neurocognition was found within the time periods 6–12 months or 12–48 months. The Sankey highlighted that at each time point, a small proportion of participants remained unchanged or declined. Proportionally, few show any improvement after the first 6 months.
Conclusion:
Most individuals remained stable or improved over time to 6 months post-injury. Summary statistics are informative regarding overall trends, but can mask differing trajectories for recovery. The Sankey diagram indicates that not all improve, as well as the potential impact of individuals moving in and out of the study. The Sankey diagram also indicated the level of functioning of those most likely to withdraw, allowing targeting of retention strategies.
In this article we engage in a critical examination of how local authority Housing Solutions staff, newly placed centre stage in preventing homelessness amongst prison leavers in Wales, understand and go about their work. Drawing on Carlen’s concept of ‘imaginary penalities’ and Ugelvik’s notion of ‘legitimation work’ we suggest practice with this group can be ritualistic and underpinned by a focus on prison leavers’ responsibilities over their rights, and public protection over promoting resettlement. In response we advocate for less-punitive justice and housing policies, underpinned by the right to permanent housing for all prison leavers and wherein stable accommodation is understood as the starting point for resettlement. The analysis presented in this article provides insights to how homelessness policies could play out in jurisdictions where more joint working between housing and criminal justice agencies are being pursued and/or preventative approaches to managing homelessness are being considered.