We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Difficulties with decision making and risk taking in individuals with bipolar disorder (BD) have been associated with mood episodes. However, there is limited information about these experiences during euthymia, the mood state where people with BD spent the majority of their time.
Aims:
To examine how individuals with BD consider risk in everyday decisions during their euthymic phase.
Method:
We conducted a qualitative study that used semi-structured audio recorded interviews. Eight euthymic participants with confirmed BD were interviewed, and we used interpretative phenomenological analysis to analyse the data.
Results:
We identified four themes. The first theme, ‘Who I really am’, involves the relationship between individual identity and risks taken. The second theme, ‘Taking back control of my life’, explored the relationship between risks taken as participants strove to keep control of their lives. The third theme, ‘Fear of the “what ifs”’, represents how the fear of negative consequences from taking risks impacts risk decisions. Finally, the fourth theme, ‘The role of family and friends’, highlights the important role that a supporting network can play in their lives in the context of taking risks.
Conclusions:
The study highlights aspects that can impact on an individual with BD’s consideration of risk during euthymia. Identity, control, fear and support all play a role when a person considers risk in their decision-making process, and they should be taken into consideration when exploring risk with individuals with BD in clinical settings, and inform the design of future interventions.
This is the first report on the association between trauma exposure and depression from the Advancing Understanding of RecOvery afteR traumA(AURORA) multisite longitudinal study of adverse post-traumatic neuropsychiatric sequelae (APNS) among participants seeking emergency department (ED) treatment in the aftermath of a traumatic life experience.
Methods
We focus on participants presenting at EDs after a motor vehicle collision (MVC), which characterizes most AURORA participants, and examine associations of participant socio-demographics and MVC characteristics with 8-week depression as mediated through peritraumatic symptoms and 2-week depression.
Results
Eight-week depression prevalence was relatively high (27.8%) and associated with several MVC characteristics (being passenger v. driver; injuries to other people). Peritraumatic distress was associated with 2-week but not 8-week depression. Most of these associations held when controlling for peritraumatic symptoms and, to a lesser degree, depressive symptoms at 2-weeks post-trauma.
Conclusions
These observations, coupled with substantial variation in the relative strength of the mediating pathways across predictors, raises the possibility of diverse and potentially complex underlying biological and psychological processes that remain to be elucidated in more in-depth analyses of the rich and evolving AURORA database to find new targets for intervention and new tools for risk-based stratification following trauma exposure.
Antibiotic prescribing practices across the Veterans’ Health Administration (VA) experienced significant shifts during the coronavirus disease 2019 (COVID-19) pandemic. From 2015 to 2019, antibiotic use between January and May decreased from 638 to 602 days of therapy (DOT) per 1,000 days present (DP), while the corresponding months in 2020 saw antibiotic utilization rise to 628 DOT per 1,000 DP.
Background: Accurate identification of Clostridioides difficile infections (CDIs) from electronic data sources is important for surveillance. We evaluated how frequently laboratory findings were supported by diagnostic coding and treatment data in the electronic health record. Methods: We analyzed a retrospective cohort of patients in the Veterans’ Affairs Health System from 2006 through 2016. A CDI event was defined as a positive laboratory test for C. difficile toxin or toxin genes in the inpatient, outpatient, or long-term care setting with no prior positive test in the preceding 14 days. Events were classified as incident (no CDI in the prior 56 days), or recurrent (CDI in the prior 56 days) and were evaluated for evidence of clinical diagnosis based on International Classification of Disease, Ninth Revision, Clinical Modification (ICD-9-CM) and ICD-10-CM codes and at least 1 dose of an anti-CDI agent (intravenous or oral metronidazole, fidaxomicin, or oral vancomycin). We further assessed the possibility of treatment without testing by quantifying positive laboratory tests and diagnostic codes among inpatients receiving an anti-CDI agent. A course of anti-CDI therapy was defined as continuous treatment with the same drug. Results: Among 119,063 incident and recurrent CDI events, 70,114 (58.9%) had a diagnosis code and 15,850 (13.3%) had no accompanying treatment. The proportion of patients with ICD codes was highest among patients treated with fidaxomicin (82.6% of 906) or oral vancomycin (74.3% of 30,777) and was lower among patients receiving metronidazole (63.3% of 103,231) and those without treatment (29.9% of 15,850). The proportion of events with ICD codes and treatment was similar between incident and recurrent episodes. During the study period, there were ~470,000 inpatient courses of metronidazole, fidaxomicin, and oral vancomycin. Table 1 shows the presence of ICD codes and positive laboratory tests by anti-CDI agents. Among 51,100 courses of oral vancomycin, 51% had an ICD code and 44% had a positive test for C. difficile within 7 days of treatment initiation. Among 1,013 courses of fidaxomicin, 79% had an ICD code and 56% had a positive laboratory test. Conclusions: In this large cohort, there was evidence of substantial CDI treatment without confirmatory C. difficile testing and, to a lesser extent, some positive tests without accompanying treatment or coding. A combination of data sources may be needed to more accurately identify CDI from electronic health records for surveillance purposes.
Background: Assessing antimicrobial use (AU) appropriateness isa cornerstone of antimicrobial stewardship, largely accomplished through time-intensive manual chart review of specific agents or diagnoses. Efforts to evaluate appropriateness have focused on assessing the appropriateness of an entire treatment course. An electronic measure was developed to assess the appropriateness of each day of inpatient AU leveraging electronic health record data. Methods: We extracted contextual data, including risk factors for resistant organisms, allergies, constitutional signs and symptoms from diagnostic and procedural codes, and microbiological findings, from the electronic health records of patients in Veterans’ Health Administration inpatient wards reporting data to the National Healthcare Safety Network (NHSN) AU option from 2017–2018. Only the antibacterial categories shown in Figure 1 were included. Respiratory, urinary tract, skin and soft-tissue, and other infection categories were defined and applied to each hospital day. Algorithm rules were constructed to evaluate AU based on the clinical context (eg, in the ICU, during empiric therapy, drug–pathogen match, recommended drugs, and duration). Rules were drawn from available literature, were discussed with experts, and were then refined empirically. Generally, the rules allowed for use of first-line agents unless risk factors or contraindications were identified. AU was categorized as appropriate, inappropriate, or indeterminate for each day, then aggregated into an overall measure of facility-level AU appropriateness. A validation set of 20 charts were randomly selected for manual review. Results: Facility distribution of appropriateness, inappropriateness, and indeterminate AU by 4 of the adult, 2017 baseline NHSN Standardized Antimicrobial Administration Ratio (SAAR) categories are shown in Figure 1. The median facility-level inappropriateness across all SAAR categories was 37.2% (IQR, 29.4%–52.5%). The median facility-level indeterminate AU across all SAAR categories was 14.4% (IQR, 9.1%–21.2%). Chart review of 20 admissions showed agreement with algorithm appropriateness and inappropriateness in 95.4% of 240 antibacterial days.
Conclusions: We developed a comprehensive, flexible electronic tool to evaluate AU appropriateness for combinations of setting, antibacterial agent, syndrome, or time frame of interest (eg, empiric, definitive, or excess duration). Application of our algorithm in 2 years of VA acute-care data suggest substantial interfacility variability; the highest rates of inappropriateness were for anti-MRSA therapy. Our preliminary chart review demonstrated agreement between electronic and manual review in >95% of antimicrobial days. This approach may be useful to identify potential stewardship targets, in the development of decision support systems, and in conjunction with other metrics to track AU over time.
Few studies have derived data-driven dietary patterns in youth in the USA. This study examined data-driven dietary patterns and their associations with BMI measures in predominantly low-income, racial/ethnic minority US youth. Data were from baseline assessments of the four Childhood Obesity Prevention and Treatment Research (COPTR) Consortium trials: NET-Works (534 2–4-year-olds), GROW (610 3–5-year-olds), GOALS (241 7–11-year-olds) and IMPACT (360 10–13-year-olds). Weight and height were measured. Children/adult proxies completed three 24-h dietary recalls. Dietary patterns were derived for each site from twenty-four food/beverage groups using k-means cluster analysis. Multivariable linear regression models examined associations of dietary patterns with BMI and percentage of the 95th BMI percentile. Healthy (produce and whole grains) and Unhealthy (fried food, savoury snacks and desserts) patterns were found in NET-Works and GROW. GROW additionally had a dairy- and sugar-sweetened beverage-based pattern. GOALS had a similar Healthy pattern and a pattern resembling a traditional Mexican diet. Associations between dietary patterns and BMI were only observed in IMPACT. In IMPACT, youth in the Sandwich (cold cuts, refined grains, cheese and miscellaneous) compared with Mixed (whole grains and desserts) cluster had significantly higher BMI (β = 0·99 (95 % CI 0·01, 1·97)) and percentage of the 95th BMI percentile (β = 4·17 (95 % CI 0·11, 8·24)). Healthy and Unhealthy patterns were the most common dietary patterns in COPTR youth, but diets may differ according to age, race/ethnicity or geographic location. Public health messages focused on healthy dietary substitutions may help youth mimic a dietary pattern associated with lower BMI.
The COVID-19 pandemic has had a major impact on clinical practice. Safe standards of practice are essential to protect health care workers while still allowing them to provide good care. The Canadian Society of Clinical Neurophysiologists, the Canadian Association of Electroneurophysiology Technologists, the Association of Electromyography Technologists of Canada, the Board of Registration of Electromyography Technologists of Canada, and the Canadian Board of Registration of Electroencephalograph Technologists have combined to review current published literature about safe practices for neurophysiology laboratories. Herein, we present the results of our review and provide our expert opinion regarding the safe practice of neurophysiology during the COVID-19 pandemic in Canada.
Head impact exposure (HIE) in youth football is a public health concern. The objective of this study was to determine if one season of HIE in youth football was related to cognitive changes.
Method:
Over 200 participants (ages 9–13) wore instrumented helmets for practices and games to measure the amount of HIE sustained over one season. Pre- and post-season neuropsychological tests were completed. Test score changes were calculated adjusting for practice effects and regression to the mean and used as the dependent variables. Regression models were calculated with HIE variables predicting neuropsychological test score changes.
Results:
For the full sample, a small effect was found with season average rotational values predicting changes in list-learning such that HIE was related to negative score change: standardized beta (β) = -.147, t(205) = -2.12, and p = .035. When analyzed by age clusters (9–10, 11–13) and adding participant weight to models, the R2 values increased. Splitting groups by weight (median split), found heavier members of the 9–10 cohort with significantly greater change than lighter members. Additionaly, significantly more participants had clinically meaningful negative changes: X2 = 10.343, p = .001.
Conclusion:
These findings suggest that in the 9–10 age cluster, the average seasonal level of HIE had inverse, negative relationships with cognitive change over one season that was not found in the older group. The mediation effects of age and weight have not been explored previously and appear to contribute to the effects of HIE on cognition in youth football players.
OBJECTIVES/GOALS: The objective of this study is to define the molecular mechanisms that control survival of malignant stem cells in acute myeloid leukemia (AML). Leukemia stem cells (LSCs) are not effectively eradicated by standard treatment and lead to resistance and relapse, which contribute to poor survival rates. METHODS/STUDY POPULATION: The recently FDA approved venetoclax, a BCL2 inhibitor, with azacitidine, a hypomethylating agent leads to a 70% response rate in AML patients. Analysis of patients treated with this regimen showed direct targeting of LSCs. BCL2 has a non-canonical function in regulation of intracellular calcium. To determine how BCL2 mediated calcium signaling plays a role in LSC biology, we used LSCs isolated from venetoclax/azacitidine (ven/aza) sensitive and resistant patient samples to measure expression of calcium channels via RNA seq. BIO-ID, siRNA, flow cytometry, seahorse assays, calcium measurements and colony assays were used to determine the effects of calcium channel perturbation on LSC biology. RESULTS/ANTICIPATED RESULTS: BCL2 inhibition leads to decreased OXPHOS activity in primary AML specimens. BIO-ID studies revealed cation/metal ion transporters, ER membrane proteins and ER membrane organization as top enriched pathways interacting with BCL2. RNA-seq data showed increased expression of genes involved in calcium influx into the ER in ven/aza sensitive LSCs and increased expression of genes involved in calcium efflux from the ER in ven/aza resistant samples. Ven/Aza resistant LSCs have increased mitochondrial calcium content, consistent with their increased OXPHOS activity as calcium is required for OXPHOS. Perturbation of these channels leads to decreased OXPHOS activity and decreased viability in LSCs. DISCUSSION/SIGNIFICANCE OF IMPACT: We postulate that a deeper understanding of the mechanisms behind ven/aza targeting of LSCs will lead to the development of novel therapies for patients who do not respond to ven/aza. Our data show targeting intracellular calcium signaling could be a viable therapeutic strategy for AML patients.
Alcohol misuse is common in bipolar disorder and is associated with worse outcomes. A recent study evaluated integrated motivational interviewing and cognitive behavioural therapy for bipolar disorder and alcohol misuse with promising results in terms of the feasibility of delivering the therapy and the acceptability to participants.
Aims:
Here we present the experiences of the therapists and supervisors from the trial to identify the key challenges in working with this client group and how these might be overcome.
Method:
Four therapists and two supervisors participated in a focus group. Topic guides for the group were informed by a summary of challenges and obstacles that each therapist had completed at the end of therapy for each individual client. The audio recording of the focus group was transcribed and data were analysed using thematic analysis.
Results:
We identified five themes: addressing alcohol use versus other problems; impact of bipolar disorder on therapy; importance of avoidance and overcoming it; fine balance in relation to shame and normalising use; and ‘talking the talk’ versus ‘walking the walk’.
Conclusions:
Findings suggest that clients may be willing to explore motivations for using alcohol even if they are not ready to change their drinking, and they may want help with a range of mental health problems. Emotional and behavioural avoidance may be a key factor in maintaining alcohol use in this client group and therapists should be aware of a possible discrepancy between clients’ intentions to reduce misuse and their actual behaviour.
To determine whether the Society for Healthcare Epidemiology of America (SHEA) and the Infectious Diseases Society of America (IDSA) Clostridioides difficile infection (CDI) severity criteria adequately predicts poor outcomes.
Design:
Retrospective validation study.
Setting and participants:
Patients with CDI in the Veterans’ Affairs Health System from January 1, 2006, to December 31, 2016.
Methods:
For the 2010 criteria, patients with leukocytosis or a serum creatinine (SCr) value ≥1.5 times the baseline were classified as severe. For the 2018 criteria, patients with leukocytosis or a SCr value ≥1.5 mg/dL were classified as severe. Poor outcomes were defined as hospital or intensive care admission within 7 days of diagnosis, colectomy within 14 days, or 30-day all-cause mortality; they were modeled as a function of the 2010 and 2018 criteria separately using logistic regression.
Results:
We analyzed data from 86,112 episodes of CDI. Severity was unclassifiable in a large proportion of episodes diagnosed in subacute care (2010, 58.8%; 2018, 49.2%). Sensitivity ranged from 0.48 for subacute care using 2010 criteria to 0.73 for acute care using 2018 criteria. Areas under the curve were poor and similar (0.60 for subacute care and 0.57 for acute care) for both versions, but negative predictive values were >0.80.
Conclusions:
Model performances across care settings and criteria versions were generally poor but had reasonably high negative predictive value. Many patients in the subacute-care setting, an increasing fraction of CDI cases, could not be classified. More work is needed to develop criteria to identify patients at risk of poor outcomes.
Monolayer (ML) molybdenum disulfide (MoS₂) is a novel 2-dimensional (2D) semiconductor whose properties have many applications in devices. Despite its potential, ML MoS₂ is limited in its use due to its degradation under exposure to ambient air. Therefore, studies of possible degradation prevention methods are important. It is well established that air humidity plays a major role in the degradation. In this paper, we investigate the effects of substrate hydrophobicity on the degradation of chemical vapor deposition (CVD) grown ML MoS2. We use optical microscopy, atomic force microscopy (AFM), and Raman mapping to investigate the degradation of ML MoS2 grown on SiO2 and Si3N4 that are hydrophilic and hydrophobic substrates, respectively. Our results show that the degradation of ML MoS₂ on Si3N4 is significantly less than the degradation on SiO2. These results show that using hydrophobic substrates to grow 2D transition metal dichalcogenide ML materials may diminish ambient degradation and enable improved protocols for device manufacturing.
Childhood abuse is a risk factor for poorer illness course in bipolar disorder, but the reasons why are unclear. Trait-like features such as affective instability and impulsivity could be part of the explanation. We aimed to examine whether childhood abuse was associated with clinical features of bipolar disorder, and whether associations were mediated by affective instability or impulsivity.
Methods
We analysed data from 923 people with bipolar I disorder recruited by the Bipolar Disorder Research Network. Adjusted associations between childhood abuse, affective instability and impulsivity and eight clinical variables were analysed. A path analysis examined the direct and indirect links between childhood abuse and clinical features with affective instability and impulsivity as mediators.
Results
Affective instability significantly mediated the association between childhood abuse and earlier age of onset [effect estimate (θ)/standard error (SE): 2.49], number of depressive (θ/SE: 2.08) and manic episodes/illness year (θ/SE: 1.32), anxiety disorders (θ/SE: 1.98) and rapid cycling (θ/SE: 2.25). Impulsivity significantly mediated the association between childhood abuse and manic episodes/illness year (θ/SE: 1.79), anxiety disorders (θ/SE: 1.59), rapid cycling (θ/SE: 1.809), suicidal behaviour (θ/SE: 2.12) and substance misuse (θ/SE: 3.09). Measures of path analysis fit indicated an excellent fit to the data.
Conclusions
Affective instability and impulsivity are likely part of the mechanism of why childhood abuse increases risk of poorer clinical course in bipolar disorder, with each showing some selectivity in pathways. They are potential novel targets for intervention to improve outcome in bipolar disorder.
Clinical Enterobacteriacae isolates with a colistin minimum inhibitory concentration (MIC) ≥4 mg/L from a United States hospital were screened for the mcr-1 gene using real-time polymerase chain reaction (RT-PCR) and confirmed by whole-genome sequencing. Four colistin-resistant Escherichia coli isolates contained mcr-1. Two isolates belonged to the same sequence type (ST-632). All subjects had prior international travel and antimicrobial exposure.
In the USA, western Washington (WWA) and the Alaska (AK) Interior are two regions where maritime and continental climates, high latitude and cropping systems necessitate early maturing spring wheat (Triticum aestivum L.). Both regions aim to increase the production of hard spring bread wheat for human consumption to support regional agriculture and food systems. The Nordic region of Europe has a history of breeding for early maturing spring wheat and also experiences long daylengths with mixed maritime and continental climates. Nordic wheat also carries wildtype (wt) NAM-B1, an allele associated with accelerated senescence and increased grain protein and micronutrient content, at a higher frequency than global germplasm. Time to senescence, yield, protein and mineral content were evaluated on 42 accessions of Nordic hard red spring wheat containing wt NAM-B1 over 2 years on experimental stations in WWA and the AK Interior. Significant variation was found by location and accession for time to senescence, suggesting potential parental lines for breeding programmes targeting early maturity. Additionally, multiple regression analysis showed that decreased time to senescence correlated negatively with grain yield and positively with grain protein, iron and zinc content. Breeding for early maturity in these regions will need to account for this potential trade-off in yield. Nordic wt NAM-B1 accessions with early senescence yet with yields similar to regional checks are reported. Collaboration among alternative wheat regions can aid in germplasm exchange and varietal development as shown here for the early maturing trait.
One year of antipsychotic treatment from symptom remission is recommended following a first episode of psychosis (FEP).
Aims
To investigate the effectiveness of commonly used antipsychotic medications in FEP.
Method
A retrospective cohort study of naturalistic treatment of patients (N = 460) accepted by FEP services across seven UK sites. Treatment initiation to all-cause discontinuation determined from case files.
Results
Risk of treatment discontinuation is greatest within 3 months of treatment initiation. Risperidone had longest median survival time. No significant differences were observed in time to discontinuation between commonly used antipsychotics on multivariable Cox regression analysis. Poor adherence and efficacy failure were the most common reasons for discontinuation.
Conclusions
Effectiveness differences appear not to be a current reason for antipsychotic choice in FEP. Adherence strategies and weighing up likely adverse effects should be the clinical focus.
To determine features associated with better perceived quality of training for psychiatrists on advance decision-making in the Mental Capacity Act 2005 (MCA), and whether the quality or amount of training were associated with positive attitudes or use of advance decisions to refuse treatment (ADRTs) by psychiatrists in people with bipolar disorder. An anonymised national survey of 650 trainee and consultant psychiatrists in England and Wales was performed.
Results
Good or better quality of training was associated with use of case summaries, role-play, ADRTs, assessment of mental capacity and its fluctuation. Good or better quality and two or more sessions of MCA training were associated with more positive attitudes and reported use of ADRTs, although many psychiatrists would never discuss them clinically with people with bipolar disorder.
Clinical implications
Consistent delivery of better-quality training is required for all psychiatrists to increase use of ADRTs in people with bipolar disorder.
The purpose of this study was to quantify the effect of multidrug-resistant (MDR) gram-negative bacteria and methicillin-resistant Staphylococcus aureus (MRSA) healthcare-associated infections (HAIs) on mortality following infection, regardless of patient location.
METHODS
We conducted a retrospective cohort study of patients with an inpatient admission in the US Department of Veterans Affairs (VA) system between October 1, 2007, and November 30, 2010. We constructed multivariate log-binomial regressions to assess the impact of a positive culture on mortality in the 30- and 90-day periods following the first positive culture, using a propensity-score–matched subsample.
RESULTS
Patients identified with positive cultures due to MDR Acinetobacter (n=218), MDR Pseudomonas aeruginosa (n=1,026), and MDR Enterobacteriaceae (n=3,498) were propensity-score matched to 14,591 patients without positive cultures due to these organisms. In addition, 3,471 patients with positive cultures due to MRSA were propensity-score matched to 12,499 patients without positive MRSA cultures. Multidrug-resistant gram-negative bacteria were associated with a significantly elevated risk of mortality both for invasive (RR, 2.32; 95% CI, 1.85–2.92) and noninvasive cultures (RR, 1.33; 95% CI, 1.22–1.44) during the 30-day period. Similarly, patients with MRSA HAIs (RR, 2.77; 95% CI, 2.39–3.21) and colonizations (RR, 1.32; 95% CI, 1.22–1.50) had an increased risk of death at 30 days.
CONCLUSIONS
We found that HAIs due to gram-negative bacteria and MRSA conferred significantly elevated 30- and 90-day risks of mortality. This finding held true both for invasive cultures, which are likely to be true infections, and noninvasive infections, which are possibly colonizations.