To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To evaluate the construct validity of the NIH Toolbox Cognitive Battery (NIH TB-CB) in the healthy oldest-old (85+ years old).
Our sample from the McKnight Brain Aging Registry consists of 179 individuals, 85 to 99 years of age, screened for memory, neurological, and psychiatric disorders. Using previous research methods on a sample of 85 + y/o adults, we conducted confirmatory factor analyses on models of NIH TB-CB and same domain standard neuropsychological measures. We hypothesized the five-factor model (Reading, Vocabulary, Memory, Working Memory, and Executive/Speed) would have the best fit, consistent with younger populations. We assessed confirmatory and discriminant validity. We also evaluated demographic and computer use predictors of NIH TB-CB composite scores.
Findings suggest the six-factor model (Vocabulary, Reading, Memory, Working Memory, Executive, and Speed) had a better fit than alternative models. NIH TB-CB tests had good convergent and discriminant validity, though tests in the executive functioning domain had high inter-correlations with other cognitive domains. Computer use was strongly associated with higher NIH TB-CB overall and fluid cognition composite scores.
The NIH TB-CB is a valid assessment for the oldest-old samples, with relatively weak validity in the domain of executive functioning. Computer use’s impact on composite scores could be due to the executive demands of learning to use a tablet. Strong relationships of executive function with other cognitive domains could be due to cognitive dedifferentiation. Overall, the NIH TB-CB could be useful for testing cognition in the oldest-old and the impact of aging on cognition in older populations.
Cognitive therapy and behavioural activation are both widely applied and effective psychotherapies for depression, but it is unclear which works best for whom. Individual participant data (IPD) meta-analysis allows for examining moderators at the participant level and can provide more precise effect estimates than conventional meta-analysis, which is based on study-level data.
This article describes the protocol for a systematic review and IPD meta-analysis that aims to compare the efficacy of cognitive therapy and behavioural activation for adults with depression, and to explore moderators of treatment effect. (PROSPERO: CRD42022341602)
Systematic literature searches will be conducted in PubMed, PsycINFO, EMBASE and the Cochrane Library, to identify randomised clinical trials comparing cognitive therapy and behavioural activation for adult acute-phase depression. Investigators of these trials will be invited to share their participant-level data. One-stage IPD meta-analyses will be conducted with mixed-effects models to assess treatment effects and to examine various available demographic, clinical and psychological participant characteristics as potential moderators. The primary outcome measure will be depressive symptom level at treatment completion. Secondary outcomes will include post-treatment anxiety, interpersonal functioning and quality of life, as well as follow-up outcomes.
To the best of our knowledge, this will be the first IPD meta-analysis concerning cognitive therapy versus behavioural activation for adult depression. This study has the potential to enhance our knowledge of depression treatment by using state-of-the-art statistical techniques to compare the efficacy of two widely used psychotherapies, and by shedding more light on which of these treatments might work best for whom.
Background: The estimated economic cost of Clostridioides difficile infection (CDI) is $5.4 billion annually, primarily attributed to acute-care costs. We previously reported data from ECOSPOR III that SER-109, an investigational oral microbiome therapeutic, was superior to placebo in reducing recurrent CDI (rCDI) in adults at 8 weeks after treatment, with a 68% relative risk reduction. Adults with rCDI have more hospitalizations and emergency room (ER) visits (defined herein as healthcare resource utilization, HRU) compared to those without recurrence. Thus, we evaluated incidence of HRU. Methods: Adults with rCDI (≥3 episodes in 12 months) were screened at 56 US and Canadian sites and were randomized 1:1 to SER-109 (4 capsules × 3 days) or placebo following resolution of CDI with standard-of-care CDI antibiotics. The primary end point was rCDI at 8 weeks. Exploratory end points included cumulative incidence of hospitalizations through 24 weeks after treatment. Here, we report cumulative incidence of all-cause HRU through 8 weeks after treatment. Results: In total, 281 patients were screened and 182 were randomized (59.9% female; mean age 65.5 years; 98.9% outpatient). Overall, 31 patients (17%) had 38 hospitalizations or ER visits through week 8 (11 events in 10 SER-109 patients and 27 events in 21 placebo patients) (Table 1). The cumulative incidence of HRU was lower in SER-109–treated patients compared to placebo at both weeks 4 and 8 with most events (65.8%) recorded within 4 weeks after treatment. The adjusted HRU incidence rate (by person time, age, sex, and antibiotic use) was also lower in SER-109–treated patients compared to placebo at weeks 4 and 8 (0.256 [95% CI, 0.096–0.683] versus 0.417 [95% CI, 0.199–0.873], respectively). Conclusions: SER-109–treated patients had less HRU compared to placebo patients through 8 weeks after treatment in this mostly outpatient population. These data suggest a potential benefit of SER-109 in reducing HRU, thus lowering the healthcare burden of rCDI.
Autism and autistic traits are risk factors for suicidal behaviour.
To explore the prevalence of autism (diagnosed and undiagnosed) in those who died by suicide, and identify risk factors for suicide in this group.
Stage 1: 372 coroners’ inquest records, covering the period 1 January 2014 to 31 December 2017 from two regions of England, were analysed for evidence that the person who died had diagnosed autism or undiagnosed possible autism (elevated autistic traits), and identified risk markers. Stage 2: 29 follow-up interviews with the next of kin of those who died gathered further evidence of autism and autistic traits using validated autism screening and diagnostic tools.
Stage 1: evidence of autism (10.8%) was significantly higher in those who died by suicide than the 1.1% prevalence expected in the UK general alive population (odds ratio (OR) = 11.08, 95% CI 3.92–31.31). Stage 2: 5 (17.2%) of the follow-up sample had evidence of autism identified from the coroners’ records in stage 1. We identified evidence of undiagnosed possible autism in an additional 7 (24.1%) individuals, giving a total of 12 (41.4%); significantly higher than expected in the general alive population (1.1%) (OR = 19.76, 95% CI 2.36–165.84). Characteristics of those who died were largely similar regardless of evidence of autism, with groups experiencing a comparably high number of multiple risk markers before they died.
Elevated autistic traits are significantly over-represented in those who die by suicide.
Food insecurity is associated with numerous adverse health outcomes. The US Veterans Health Administration (VHA) began universal food insecurity screening in 2017. This study examined prevalence and correlates of food insecurity among Veterans screened.
Retrospective cross-sectional study using VHA administrative data. Multivariable logistic regression models were estimated to identify sociodemographic and medical characteristics associated with a positive food insecurity screen.
All US Veterans Administration (VA) medical centres (n 161).
All Veterans were screened for food insecurity since screening initiation (July 2017–December 2018).
Of 3 304 702 Veterans screened for food insecurity, 44 298 were positive on their initial screen (1·3 % of men; 2·0 % of women). Food insecurity was associated with identifying as non-Hispanic Black or Hispanic. Veterans who were non-married/partnered, low-income Veterans without VA disability-related compensation and those with housing instability had higher odds of food insecurity, as did Veterans with a BMI < 18·5, diabetes, depression and post-traumatic stress disorder. Prior military sexual trauma (MST) was associated with food insecurity among both men and women. Women screening positive, however, were eight times more likely than men to have experienced MST (48·9 % v. 5·9 %).
Food insecurity was associated with medical and trauma-related comorbidities as well as unmet social needs including housing instability. Additionally, Veterans of colour and women were at higher risk for food insecurity. Findings can inform development of tailored interventions to address food insecurity such as more frequent screening among high-risk populations, onsite support applying for federal food assistance programs and formal partnerships with community-based resources.
Agitation is a common complication of Alzheimer’s dementia (Agit-AD) associated with substantial morbidity, high healthcare service utilization, and adverse emotional and physical impact on care partners. There are currently no FDA-approved pharmacological treatments for Agit-AD. We present the study design and baseline data for an ongoing multisite, three-week, double-blind, placebo-controlled, randomized clinical trial of dronabinol (synthetic tetrahydrocannabinol [THC]), titrated to a dose of 10 mg daily, in 80 participants to examine the safety and efficacy of dronabinol as an adjunctive treatment for Agit-AD. Preliminary findings for 44 participants enrolled thus far show a predominately female, white sample with advanced cognitive impairment (Mini Mental Status Examination mean 7.8) and agitation (Neuropsychiatric Inventory-Clinician Agitation subscale mean 14.1). Adjustments to study design in light of the COVID-19 pandemic are described. Findings from this study will provide guidance for the clinical utility of dronabinol for Agit-AD. ClinicalTrials.gov Identifier: NCT02792257.
The severe acute respiratory syndrome coronavirus disease-2 (SARS-CoV-2) pandemic of 2020-2021 created unprecedented challenges for clinicians in critical care transport (CCT). These CCT services had to rapidly adjust their clinical approaches to evolving patient demographics, a preponderance of respiratory failure, and transport utilization stratagem. Organizations had to develop and implement new protocols and guidelines in rapid succession, often without the education and training that would have been involved pre-coronavirus disease 2019 (COVID-19). These changes were complicated by the need to protect crew members as well as to optimize patient care. Clinical initiatives included developing an awake proning transport protocol and a protocol to transport intubated proned patients. One service developed a protocol for helmet ventilation to minimize aerosolization risks for patients on noninvasive positive pressure ventilation (NIPPV). While these clinical protocols were developed specifically for COVID-19, the growth in practice will enhance the care of patients with other causes of respiratory failure. Additionally, these processes will apply to future respiratory epidemics and pandemics.
Vast disparities between and within American states’ responses to the COVID-19 pandemic have evoked renewed attention to whether greater centralization might enhance investments in subnational capacity and remedy subnational inequalities or instead erode subnational organizational capacity. Developments in American public education (1997–2015) offer perspective on this puzzle, which we examine by applying interrupted time series analysis to a novel dataset to assess the implications of centralization on subnational investments in administrative and technical capacity, two dimensions of organizational capacity. We find simultaneous subnational erosion in administrative capacity and growth in technical capacity following centralization, both of which appear concentrated in low-poverty areas despite centralization’s explicit antipoverty purposes. Public education reforms highlight both the challenge of dismantling subnational inequality through centralization and the need for future research on policy designs that enable centralization to yield subnational capacity that is able to remedy inequality.
Coronavirus disease 2019 (COVID-19) vaccination effectiveness in healthcare personnel (HCP) has been established. However, questions remain regarding its performance in high-risk healthcare occupations and work locations. We describe the effect of a COVID-19 HCP vaccination campaign on SARS-CoV-2 infection by timing of vaccination, job type, and work location.
We conducted a retrospective review of COVID-19 vaccination acceptance, incidence of postvaccination COVID-19, hospitalization, and mortality among 16,156 faculty, students, and staff at a large academic medical center. Data were collected 8 weeks prior to the start of phase 1a vaccination of frontline employees and ended 11 weeks after campaign onset.
The COVID-19 incidence rate among HCP at our institution decreased from 3.2% during the 8 weeks prior to the start of vaccinations to 0.38% by 4 weeks after campaign initiation. COVID-19 risk was reduced among individuals who received a single vaccination (hazard ratio [HR], 0.52; 95% confidence interval [CI], 0.40–0.68; P < .0001) and was further reduced with 2 doses of vaccine (HR, 0.17; 95% CI, 0.09–0.32; P < .0001). By 2 weeks after the second dose, the observed case positivity rate was 0.04%. Among phase 1a HCP, we observed a lower risk of COVID-19 among physicians and a trend toward higher risk for respiratory therapists independent of vaccination status. Rates of infection were similar in a subgroup of nurses when examined by work location.
Our findings show the real-world effectiveness of COVID-19 vaccination in HCP. Despite these encouraging results, unvaccinated HCP remain at an elevated risk of infection, highlighting the need for targeted outreach to combat vaccine hesitancy.
Bleeding in the perioperative period of congenital heart surgery with cardiopulmonary bypass is associated with increased morbidity and mortality both from the direct effects of haemorrhage as well as the therapies deployed to restore haemostasis. Perioperative bleeding is complex and multifactorial with both patient and procedural contributions. Moreover, neonates and infants are especially at risk. The objective of this review is to summarise the evidence regarding bleeding management in paediatric surgical patients and identify strategies that might facilitate appropriate bleeding management while minimising the risk of thrombosis. We will address the use of standard and point-of-care tests, and the role of contemporary coagulation factors and other novel drugs.
The objectives of this study were (1) to develop and validate a simulation model to estimate daily probabilities of healthcare-associated infections (HAIs), length of stay (LOS), and mortality using time varying patient- and unit-level factors including staffing adequacy and (2) to examine whether HAI incidence varies with staffing adequacy.
The study was conducted at 2 tertiary- and quaternary-care hospitals, a pediatric acute care hospital, and a community hospital within a single New York City healthcare network.
All patients discharged from 2012 through 2016 (N = 562,435).
We developed a non-Markovian simulation to estimate daily conditional probabilities of bloodstream, urinary tract, surgical site, and Clostridioides difficile infection, pneumonia, length of stay, and mortality. Staffing adequacy was modeled based on total nurse staffing (care supply) and the Nursing Intensity of Care Index (care demand). We compared model performance with logistic regression, and we generated case studies to illustrate daily changes in infection risk. We also described infection incidence by unit-level staffing and patient care demand on the day of infection.
Most model estimates fell within 95% confidence intervals of actual outcomes. The predictive power of the simulation model exceeded that of logistic regression (area under the curve [AUC], 0.852 and 0.816, respectively). HAI incidence was greatest when staffing was lowest and nursing care intensity was highest.
This model has potential clinical utility for identifying modifiable conditions in real time, such as low staffing coupled with high care demand.
This study aimed to investigate general factors associated with prognosis regardless of the type of treatment received, for adults with depression in primary care.
We searched Medline, Embase, PsycINFO and Cochrane Central (inception to 12/01/2020) for RCTs that included the most commonly used comprehensive measure of depressive and anxiety disorder symptoms and diagnoses, in primary care depression RCTs (the Revised Clinical Interview Schedule: CIS-R). Two-stage random-effects meta-analyses were conducted.
Twelve (n = 6024) of thirteen eligible studies (n = 6175) provided individual patient data. There was a 31% (95%CI: 25 to 37) difference in depressive symptoms at 3–4 months per standard deviation increase in baseline depressive symptoms. Four additional factors: the duration of anxiety; duration of depression; comorbid panic disorder; and a history of antidepressant treatment were also independently associated with poorer prognosis. There was evidence that the difference in prognosis when these factors were combined could be of clinical importance. Adding these variables improved the amount of variance explained in 3–4 month depressive symptoms from 16% using depressive symptom severity alone to 27%. Risk of bias (assessed with QUIPS) was low in all studies and quality (assessed with GRADE) was high. Sensitivity analyses did not alter our conclusions.
When adults seek treatment for depression clinicians should routinely assess for the duration of anxiety, duration of depression, comorbid panic disorder, and a history of antidepressant treatment alongside depressive symptom severity. This could provide clinicians and patients with useful and desired information to elucidate prognosis and aid the clinical management of depression.
Total cost estimates for crime in the USA are both out-of-date and incomplete. We estimated incidence and costs of personal crimes (both violent and non-violent) and property crimes in 2017. Incidence came from national arrest data, multi-state estimates of police-reported crimes per arrest, national victimization and road crash surveys, and police underreporting studies. We updated and expanded upon published unit costs. Estimated crime costs totaled $2.6 trillion ($620 billion in monetary costs plus quality of life losses valued at $1.95 trillion; 95 % uncertainty interval $2.2–$3.0 trillion). Violent crime accounted for 85 % of costs. Principal contributors to the 10.9 million quality-adjusted life years lost were sexual violence, physical assault/robbery, and child maltreatment. Monetary expenditures caused by criminal victimization represent 3 % of Gross Domestic Product – equivalent to the amount spent on national defense. These estimates exclude the additional costs of preventing and avoiding crime such as enhanced lighting and burglar alarms. They also exclude crimes against businesses and most white-collar and corporate offenses.
Amongst patients with CHD, the time of transition to adulthood is associated with lapses in care leading to significant morbidity. The purpose of this study was to identify differences in perceptions between parents and teens in regard to transition readiness.
Responses were collected from 175 teen–parent pairs via the validated CHD Transition Readiness survey and an information request checklist. The survey was distributed via an electronic tablet at a routine clinic visit.
Parents reported a perceived knowledge gap of 29.2% (the percentage of survey items in which a parent believes their teen does not know), compared to teens self-reporting an average of 25.9% of survey items in which they feel deficient (p = 0.01). Agreement was lowest for long-term medical needs, physical activities allowed, insurance, and education. In regard to self-management behaviours, agreement between parent and teen was slight to moderate (weighted κ statistic = 0.18 to 0.51). For self-efficacy, agreement ranged from slight to fair (weighted κ = 0.16 to 0.28). Teens were more likely to request information than their parents (79% versus 65% requesting at least one item) particularly in regard to pregnancy/contraception and insurance.
Parents and teens differ in several key perceptions regarding knowledge, behaviours, and feelings related to the management of heart disease. Specifically, parents perceive a higher knowledge deficit, teens perceive higher self-efficacy, and parents and teens agree that self-management is low.
A case–case–control investigation (216 patients) examined the risk factors and outcomes of carbapenem-resistant Enterobacter (CR-En) acquisition. Recent exposure to fluoroquinolones, intensive care unit (ICU) stay, and rapidly fatal McCabe condition were independent predictors for acquisition. Acquiring CR-En was independently associated with discharge to a long-term care facility after being admitted from home.
Background:Pseudomonas aeruginosa is an important nosocomial pathogen associated with intrinsic and acquired resistance mechanisms to major classes of antibiotics. To better understand clinical risk factors for drug-resistant P. aeruginosa infection, decision-tree models for the prediction of fluoroquinolone and carbapenem-resistant P. aeruginosa were constructed and compared to multivariable logistic regression models using performance characteristics. Methods: In total, 5,636 patients admitted to 4 hospitals within a New York City healthcare system from 2010 to 2016 with blood, respiratory, wound, or urine cultures growing PA were included in the analysis. Presence or absence of drug-resistance was defined using the first culture of any source positive for P. aeruginosa during each hospitalization. To train and validate the prediction models, cases were randomly split (60 of 40) into training and validation datasets. Clinical decision-tree models for both fluoroquinolone and carbapenem resistance were built from the training dataset using 21 clinical variables of interest, and multivariable logistic regression models were built using the 16 clinical variables associated with resistance in bivariate analyses. Decision-tree models were optimized using K-fold cross validation, and performance characteristics between the 4 models were compared. Results: From 2010 through 2016, prevalence of fluoroquinolone and carbapenem resistance was 32% and 18%, respectively. For fluoroquinolone resistance, the logistic regression algorithm attained a positive predictive value (PPV) of 0.57 and a negative predictive value (NPV) of 0.73 (sensitivity, 0.27; specificity, 0.90) and the decision-tree algorithm attained a PPV of 0.65 and an NPV of 0.72 (sensitivity 0.21, specificity 0.95). For carbapenem resistance, the logistic regression algorithm attained a PPV of 0.53 and a NPV of 0.85 (sensitivity 0.20, specificity 0.96) and the decision-tree algorithm attained a PPV of 0.59 and an NPV of 0.84 (sensitivity 0.22, specificity 0.96). The decision-tree partitioning algorithm identified prior fluoroquinolone resistance, SNF stay, sex, and length-of-stay as variables of greatest importance for fluoroquinolone resistance compared to prior carbapenem resistance, age, and length-of-stay for carbapenem resistance. The highest-performing decision tree for fluoroquinolone resistance is illustrated in Fig. 1. Conclusions: Supervised machine-learning techniques may facilitate prediction of P. aeruginosa resistance and risk factors driving resistance patterns in hospitalized patients. Such techniques may be applied to readily available clinical information from hospital electronic health records to aid with clinical decision making.
Several studies suggest significant relationships between migration and autism spectrum disorder (ASD) but there are discrepant results. Given that no studies to date have included a pathological control group, the specificity of the results in ASD can be questioned.
To compare the migration experience (premigration, migratory trip, postmigration) in ASD and non-ASD pathological control groups, and study the relationships between migration and autism severity.
Parents’ and grandparents’ migrant status was compared in 30 prepubertal boys with ASD and 30 prepubertal boys without ASD but with language disorders, using a questionnaire including Human Development Index (HDI)/Inequality-adjusted Human Development Index (IHDI) of native countries. Autism severity was assessed using the Child Autism Rating Scale, Autism Diagnostic Observation Schedule and Autism Diagnostic Interview-Revised scales.
The parents’ and grandparents’ migrant status frequency did not differ between ASD and control groups and was not associated with autism severity. The HDI/IHDI values of native countries were significantly lower for parents and grandparents of children with ASD compared with the controls, especially for paternal grandparents. Furthermore, HDI/IDHI levels from the paternal line (father and especially paternal grandparents) were significantly negatively correlated with autism severity, particularly for social interaction impairments.
In this study, parents’ and/or grandparents’ migrant status did not discriminate ASD and pathological control groups and did not contribute either to autism severity. However, the HDI/IHDI results suggest that social adversity-related stress experienced in native countries, especially by paternal grandparents, is potentially a traumatic experience that may play a role in ASD development. A ‘premigration theory of autism’ is then proposed.
Previous genetic association studies have failed to identify loci robustly associated with sepsis, and there have been no published genetic association studies or polygenic risk score analyses of patients with septic shock, despite evidence suggesting genetic factors may be involved. We systematically collected genotype and clinical outcome data in the context of a randomized controlled trial from patients with septic shock to enrich the presence of disease-associated genetic variants. We performed genomewide association studies of susceptibility and mortality in septic shock using 493 patients with septic shock and 2442 population controls, and polygenic risk score analysis to assess genetic overlap between septic shock risk/mortality with clinically relevant traits. One variant, rs9489328, located in AL589740.1 noncoding RNA, was significantly associated with septic shock (p = 1.05 × 10–10); however, it is likely a false-positive. We were unable to replicate variants previously reported to be associated (p < 1.00 × 10–6 in previous scans) with susceptibility to and mortality from sepsis. Polygenic risk scores for hematocrit and granulocyte count were negatively associated with 28-day mortality (p = 3.04 × 10–3; p = 2.29 × 10–3), and scores for C-reactive protein levels were positively associated with susceptibility to septic shock (p = 1.44 × 10–3). Results suggest that common variants of large effect do not influence septic shock susceptibility, mortality and resolution; however, genetic predispositions to clinically relevant traits are significantly associated with increased susceptibility and mortality in septic individuals.
Quality-adjusted life-years (QALYs) and disability-adjusted life-years (DALYs) are commonly used in cost-effectiveness analysis (CEA) to measure health benefits. We sought to quantify and explain differences between QALY- and DALY-based cost-effectiveness ratios, and explore whether using one versus the other would materially affect conclusions about an intervention's cost-effectiveness.
We identified CEAs using both QALYs and DALYs from the Tufts Medical Center CEA Registry and Global Health CEA Registry, with a supplemental search to ensure comprehensive literature coverage. We calculated absolute and relative differences between the QALY- and DALY-based ratios, and compared ratios to common benchmarks (e.g., 1× gross domestic product per capita). We converted reported costs into US dollars.
Among eleven published CEAs reporting both QALYs and DALYs, seven focused on pharmaceuticals and infectious disease, and five were conducted in high-income countries. Four studies concluded that the intervention was “dominant” (cost-saving). Among the QALY- and DALY-based ratios reported from the remaining seven studies, absolute differences ranged from approximately $2 to $15,000 per unit of benefit, and relative differences from 6–120 percent, but most differences were modest in comparison with the ratio value itself. The values assigned to utility and disability weights explained most observed differences. In comparison with cost-effectiveness thresholds, conclusions were consistent regardless of the ratio type in ten of eleven cases.
Our results suggest that although QALY- and DALY-based ratios for the same intervention can differ, differences tend to be modest and do not materially affect comparisons to common cost-effectiveness thresholds.