We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The use of binomial coefficients in place of factorials to shorten the calculation of exact probabilities for 2 × 2 and 2 × r contingency tables is discussed. A useful set of inequalities for estimating the cumulative probabilities in the tail of the distribution from the probability of a single table is given. A table of binomial coefficients with four significant places and n through 60 is provided.
Individuals with major depressive disorder (MDD) can experience reduced motivation and cognitive function, leading to challenges with goal-directed behavior. When selecting goals, people maximize ‘expected value’ by selecting actions that maximize potential reward while minimizing associated costs, including effort ‘costs’ and the opportunity cost of time. In MDD, differential weighing of costs and benefits are theorized mechanisms underlying changes in goal-directed cognition and may contribute to symptom heterogeneity.
Methods
We used the Effort Foraging Task to quantify cognitive and physical effort costs, and patch leaving thresholds in low effort conditions (reflecting perceived opportunity cost of time) and investigated their shared versus distinct relationships to clinical features in participants with MDD (N = 52, 43 in-episode) and comparisons (N = 27).
Results
Contrary to our predictions, none of the decision-making measures differed with MDD diagnosis. However, each of the measures was related to symptom severity, over and above effects of ability (i.e. performance). Greater anxiety symptoms were selectively associated with lower cognitive effort cost (i.e. greater willingness to exert effort). Anhedonia and behavioral apathy were associated with increased physical effort costs. Finally, greater overall depression was related to decreased patch leaving thresholds.
Conclusions
Markers of effort-based decision-making may inform understanding of MDD heterogeneity. Increased willingness to exert cognitive effort may contribute to anxiety symptoms such as worry. Decreased leaving threshold associations with symptom severity are consistent with reward rate-based accounts of reduced vigor in MDD. Future research should address subtypes of depression with or without anxiety, which may relate differentially to cognitive effort decisions.
Interpersonal psychotherapy (IPT) and antidepressant medications are both first-line interventions for adult depression, but their relative efficacy in the long term and on outcome measures other than depressive symptomatology is unknown. Individual participant data (IPD) meta-analyses can provide more precise effect estimates than conventional meta-analyses. This IPD meta-analysis compared the efficacy of IPT and antidepressants on various outcomes at post-treatment and follow-up (PROSPERO: CRD42020219891). A systematic literature search conducted May 1st, 2023 identified randomized trials comparing IPT and antidepressants in acute-phase treatment of adults with depression. Anonymized IPD were requested and analyzed using mixed-effects models. The prespecified primary outcome was post-treatment depression symptom severity. Secondary outcomes were all post-treatment and follow-up measures assessed in at least two studies. IPD were obtained from 9 of 15 studies identified (N = 1536/1948, 78.9%). No significant comparative treatment effects were found on post-treatment measures of depression (d = 0.088, p = 0.103, N = 1530) and social functioning (d = 0.026, p = 0.624, N = 1213). In smaller samples, antidepressants performed slightly better than IPT on post-treatment measures of general psychopathology (d = 0.276, p = 0.023, N = 307) and dysfunctional attitudes (d = 0.249, p = 0.029, N = 231), but not on any other secondary outcomes, nor at follow-up. This IPD meta-analysis is the first to examine the acute and longer-term efficacy of IPT v. antidepressants on a broad range of outcomes. Depression treatment trials should routinely include multiple outcome measures and follow-up assessments.
The aim of this project is to study to which extent salience alterations influence the severity of psychotic symptoms. However, rather than studying them individually, we decided to focus on their interplay with two additional variables, that is: observing their effect in a vulnerability phase (adolescence) and with another added, well-recognized risk factor (cannabis use).
The reason for this study design lies in the fact that, in our opinion, it is fundamental to observe the trajectory of psychotic symptoms over a continuum; however, rather than adopting a longitudinal approach, we decided to structure it as a cross-sectional study confronting patients from two age brackets - adolescence and adulthood.
Objectives
The primary purpose of this study was to assess a difference between THC-abusing and non-abusing patients in adolescent and adult cohorts, using the Italian version of the psychometric scale “Aberrant Salience Inventory” (ASI), and the possible correlation with more severe psychotic symptoms. The employment of several different psychometric scales and the inclusion of a variegated cohort allowed to pursue multiple secondary objectives.
Methods
We recruited 192 patients, subsequently divided into six subgroups based on age and department of recruitment (whether adolescent or adult psychiatric or neurologic units - the latter serving as controls). Each individual was administered a set of questionnaires and a socio-demographic survey; the set included: Aberrant Salience Inventory (ASI), Community Assessment of Psychic Experiences (CAPE), Positive and Negative Syndrome Scale (PANSS), Montgomery-Asberg Depression Rating Scale (MADRS), Mania Rating Scale (MRS), Hamilton Anxiety Scale (HAM-A), Association for Methodology and Documentation in Psychiatry (AMDP) and Cannabis Experience Questionnaire (CEQ).
Results
The data analysis showed statistically significant (p<0.05) differences between adolescents and adults with psychotic symptoms in all of the three scales of PANSS and in MADRS. These two groups were homogenous for both cannabis use and ASI score. The intra-group comparison (either adolescent or adult) showed a hierarchical pattern in the scores of psychometric scales according to the diagnostic subgroup of allocation: patients with psychotic symptoms showed an higher level of psychopathology in all measures when compared to patients from the psychiatric unit without psychotic symptoms, which in turn scored higher than the patients from the neurologic unit.
Image:
Conclusions
The results of the present study may suggest that when salience alterations occur in adolescents with cannabis exposure, we might observe worsened positive and negative psychotic symptoms; their influence might be relevant also in other domains, especially regarding the depressive and anxiety spectrums.
Background: Meningiomas are the most common intracranial tumor with surgery, dural margin treatment, and radiotherapy as cornerstones of therapy. Response to treatment continues to be highly heterogeneous even across tumors of the same grade. Methods: Using a cohort of 2490 meningiomas in addition to 100 cases from the prospective RTOG-0539 phase II clinical trial, we define molecular biomarkers of response across multiple different, recently defined molecular classifications and use propensity score matching to mimic a randomized controlled trial to evaluate the role of extent of resection, dural marginal resection, and adjuvant radiotherapy on clinical outcome. Results: Gross tumor resection led to improved progression-free-survival (PFS) across all molecular groups (MG) and improved overall survival in proliferative meningiomas (HR 0.52, 95%CI 0.30-0.93). Dural margin treatment (Simpson grade 1/2) improved PFS versus complete tumor removal alone (Simpson 3). MG reliably predicted response to radiotherapy, including in the RTOG-0539 cohort. A molecular model developed using clinical trial cases discriminated response to radiotherapy better than standard of care grading in multiple cohorts (ΔAUC 0.12, 95%CI 0.10-0.14). Conclusions: We elucidate biological and molecular classifications of meningioma that influence response to surgery and radiotherapy in addition to introducing a novel molecular-based prediction model of response to radiation to guide treatment decisions.
The timing of tracheostomy for intensive care unit patients is controversial, with conflicting findings on early versus late tracheostomy.
Methods
Patients undergoing tracheostomy from 2001through 2012 were identified from the Medical Information Mart for Intensive Care-III database. Early tracheostomy was defined as less than the 25th percentile of time from intensive care unit admission to tracheostomy (time to tracheostomy). Statistical analysis for tracheostomy timing on intensive care unit length of stay and mortality were conducted.
Results
Of the 1,566 patients that were included, patients with early tracheostomy had shorter intensive care unit length of stay (27.32 vs 12.55 days, p < 0.001) and lower mortality (12.9 per cent vs 9.0 per cent, p = 0.039). Multivariate logistic regression analysis found an association between increasing time to tracheostomy and mortality (odds ratio: 1.029, 95 per cent confidence interval 1.007–1.051, p = 0.009).
Conclusion
Our analysis revealed that patients with early tracheostomy were more likely to have shorter intensive care unit lengths of stay and lower mortality. Our data suggest that early tracheostomy should be given strong consideration in appropriately selected patients.
The COVID-19 pandemic has had a globally devastating psychosocial impact. A detailed understanding of the mental health implications of this worldwide crisis is critical for successful mitigation of and preparation for future pandemics. Using a large international sample, we investigated in the present study the relationship between multiple COVID-19 parameters (both disease characteristics and government responses) and the incidence of the suicide crisis syndrome (SCS), an acute negative affect state associated with near-term suicidal behavior.
Methods:
Data were collected from 5528 adults across 10 different countries in an anonymous web-based survey between June 2020 and January 2021.
Results:
Individuals scoring above the SCS cut-off lived in countries with higher peak daily cases and deaths during the first wave of the pandemic. Additionally, the longer participants had been exposed to markers of pandemic severity (eg, lockdowns), the more likely they were to screen positive for the SCS. Findings reflected both country-to-country comparisons and individual variation within the pooled sample.
Conclusion:
Both the pandemic itself and the government interventions utilized to contain the spread appear to be associated with suicide risk. Public policy should include efforts to mitigate the mental health impact of current and future global disasters.
The tegument of adult Saccocoelioides godoyi Kohn & Froes, 1986 (Digenea: Haploporidae), specimens of which were collected from the intestine of the freshwater fish, Leporinus friderici (Bloch, 1794) (Anostomidae) from the reservoir of Itaipu Hydroelectric Power Station, Parana State, Brazil, was studied by transmission electron microscopy. The tegument comprises an external anucleate layer, covered by a surface plasma membrane and associated glycocalyx. The surface layer is bound by the basal plasma membrane and contains spines, two types of inclusion bodies and mitochondria. Tegumental cell bodies are located beneath the surface musculature and contain a single nucleus, cytoplasm with rough endoplasmic reticulum, mitochondria, ribosomes, and inclusion bodies similar to those found in the external layer. Cytoplasmic strands connect the cell bodies to the external surface layer, suggesting that the inclusion bodies are produced in these cells and pass up into the syncytium, as is known for other digeneans from experimental evidence.
OBJECTIVES/GOALS: Many economic evaluations rely on clinical trial data that may not represent real world populations and intervention effectiveness. We compare risk and cost-effectiveness for the Diabetes Prevention Program (DPP) clinical trial cohort and a real world population eligible for the national DPP to assess the impact of using real world data. METHODS/STUDY POPULATION: To produce real world (US population) representative results, we identified National Health and Nutrition Examination Survey (NHANES) subjects eligible for the national DPP and adjusted projections using survey weights. We used clinical predictive models to estimate individual diabetes risk, and microsimulation to estimate lifetime costs, benefits, and net monetary benefits (NMB) for lifestyle intervention and metformin. We compared results across the DPP clinical trial and NHANES populations. RESULTS/ANTICIPATED RESULTS: Three-year risk of diabetes onset for the DPP trial population (mean of 19.7%, median of 10.3%) exceeded corresponding risk for the NHANES population (mean of 14.6%, median of 4.8%). The proportion of individuals with a three-year diabetes risk < 10% for the DPP trial population (49%) was less than the corresponding proportion for NHANES (67%). Mean NMB for metformin for the DPP trial population ($9,749) exceeded the corresponding value for NHANES ($5,391). The proportion of subjects with negative NMB was 49% for the DPP trial population and 67% for NHANES. Lifestyle intervention had a mean NMB of $34,889 for the DPP trial population and $28,652 for NHANES. Only 20% of the NHANES population eligible for national DPP met inclusion/exclusion criteria for the DPP trial. DISCUSSION/SIGNIFICANCE: Real world populations eligible for the national DPP include a greater proportion of low-risk individuals, and for these people, prevention programs may confer smaller benefits. Technology assessments based on clinical trial data should be revised using real world population and treatment effect data.
The aim of this chapter is to discuss the foundational knowledge of the occurrences, events, and disease manifestations that was developed during the early stages of the COVID-19 pandemic, including the responses and measures that were undertaken to contain the disease. It emphasizes the importance of early intervention and the impact that timely action – or, in many cases, inaction – had on the development of this pandemic crisis. This chapter explores the role of data collection and the analysis mechanisms utilized in this pandemic to monitor disease spread in different geographies. The necessity of information derived from early disease vigilance and subsequent surveillance programs is stressed. The participation of the different stakeholders in the control and management of the pandemic is discussed as a function of synchronized intervention and effectiveness. This chronological account is intended to create a roadmap for future undertakings, programs, and decision-making processes by health and governmental authorities to be conducted at the earliest phases of future pandemics.
The link between cannabis use and psychotic symptoms or disorders is well known. However, the relation between cannabis withdrawal and psychotic symptoms is less studied.
Methods:
To our knowledge, this is the first publication of an observational systematic report of cannabis-induced psychotic disorder with onset during withdrawal. Here, we review patients presenting to a major emergency room in Montreal between January 2020 and September 2023 in a context of psychotic symptoms following cannabis cessation.
Results:
In total, seven male and one female patients presented at the peak of cannabis withdrawal with acute psychotic symptoms, representing less than 1% of all emergency service admissions.
Conclusions:
We discuss current knowledge regarding the endocannabinoid system and dopamine homeostasis to formulate hypotheses regarding these observations.
Cognitive training is a non-pharmacological intervention aimed at improving cognitive function across a single or multiple domains. Although the underlying mechanisms of cognitive training and transfer effects are not well-characterized, cognitive training has been thought to facilitate neural plasticity to enhance cognitive performance. Indeed, the Scaffolding Theory of Aging and Cognition (STAC) proposes that cognitive training may enhance the ability to engage in compensatory scaffolding to meet task demands and maintain cognitive performance. We therefore evaluated the effects of cognitive training on working memory performance in older adults without dementia. This study will help begin to elucidate non-pharmacological intervention effects on compensatory scaffolding in older adults.
Participants and Methods:
48 participants were recruited for a Phase III randomized clinical trial (Augmenting Cognitive Training in Older Adults [ACT]; NIH R01AG054077) conducted at the University of Florida and University of Arizona. Participants across sites were randomly assigned to complete cognitive training (n=25) or an education training control condition (n=23). Cognitive training and the education training control condition were each completed during 60 sessions over 12 weeks for 40 hours total. The education training control condition involved viewing educational videos produced by the National Geographic Channel. Cognitive training was completed using the Posit Science Brain HQ training program, which included 8 cognitive training paradigms targeting attention/processing speed and working memory. All participants also completed demographic questionnaires, cognitive testing, and an fMRI 2-back task at baseline and at 12-weeks following cognitive training.
Results:
Repeated measures analysis of covariance (ANCOVA), adjusted for training adherence, transcranial direct current stimulation (tDCS) condition, age, sex, years of education, and Wechsler Test of Adult Reading (WTAR) raw score, revealed a significant 2-back by training group interaction (F[1,40]=6.201, p=.017, η2=.134). Examination of simple main effects revealed baseline differences in 2-back performance (F[1,40]=.568, p=.455, η2=.014). After controlling for baseline performance, training group differences in 2-back performance was no longer statistically significant (F[1,40]=1.382, p=.247, η2=.034).
Conclusions:
After adjusting for baseline performance differences, there were no significant training group differences in 2-back performance, suggesting that the randomization was not sufficient to ensure adequate distribution of participants across groups. Results may indicate that cognitive training alone is not sufficient for significant improvement in working memory performance on a near transfer task. Additional improvement may occur with the next phase of this clinical trial, such that tDCS augments the effects of cognitive training and results in enhanced compensatory scaffolding even within this high performing cohort. Limitations of the study include a highly educated sample with higher literacy levels and the small sample size was not powered for transfer effects analysis. Future analyses will include evaluation of the combined intervention effects of a cognitive training and tDCS on nback performance in a larger sample of older adults without dementia.
Understanding the factors contributing to optimal cognitive function throughout the aging process is essential to better understand successful cognitive aging. Processing speed is an age sensitive cognitive domain that usually declines early in the aging process; however, this cognitive skill is essential for other cognitive tasks and everyday functioning. Evaluating brain network interactions in cognitively healthy older adults can help us understand how brain characteristics variations affect cognitive functioning. Functional connections among groups of brain areas give insight into the brain’s organization, and the cognitive effects of aging may relate to this large-scale organization. To follow-up on our prior work, we sought to replicate our findings regarding network segregation’s relationship with processing speed. In order to address possible influences of node location or network membership we replicated the analysis across 4 different node sets.
Participants and Methods:
Data were acquired as part of a multi-center study of 85+ cognitively normal individuals, the McKnight Brain Aging Registry (MBAR). For this analysis, we included 146 community-dwelling, cognitively unimpaired older adults, ages 85-99, who had undergone structural and BOLD resting state MRI scans and a battery of neuropsychological tests. Exploratory factor analysis identified the processing speed factor of interest. We preprocessed BOLD scans using fmriprep, Ciftify, and XCPEngine algorithms. We used 4 different sets of connectivity-based parcellation: 1)MBAR data used to define nodes and Power (2011) atlas used to determine node network membership, 2) Younger adults data used to define nodes (Chan 2014) and Power (2011) atlas used to determine node network membership, 3) Older adults data from a different study (Han 2018) used to define nodes and Power (2011) atlas used to determine node network membership, and 4) MBAR data used to define nodes and MBAR data based community detection used to determine node network membership.
Segregation (balance of within-network and between-network connections) was measured within the association system and three wellcharacterized networks: Default Mode Network (DMN), Cingulo-Opercular Network (CON), and Fronto-Parietal Network (FPN). Correlation between processing speed and association system and networks was performed for all 4 node sets.
Results:
We replicated prior work and found the segregation of both the cortical association system, the segregation of FPN and DMN had a consistent relationship with processing speed across all node sets (association system range of correlations: r=.294 to .342, FPN: r=.254 to .272, DMN: r=.263 to .273). Additionally, compared to parcellations created with older adults, the parcellation created based on younger individuals showed attenuated and less robust findings as those with older adults (association system r=.263, FPN r=.255, DMN r=.263).
Conclusions:
This study shows that network segregation of the oldest-old brain is closely linked with processing speed and this relationship is replicable across different node sets created with varied datasets. This work adds to the growing body of knowledge about age-related dedifferentiation by demonstrating replicability and consistency of the finding that as essential cognitive skill, processing speed, is associated with differentiated functional networks even in very old individuals experiencing successful cognitive aging.
Cognitive training has shown promise for improving cognition in older adults. Aging involves a variety of neuroanatomical changes that may affect response to cognitive training. White matter hyperintensities (WMH) are one common age-related brain change, as evidenced by T2-weighted and Fluid Attenuated Inversion Recovery (FLAIR) MRI. WMH are associated with older age, suggestive of cerebral small vessel disease, and reflect decreased white matter integrity. Higher WMH load associates with reduced threshold for clinical expression of cognitive impairment and dementia. The effects of WMH on response to cognitive training interventions are relatively unknown. The current study assessed (a) proximal cognitive training performance following a 3-month randomized control trial and (b) the contribution of baseline whole-brain WMH load, defined as total lesion volume (TLV), on pre-post proximal training change.
Participants and Methods:
Sixty-two healthy older adults ages 65-84 completed either adaptive cognitive training (CT; n=31) or educational training control (ET; n=31) interventions. Participants assigned to CT completed 20 hours of attention/processing speed training and 20 hours of working memory training delivered through commercially-available Posit Science BrainHQ. ET participants completed 40 hours of educational videos. All participants also underwent sham or active transcranial direct current stimulation (tDCS) as an adjunctive intervention, although not a variable of interest in the current study. Multimodal MRI scans were acquired during the baseline visit. T1- and T2-weighted FLAIR images were processed using the Lesion Segmentation Tool (LST) for SPM12. The Lesion Prediction Algorithm of LST automatically segmented brain tissue and calculated lesion maps. A lesion threshold of 0.30 was applied to calculate TLV. A log transformation was applied to TLV to normalize the distribution of WMH. Repeated-measures analysis of covariance (RM-ANCOVA) assessed pre/post change in proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures in the CT group compared to their ET counterparts, controlling for age, sex, years of education and tDCS group. Linear regression assessed the effect of TLV on post-intervention proximal composite and sub-composite, controlling for baseline performance, intervention assignment, age, sex, years of education, multisite scanner differences, estimated total intracranial volume, and binarized cardiovascular disease risk.
Results:
RM-ANCOVA revealed two-way group*time interactions such that those assigned cognitive training demonstrated greater improvement on proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures compared to their ET counterparts. Multiple linear regression showed higher baseline TLV associated with lower pre-post change on Processing Speed Training sub-composite (ß = -0.19, p = 0.04) but not other composite measures.
Conclusions:
These findings demonstrate the utility of cognitive training for improving postintervention proximal performance in older adults. Additionally, pre-post proximal processing speed training change appear to be particularly sensitive to white matter hyperintensity load versus working memory training change. These data suggest that TLV may serve as an important factor for consideration when planning processing speed-based cognitive training interventions for remediation of cognitive decline in older adults.
Interventions using a cognitive training paradigm called the Useful Field of View (UFOV) task have shown to be efficacious in slowing cognitive decline. However, no studies have looked at the engagement of functional networks during UFOV task completion. The current study aimed to (a) assess if regions activated during the UFOV fMRI task were functionally connected and related to task performance (henceforth called the UFOV network), (b) compare connectivity of the UFOV network to 7 resting-state functional connectivity networks in predicting proximal (UFOV) and near-transfer (Double Decision) performance, and (c) explore the impact of network segregation between higher-order networks and UFOV performance.
Participants and Methods:
336 healthy older adults (mean age=71.6) completed the UFOV fMRI task in a Siemens 3T scanner. UFOV fMRI accuracy was calculated as the number of correct responses divided by 56 total trials. Double Decision performance was calculated as the average presentation time of correct responses in log ms, with lower scores equating to better processing speed. Structural and functional MRI images were processed using the default pre-processing pipeline within the CONN toolbox. The Artifact Rejection Toolbox was set at a motion threshold of 0.9mm and participants were excluded if more than 50% of volumes were flagged as outliers. To assess connectivity of regions associated with the UFOV task, we created 10 spherical regions of interest (ROIs) a priori using the WFU PickAtlas in SPM12. These include the bilateral pars triangularis, supplementary motor area, and inferior temporal gyri, as well as the left pars opercularis, left middle occipital gyrus, right precentral gyrus and right superior parietal lobule. We used a weighted ROI-to-ROI connectivity analysis to model task-based within-network functional connectivity of the UFOV network, and its relationship to UFOV accuracy. We then used weighted ROI-to-ROI connectivity analysis to compare the efficacy of the UFOV network versus 7 resting-state networks in predicting UFOV fMRI task performance and Double Decision performance. Finally, we calculated network segregation among higher order resting state networks to assess its relationship with UFOV accuracy. All functional connectivity analyses were corrected at a false discovery threshold (FDR) at p<0.05.
Results:
ROI-to-ROI analysis showed significant within-network functional connectivity among the 10 a priori ROIs (UFOV network) during task completion (all pFDR<.05). After controlling for covariates, greater within-network connectivity of the UFOV network associated with better UFOV fMRI performance (pFDR=.008). Regarding the 7 resting-state networks, greater within-network connectivity of the CON (pFDR<.001) and FPCN (pFDR=. 014) were associated with higher accuracy on the UFOV fMRI task. Furthermore, greater within-network connectivity of only the UFOV network associated with performance on the Double Decision task (pFDR=.034). Finally, we assessed the relationship between higher-order network segregation and UFOV accuracy. After controlling for covariates, no significant relationships between network segregation and UFOV performance remained (all p-uncorrected>0.05).
Conclusions:
To date, this is the first study to assess task-based functional connectivity during completion of the UFOV task. We observed that coherence within 10 a priori ROIs significantly predicted UFOV performance. Additionally, enhanced within-network connectivity of the UFOV network predicted better performance on the Double Decision task, while conventional resting-state networks did not. These findings provide potential targets to optimize efficacy of UFOV interventions.
Aging is associated with disruptions in functional connectivity within the default mode (DMN), frontoparietal control (FPCN), and cingulo-opercular (CON) resting-state networks. Greater within-network connectivity predicts better cognitive performance in older adults. Therefore, strengthening network connectivity, through targeted intervention strategies, may help prevent age-related cognitive decline or progression to dementia. Small studies have demonstrated synergistic effects of combining transcranial direct current stimulation (tDCS) and cognitive training (CT) on strengthening network connectivity; however, this association has yet to be rigorously tested on a large scale. The current study leverages longitudinal data from the first-ever Phase III clinical trial for tDCS to examine the efficacy of an adjunctive tDCS and CT intervention on modulating network connectivity in older adults.
Participants and Methods:
This sample included 209 older adults (mean age = 71.6) from the Augmenting Cognitive Training in Older Adults multisite trial. Participants completed 40 hours of CT over 12 weeks, which included 8 attention, processing speed, and working memory tasks. Participants were randomized into active or sham stimulation groups, and tDCS was administered during CT daily for two weeks then weekly for 10 weeks. For both stimulation groups, two electrodes in saline-soaked 5x7 cm2 sponges were placed at F3 (cathode) and F4 (anode) using the 10-20 measurement system. The active group received 2mA of current for 20 minutes. The sham group received 2mA for 30 seconds, then no current for the remaining 20 minutes.
Participants underwent resting-state fMRI at baseline and post-intervention. CONN toolbox was used to preprocess imaging data and conduct region of interest (ROI-ROI) connectivity analyses. The Artifact Detection Toolbox, using intermediate settings, identified outlier volumes. Two participants were excluded for having greater than 50% of volumes flagged as outliers. ROI-ROI analyses modeled the interaction between tDCS group (active versus sham) and occasion (baseline connectivity versus postintervention connectivity) for the DMN, FPCN, and CON controlling for age, sex, education, site, and adherence.
Results:
Compared to sham, the active group demonstrated ROI-ROI increases in functional connectivity within the DMN following intervention (left temporal to right temporal [T(202) = 2.78, pFDR < 0.05] and left temporal to right dorsal medial prefrontal cortex [T(202) = 2.74, pFDR < 0.05]. In contrast, compared to sham, the active group demonstrated ROI-ROI decreases in functional connectivity within the FPCN following intervention (left dorsal prefrontal cortex to left temporal [T(202) = -2.96, pFDR < 0.05] and left dorsal prefrontal cortex to left lateral prefrontal cortex [T(202) = -2.77, pFDR < 0.05]). There were no significant interactions detected for CON regions.
Conclusions:
These findings (a) demonstrate the feasibility of modulating network connectivity using tDCS and CT and (b) provide important information regarding the pattern of connectivity changes occurring at these intervention parameters in older adults. Importantly, the active stimulation group showed increases in connectivity within the DMN (a network particularly vulnerable to aging and implicated in Alzheimer’s disease) but decreases in connectivity between left frontal and temporal FPCN regions. Future analyses from this trial will evaluate the association between these changes in connectivity and cognitive performance post-intervention and at a one-year timepoint.
Nonpathological aging has been linked to decline in both verbal and visuospatial memory abilities in older adults. Disruptions in resting-state functional connectivity within well-characterized, higherorder cognitive brain networks have also been coupled with poorer memory functioning in healthy older adults and in older adults with dementia. However, there is a paucity of research on the association between higherorder functional connectivity and verbal and visuospatial memory performance in the older adult population. The current study examines the association between resting-state functional connectivity within the cingulo-opercular network (CON), frontoparietal control network (FPCN), and default mode network (DMN) and verbal and visuospatial learning and memory in a large sample of healthy older adults. We hypothesized that greater within-network CON and FPCN functional connectivity would be associated with better immediate verbal and visuospatial memory recall. Additionally, we predicted that within-network DMN functional connectivity would be associated with improvements in delayed verbal and visuospatial memory recall. This study helps to glean insight into whether within-network CON, FPCN, or DMN functional connectivity is associated with verbal and visuospatial memory abilities in later life.
Participants and Methods:
330 healthy older adults between 65 and 89 years old (mean age = 71.6 ± 5.2) were recruited at the University of Florida (n = 222) and the University of Arizona (n = 108). Participants underwent resting-state fMRI and completed verbal memory (Hopkins Verbal Learning Test - Revised [HVLT-R]) and visuospatial memory (Brief Visuospatial Memory Test - Revised [BVMT-R]) measures. Immediate (total) and delayed recall scores on the HVLT-R and BVMT-R were calculated using each test manual’s scoring criteria. Learning ratios on the HVLT-R and BVMT-R were quantified by dividing the number of stimuli (verbal or visuospatial) learned between the first and third trials by the number of stimuli not recalled after the first learning trial. CONN Toolbox was used to extract average within-network connectivity values for CON, FPCN, and DMN. Hierarchical regressions were conducted, controlling for sex, race, ethnicity, years of education, number of invalid scans, and scanner site.
Results:
Greater CON connectivity was significantly associated with better HVLT-R immediate (total) recall (ß = 0.16, p = 0.01), HVLT-R learning ratio (ß = 0.16, p = 0.01), BVMT-R immediate (total) recall (ß = 0.14, p = 0.02), and BVMT-R delayed recall performance (ß = 0.15, p = 0.01). Greater FPCN connectivity was associated with better BVMT-R learning ratio (ß = 0.13, p = 0.04). HVLT-R delayed recall performance was not associated with connectivity in any network, and DMN connectivity was not significantly related to any measure.
Conclusions:
Connectivity within CON demonstrated a robust relationship with different components of memory function as well across verbal and visuospatial domains. In contrast, FPCN only evidenced a relationship with visuospatial learning, and DMN was not significantly associated with memory measures. These data suggest that CON may be a valuable target in longitudinal studies of age-related memory changes, but also a possible target in future non-invasive interventions to attenuate memory decline in older adults.
Early detection of ST-segment elevation myocardial infarction (STEMI) on the prehospital electrocardiogram (ECG) improves patient outcomes. Current software algorithms optimize sensitivity but have a high false-positive rate. The authors propose an algorithm to improve the specificity of STEMI diagnosis in the prehospital setting.
Methods:
A dataset of prehospital ECGs with verified outcomes was used to validate an algorithm to identify true and false-positive software interpretations of STEMI. Four criteria implicated in prior research to differentiate STEMI true positives were applied: heart rate <130, QRS <100, verification of ST-segment elevation, and absence of artifact. The test characteristics were calculated and regression analysis was used to examine the association between the number of criteria included and test characteristics.
Results:
There were 44,611 cases available. Of these, 1,193 were identified as STEMI by the software interpretation. Applying all four criteria had the highest positive likelihood ratio of 353 (95% CI, 201-595) and specificity of 99.96% (95% CI, 99.93-99.98), but the lowest sensitivity (14%; 95% CI, 11-17) and worst negative likelihood ratio (0.86; 95% CI, 0.84-0.89). There was a strong correlation between increased positive likelihood ratio (r2 = 0.90) and specificity (r2 = 0.85) with increasing number of criteria.
Conclusions:
Prehospital ECGs with a high probability of true STEMI can be accurately identified using these four criteria: heart rate <130, QRS <100, verification of ST-segment elevation, and absence of artifact. Applying these criteria to prehospital ECGs with software interpretations of STEMI could decrease false-positive field activations, while also reducing the need to rely on transmission for physician over-read. This can have significant clinical and quality implications for Emergency Medical Services (EMS) systems.
The COVID-19 pandemic accelerated the development of decentralized clinical trials (DCT). DCT’s are an important and pragmatic method for assessing health outcomes yet comprise only a minority of clinical trials, and few published methodologies exist. In this report, we detail the operational components of COVID-OUT, a decentralized, multicenter, quadruple-blinded, randomized trial that rapidly delivered study drugs nation-wide. The trial examined three medications (metformin, ivermectin, and fluvoxamine) as outpatient treatment of SARS-CoV-2 for their effectiveness in preventing severe or long COVID-19. Decentralized strategies included HIPAA-compliant electronic screening and consenting, prepacking investigational product to accelerate delivery after randomization, and remotely confirming participant-reported outcomes. Of the 1417 individuals with the intention-to-treat sample, the remote nature of the study caused an additional 94 participants to not take any doses of study drug. Therefore, 1323 participants were in the modified intention-to-treat sample, which was the a priori primary study sample. Only 1.4% of participants were lost to follow-up. Decentralized strategies facilitated the successful completion of the COVID-OUT trial without any in-person contact by expediting intervention delivery, expanding trial access geographically, limiting contagion exposure, and making it easy for participants to complete follow-up visits. Remotely completed consent and follow-up facilitated enrollment.
Eating disorders (ED) are serious psychiatric disorders, taking a life every 52 minutes, with high relapse. There are currently no support or effective intervention therapeutics for individuals with an ED in their everyday life. The aim of this study is to build idiographic machine learning (ML) models to evaluate the performance of physiological recordings to detect individual ED behaviors in naturalistic settings.
Methods
From an ongoing study (Final N = 120), we piloted the ability for ML to detect an individual's ED behavioral episodes (e.g. purging) from physiological data in six individuals diagnosed with an ED, all of whom endorsed purging. Participants wore an ambulatory monitor for 30 days and tapped a button to denote ED behavioral episodes. We built idiographic (N = 1) logistic regression classifiers (LRC) ML trained models to identify onset of episodes (~600 windows) v. baseline (~571 windows) physiology (Heart Rate, Electrodermal Activity, and Temperature).
Results
Using physiological data, ML LRC accurately classified on average 91% of cases, with 92% specificity and 90% sensitivity.
Conclusions
This evidence suggests the ability to build idiographic ML models that detect ED behaviors from physiological indices within everyday life with a high level of accuracy. The novel use of ML with wearable sensors to detect physiological patterns of ED behavior pre-onset can lead to just-in-time clinical interventions to disrupt problematic behaviors and promote ED recovery.