We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The recent ONS survey reported that 92% of students had been affected by the cost-of-living crisis with 46% revealing their overall mental health and well-being had worsened(1). London Metropolitan University has a unique diverse student population: in 2020-21, 82% of students were mature, 64% of students identified as female, 55% of students were from a minoritised background and 13% had a known disability(2). Furthermore, at least 50% of our students, many of whom have caring responsibilities, reside in the most deprived wards of Islington or other impoverished London boroughs. It has been documented that those students with families, who come from a low income and a minority background are more vulnerable(3) and are more likely to be disproportionately affected by the cost-of-living crisis and at risk of food insecurity. We sought to ease the burden of the cost-of-living crisis with a recipe box scheme, BRITE Box (4) and evaluate its acceptance.
BRITE Box provides a complete set of pre-weighed ingredients for a healthy nutritious meal with an easy-to-follow recipe guide. Each box typically contains two servings of vegetables, meat, bread and dairy, as well as spices, dried and tinned goods to feed a family of five people. We distributed 300 boxes over a period of five months to students primarily with families who had accessed the university hardship fund. Student volunteers and the academic staff pre-weighed the ingredients, prepared and distributed the boxes. The scheme was advertised through Student Services who administer the hardship fund, the Student Union and the intranet. The recipients of the boxes were provided with a QR code linking to a 20-item online survey on demographic characteristics, number of children, acceptability of the box and perceived advantages and disadvantages of the scheme. Ethics was approved by London Metropolitan University.
Thirty-three participants completed the survey, 42% of the recipients identified as female, 55% were from a minority background and 30% had children. The responses showed that students agreed and strongly agreed that the recipe box introduced them to new flavours (52%), and new foods (42%). The majority followed the recipe and will use again (67%). Most importantly, 73% students agreed and or strongly agreed that it helped with the food budget and helped foster a sense of belonging to the university (85%).
The scheme has proved to be popular among the students: “a really cool concept”, “it helped me cook”, “the box provided food for 3 days” and created a buzz around campus on distribution days. It has enhanced the feeling of community and belonging within the university, whilst also alleviating food insecurity and tackling the cost-of-living crisis.
Resistive tearing instabilities are common in fluids that are highly electrically conductive and carry strong currents. We determine the effect of stable stratification on the tearing instability under the Boussinesq approximation. Our results generalise previous work that considered only specific parameter regimes, and we show that the length scale of the fastest-growing mode depends non-monotonically on the stratification strength. We confirm our analytical results by solving the linearised equations numerically, and we discuss whether the instability could operate in the solar tachocline.
Psychiatric drugs, including antipsychotics and antidepressants, are widely prescribed, even in young and adolescent populations at early or subthreshold disease stages. However, their impact on brain structure remains elusive. Elucidating the relationship between psychotropic medication and structural brain changes could enhance the understanding of the potential benefits and risks associated with such treatment.
Objectives
Investigation of the associations between psychiatric drug intake and longitudinal grey matter volume (GMV) changes in a transdiagnostic sample of young individuals at early stages of psychosis or depression using an unbiased data-driven approach.
Methods
The study sample comprised 247 participants (mean [SD] age = 25.06 [6.13] years, 50.61% male), consisting of young, minimally medicated individuals at clinical high-risk states for psychosis, individuals with recent-onset depression or psychosis, and healthy control individuals. Structural magnetic resonance imaging was used to obtain whole-brain voxel-wise GMV for all participants at two timepoints (mean [SD] time between scans = 11.15 [4.93] months). The multivariate sparse partial least squares (SPLS) algorithm (Monteiro et al. JNMEDT 2016; 271:182-194) was embedded in a nested cross-validation framework to identify parsimonious associations between the cumulative intake of psychiatric drugs, including commonly prescribed antipsychotics and antidepressants, and change in GMV between both timepoints, while additionally factoring in age, sex, and diagnosis. Furthermore, we correlated the retrieved SPLS results to personality domains (NEO-FFI) and childhood trauma (CTQ).
Results
SPLS analysis revealed significant associations between the antipsychotic classes of benzamides, butyrophenones and thioxanthenes and longitudinal GMV decreases in cortical regions including the insula, posterior superior temporal sulcus as well as cingulate, postcentral, precentral, orbital and frontal gyri (Figure 1A-C). These brain regions corresponded most closely to the dorsal and ventral attention, somatomotor, salience and default network (Figure 1D). Furthermore, the medication signature was negatively associated with the personality domains extraversion, agreeableness and conscientiousness and positively associated with the CTQ domains emotional and physical neglect.
Image:
Conclusions
Psychiatric drug intake over a period of one year was linked to distinct GMV reductions in key cortical hubs. These patterns were already visible in young individuals at early or subthreshold stages of mental illness and were further linked to childhood neglect and personality traits. Hence, a better and more in-depth understanding of the structural brain implications of medicating young and adolescent individuals might lead to more cautious, sustainable and targeted treatment strategies.
The clinical high-risk state for psychosis (CHR) is associated with alterations in grey matter volume (GMV) in various regions such as the hippocampus (Vissink et al. BP:GOS 2022; 2(2) 147-152). Within the scope of the North American Prodrome Longitudinal Study (NAPLS-2; Cannon et al. AM J Psychiatry 2016; 173(10), 980-988), a publicly available risk calculator based on clinical variables was developed to assess the likelihood of individuals to transition to psychosis within a 2-year period.
Objectives
In the current study, we aim to examine the association between GMV and NAPLS-2 risk scores calculated for individuals with CHR and recent-onset depression (ROD), taking a transdiagnostic approach on the transition to psychosis.
Methods
The sample consisted of 315 CHR (M = 23.85, SD = ± 5.64; female: 164) and 295 ROD (M = 25.11, SD = ± 6.21; female: 144) patients from the multi-site Personalised Prognostic Tools for Early Psychosis Management (PRONIA) Study (Koutsouleris et al. JAMA Psychiatry 2018; 57(11), 1156-1172). Risk scores were calculated using the six clinical and neurocognitive variables included in the NAPLS-2 risk calculator that were significant for predicting psychosis. Further, we derived smoothed GMV maps from T1-weighted structural magnetic resonance imaging using a full width at half maximum kernel size of 8 mm. We employed a multiple regression design in SPM12 to examine associations between risk scores and GMV. On the whole-brain level, we calculated permutation-based threshold-free cluster enhancement (TFCE) contrasts using the TFCE toolbox. Additionally, we calculated t-contrasts within a region-of-interest (ROI) analysis encompassing the hippocampus. All results were thresholded at p < 0.05 with family wise error correction to address multiple comparisons.
Results
Our analysis revealed that linear GMV increases in the right middle and superior frontal gyrus (kE= 2726 voxels) were significantly associated with higher risk for psychosis transition within two years (see figure 1, highlighted in blue). In the ROI analysis, we found a significant negative linear association between GMV decreases in the left hippocampus (kE = 353 voxels) and higher risk for psychosis transition (see figure 1, highlighted in red).
Image:
Conclusions
GMV reductions in the hippocampus have frequently been observed in CHR and psychosis patients (Vissink et al. BP:GOS 2022; 2(2) 147-152), therefore our results further highlight the crucial role of this region in the progression of the disease. There is limited evidence on GMV increases in CHR patients. However, the GMV increase we found in the frontal pole may reflect compensatory mechanisms of the brain in the development of psychosis. In addition, we were able to provide biological validation of the NAPLS-2 risk calculator and its assessment of risk for transition to psychosis.
Few studies have examined the genetic population structure of vector-borne microparasites in wildlife, making it unclear how much these systems can reveal about the movement of their associated hosts. This study examined the complex host–vector–microbe interactions in a system of bats, wingless ectoparasitic bat flies (Nycteribiidae), vector-borne microparasitic bacteria (Bartonella) and bacterial endosymbionts of flies (Enterobacterales) across an island chain in the Gulf of Guinea, West Africa. Limited population structure was found in bat flies and Enterobacterales symbionts compared to that of their hosts. Significant isolation by distance was observed in the dissimilarity of Bartonella communities detected in flies from sampled populations of Eidolon helvum bats. These patterns indicate that, while genetic dispersal of bats between islands is limited, some non-reproductive movements may lead to the dispersal of ectoparasites and associated microbes. This study deepens our knowledge of the phylogeography of African fruit bats, their ectoparasites and associated bacteria. The results presented could inform models of pathogen transmission in these bat populations and increase our theoretical understanding of community ecology in host–microbe systems.
Providing access to food in schools can serve as a platform for food system transformation, while simultaneously improving educational outcomes and livelihoods. Locally grown and procured food is a nutritious, healthy, and efficient way to provide schoolchildren with a daily meal while, at the same time, improving opportunities for smallholder farmers(1). While there is significant potential for school food provision activities to support healthy dietary behaviours in the Pacific Islands region, there is limited evidence of these types of activities(2), including scope and links to local food production in the region. Therefore, the aim of this scoping study was to understand the current state of school food activities (school feeding, gardening and other food provision activities) and any current, and potential links to local agriculture in the Pacific Islands. A regional mapping activity was undertaken, initially covering 22 Pacific Island countries. The mapping included two steps: 1) a desk based scoping review including peer-reviewed and grey literature (2007-2022) and 2) One-hour semi-structured online Zoom interviews with key country stakeholders. Twelve sources were identified, predominately grey literature (n = 9). Thirty interviews were completed with at least 1 key stakeholder from 15 countries. A variety of school food provision activities were identified, including school feeding programs (n = 16, of varying scale), programs covering both school feeding and school gardens (n = 2), school garden programs (n = 12), and other school food provision activities (n = 4, including taste/sensory education, food waste reduction, increasing canteen capacity for local foods, supply chain distribution between local agriculture and schools). Existing links to local agriculture varied for the different programs. Of the 16 school feeding programs, 8 had a requirement for the use of local produce (policy requirement n = 6, traditional requirement from leaders n = 2). Of the 12 school garden programs, 6 used local or traditional produce in the garden and 5 involved local farmers in varying capacities. Challenges to linking local agriculture into school food provision programs were reported for 17 activities and were context dependent. Common challenges included limited funding, inflation, Covid-19, inadequate produce supply for the scale of program, limited farmer capacity, limited institutional support for local produce, low produce storage life, climatic conditions and disasters, water security, delayed procurement process, and limited professional development and upskilling opportunities. Modernisation and colonisation of food systems resulting in a preference for hyperpalatable foods and challenges in incorporating local produce in a way that is accepted by students was also identified as a challenge. This evidence can be used to develop a pathway to piloting and implementing models of school food provision programs and promoting opportunities for shared learning and collaboration with key stakeholders across the Pacific Islands region.
The personalised oncology paradigm remains challenging to deliver despite technological advances in genomics-based identification of actionable variants combined with the increasing focus of drug development on these specific targets. To ensure we continue to build concerted momentum to improve outcomes across all cancer types, financial, technological and operational barriers need to be addressed. For example, complete integration and certification of the ‘molecular tumour board’ into ‘standard of care’ ensures a unified clinical decision pathway that both counteracts fragmentation and is the cornerstone of evidence-based delivery inside and outside of a research setting. Generally, integrated delivery has been restricted to specific (common) cancer types either within major cancer centres or small regional networks. Here, we focus on solutions in real-world integration of genomics, pathology, surgery, oncological treatments, data from clinical source systems and analysis of whole-body imaging as digital data that can facilitate cost-effectiveness analysis, clinical trial recruitment, and outcome assessment. This urgent imperative for cancer also extends across the early diagnosis and adjuvant treatment interventions, individualised cancer vaccines, immune cell therapies, personalised synthetic lethal therapeutics and cancer screening and prevention. Oncology care systems worldwide require proactive step-changes in solutions that include inter-operative digital working that can solve patient centred challenges to ensure inclusive, quality, sustainable, fair and cost-effective adoption and efficient delivery. Here we highlight workforce, technical, clinical, regulatory and economic challenges that prevent the implementation of precision oncology at scale, and offer a systematic roadmap of integrated solutions for standard of care based on minimal essential digital tools. These include unified decision support tools, quality control, data flows within an ethical and legal data framework, training and certification, monitoring and feedback. Bridging the technical, operational, regulatory and economic gaps demands the joint actions from public and industry stakeholders across national and global boundaries.
Understanding the factors contributing to optimal cognitive function throughout the aging process is essential to better understand successful cognitive aging. Processing speed is an age sensitive cognitive domain that usually declines early in the aging process; however, this cognitive skill is essential for other cognitive tasks and everyday functioning. Evaluating brain network interactions in cognitively healthy older adults can help us understand how brain characteristics variations affect cognitive functioning. Functional connections among groups of brain areas give insight into the brain’s organization, and the cognitive effects of aging may relate to this large-scale organization. To follow-up on our prior work, we sought to replicate our findings regarding network segregation’s relationship with processing speed. In order to address possible influences of node location or network membership we replicated the analysis across 4 different node sets.
Participants and Methods:
Data were acquired as part of a multi-center study of 85+ cognitively normal individuals, the McKnight Brain Aging Registry (MBAR). For this analysis, we included 146 community-dwelling, cognitively unimpaired older adults, ages 85-99, who had undergone structural and BOLD resting state MRI scans and a battery of neuropsychological tests. Exploratory factor analysis identified the processing speed factor of interest. We preprocessed BOLD scans using fmriprep, Ciftify, and XCPEngine algorithms. We used 4 different sets of connectivity-based parcellation: 1)MBAR data used to define nodes and Power (2011) atlas used to determine node network membership, 2) Younger adults data used to define nodes (Chan 2014) and Power (2011) atlas used to determine node network membership, 3) Older adults data from a different study (Han 2018) used to define nodes and Power (2011) atlas used to determine node network membership, and 4) MBAR data used to define nodes and MBAR data based community detection used to determine node network membership.
Segregation (balance of within-network and between-network connections) was measured within the association system and three wellcharacterized networks: Default Mode Network (DMN), Cingulo-Opercular Network (CON), and Fronto-Parietal Network (FPN). Correlation between processing speed and association system and networks was performed for all 4 node sets.
Results:
We replicated prior work and found the segregation of both the cortical association system, the segregation of FPN and DMN had a consistent relationship with processing speed across all node sets (association system range of correlations: r=.294 to .342, FPN: r=.254 to .272, DMN: r=.263 to .273). Additionally, compared to parcellations created with older adults, the parcellation created based on younger individuals showed attenuated and less robust findings as those with older adults (association system r=.263, FPN r=.255, DMN r=.263).
Conclusions:
This study shows that network segregation of the oldest-old brain is closely linked with processing speed and this relationship is replicable across different node sets created with varied datasets. This work adds to the growing body of knowledge about age-related dedifferentiation by demonstrating replicability and consistency of the finding that as essential cognitive skill, processing speed, is associated with differentiated functional networks even in very old individuals experiencing successful cognitive aging.
Cohort studies demonstrate that people who later develop schizophrenia, on average, present with mild cognitive deficits in childhood and endure a decline in adolescence and adulthood. Yet, tremendous heterogeneity exists during the course of psychotic disorders, including the prodromal period. Individuals identified to be in this period (known as CHR-P) are at heightened risk for developing psychosis (~35%) and begin to exhibit cognitive deficits. Cognitive impairments in CHR-P (as a singular group) appear to be relatively stable or ameliorate over time. A sizeable proportion has been described to decline on measures related to processing speed or verbal learning. The purpose of this analysis is to use data-driven approaches to identify latent subgroups among CHR-P based on cognitive trajectories. This will yield a clearer understanding of the timing and presentation of both general and domain-specific deficits.
Participants and Methods:
Participants included 684 young people at CHR-P (ages 12–35) from the second cohort of the North American Prodromal Longitudinal Study. Performance on the MATRICS Consensus Cognitive Battery (MCCB) and the Wechsler Abbreviated Scale of Intelligence (WASI-I) was assessed at baseline, 12-, and 24-months. Tested MCCB domains include verbal learning, speed of processing, working memory, and reasoning & problem-solving. Sex- and age-based norms were utilized. The Oral Reading subtest on the Wide Range Achievement Test (WRAT4) indexed pre-morbid IQ at baseline. Latent class mixture models were used to identify distinct trajectories of cognitive performance across two years. One- to 5-class solutions were compared to decide the best solution. This determination depended on goodness-of-fit metrics, interpretability of latent trajectories, and proportion of subgroup membership (>5%).
Results:
A one-class solution was found for WASI-I Full-Scale IQ, as people at CHR-P predominantly demonstrated an average IQ that increased gradually over time. For individual domains, one-class solutions also best fit the trajectories for speed of processing, verbal learning, and working memory domains. Two distinct subgroups were identified on one of the executive functioning domains, reasoning and problem-solving (NAB Mazes). The sample divided into unimpaired performance with mild improvement over time (Class I, 74%) and persistent performance two standard deviations below average (Class II, 26%). Between these classes, no significant differences were found for biological sex, age, years of education, or likelihood of conversion to psychosis (OR = 1.68, 95% CI 0.86 to 3.14). Individuals assigned to Class II did demonstrate a lower WASI-I IQ at baseline (96.3 vs. 106.3) and a lower premorbid IQ (100.8 vs. 106.2).
Conclusions:
Youth at CHR-P demonstrate relatively homogeneous trajectories across time in terms of general cognition and most individual domains. In contrast, two distinct subgroups were observed with higher cognitive skills involving planning and foresight, and they notably exist independent of conversion outcome. Overall, these findings replicate and extend results from a recently published latent class analysis that examined 12-month trajectories among CHR-P using a different cognitive battery (Allott et al., 2022). Findings inform which individuals at CHR-P may be most likely to benefit from cognitive remediation and can inform about the substrates of deficits by establishing meaningful subtypes.
Cognitive training has shown promise for improving cognition in older adults. Aging involves a variety of neuroanatomical changes that may affect response to cognitive training. White matter hyperintensities (WMH) are one common age-related brain change, as evidenced by T2-weighted and Fluid Attenuated Inversion Recovery (FLAIR) MRI. WMH are associated with older age, suggestive of cerebral small vessel disease, and reflect decreased white matter integrity. Higher WMH load associates with reduced threshold for clinical expression of cognitive impairment and dementia. The effects of WMH on response to cognitive training interventions are relatively unknown. The current study assessed (a) proximal cognitive training performance following a 3-month randomized control trial and (b) the contribution of baseline whole-brain WMH load, defined as total lesion volume (TLV), on pre-post proximal training change.
Participants and Methods:
Sixty-two healthy older adults ages 65-84 completed either adaptive cognitive training (CT; n=31) or educational training control (ET; n=31) interventions. Participants assigned to CT completed 20 hours of attention/processing speed training and 20 hours of working memory training delivered through commercially-available Posit Science BrainHQ. ET participants completed 40 hours of educational videos. All participants also underwent sham or active transcranial direct current stimulation (tDCS) as an adjunctive intervention, although not a variable of interest in the current study. Multimodal MRI scans were acquired during the baseline visit. T1- and T2-weighted FLAIR images were processed using the Lesion Segmentation Tool (LST) for SPM12. The Lesion Prediction Algorithm of LST automatically segmented brain tissue and calculated lesion maps. A lesion threshold of 0.30 was applied to calculate TLV. A log transformation was applied to TLV to normalize the distribution of WMH. Repeated-measures analysis of covariance (RM-ANCOVA) assessed pre/post change in proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures in the CT group compared to their ET counterparts, controlling for age, sex, years of education and tDCS group. Linear regression assessed the effect of TLV on post-intervention proximal composite and sub-composite, controlling for baseline performance, intervention assignment, age, sex, years of education, multisite scanner differences, estimated total intracranial volume, and binarized cardiovascular disease risk.
Results:
RM-ANCOVA revealed two-way group*time interactions such that those assigned cognitive training demonstrated greater improvement on proximal composite (Total Training Composite) and sub-composite (Processing Speed Training Composite, Working Memory Training Composite) measures compared to their ET counterparts. Multiple linear regression showed higher baseline TLV associated with lower pre-post change on Processing Speed Training sub-composite (ß = -0.19, p = 0.04) but not other composite measures.
Conclusions:
These findings demonstrate the utility of cognitive training for improving postintervention proximal performance in older adults. Additionally, pre-post proximal processing speed training change appear to be particularly sensitive to white matter hyperintensity load versus working memory training change. These data suggest that TLV may serve as an important factor for consideration when planning processing speed-based cognitive training interventions for remediation of cognitive decline in older adults.
Aging is associated with disruptions in functional connectivity within the default mode (DMN), frontoparietal control (FPCN), and cingulo-opercular (CON) resting-state networks. Greater within-network connectivity predicts better cognitive performance in older adults. Therefore, strengthening network connectivity, through targeted intervention strategies, may help prevent age-related cognitive decline or progression to dementia. Small studies have demonstrated synergistic effects of combining transcranial direct current stimulation (tDCS) and cognitive training (CT) on strengthening network connectivity; however, this association has yet to be rigorously tested on a large scale. The current study leverages longitudinal data from the first-ever Phase III clinical trial for tDCS to examine the efficacy of an adjunctive tDCS and CT intervention on modulating network connectivity in older adults.
Participants and Methods:
This sample included 209 older adults (mean age = 71.6) from the Augmenting Cognitive Training in Older Adults multisite trial. Participants completed 40 hours of CT over 12 weeks, which included 8 attention, processing speed, and working memory tasks. Participants were randomized into active or sham stimulation groups, and tDCS was administered during CT daily for two weeks then weekly for 10 weeks. For both stimulation groups, two electrodes in saline-soaked 5x7 cm2 sponges were placed at F3 (cathode) and F4 (anode) using the 10-20 measurement system. The active group received 2mA of current for 20 minutes. The sham group received 2mA for 30 seconds, then no current for the remaining 20 minutes.
Participants underwent resting-state fMRI at baseline and post-intervention. CONN toolbox was used to preprocess imaging data and conduct region of interest (ROI-ROI) connectivity analyses. The Artifact Detection Toolbox, using intermediate settings, identified outlier volumes. Two participants were excluded for having greater than 50% of volumes flagged as outliers. ROI-ROI analyses modeled the interaction between tDCS group (active versus sham) and occasion (baseline connectivity versus postintervention connectivity) for the DMN, FPCN, and CON controlling for age, sex, education, site, and adherence.
Results:
Compared to sham, the active group demonstrated ROI-ROI increases in functional connectivity within the DMN following intervention (left temporal to right temporal [T(202) = 2.78, pFDR < 0.05] and left temporal to right dorsal medial prefrontal cortex [T(202) = 2.74, pFDR < 0.05]. In contrast, compared to sham, the active group demonstrated ROI-ROI decreases in functional connectivity within the FPCN following intervention (left dorsal prefrontal cortex to left temporal [T(202) = -2.96, pFDR < 0.05] and left dorsal prefrontal cortex to left lateral prefrontal cortex [T(202) = -2.77, pFDR < 0.05]). There were no significant interactions detected for CON regions.
Conclusions:
These findings (a) demonstrate the feasibility of modulating network connectivity using tDCS and CT and (b) provide important information regarding the pattern of connectivity changes occurring at these intervention parameters in older adults. Importantly, the active stimulation group showed increases in connectivity within the DMN (a network particularly vulnerable to aging and implicated in Alzheimer’s disease) but decreases in connectivity between left frontal and temporal FPCN regions. Future analyses from this trial will evaluate the association between these changes in connectivity and cognitive performance post-intervention and at a one-year timepoint.
Individuals living with HIV may experience cognitive difficulties or marked declines known as HIV-Associated Neurocognitive Disorder (HAND). Cognitive difficulties have been associated with worse outcomes for people living with HIV, therefore, accurate cognitive screening and identification is critical. One potentially sensitive marker of cognitive impairment which has been underutilized, is intra-individual variability (IIV). Cognitive IIV is the dispersion of scores across tasks in neuropsychological assessment. In individuals living with HIV, greater cognitive IIV has been associated with cortical atrophy, poorer cognitive functioning, with more rapid declines, and greater difficulties in daily functioning. Studies examining the use of IIV in clinical neuropsychological testing are limited, and few have examined IIV in the context of a single neuropsychological battery designed for culturally diverse or at-risk populations. To address these gaps, this study aimed to examine IIV profiles of individuals living with HIV and who inject drugs, utilizing the Neuropsi, a standardized neuropsychological instrument for Spanish speaking populations.
Participants and Methods:
Spanish speaking adults residing in Puerto Rico (n=90) who are HIV positive and who inject drugs (HIV+I), HIV negative and who inject drugs (HIV-I), HIV positive who do not inject drugs (HIV+), or healthy controls (HC) completed the Neuropsi battery as part of a larger research protocol. The Neuropsi produces 3 index scores representing cognitive domains of memory, attention/memory, and attention/executive functioning. Total battery and within index IIV were calculated by dividing the standard deviation of T-scores by mean performance, resulting in a coefficient of variance (CoV). Group differences on overall test battery mean CoV (OTBMCoV) were investigated. To examine unique profiles of index specific IIV, a cluster analysis was performed for each group.
Results:
Results of a one-way ANOVA indicated significant between group differences on OTBMCoV (F[3,86]=6.54, p<.001). Post-hoc analyses revealed that HIV+I (M=.55, SE=.07, p=.003), HIV-I (M=.50, SE=.03, p=.001), and HIV+ (M=.48, SE=.02, p=.002) had greater OTBMCoV than the HC group (M=.30, SE=.02). To better understand sources of IIV within each group, cluster analysis of index specific IIV was conducted. For the HIV+ group, 3 distinct clusters were extracted: 1. High IIV in attention/memory and attention/executive functioning (n=3, 8%); 2. Elevated memory IIV (n=21, 52%); 3. Low IIV across all indices (n=16, 40%). For the HIV-I group, 2 distinct clusters were extracted: 1. High IIV across all 3 indices (n=7, 24%) and 2. Low IIV across all 3 indices (n=22, 76%). For the HC group, 3 distinct clusters were extracted: 1. Very low IIV across all 3 indices (n=5, 36%); 2. Elevated memory IIV (n=6, 43%); 3. Elevated attention/executive functioning IIV with very low attention/memory and memory IIV (n=3, 21%). Sample size of the HIV+I group was insufficient to extract clusters.
Conclusions:
Current findings support IIV in the Neuropsi test battery as clinically sensitive marker for cognitive impairment in Spanish speaking individuals living with HIV or who inject drugs. Furthermore, the distinct IIV cluster types identified between groups can help to better understand specific sources of variability. Implications for clinical assessment in prognosis and etiological considerations are discussed.
Nonpathological aging has been linked to decline in both verbal and visuospatial memory abilities in older adults. Disruptions in resting-state functional connectivity within well-characterized, higherorder cognitive brain networks have also been coupled with poorer memory functioning in healthy older adults and in older adults with dementia. However, there is a paucity of research on the association between higherorder functional connectivity and verbal and visuospatial memory performance in the older adult population. The current study examines the association between resting-state functional connectivity within the cingulo-opercular network (CON), frontoparietal control network (FPCN), and default mode network (DMN) and verbal and visuospatial learning and memory in a large sample of healthy older adults. We hypothesized that greater within-network CON and FPCN functional connectivity would be associated with better immediate verbal and visuospatial memory recall. Additionally, we predicted that within-network DMN functional connectivity would be associated with improvements in delayed verbal and visuospatial memory recall. This study helps to glean insight into whether within-network CON, FPCN, or DMN functional connectivity is associated with verbal and visuospatial memory abilities in later life.
Participants and Methods:
330 healthy older adults between 65 and 89 years old (mean age = 71.6 ± 5.2) were recruited at the University of Florida (n = 222) and the University of Arizona (n = 108). Participants underwent resting-state fMRI and completed verbal memory (Hopkins Verbal Learning Test - Revised [HVLT-R]) and visuospatial memory (Brief Visuospatial Memory Test - Revised [BVMT-R]) measures. Immediate (total) and delayed recall scores on the HVLT-R and BVMT-R were calculated using each test manual’s scoring criteria. Learning ratios on the HVLT-R and BVMT-R were quantified by dividing the number of stimuli (verbal or visuospatial) learned between the first and third trials by the number of stimuli not recalled after the first learning trial. CONN Toolbox was used to extract average within-network connectivity values for CON, FPCN, and DMN. Hierarchical regressions were conducted, controlling for sex, race, ethnicity, years of education, number of invalid scans, and scanner site.
Results:
Greater CON connectivity was significantly associated with better HVLT-R immediate (total) recall (ß = 0.16, p = 0.01), HVLT-R learning ratio (ß = 0.16, p = 0.01), BVMT-R immediate (total) recall (ß = 0.14, p = 0.02), and BVMT-R delayed recall performance (ß = 0.15, p = 0.01). Greater FPCN connectivity was associated with better BVMT-R learning ratio (ß = 0.13, p = 0.04). HVLT-R delayed recall performance was not associated with connectivity in any network, and DMN connectivity was not significantly related to any measure.
Conclusions:
Connectivity within CON demonstrated a robust relationship with different components of memory function as well across verbal and visuospatial domains. In contrast, FPCN only evidenced a relationship with visuospatial learning, and DMN was not significantly associated with memory measures. These data suggest that CON may be a valuable target in longitudinal studies of age-related memory changes, but also a possible target in future non-invasive interventions to attenuate memory decline in older adults.
Injection drug use is a significant public health crisis with adverse health outcomes, including increased risk of human immunodeficiency virus (HIV) infection. Comorbidity of HIV and injection drug use is highly prevalent in the United States and disproportionately elevated in surrounding territories such as Puerto Rico. While both HIV status and injection drug use are independently known to be associated with cognitive deficits, the interaction of these effects remains largely unknown. The aim of this study was to determine how HIV status and injection drug use are related to cognitive functioning in a group of Puerto Rican participants. Additionally, we investigated the degree to which type and frequency of substance use predict cognitive abilities.
Participants and Methods:
96 Puerto Rican adults completed the Neuropsi Attention and Memory-3rd Edition battery for Spanish-speaking participants. Injection substance use over the previous 12 months was also obtained via clinical interview. Participants were categorized into four groups based on HIV status and injection substance use in the last 30 days (HIV+/injector, HIV+/non-injector, HIV/injector, HIV-/non-injector). One-way analysis of variance (ANOVA) was conducted to determine differences between groups on each index of the Neuropsi battery (Attention and Executive Function; Memory; Attention and Memory). Multiple linear regression was used to determine whether type and frequency of substance use predicted performance on these indices while considering HIV status.
Results:
The one-way ANOVAs revealed significant differences (p’s < 0.01) between the healthy control group and all other groups across all indices. No significant differences were observed between the other groups. Injection drug use, regardless of the substance, was associated with lower combined attention and memory performance compared to those who inject less than monthly (Monthly: p = 0.04; 2-3x daily: p < 0.01; 4-7x daily: p = 0.02; 8+ times daily: p < 0.01). Both minimal and heavy daily use predicted poorer memory performance (p = 0.02 and p = 0.01, respectively). Heavy heroin use predicted poorer attention and executive functioning (p = 0.04). Heroin use also predicted lower performance on tests of memory when used monthly (p = 0.049), and daily or almost daily (2-6x weekly: p = 0.04; 4-7x daily: p = 0.04). Finally, moderate injection of heroin predicted lower scores on attention and memory (Weekly: p = 0.04; 2-6x weekly: p = 0.048). Heavy combined heroin and cocaine use predicted worse memory performance (p = 0.03) and combined attention and memory (p = 0.046). HIV status was not a moderating factor in any circumstance.
Conclusions:
As predicted, residents of Puerto Rico who do not inject substances and are HIVnegative performed better in domains of memory, attention, and executive function than those living with HIV and/or inject substances. There was no significant difference among the affected groups in cognitive ability. As expected, daily injection of substances predicted worse performance on tasks of memory. Heavy heroin use predicted worse performance on executive function and memory tasks, while heroin-only and combined heroin and cocaine use predicted worse memory performance. Overall, the type and frequency of substance is more predictive of cognitive functioning than HIV status.
This study examined physicians’ reasoning about obtaining transesophageal echocardiography (TEE) in cases of Staphylococcus aureus bacteremia (SAB). In 221 cases of SAB over 5 years, the most common reasons for not performing TEE were clinical response to antibiotics, negative TTE results, and the expectation that TEE would not change management.
Clinical implementation of risk calculator models in the clinical high-risk for psychosis (CHR-P) population has been hindered by heterogeneous risk distributions across study cohorts which could be attributed to pre-ascertainment illness progression. To examine this, we tested whether the duration of attenuated psychotic symptom (APS) worsening prior to baseline moderated performance of the North American prodrome longitudinal study 2 (NAPLS2) risk calculator. We also examined whether rates of cortical thinning, another marker of illness progression, bolstered clinical prediction models.
Methods
Participants from both the NAPLS2 and NAPLS3 samples were classified as either ‘long’ or ‘short’ symptom duration based on time since APS increase prior to baseline. The NAPLS2 risk calculator model was applied to each of these groups. In a subset of NAPLS3 participants who completed follow-up magnetic resonance imaging scans, change in cortical thickness was combined with the individual risk score to predict conversion to psychosis.
Results
The risk calculator models achieved similar performance across the combined NAPLS2/NAPLS3 sample [area under the curve (AUC) = 0.69], the long duration group (AUC = 0.71), and the short duration group (AUC = 0.71). The shorter duration group was younger and had higher baseline APS than the longer duration group. The addition of cortical thinning improved the prediction of conversion significantly for the short duration group (AUC = 0.84), with a moderate improvement in prediction for the longer duration group (AUC = 0.78).
Conclusions
These results suggest that early illness progression differs among CHR-P patients, is detectable with both clinical and neuroimaging measures, and could play an essential role in the prediction of clinical outcomes.
The social defeat hypothesis (SDH) suggests that a chronic experience of social defeat increases the likelihood of the development of psychosis. The SDH indicates that a negative experience of exclusion leads to an increase in the baseline activity of the mesolimbic dopamine system (MDS), which in turn leads to the onset of psychosis. Social defeat models have previously been produced using animal models and preclinical literature; however, these theories have not fully been tested in human clinical samples. There have been studies implying changes in brain structure due to social defeat interactions; however, research evidence is varied.
Objectives
This study aims to uncover whether exposure to SoDe has an impact on brain structure. Furthermore, we hope to understand if these changes are relevant to other mental health disorders.
Methods
698 (506 no SoDe, 191 SoDe) participants between the ages of 15-41 were recruited from the PRONIA-FP7 study. SoDe was measured from the self-reported questionnaires’ Bullying Scale’ and ‘The Everyday Discrimination Scale’. T1-weighted structural MRI data were processed; five 2 sample t-test analyses were carried out to compare the GMV differences in the entire sample and between the four groups.
Results
The VBM analysis showed significant group interactions in the right thalamus proper when comparing participants who had experience SoDe to participants who had not experienced SoDe including all 4 groups along with left cerebral white matter differences. In the ROP subgroup, significant group interactions in the left cerebellum white matter were found along with right cerebral white matter, left cerebral white matter and right Thalamus proper.
Conclusions
The findings suggest that there are significant group interactions in thalamus and cerebral white matter. This is in keeping with some previous research suggesting volumetric changes in the thalamus due to stress and psychosis. Similarly for white matter there is some evidence suggesting differences due to SoDe and psychosis. However, there is a scarcity of research in this area with different research suggesting distinctive findings and therefore the evidence is inconclusive. In the ROP group analysis significant group interactions were present in the cerebellum due to SoDe experience. There is research suggesting the cerebellum’s role in multiple different aspects like social interaction, higher-order cognition, working memory, cognitive flexibility, and psychotic symptoms, with every research suggesting multiple different things the role of the cerebellum in SoDe in the ROP population is in question. Nonetheless this large-scale research presents some interesting novel finding and leads the way to a new area of research. Further analysis will explore the relationship between groups on markers of stress (CRP) and neuroinflammation as potential mediation of the environmental effects of SoDe.
To characterize residential social vulnerability among healthcare personnel (HCP) and evaluate its association with severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection.
Design:
Case–control study.
Setting:
This study analyzed data collected in May–December 2020 through sentinel and population-based surveillance in healthcare facilities in Colorado, Minnesota, New Mexico, New York, and Oregon.
Participants:
Data from 2,168 HCP (1,571 cases and 597 controls from the same facilities) were analyzed.
Methods:
HCP residential addresses were linked to the social vulnerability index (SVI) at the census tract level, which represents a ranking of community vulnerability to emergencies based on 15 US Census variables. The primary outcome was SARS-CoV-2 infection, confirmed by positive antigen or real-time reverse-transcriptase– polymerase chain reaction (RT-PCR) test on nasopharyngeal swab. Significant differences by SVI in participant characteristics were assessed using the Fisher exact test. Adjusted odds ratios (aOR) with 95% confidence intervals (CIs) for associations between case status and SVI, controlling for HCP role and patient care activities, were estimated using logistic regression.
Results:
Significantly higher proportions of certified nursing assistants (48.0%) and medical assistants (44.1%) resided in high SVI census tracts, compared to registered nurses (15.9%) and physicians (11.6%). HCP cases were more likely than controls to live in high SVI census tracts (aOR, 1.76; 95% CI, 1.37–2.26).
Conclusions:
These findings suggest that residing in more socially vulnerable census tracts may be associated with SARS-CoV-2 infection risk among HCP and that residential vulnerability differs by HCP role. Efforts to safeguard the US healthcare workforce and advance health equity should address the social determinants that drive racial, ethnic, and socioeconomic health disparities.