To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Attentional impairments are common in dementia with Lewy bodies and its prodromal stage of mild cognitive impairment (MCI) with Lewy bodies (MCI-LB). People with MCI may be capable of compensating for subtle attentional deficits in most circumstances, and so these may present as occasional lapses of attention. We aimed to assess the utility of a continuous performance task (CPT), which requires sustained attention for several minutes, for measuring attentional performance in MCI-LB in comparison to Alzheimer’s disease (MCI-AD), and any performance deficits which emerged with sustained effort.
We included longitudinal data on a CPT sustained attention task for 89 participants with MCI-LB or MCI-AD and 31 healthy controls, estimating ex-Gaussian response time parameters, omission and commission errors. Performance trajectories were estimated both cross-sectionally (intra-task progress from start to end) and longitudinally (change in performance over years).
While response times in successful trials were broadly similar, with slight slowing associated with clinical parkinsonism, those with MCI-LB made considerably more errors. Omission errors were more common throughout the task in MCI-LB than MCI-AD (OR 2.3, 95% CI: 1.1–4.7), while commission errors became more common after several minutes of sustained attention. Within MCI-LB, omission errors were more common in those with clinical parkinsonism (OR 1.9, 95% CI: 1.3–2.9) or cognitive fluctuations (OR 4.3, 95% CI: 2.2–8.8).
Sustained attention deficits in MCI-LB may emerge in the form of attentional lapses leading to omissions, and a breakdown in inhibitory control leading to commission errors.
Subtle changes in memory, attention, and spatial navigation abilities have been associated with preclinical Alzheimer disease (AD). The current study examined whether baseline AD biomarkers are associated with self- and informant-reported decline in memory, attention, and spatial navigation.
Clinically normal (Clinical Dementia Rating Scale (CDR®) = 0) adults aged 56–93 (N = 320) and their informants completed the memory, divided attention, and visuospatial abilities (which assesses spatial navigation) subsections of the Everyday Cognition Scale (ECog) annually for an average of 4 years. Biomarker data was collected within (±) 2 years of baseline (i.e., cerebrospinal fluid (CSF) p-tau181/Aβ42 ratio and hippocampal volume). Clinical progression was defined as CDR > 0 at time of final available ECog.
Self- and informant-reported memory, attention, and spatial navigation significantly declined over time (ps < .001). Baseline AD biomarkers were significantly associated with self- and informant-reported decline in cognitive ability (ps < .030), with the exception of p-tau181/Aβ42 ratio and self-reported attention (p = .364). Clinical progression did not significantly moderate the relationship between AD biomarkers and decline in self- or informant-reported cognitive ability (ps > .062). Post-hoc analyses indicated that biomarker burden was also associated with self- and informant-reported decline in total ECog (ps < .002), and again clinical progression did not significantly moderate these relationships (ps > .299).
AD biomarkers at baseline may indicate risk of decline in self- and informant-reported change in memory, attention, and spatial navigation ability. As such, subjectively reported decline in these domains may have clinical utility in tracking the subtle cognitive changes associated with the earliest stages of AD.
Our institution sought to evaluate our antimicrobial stewardship empiric treatment recommendations for Salmonella. Results from 36 isolates demonstrated reduced susceptibilities to fluoroquinolones with 1 isolate susceptible only to ceftriaxone. Analysis supports the current recommendation of empiric ceftriaxone therapy for severe infection and updated recommendation for sulfamethoxazole-trimethoprim in non-severe infections.
Microbial processing of soil organic matter is a significant driver of C cycling, yet we lack an understanding of what shapes the turnover of this large terrestrial pool. In part, this is due to limited options for accurately identifying the source of C assimilated by microbial communities. Laboratory incubations are the most common method for this; however, they can introduce artifacts due to sample disruption and processing and can take months to produce sufficient CO2 for analysis. We present a biomass extraction method which allows for the direct 14C analysis of microbial biomolecules and compare the results to laboratory incubations. In the upper 50 cm soil depths, the Δ14C from incubations was indistinguishable from that of extracted microbial biomass. Below 50 cm, the Δ14C of the biomass was more depleted than that of the incubations, either due to the stimulation of labile C decomposition in the incubations, the inclusion of biomolecules from non-living cells in the biomass extractions, or differences in C used for assimilation versus respiration. Our results suggest that measurement of Δ14C of microbial biomass extracts can be a useful alternative to soil incubations.
Data from a national survey of 348 U.S. sports field managers were used to examine the effects of participation in Cooperative Extension events on the adoption of turfgrass weed management practices. Of the respondents, 94% attended at least one event in the previous three years. Of this 94%, 97% reported adopting at least one practice as a result of knowledge gained at an Extension turfgrass event. Half of the respondents adopted four or more practices; a third adopted five or more practices. Non-chemical, cultural practices were the most-adopted practices (65% of respondents). Multiple regression analysis was used to examine factors explaining practice adoption and Extension event attendance. Compared to attending one event, attending three events increased total adoption by an average of one practice. Attending four or more events increased total adoption by two practices. Attending four or more events (compared to one event) increased the odds of adopting six individual practices by 3- to 6-fold, depending on the practice. This suggests practice adoption could be enhanced by encouraging repeat attendance among past Extension event attendees. Manager experience was a statistically significant predictor of the number of Extension events attended, but a poor direct predictor of practice adoption. Experience does not appear to increase adoption directly, but indirectly, via its impact on Extension event attendance. In addition to questions about weed management generally, the survey asked questions about annual bluegrass management, specifically. Respondents were asked to rank seven sources of information for their helpfulness in managing annual bluegrass. There was no single dominant information source, but Extension was ranked as the most helpful more than any other source (by 22% of the respondents) and was ranked among the top three by 53%, closely behind field representative/local distributor sources at 54%.
The view advanced by Madole & Harden falls back on the dogma of a gene as a DNA sequence that codes for a fixed product with an invariant function regardless of temporal and spatial contexts. This outdated perspective entrenches the metaphor of genes as static units of information and glosses over developmental complexities.
Preclinical Alzheimer disease (AD) has been associated with subtle changes in memory, attention, and spatial navigation abilities. The current study examined whether self- and informant-reported domain-specific cognitive changes are sensitive to AD-associated biomarkers.
Clinically normal adults aged 56–93 and their informants completed the memory, divided attention, and visuospatial abilities (which assesses spatial navigation) subsections of the Everyday Cognition Scale (ECog). Reliability and validity of these subsections were examined using Cronbach’s alpha and confirmatory factor analysis. Logistic regression was used to examine the ability of ECog subsections to predict AD-related biomarkers (cerebrospinal fluid (CSF) ptau181/Aβ42 ratio (N = 371) or hippocampal volume (N = 313)). Hierarchical logistic regression was used to examine whether the self-reported subsections continued to predict biomarkers when controlling for depressive symptomatology if available (N = 197). Additionally, logistic regression was used to examine the ability of neuropsychological composites assessing the same or similar cognitive domains as the subsections (memory, executive function, and visuospatial abilities) to predict biomarkers to allow for comparison of the predictive ability of subjective and objective measures.
All subsections demonstrated appropriate reliability and validity. Self-reported memory (with outliers removed) was the only significant predictor of AD biomarker positivity (i.e., CSF ptau181/Aβ42 ratio; p = .018) but was not significant when examined in the subsample with depressive symptomatology available (p = .517). Self-reported memory (with outliers removed) was a significant predictor of CSF ptau181/Aβ42 ratio biomarker positivity when the objective memory composite was included in the model.
ECog subsections were not robust predictors of AD biomarker positivity.
Over the past decade, transdiagnostic indicators in relation to neurobiological processes have provided extensive insight into youth’s risk for psychopathology. During development, exposure to childhood trauma and dysregulation (i.e., so-called AAA symptomology: anxiety, aggression, and attention problems) puts individuals at a disproportionate risk for developing psychopathology and altered network-level neural functioning. Evidence for the latter has emerged from resting-state fMRI studies linking mental health symptoms and aberrations in functional networks (e.g., cognitive control (CCN), default mode networks (DMN)) in youth, although few of these investigations have used longitudinal designs. Herein, we leveraged a three-year longitudinal study to identify whether traumatic exposures and concomitant dysregulation trigger changes in the developmental trajectories of resting-state functional networks involved in cognitive control (N = 190; 91 females; time 1 Mage = 11.81). Findings from latent growth curve analyses revealed that greater trauma exposure predicted increasing connectivity between the CCN and DMN across time. Greater levels of dysregulation predicted reductions in within-network connectivity in the CCN. These findings presented in typically developing youth corroborate connectivity patterns reported in clinical populations, suggesting there is predictive utility in using transdiagnostic indicators to forecast alterations in resting-state networks implicated in psychopathology.
In recent years, there has been significant momentum in applying deep learning (DL) to machine health monitoring (MHM). It has been widely claimed that DL methodologies are superior to more traditional techniques in this area. This paper aims to investigate this claim by analysing a real-world dataset of helicopter sensor faults provided by Airbus. Specifically, we will address the problem of machine sensor health unsupervised classification. In a 2019 worldwide competition hosted by Airbus, Fujitsu Systems Europe (FSE) won first prize by achieving an F1-score of 93% using a DL model based on generative adversarial networks (GAN). In another comprehensive study, various modified and existing image encoding methods were compared for the convolutional auto-encoder (CAE) model. The best classification result was achieved using the scalogram as the image encoding method, with an F1-score of 91%. In this paper, we use these two studies as benchmarks to compare with basic statistical analysis methods and the one-class supporting vector machine (SVM). Our comparative study demonstrates that while DL-based techniques have great potential, they are not always superior to traditional methods. We therefore recommend that all future published studies of applying DL methods to MHM include appropriately selected traditional reference methods, wherever possible.
This paper considers the structure and priorities of the Carthaginian state in its imperial endeavours in both North Africa and across the Mediterranean, focusing especially on the well-documented period of the Punic Wars (264–146 BC.). It suggests that Carthaginian constitutional structures, in particular the split between civil shofetim (‘judges’) and military rabbim (‘generals’), impacted the strategic outlook and marginal bellicosity of the city, making it less competitive against its primary peer-rival in the Western Mediterranean, Rome.
Key theoretical frameworks have proposed that examining the impact of exposure to specific dimensions of stress at specific developmental periods is likely to yield important insight into processes of risk and resilience. Utilizing a sample of N = 549 young adults who provided a detailed retrospective history of their lifetime exposure to numerous dimensions of traumatic stress and ratings of their current trauma-related symptomatology via completion of an online survey, here we test whether an individual’s perception of their lifetime stress as either controllable or predictable buffered the impact of exposure on trauma-related symptomatology assessed in adulthood. Further, we tested whether this moderation effect differed when evaluated in the context of early childhood, middle childhood, adolescence, and young adulthood stress. Consistent with hypotheses, results highlight both stressor controllability and stressor predictability as buffering the impact of traumatic stress exposure on trauma-related symptomatology and suggest that the potency of this buffering effect varies across unique developmental periods. Leveraging dimensional ratings of lifetime stress exposure to probe heterogeneity in outcomes following stress – and, critically, considering interactions between dimensions of exposure and the developmental period when stress occurred – is likely to yield increased understanding of risk and resilience following traumatic stress.
The worldwide spread of the COVID-19 pandemic affected all major sectors, including higher education. The measures to contain this deadly disease led to the closure of universities across the globe, introducing several changes in students’ academic and social experience. During the last two years, self-isolation together with the difficulties linked to online teaching and learning, have amplified psychological burden and mental health vulnerability of students.
We aimed to explore in depth students’ feelings and perspectives regarding the impact of the COVID-19 on their mental health and to compare these data among students from Italy and the UK.
Data were resulting from the qualitative arm of “the CAMPUS study”, a large ongoing project to longitudinally assess the mental health of university students enrolled at the University of Milano-Bicocca (Unimib, Italy) and the University of Surrey (UoS, Guildford, UK). We conducted in-depth interviews through the Microsoft Teams online platform between September 2021 and April 2022, and thematically analysed the transcripts.
A total of 33 students (15 for Unimib and 18 for UoS), with a wide range of sociodemographic characteristics, were interviewed. Four themes were identified: i) impact of COVID-19 on students’ mental health; ii) causes of poor mental health; iii) most vulnerable subgroups; vi) coping strategies.
Anxiety symptoms, social anxiety, and stress were frequently reported as negative effects of the pandemic, while the main sources of poor mental health were identified in loneliness, exceeding time online, unhealthy management of space and time, bad organization/communication with university, low motivation and uncertainty about the future. Freshers, international or off-campus students, as well as both extremely extroverted and introverted subjects, represented the most vulnerable populations, because of their extensive exposure to loneliness. Among coping strategies, the opportunity to take time for yourself, family support, and mental health support were common in the sample.
Some differences were found comparing students from Italy and the UK. While at Unimib the impact of COVID-19 on mental health was mainly described in relation to academic worries and the inadequate organization of the university system, UoS students, familiar to the conviviality of campus life, explained these effects as a result of the drastic loss of social connectedness.
The current study highlights the key role of mental health support for university students, mainly during crisis times, and calls for measures to improve communication between students and the educational institution, as well as to encourage social connectedness.
Traumatic brain injury (TBI) is highly prevalent in prison populations, with an estimated prevalence of 51%-82% according to a 2018 review. TBI has been linked to higher rates of interpersonal violence, recidivism, suicide, higher drop-out rates in rehabilitation programmes, and lower age of first conviction. Attention deficit hyperactivity disorder (ADHD) has been shown to be associated with an increased risk of interpersonal violence, and previous TBI. Little is known about prevalence of TBI or ADHD amongst inpatients in secure psychiatric settings in the UK. We aimed to estimate the prevalence of TBI and ADHD in inpatients admitted to a psychiatric intensive care unit (PICU) and to low and medium secure units across three London mental health NHS trusts.
60 male participants were identified through prospective purposive sampling. Three questionnaires were administered: the Brain Injury screening Index (BISI); Adult ADHD Self-Report Scale v1.1 (ASRS); and the Brief-Barkley Adult ADHD Rating scale (B-BAARS). We also reviewed medical records of participants, age, psychiatric diagnoses, level of education, and convictions for violent and/or non-violent offences, number of admissions, and length of current admission. Ethical approval was granted by the local research ethics committee
67.8% of participants screened positive for a history of head injury, and 68.3% and 32.2% screened positive on the ASRS and B-BAARS respectively. 38.33% recorded greater than one head injury on the BISI. The most commonly recorded psychiatric diagnoses were schizophrenia (43.33%), schizoaffective disorder (23.33%), Bipolar Affective Disorder (11.67%), and Unspecified Non-Organic Psychosis (10.00%). Screening positive on ASRS was associated with screening positive for previous head injuries BISI (p = 0.01, ꭕ2). No other statistical associations were identified.
A relatively high proportion of participants screened positive for head injury and ADHD in this population. A history of head injury was associated with positive screening on the ASRS, which is consistent with previously reported associations between these conditions in other populations. A similar relationship was not seen with the B-BAARS however, and it is notable that fewer participants in the sample screened positive on the B-BAARS than using the ASRS. Few (n = 5) patients were able to provide detailed descriptions of head injuries using the BISI, suggesting that the BISI may not be suitable in this specific population as a screening tool.
The ability to accurately identify human gait intent is a challenge relevant to the success of many applications in robotics, including, but not limited to, assistive devices. Most existing intent identification approaches, however, are either sensor-specific or use a pattern-recognition approach that requires large amounts of training data. This paper introduces a real-time walking speed intent identification algorithm based on the Mahalanobis distance that requires minimal training data. This data efficiency is enabled by making the simplifying assumption that each time step of walking data is independent of all other time steps. The accuracy of the algorithm was analyzed through human-subject experiments that were conducted using controlled walking speed changes on a treadmill. Experimental results confirm that the model used for intent identification converges quickly (within 5 min of training data). On average, the algorithm successfully detected the change in desired walking speed within one gait cycle and had a maximum of 87% accuracy at responding with the correct intent category of speed up, slow down, or no change. The findings also show that the accuracy of the algorithm improves with the magnitude of the speed change, while speed increases were more easily detected than speed decreases.
The antipsychotic aripiprazole is often used in the treatment of first-episode psychosis. Measuring aripiprazole blood levels provides an objective measure of treatment adherence, but this currently involves taking a venous blood sample and sending to a laboratory for analysis.
To detail the development, validation and utility of a new point of care (POC) test for finger-stick capillary blood concentrations of aripiprazole.
Analytical performance (sensitivity, precision, recovery and linearity) of the assay were established using spiked whole blood and control samples of varying aripiprazole concentration. Assay validation was performed over a 14-month period starting in July 2021. Eligible patients were asked to provide a finger-stick capillary sample in addition to their usual venous blood sample. Capillary blood samples were tested by the MyCare™ Insite POC analyser, which provided measurement of aripiprazole concentration in 6 min, and the venous blood sample was tested by the standard laboratory method.
A total of 101 patients agreed to measurements by the two methods. Venous blood aripiprazole concentrations as assessed by the laboratory method ranged from 17 to 909 ng/mL, and from 1 to 791 ng/mL using POC testing. The correlation coefficient between the two methods (r) was 0.96 and there was minimal bias (slope 0.91, intercept 4 ng/ml).
The MyCare Insite POC analyser is sufficiently accurate and reliable for clinical use. The availability of this technology will improve the assessment of adherence to aripiprazole and the optimising of aripiprazole dosing.
We introduce new data resources to enable spatial and nonspatial research on Canadian elections, electoral history and political geography. These include a comprehensive set of distinct identification codes for every federal electoral district in Canada from 1867 to the present, a complete set of digital boundary files for these electoral districts, historical census data aggregated to federal electoral districts, and tools to connect our district identification codes to federal election results. After describing the construction and content of these new resources, we provide an example of their use in a comparative-historical analysis of district compactness in Canada and the United States. We find that, in contrast to the United States, postwar institutional changes to district boundary-drawing processes had little effect on district compactness in Canada.
The incompressible motion of viscoplastic fluid between two semi-infinite rigid plates, hinged at their ends and rotating towards one another at constant angular velocity, generates self-similar flow fields because there is no externally imposed length scale in the absence of inertia. The magnitude of the strain rate scales with the angular velocity of the plates and the dimensionless deviatoric stresses are functions only of the polar angle and a dimensionless measure of the yield stress; they are independent of the radial distance from the corner. These flows feature unyielded regions adjacent to the boundaries for sufficiently large angles between the plates. Moreover, when the dimensionless yield stress is large, there are viscoplastic boundary layers that are attached to the boundary or the plug, the asymptotic structures of which are constructed.
Mild cognitive impairment (MCI) is an etiologically nonspecific diagnosis including a broad spectrum of cognitive decline between normal aging and dementia. Several large-scale cohort studies have found sex effects on neuropsychological test performance in MCI. The primary aim of the current project was to examine sex differences in neuropsychological profiles in a clinically diagnosed MCI sample using clinical and research diagnostic criteria.
The current study includes archival data from 349 patients (age M = 74.7; SD = 7.7) who underwent an outpatient neuropsychological evaluation and were diagnosed with MCI. Raw scores were converted to z-scores using normative datasets. Sex differences in neurocognitive profiles including severity, domain-specific composites (memory, executive functioning/information processing speed, and language), and modality-specific learning curves (verbal, visual) were examined using Analysis of Variance, Chi-square analyses, and linear mixed models. Post hoc analyses examined whether sex effects were uniform across age and education brackets.
Females exhibit worse non-memory domain and test-specific cognitive performances compared to males with otherwise comparable categorical MCI criteria and global cognition measured via screening and composite scores. Analysis of learning curves showed additional sex-specific advantages (visual Males>Females; verbal Females >Males) not captured by MCI subtypes.
Our results highlight sex differences in a clinical sample with MCI. The emphasis of verbal memory in the diagnosis of MCI may result in diagnosis at more advanced stages for females. Additional investigation is needed to determine whether these profiles confer greater risk for progressing to dementia or are confounded by other factors (e.g., delayed referral, medical comorbidities).
Mental health and functional difficulties are highly comorbid across neurological disorders, but supportive care options are limited. This randomised controlled trial assessed the efficacy of a novel transdiagnostic internet-delivered psychological intervention for adults with neurological disorders.
221 participants with a confirmed diagnosis of epilepsy, multiple sclerosis, Parkinson's disease, or an acquired brain injury were allocated to either an immediate treatment group (n = 115) or treatment-as-usual waitlist control (n = 106). The intervention, the Wellbeing Neuro Course, was delivered online via the eCentreClinic website. The Course includes six lessons, based on cognitive behavioural therapy, delivered over 10 weeks with support from a psychologist via email and telephone. Primary outcomes were symptoms of depression (PHQ-9), anxiety (GAD-7) and disability (WHODAS 2.0).
215 participants commenced the trial (treatment n = 111; control n = 104) and were included in intention-to-treat analysis. At post-treatment, we observed significant between-group differences in depression (PHQ-9; difference = 3.07 [95% CI 2.04–4.11], g = 0.62), anxiety (GAD-7; difference = 1.87 [0.92–2.81], g = 0.41) and disability (WHODAS 2.0 difference = 3.08 [1.09–5.06], g = 0.31), that favoured treatment (all ps < 0.001). Treatment-related effects were maintained at 3-month follow-up. Findings were achieved with minimal clinician time (average of 95.7 min [s.d. = 59.3] per participant), highlighting the public health potential of this approach to care. No adverse treatment events were reported.
Internet-delivered psychological interventions could be a suitable model of accessible supportive care for patients with neurological disorders.
The attempt to provide a firm scientific basis for understanding consciousness is now in full swing, with special contributions from two areas. One is experimental: brain imaging is providing ever increasing detail of the brain structures used by humans (and other animals) as they solve a variety of tasks, including those of higher cognition. The other is theoretical: the discipline of neural networks is allowing models of these cognitive processes to be constructed and tested against the available data. In particular, a control framework can be created to give a global view of the brain. The highest cognitive process, that of consciousness, is naturally a target for such experimentation and modelling. This paper reviews available data and related models leading to the central representation, which involves particular brain regions and functional processing. Principles of consciousness, which have great relevance to the question in the title, are thereby deduced. The requisite neuronal systems needed to provide animal experience, and the problem of assessing the quality and quantity of such experience, will then be considered. In conclusion, animal consciousness is seen to exist broadly across those species with the requisite control structures; the level of pain and other sensations depends in an increasingly well-defined manner on the complexity of the cerebral apparatus.