We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
From early on, infants show a preference for infant-directed speech (IDS) over adult-directed speech (ADS), and exposure to IDS has been correlated with language outcome measures such as vocabulary. The present multi-laboratory study explores this issue by investigating whether there is a link between early preference for IDS and later vocabulary size. Infants’ preference for IDS was tested as part of the ManyBabies 1 project, and follow-up CDI data were collected from a subsample of this dataset at 18 and 24 months. A total of 341 (18 months) and 327 (24 months) infants were tested across 21 laboratories. In neither preregistered analyses with North American and UK English, nor exploratory analyses with a larger sample did we find evidence for a relation between IDS preference and later vocabulary. We discuss implications of this finding in light of recent work suggesting that IDS preference measured in the laboratory has low test-retest reliability.
Mixing describes the process by which solutes evolve from an initial heterogeneous state to uniformity under the stirring action of a fluid flow. Fluid stretching forms thin scalar lamellae that coalesce due to molecular diffusion. Owing to the linearity of the advection–diffusion equation, coalescence can be envisioned as an aggregation process. Here, we demonstrate that in smooth two-dimensional chaotic flows, mixing obeys a correlated aggregation process, where the spatial distribution of the number of lamellae in aggregates is highly correlated with their elongation, and is set by the fractal properties of the advected material lines. We show that the presence of correlations makes mixing less efficient than a completely random aggregation process because lamellae with similar elongations and scalar levels tend to remain isolated from each other. We show that correlated aggregation is uniquely determined by a single exponent that quantifies the effective number of random aggregation events. These findings expand aggregation theories to a larger class of systems, which have relevance to various fundamental and applied mixing problems.
Attention-deficit/hyperactivity disorder (ADHD) is a highly prevalent psychiatric condition that frequently originates in early development and is associated with a variety of functional impairments. Despite a large functional neuroimaging literature on ADHD, our understanding of the neural basis of this disorder remains limited, and existing primary studies on the topic include somewhat divergent results.
Objectives
The present meta-analysis aims to advance our understanding of the neural basis of ADHD by identifying the most statistically robust patterns of abnormal neural activation throughout the whole-brain in individuals diagnosed with ADHD compared to age-matched healthy controls.
Methods
We conducted a meta-analysis of task-based functional magnetic resonance imaging (fMRI) activation studies of ADHD. This included, according to PRISMA guidelines, a comprehensive PubMed search and predetermined inclusion criteria as well as two independent coding teams who evaluated studies and included all task-based, whole-brain, fMRI activation studies that compared participants diagnosed with ADHD to age-matched healthy controls. We then performed multilevel kernel density analysis (MKDA) a well-established, whole-brain, voxelwise approach that quantitatively combines existing primary fMRI studies, with ensemble thresholding (p<0.05-0.0001) and multiple comparisons correction.
Results
Participants diagnosed with ADHD (N=1,550), relative to age-matched healthy controls (N=1,340), exhibited statistically significant (p<0.05-0.0001; FWE-corrected) patterns of abnormal activation in multiple brains of the cerebral cortex and basal ganglia across a variety of cognitive control tasks.
Conclusions
This study advances our understanding of the neural basis of ADHD and may aid in the development of new brain-based clinical interventions as well as diagnostic tools and treatment matching protocols for patients with ADHD. Future studies should also investigate the similarities and differences in neural signatures between ADHD and other highly comorbid psychiatric disorders.
Different fertilization strategies can be adopted to optimize the productive components of an integrated crop–livestock systems. The current research evaluated how the application of P and K to soybean (Glycine max (L.) Merr.) or Urochloa brizantha (Hochst. ex A. Rich.) R. D. Webster cv. BRS Piatã associated with nitrogen or without nitrogen in the pasture phase affects the accumulation and chemical composition of forage and animal productivity. The treatments were distributed in randomized blocks with three replications. Four fertilization strategies were tested: (1) conventional fertilization with P and K in the crop phase (CF–N); (2) conventional fertilization with nitrogen in the pasture phase (CF + N); (3) system fertilization with P and K in the pasture phase (SF–N); (4) system fertilization with nitrogen in the pasture phase (SF + N). System fertilization increased forage accumulation from 15 710 to 20 920 kg DM ha/year compared to conventional without nitrogen. Stocking rate (3.1 vs. 2.8 AU/ha; SEM = 0.12) and gain per area (458 vs. 413 kg BW/ha; SEM = 27.9) were higher in the SF–N than CF–N, although the average daily gain was lower (0.754 vs. 0.792 kg LW/day; SEM = 0.071). N application in the pasture phase, both, conventional and system fertilization resulted in higher crude protein, stocking rate and gain per area. Applying nitrogen and relocate P and K from crop to pasture phase increase animal productivity and improve forage chemical composition in integrated crop–livestock system.
To test the hypothesis that exposure to peer self-harm induces adolescents’ urges to self-harm and that this is influenced by individual suggestibility.
Methods:
We recruited 97 UK-based adults aged 18–25 years with a recent history of self-harm, measuring baseline suggestibility (Resistance to Peer Influence; RPI) and perceived ability to control urges to self-harm (using an adapted item from the Self-Efficacy to Resist Suicidal Action scale; SEASA) before and after two self-harm vignettes featuring named peers from the participant’s social network (to simulate exposure to peer non-suicidal self-harm) and after a wash-out exposure. We used paired t-tests to compare mean SEASA scores pre- and post-exposure, and linear regression to test for an association between RPI and change in SEASA scores pre- and post-exposure.
Results:
Perceived ability to control urges to self-harm was significantly reduced following exposure to peer self-harm (t(96) = 4.02, p < 0.001, mean difference = 0.61; 95% CI = 0.31, 0.91), but was not significantly different from baseline after exposure to a wash-out. We found no association between suggestibility and change in urges to self-harm after exposure to peer self-harm.
Conclusion:
Our findings support social influences on self-harm in a sample of young adults, regardless of their individual degree of suggestibility.
Area-based conservation is a widely used approach for maintaining biodiversity, and there are ongoing discussions over what is an appropriate global conservation area coverage target. To inform such debates, it is necessary to know the extent and ecological representativeness of the current conservation area network, but this is hampered by gaps in existing global datasets. In particular, although data on privately and community-governed protected areas and other effective area-based conservation measures are often available at the national level, it can take many years to incorporate these into official datasets. This suggests a complementary approach is needed based on selecting a sample of countries and using their national-scale datasets to produce more accurate metrics. However, every country added to the sample increases the costs of data collection, collation and analysis. To address this, here we present a data collection framework underpinned by a spatial prioritization algorithm, which identifies a minimum set of countries that are also representative of 10 factors that influence conservation area establishment and biodiversity patterns. We then illustrate this approach by identifying a representative set of sampling units that cover 10% of the terrestrial realm, which included areas in only 25 countries. In contrast, selecting 10% of the terrestrial realm at random included areas across a mean of 162 countries. These sampling units could be the focus of future data collation on different types of conservation area. Analysing these data could produce more rapid and accurate estimates of global conservation area coverage and ecological representativeness, complementing existing international reporting systems.
In this work, we present a methodology and a corresponding code-base for constructing mock integral field spectrograph (IFS) observations of simulated galaxies in a consistent and reproducible way. Such methods are necessary to improve the collaboration and comparison of observation and theory results, and accelerate our understanding of how the kinematics of galaxies evolve over time. This code, SimSpin, is an open-source package written in R, but also with an API interface such that the code can be interacted with in any coding language. Documentation and individual examples can be found at the open-source website connected to the online repository. SimSpin is already being utilised by international IFS collaborations, including SAMI and MAGPI, for generating comparable data sets from a diverse suite of cosmological hydrodynamical simulations.
Delirium is characterised by an acute, fluctuating change in cognition, attention and awareness (Wilson et al. Nature Reviews 2020; 6). This presentation can make the diagnosis of delirium extremely challenging to clinicians (Gofton., Canadian Journal of neurological sciences. 2011; 38 673-680). It is commonly reported in hospitalised patients, particularly in those over the age of sixty five (NICE. Delirium: prevention, diagnosis and management. 2010).
Objectives
Our aim is to identify which investigations and cognitive assessments are completed prior to a referral to the liaison psychiatry services in patients with symptoms of delirium.
Methods
Referrals (N = 6012) to the liaison psychiatry team at Croydon University Hospital made between April and September 2022 were screened. Search parameters used to identify referrals related to a potential diagnosis of delirium were selected by the authors. The terms used were confusion; delirium; agitation; aggression; cognitive decline or impairment; disorientation; challenging behaviour. Data was collected on the completion rates of investigations for delirium as advised by the NICE clinical knowledge summaries. Further data was gathered on neuroimaging (CT or MRI), cognitive assessment tools (MOCA/MMSE) and delirium screening tools (4AT/AMTS).
Results
The study sample identified 114 referrals (61 males and 53 females), with 82% over 65 years at the time of referral. In 96% of referrals, U&E and CRP were performed. Sputum culture (1%), urine toxin screen (4%) and free T3/4 (8%) were the tests utilised the least. Neuroimaging was completed in 41% of referrals (see Graph 1 for a full breakdown of results).
A formal cognitive assessment or delirium screening tool was completed in 32% of referrals. The AMTS and 4AT tools were documented for 65% and 24% respectively. A total of 19 referrals explicitly stated the patient was suspected to have dementia. A delirium screening tool was documented in 47% of these cases however, a formal cognitive assessment was documented in only 5% of these patients.
Following psychiatric assessment 47% of referrals were confirmed as delirium.
Image:
Conclusions
Our data highlights the low level completion of the NICE recommended delirium screen prior to referral to liaison psychiatry. The effective implementation of a delirium screen and cognitive assessment is paramount to reduce the number of inappropriate psychiatric referrals in hospital and helps to identify reversible organic causes of delirium. This in turn will ensure timely treatment of reversible causes of delirium and reduce the length of hospital admission.
Transitions into an assisted living home (ALH) are difficult and may impact the well-being of older adults. A thematic analysis guided by grounded theory was employed to better understand how a transition into an ALH influenced older adults’ overall well-being. Individual, face-to-face interviews were conducted with a convenience sample of 14 participants at an ALH in the rural, southeastern U.S. Two central findings that influenced well-being during the transition process were revealed: loss of independence (sub-themes include loss of physical and mental health and loss of driving) and downsizing in space and possessions. The themes support and broaden the Hierarchical Leisure Constraints Theory, a Modified Constraints to Wellbeing model is proposed, and implications for older adult health care practitioners in ALHs are recommended. Further research is needed on the Modified Constraints to Wellbeing model and how to better describe these constraints to older adults’ well-being when relocating into ALHs.
When assessing the evolution of the early Roman Republic, scholars typically designate a break between the fifth/fourth centuries and the end of the fourth century BCE/beginning of the third, based on political, legal, and military milestones. Archaeologists detect a similar break, as members of the new nobilitas turned to architecture as a vehicle for self-representation. Where most scholarship characterizes buildings and the broader cityscape as a reflection of political change, this chapter deploys theories of object agency and object-scapes to argue for their agency in effecting such change. Questioning whether Romans were conscious, at the time, of a new era dawning, I suggest that circumstantial evidence supports a hypothesis that, at least in the later Republic, they were.
Lumateperone (LUMA) is an FDA-approved antipsychotic to treat schizophrenia and depressive episodes associated with bipolar I or bipolar II disorder. An open-label study (Study 303) evaluated the safety and tolerability of LUMA in outpatients with stable schizophrenia who switched from previous antipsychotic (AP) treatment. This post hoc analysis of Study 303 investigated the safety and tolerability of LUMA stratified by previous AP in patients who switched to LUMA treatment for 6 weeks.
Methods
Adult outpatients (≥18 years) with stable schizophrenia were switched from previous AP to LUMA 42 mg once daily for 6 weeks followed by switching to another approved AP for 2 weeks follow-up. Post hoc analyses were stratified by most common previous AP: risperidone or paliperidone (RIS/PAL); quetiapine (QET); aripiprazole or brexpiprazole (ARI/BRE); olanzapine (OLA). Safety analyses included adverse events (AE), vital signs, and laboratory tests. Efficacy was assessed using the Positive and Negative Syndrome Scale (PANSS) and the Clinical Global Impressions-Severity (CGI-S) scale.
Results
The safety population comprised 301 patients, of which 235 (78.1%) were previously treated with RIS/PAL (n=95), QET (n=60), ARI/BRE (n=43), or OLA (n=37). Rates of treatment-emergent AEs (TEAEs) while on LUMA were similar between previous AP groups (44.2%-55.8%). TEAEs with incidences of ≥5% in any AP group were dry mouth, somnolence, sedation, headache, diarrhea, cough, and insomnia. Most TEAEs were mild or moderate in severity for all groups. Rates of serious TEAEs were low and similar between groups (0%–7.0%).
Statistically significant (P<.05) decreases from baseline were observed in the OLA group that switched to LUMA in total cholesterol and low-density lipoprotein cholesterol with significant decreases thereafter on LUMA. Statistically significant decreases in prolactin levels were observed in both the RIS/PAL (P<.0001) and OLA (P<.05) groups. Patients switched from RIS/PAL to LUMA showed significant (P<.05) decreases for body mass index, waist circumference, and weight. At follow-up, 2 weeks after patients switched back from LUMA to another AP, none of the decreases in laboratory parameters or body morphology observed while on LUMA maintained significance.
Those switching from QET had significant improvements from baseline at Day 42 in PANSS Total score (mean change from baseline −3.47; 95% confidence interval [CI] −5.27, −1.68; P<.001) and CGI-S Total score (mean change from baseline −0.24; 95% CI, −0.38, −0.10; P<.01).
Conclusion
In outpatients with stable schizophrenia, LUMA 42 mg treatment was well tolerated in patients switching from a variety of previous APs. Patients switching from RIS/PAL or OLA to LUMA had significant improvements in cardiometabolic and prolactin parameters. These data further support the favorable safety, tolerability, and efficacy of LUMA in patients with schizophrenia.
There is substantial variation in patient symptoms following psychological therapy for depression and anxiety. However, reliance on endpoint outcomes ignores additional interindividual variation during therapy. Knowing a patient's likely symptom trajectories could guide clinical decisions. We aimed to identify latent classes of patients with similar symptom trajectories over the course of psychological therapy and explore associations between baseline variables and trajectory class.
Methods
Patients received high-intensity psychological treatment for common mental health problems at National Health Service Improving Access to Psychological Therapies services in South London (N = 16 258). To identify trajectories, we performed growth mixture modelling of depression and anxiety symptoms over 11 sessions. We then ran multinomial regressions to identify baseline variables associated with trajectory class membership.
Results
Trajectories of depression and anxiety symptoms were highly similar and best modelled by four classes. Three classes started with moderate-severe symptoms and showed (1) no change, (2) gradual improvement, and (3) fast improvement. A final class (4) showed initially mild symptoms and minimal improvement. Within the moderate-severe baseline symptom classes, patients in the two showing improvement as opposed to no change tended not to be prescribed psychotropic medication or report a disability and were in employment. Patients showing fast improvement additionally reported lower baseline functional impairment on average.
Conclusions
Multiple trajectory classes of depression and anxiety symptoms were associated with baseline characteristics. Identifying the most likely trajectory for a patient at the start of treatment could inform decisions about the suitability and continuation of therapy, ultimately improving patient outcomes.
While studies from the start of the COVID-19 pandemic have described initial negative effects on mental health and exacerbating mental health inequalities, longer-term studies are only now emerging.
Method
In total, 34 465 individuals in the UK completed online questionnaires and were re-contacted over the first 12 months of the pandemic. We used growth mixture modelling to identify trajectories of depression, anxiety and anhedonia symptoms using the 12-month data. We identified sociodemographic predictors of trajectory class membership using multinomial regression models.
Results
Most participants had consistently low symptoms of depression or anxiety over the year of assessments (60%, 69% respectively), and a minority had consistently high symptoms (10%, 15%). We also identified participants who appeared to show improvements in symptoms as the pandemic progressed, and others who showed the opposite pattern, marked symptom worsening, until the second national lockdown. Unexpectedly, most participants showed stable low positive affect, indicating anhedonia, throughout the 12-month period. From regression analyses, younger age, reporting a previous mental health diagnosis, non-binary, or self-defined gender, and an unemployed or a student status were significantly associated with membership of the stable high symptom groups for depression and anxiety.
Conclusions
While most participants showed little change in their depression and anxiety symptoms across the first year of the pandemic, we highlight the divergent responses of subgroups of participants, who fared both better and worse around national lockdowns. We confirm that previously identified predictors of negative outcomes in the first months of the pandemic also predict negative outcomes over a 12-month period.
As part of surveillance of snail-borne trematodiasis in Knowsley Safari (KS), Prescot, United Kingdom, a collection was made in July 2021 of various planorbid (n = 173) and lymnaeid (n = 218) snails. These were taken from 15 purposely selected freshwater habitats. In the laboratory emergent trematode cercariae, often from single snails, were identified by morphology with a sub-set, of those most accessible, later characterized by cytochrome oxidase subunit 1 (cox1) DNA barcoding. Two schistosomatid cercariae were of special note in the context of human cercarial dermatitis (HCD), Bilharziella polonica emergent from Planorbarius corneus and Trichobilharzia spp. emergent from Ampullacaena balthica. The former schistosomatid was last reported in the United Kingdom over 50 years ago. From cox1 analyses, the latter likely consisted of two taxa, Trichobilharzia anseri, a first report in the United Kingdom, and a hitherto unnamed genetic lineage having some affiliation with Trichobilharzia longicauda. The chronobiology of emergent cercariae from P. corneus was assessed, with the vertical swimming rate of B. polonica measured. We provide a brief risk appraisal of HCD for public activities typically undertaken within KS educational and recreational programmes.
The impact of the coronavirus disease 2019 (COVID-19) pandemic on mental health is still being unravelled. It is important to identify which individuals are at greatest risk of worsening symptoms. This study aimed to examine changes in depression, anxiety and post-traumatic stress disorder (PTSD) symptoms using prospective and retrospective symptom change assessments, and to find and examine the effect of key risk factors.
Method
Online questionnaires were administered to 34 465 individuals (aged 16 years or above) in April/May 2020 in the UK, recruited from existing cohorts or via social media. Around one-third (n = 12 718) of included participants had prior diagnoses of depression or anxiety and had completed pre-pandemic mental health assessments (between September 2018 and February 2020), allowing prospective investigation of symptom change.
Results
Prospective symptom analyses showed small decreases in depression (PHQ-9: −0.43 points) and anxiety [generalised anxiety disorder scale – 7 items (GAD)-7: −0.33 points] and increases in PTSD (PCL-6: 0.22 points). Conversely, retrospective symptom analyses demonstrated significant large increases (PHQ-9: 2.40; GAD-7 = 1.97), with 55% reported worsening mental health since the beginning of the pandemic on a global change rating. Across both prospective and retrospective measures of symptom change, worsening depression, anxiety and PTSD symptoms were associated with prior mental health diagnoses, female gender, young age and unemployed/student status.
Conclusions
We highlight the effect of prior mental health diagnoses on worsening mental health during the pandemic and confirm previously reported sociodemographic risk factors. Discrepancies between prospective and retrospective measures of changes in mental health may be related to recall bias-related underestimation of prior symptom severity.
Since the advent of direct-acting antiviral therapy, the elimination of hepatitis c virus (HCV) as a public health concern is now possible. However, identification of those who remain undiagnosed, and re-engagement of those who are diagnosed but remain untreated, will be essential to achieve this. We examined the extent of HCV infection among individuals undergoing liver function tests (LFT) in primary care. Residual biochemistry samples for 6007 patients, who had venous blood collected in primary care for LFT between July 2016 and January 2017, were tested for HCV antibody. Through data linkage to national and sentinel HCV surveillance databases, we also examined the extent of diagnosed infection, attendance at specialist service and HCV treatment for those found to be HCV positive. Overall HCV antibody prevalence was 4.0% and highest for males (5.0%), those aged 37–50 years (6.2%), and with an ALT result of 70 or greater (7.1%). Of those testing positive, 68.9% had been diagnosed with HCV in the past, 84.9% before the study period. Most (92.5%) of those diagnosed with chronic infection had attended specialist liver services and while 67.7% had ever been treated only 38% had successfully cleared infection. More than half of HCV-positive people required assessment, and potentially treatment, for their HCV infection but were not engaged with services during the study period. LFT in primary care are a key opportunity to diagnose, re-diagnose and re-engage patients with HCV infection and highlight the importance of GPs in efforts to eliminate HCV as a public health concern.
Research among non-industrial societies suggests that body kinematics adopted during running vary between groups according to the cultural importance of running. Among groups in which running is common and an important part of cultural identity, runners tend to adopt what exercise scientists and coaches consider to be good technique for avoiding injury and maximising performance. In contrast, among groups in which running is not particularly culturally important, people tend to adopt suboptimal technique. This paper begins by describing key elements of good running technique, including landing with a forefoot or midfoot strike pattern and leg oriented roughly vertically. Next, we review evidence from non-industrial societies that cultural attitudes about running associate with variation in running techniques. Then, we present new data from Tsimane forager–horticulturalists in Bolivia. Our findings suggest that running is neither a common activity among the Tsimane nor is it considered an important part of cultural identity. We also demonstrate that when Tsimane do run, they tend to use suboptimal technique, specifically landing with a rearfoot strike pattern and leg protracted ahead of the knee (called overstriding). Finally, we discuss processes by which culture might influence variation in running techniques among non-industrial societies, including self-optimisation and social learning.
Herbicides that inhibit very-long-chain fatty acids (VLCFAs) have been widely used for preemergence control of annual monocot and small-seeded dicot weed species, such as waterhemp, since their discovery in the 1950s. VLCFA-inhibiting herbicides are often applied in combination with active ingredients that possess residual activity on small-seeded broadleaf weeds, which can make their contribution to preemergence waterhemp control difficult to quantify. Bare-ground field experiments were designed to investigate the efficacy of eight VLCFA-inhibiting herbicides applied at their minimum and maximum labeled rates for control of Illinois waterhemp populations. Four different locations were selected, two of which contained previously characterized VLCFA inhibitor–resistant waterhemp populations in Champaign County (CHR) and McLean County (MCR). Two locations with VLCFA inhibitor–sensitive waterhemp populations included the University of Illinois South Farm in Urbana, IL, and the Orr Research Center in Perry, IL. Soils at the CHR, MCR, and Urbana locations contained greater than 3% organic matter, but less than 3% organic matter at Perry. Non-encapsulated acetochlor and alachlor controlled CHR and MCR waterhemp populations 28 d after treatment (DAT), whereas other VLCFA-inhibiting herbicides resulted in 61% and 76% control of the CHR and MCR populations, respectively. In contrast, all VLCFA-inhibiting herbicides resulted in 81% and 88% control of the Perry and Urbana waterhemp populations, respectively, 28 DAT. Waterhemp control decreased by 42 DAT, especially for the VLCFA inhibitor–resistant CHR and MCR populations. Overall, VLCFA-inhibiting herbicides remain effective for controlling sensitive waterhemp, but most are not effective for controlling VLCFA inhibitor–resistant waterhemp populations. Proper herbicide stewardship and integrated weed management practices should be implemented to maintain VLCFA-inhibiting herbicide efficacy for waterhemp management in the future.