We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
Nearly 25% of people with intellectual disability (PwID) have epilepsy compared to 1% of the UK general population. PwID are commonly excluded from research, eventually affecting their care. Understanding seizures in PwID is particularly challenging because of reliance on subjective external observation and poor objective validation. Remote electroencephalography (EEG) monitoring could capture objective data, but particular challenges and implementation strategies for this population need to be understood.
Aim
This co-production aimed to explore the accessibility and potential impact of a remote, long-term EEG tool (UnEEG 24/7 SubQ) for PwID and epilepsy.
Method
We conducted six, 2-hour long workshops; three with people with mild intellectual disability and three with families/carers of people with moderate–profound intellectual disability. Brief presentations, easy read information and model demonstrations were used to explain the problem and device. A semi-structured guide developed by a communication specialist and art-based techniques facilitated discussion with PwID. For family/carers, active listening was employed. All conversations were recorded and transcribed. Artificial intelligence-based coding and thematic analysis (ATLAS.ti and ChatGPT) were synthesised with manual theming to generate insights.
Results
Co-production included four PwID, five family members and seven care professionals. Three main themes were identified: (1) perceived benefits for improving seizure understanding, informing care and reducing family and carer responsibility to accurately identify seizures; (2) the device was feasible for some PwID but not all; and (3) appropriate person-centred communication is essential for all stakeholders to reduce concerns.
Conclusions
The workshops identified key benefits and implementing barriers to SubQ in PwID.
Persistent brain fog is common in adults with Post-Acute Sequelae of SARS-CoV-2 infection (PASC), in whom it causes distress and in many cases interferes with performance of instrumental activities of daily living (IADL) and return-to-work. There are no interventions with rigorous evidence of efficacy for this new, often disabling condition. The purpose of this pilot is to evaluate the efficacy, on a preliminary basis, of a new intervention for this condition termed Constraint-Induced Cognitive therapy (CICT). CICT combines features of two established therapeutic approaches: cognitive speed of processing training (SOPT) developed by the laboratory of K. Ball and the Transfer Package and task-oriented training components of Constraint-Induced Movement therapy developed by the laboratory of E. Taub and G. Uswatte.
Participants and Methods:
Participants were > 3 months after recovery from acute COVID symptoms and had substantial brain fog and impairment in IADL. Participants were randomized to CICT immediately or after a 3-month delay. CICT involved 36 hours of outpatient therapy distributed over 4-6 weeks. Sessions had three components: (a) videogamelike training designed to improve how quickly participants process sensory input (SOPT), (b) training on IADLs following shaping principles, and (c) a set of behavioral techniques designed to transfer gains from the treatment setting to daily life, i.e., the Transfer Package. The Transfer Package included (a) negotiating a behavioral contract with participants and one or more family members about the responsibilities of the participants, family members, and treatment team; (b) assigning homework during and after the treatment period; (c) monitoring participants’ out-of-session behavior; (d) supporting problem-solving by participants and family members about barriers to performance of IADL; and (e) making follow-up phone calls. IADL performance, brain fog severity, and cognitive impairment were assessed using validated, trans-diagnostic measures before and after treatment and three months afterwards in the immediate-CICT group and on parallel occasions in the delayed-CICT group (aka waitlist controls).
Results:
To date, five were enrolled in the immediate-CICT group; four were enrolled in the wait-list group. All had mild cognitive impairment, except for one with moderate impairment in the immediate-CICT group. Immediate-CICT participants, on average, had large reductions in brain fog severity on the Mental Clutter Scale (MCS, range = 0 to 10 points, mean change = -3.7, SD = 2.0); wait-list participants had small increases (mean change = 1.0, SD = 1.4). Notably, all five in the immediate-CICT group had clinically meaningful improvements (i.e., changes > 2 points) in performance of IADL outside the treatment setting as measured by the Canadian Occupational Performance Measure (COPM) Performance scale; only one did in the wait-list group. The advantage for the immediate-CICT group was very large on both the MCS and COPM (d’s = 1.7, p’s < .05). In follow-up, immediate-CICT group gains were retained or built-upon.
Conclusions:
These preliminary findings warrant confirmation by a large-scale randomized controlled trial. To date, CICT shows high promise as an efficacious therapy for brain fog due to PASC. CICT participants had large, meaningful improvements in IADL performance outside the treatment setting, in addition to large reductions in brain fog severity.
CI Cognitive Therapy (CICT) is a combination of behavioral techniques derived from CI Movement Therapy (CIMT) modified to apply to the cognitive domain, and Speed of (Cognitive) Processing Training (SOPT). SOPT is effective in improving cognitive function in the treatment setting and driving ability in everyday situations. The data concerning the effect of SOPT on other cognition-based instrumental activities of daily living (IADL) in everyday situations is incomplete. The strengths of CIMT, based on its Transfer Package (TP), are to facilitate 1) transfer of improved function from the treatment setting to IADL in everyday settings, and 2) long-term retention of the improved performance of IADL. This study sought to determine in a preliminary case series whether the TP of CI Movement Therapy combined with SOPT would have the same effect on a wide range of impaired cognition-based ADL.
Participants and Methods:
Participants were 6 adults with chronic stroke: mean chronicity = 36.2 months, (range, 16-56 months); mean age = 59.7 years, (range, 47-55); 1 female; 3 African American and 3 European American. Five had mild cognitive impairment, while one had moderate impairment. Participants received 35 hours of outpatient treatment in 10-15 sessions distributed over 2-6 weeks, depending on the participants’ availability. Sessions began with 1 hour of SOPT training followed by training of cognition-based ADL by the process of shaping, a common method in the behavior analysis field. Other behavior analysis methods employed in the TP of CI Movement Therapy were used, including: 1) behavior contracting, daily assignment of homework, participation of a family member in the training and monitoring process, daily administration of a structured interview assessing amount and quality of performance of 30 IADL, problem solving to overcome perceived (or real) barriers to performance of IADL. Participants were given daily homework assignments in follow-up and were contacted in periodic, pre-arranged phone calls to determine status, compliance and problem-solve.
Results:
All six participants showed marked improvement on the SOPT test similar to that in the Ball et al studies. However, here transfer to IADL outside the treatment setting was substantial. On the main real-world outcome, the Canadian Occupational Performance Measure (COPM), there were increases of 2.7±1.3 and 2.1±1.6 on the two scales (d’s = 1.9 & 1.3, respectively). (Changes on the COPM > 2 points are considered clinically meaningful and changes in d’ >.8 are considered large). On two other real-word measures, the Cognitive Task Activity Log (CTAL) and inventory of Improved and New Cognitive Activities (INCA), there was a marked increase during the acquisition phase of training. There was no loss in retention over the 6-16 months (mean = 12.2) of follow-up to date. Instead, the INCA showed strong further improvement after the end of treatment-setting training, especially in the New Activities Not Performed Since Before Stroke Onset category, going from a mean of 8.2 after training to 14.6 at the end of follow-up.
Conclusions:
These very preliminary results suggest that CICT may be an efficacious therapy for mild to moderate cognitive impairment in chronic stroke and possibly other disorders.
Load balancing of constrained healthcare resources has become a critical aspect of assuring access to care during periods of pandemic related surge. These impacts include patient surges, staffing shortages, and limited access to specialty resources. This research focuses on the creation and work of a novel statewide coordination center, the Washington Medical Coordination Center (WMCC), whose primary goal is the load balancing of patients across the healthcare continuum of Washington State.
Methods:
This article discusses the origins, development, and operations of the WMCC including key partners, cooperative agreements, and structure necessary to create a patient load balancing system on a statewide level.
Results:
As of April 21, 2022, the WMCC received 3821 requests from Washington State hospitals. Nearly 90% were received during the pandemic surge. Nearly 75% originated from rural hospitals that are most often limited in their ability to transfer patients when referral centers are also overwhelmed.
Conclusions:
The WMCC served as an effective tool to carry out patient load balancing activities during the COVID-19 pandemic surge in Washington State. It (the WMCC) has been shown to be an equity enhancing, cost effective means of managing healthcare surge events across a broad geographic region.
Obesity is one of the major contributors to the excess mortality seen in people with severe mental illness (SMI) and in low- and middle-income countries people with SMI may be at an even greater risk. In this study, we aimed to determine the prevalence of obesity and overweight in people with SMI and investigate the association of obesity and overweight with sociodemographic variables, other physical comorbidities, and health-risk behaviours. This was a multi-country cross-sectional survey study where data were collected from 3989 adults with SMI from three specialist mental health institutions in Bangladesh, India, and Pakistan. The prevalence of overweight and obesity was estimated using Asian BMI thresholds. Multinomial regression models were then used to explore associations between overweight and obesity with various potential determinants. There was a high prevalence of overweight (17·3 %) and obesity (46·2 %). The relative risk of having obesity (compared to normal weight) was double in women (RRR = 2·04) compared with men. Participants who met the WHO recommendations for fruit and vegetable intake had 2·53 (95 % CI: 1·65–3·88) times greater risk of having obesity compared to those not meeting them. Also, the relative risk of having obesity in people with hypertension is 69 % higher than in people without hypertension (RRR = 1·69). In conclusion, obesity is highly prevalent in SMI and associated with chronic disease. The complex relationship between diet and risk of obesity was also highlighted. People with SMI and obesity could benefit from screening for non-communicable diseases, better nutritional education, and context-appropriate lifestyle interventions.
It is acknowledged that health technology assessment (HTA) is an inherently value-based activity that makes use of normative reasoning alongside empirical evidence. But the language used to conceptualise and articulate HTA's normative aspects is demonstrably unnuanced, imprecise, and inconsistently employed, undermining transparency and preventing proper scrutiny of the rationales on which decisions are based. This paper – developed through a cross-disciplinary collaboration of 24 researchers with expertise in healthcare priority-setting – seeks to address this problem by offering a clear definition of key terms and distinguishing between the types of normative commitment invoked during HTA, thus providing a novel conceptual framework for the articulation of reasoning. Through application to a hypothetical case, it is illustrated how this framework can operate as a practical tool through which HTA practitioners and policymakers can enhance the transparency and coherence of their decision-making, while enabling others to hold them more easily to account. The framework is offered as a starting point for further discussion amongst those with a desire to enhance the legitimacy and fairness of HTA by facilitating practical public reasoning, in which decisions are made on behalf of the public, in public view, through a chain of reasoning that withstands ethical scrutiny.
Many governments across the world provide extensive funding to national sports teams and individual athletes in pursuit of success at international competitions such as the Olympic Games. One factor that motivates governments to fund national sports teams is the potential to exploit the elevation in nationalistic pride that attends international sporting success. Drawing on research in the psychology of sport, this article contends that politicians can access the ‘reflective glow’ of successful athletes for their political benefit. The statistical correlation between government funding and Olympic success is explored using the basic prisoners’ dilemma to represent the decisions of two governments competing for sports success. While the analysis is simple, we argue that it sheds some light on recent examples and represents a first step in understanding this complex issue.
In 2017 the Scottish Government passed the Child Poverty (Scotland) Act with the commitment to significantly reduce the relative child poverty rate from the current prevailing level of around 25% to 10% by 2030/31. In response, the government introduced the Scottish Child Payment (SCP) that provides a direct transfer to households at a fixed rate per eligible child – currently £25 per week. In this paper we explore, using a micro to macro modelling approach, the effectiveness of using the SCP to achieve the Scottish child poverty targets. While we find that the ambitious child poverty targets can technically be met solely using the SCP, the necessary payment of £165 per week amounting to a total government cost of £3 billion per year, makes the political and economy-wide barriers significant. A key issue with only using the SCP is the non-linearity in the response to the payment; as the payment increases, the marginal gain in the reduction of child poverty decreases – this is particularly evident after payments of £80 per week. A ‘policy-mix’ option combining the SCP, targeted cash transfers and other policy levels (such as childcare provision) seems the most promising approach to reaching the child poverty targets.
Suicide is the second leading cause of death in all youth and among adults with bipolar disorder (BD). The risk of suicide in BD is among the highest of all psychiatric conditions. Self-harm, including suicide attempts and non-suicidal self-injury, is a leading risk factor for suicide. Neuroimaging studies suggest reward circuits are implicated in both BD and self-harm; however, studies have yet to examine self-harm related resting-state functional connectivity (rsFC) phenotypes within adolescent BD.
Methods
Resting-state fMRI data were analyzed for 141 adolescents, ages 13–20 years, including 38 with BD and lifetime self-harm (BDSH+), 33 with BD and no self-harm (BDSH−), and 70 healthy controls (HC). The dorsolateral prefrontal cortex (dlPFC), orbitofrontal cortex (OFC) and amygdala were examined as regions of interest in seed-to-voxel analyses. A general linear model was used to explore the bivariate correlations for each seed.
Results
BDSH− had increased positive rsFC between the left amygdala and left lateral occipital cortex, and between the right dlPFC and right frontal pole, and increased negative rsFC between the left amygdala and left superior frontal gyrus compared to BDSH+ and HC. BDSH+ had increased positive rsFC of the right OFC with the precuneus and left paracingulate gyrus compared to BDSH− and HC.
Conclusions
This study provides preliminary evidence of altered reward-related rsFC in relation to self-harm in adolescents with BD. Between-group differences conveyed a combination of putative risk and resilience connectivity patterns. Future studies are warranted to evaluate changes in rsFC in response to treatment and related changes in self-harm.
Response to lithium in patients with bipolar disorder is associated with clinical and transdiagnostic genetic factors. The predictive combination of these variables might help clinicians better predict which patients will respond to lithium treatment.
Aims
To use a combination of transdiagnostic genetic and clinical factors to predict lithium response in patients with bipolar disorder.
Method
This study utilised genetic and clinical data (n = 1034) collected as part of the International Consortium on Lithium Genetics (ConLi+Gen) project. Polygenic risk scores (PRS) were computed for schizophrenia and major depressive disorder, and then combined with clinical variables using a cross-validated machine-learning regression approach. Unimodal, multimodal and genetically stratified models were trained and validated using ridge, elastic net and random forest regression on 692 patients with bipolar disorder from ten study sites using leave-site-out cross-validation. All models were then tested on an independent test set of 342 patients. The best performing models were then tested in a classification framework.
Results
The best performing linear model explained 5.1% (P = 0.0001) of variance in lithium response and was composed of clinical variables, PRS variables and interaction terms between them. The best performing non-linear model used only clinical variables and explained 8.1% (P = 0.0001) of variance in lithium response. A priori genomic stratification improved non-linear model performance to 13.7% (P = 0.0001) and improved the binary classification of lithium response. This model stratified patients based on their meta-polygenic loadings for major depressive disorder and schizophrenia and was then trained using clinical data.
Conclusions
Using PRS to first stratify patients genetically and then train machine-learning models with clinical predictors led to large improvements in lithium response prediction. When used with other PRS and biological markers in the future this approach may help inform which patients are most likely to respond to lithium treatment.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Conflicts between people with different worldviews, values and perspectives over nature and its conservation can be damaging for both people and nature. Managing such conflicts is therefore a priority and key to effective conservation. In this chapter we outline some of the current approaches to managing conflicts in conservation. We focus on the aim of bringing about fundamental shifts in the ways in which the people involved in the conflict reflect on the real point of conflict and the paradigms and approaches used to mitigate it, leading to the transformation of the institutions and discourses, as well as in the relationships within and between the conflicted parties. We conclude with the need to focus on worldviews, as they can and do shape evidence, institutional arrangements and approaches to conservation, including the way in which conflicts are managed.
The evidence of funerary archaeology, historical sources and poetry has been used to define a ‘heroic warrior ethos’ across Northern Europe during the first millennium AD. In northern Britain, burials of later prehistoric to early medieval date are limited, as are historical and literary sources. There is, however, a rich sculptural corpus, to which a newly discovered monolith with an image of a warrior can now be added. Comparative analysis reveals a materialisation of a martial ideology on carved stone monuments, probably associated with elite cemeteries, highlighting a regional expression of the warrior ethos in late Roman and post-Roman Europe.
Little is known about the types of intestinal parasites that infected people living in prehistoric Britain. The Late Bronze Age archaeological site of Must Farm was a pile-dwelling settlement located in a wetland, consisting of stilted timber structures constructed over a slow-moving freshwater channel. At excavation, sediment samples were collected from occupation deposits around the timber structures. Fifteen coprolites were also hand-recovered from the occupation deposits; four were identified as human and seven as canine, using fecal lipid biomarkers. Digital light microscopy was used to identify preserved helminth eggs in the sediment and coprolites. Eggs of fish tapeworm (Diphyllobothrium latum and Diphyllobothrium dendriticum), Echinostoma sp., giant kidney worm (Dioctophyma renale), probable pig whipworm (Trichuris suis) and Capillaria sp. were found. This is the earliest evidence for fish tapeworm, Echinostoma worm, Capillaria worm and the giant kidney worm so far identified in Britain. It appears that the wetland environment of the settlement contributed to establishing parasite diversity and put the inhabitants at risk of infection by helminth species spread by eating raw fish, frogs or molluscs that flourish in freshwater aquatic environments, conversely the wetland may also have protected them from infection by certain geohelminths.