To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Experience of crisis care may vary across different care models.
To explore the experience of care in standard care and ‘open dialogue’ (a peer-supported community service focused on open dialogue and involving social networks for adults with a recent mental health crisis) 3 months after a crisis.
We conducted semi-structured interviews with 11 participants (6 received open dialogue; 5 received treatment as usual (TAU)) in a feasibility study of open dialogue and analysed the data using a three-step inductive thematic analysis to identify themes that (a) were frequently endorsed and (b) represented the experiences of all participants.
Four themes emerged: (a) feeling able to rely on and access mental health services; (b) supportive and understanding family and friends; (c) having a choice and a voice; and (d) confusion and making sense of experiences. Generally, there was a divergence in experience across the two care models. Open dialogue participants often felt able to rely on and access services and involve their family and friends in their care. TAU participants described a need to rely on services and difficulty when it was not met, needing family and friends for support and wanting them to be more involved in their care. Some participants across both care models experienced confusion after a crisis and described benefits of sense-making.
Understanding crisis care experiences across different care models can inform service development in crisis and continuing mental healthcare services.
Adverse drug reactions (ADRs) are associated with increased morbidity, mortality, and resource utilization. Drug interactions (DDIs) are among the most common causes of ADRs, and estimates have cited that up to 22% of patients take interacting medications. DDIs are often due to the propensity for agents to induce or inhibit enzymes responsible for the metabolism of concomitantly administered drugs. However, this phenomenon is further complicated by genetic variants of such enzymes. The aim of this study is to quantify and describe potential drug-drug, drug-gene, and drug-drug-gene interactions in a community-based patient population.
A regional pharmacy with retail outlets in Arkansas provided deidentified prescription data from March 2020 for 4761 individuals. Drug-drug and drug-drug-gene interactions were assessed utilizing the logic incorporated into GenMedPro, a commercially available digital gene-drug interaction software program that incorporates variants of 9 pharmacokinetic (PK) and 2 pharmacodynamic (PD) genes to evaluate DDIs and drug-gene interactions. The data were first assessed for composite drug-drug interaction risk, and each individual was stratified to a risk category using the logic incorporated in GenMedPro. To calculate the frequency of potential drug-gene interactions, genotypes were imputed and allocated to the cohort according to each gene’s frequency in the general population. Potential genotypes were randomly allocated to the population 100 times in a Monte Carlo simulation. Potential drug-drug, gene-drug, or gene-drug-drug interaction risk was characterized as minor, moderate, or major.
Based on prescription data only, the probability of a DDI of any impact (mild, moderate, or major) was 26% [95% CI: 0.248-0.272] in the population. This probability increased to 49.6% [95% CI: 0.484-0.507] when simulated genetic polymorphisms were additionally assessed. When assessing only major impact interactions, there was a 7.8% [95% CI: 0.070-0.085] probability of drug-drug interactions and 10.1% [95% CI: 0.095-0.108] probability with the addition of genetic contributions. The probability of drug-drug-gene interactions of any impact was correlated with the number of prescribed medications, with an approximate probability of 77%, 85%, and 94% in patients prescribed 5, 6, or 7+ medications, respectively. When stratified by specific drug class, antidepressants (19.5%), antiemetics (21.4%), analgesics (16%), antipsychotics (15.6%), and antiparasitics (49.7%) had the highest probability of major drug-drug-gene interaction.
In a community-based population of outpatients, the probability of drug-drug interaction risk increases when genetic polymorphisms are attributed to the population. These data suggest that pharmacogenetic testing may be useful in predicting drug interactions, drug-gene interactions, and severity of interactions when proactively evaluating patient medication profiles.
Patients presenting to hospital with suspected coronavirus disease 2019 (COVID-19), based on clinical symptoms, are routinely placed in a cohort together until polymerase chain reaction (PCR) test results are available. This procedure leads to delays in transfers to definitive areas and high nosocomial transmission rates. FebriDx is a finger-prick point-of-care test (PoCT) that detects an antiviral host response and has a high negative predictive value for COVID-19. We sought to determine the clinical impact of using FebriDx for COVID-19 triage in the emergency department (ED).
We undertook a retrospective observational study evaluating the real-world clinical impact of FebriDx as part of an ED COVID-19 triage algorithm.
Emergency department of a university teaching hospital.
Patients presenting with symptoms suggestive of COVID-19, placed in a cohort in a ‘high-risk’ area, were tested using FebriDx. Patients without a detectable antiviral host response were then moved to a lower-risk area.
Between September 22, 2020, and January 7, 2021, 1,321 patients were tested using FebriDx, and 1,104 (84%) did not have a detectable antiviral host response. Among 1,104 patients, 865 (78%) were moved to a lower-risk area within the ED. The median times spent in a high-risk area were 52 minutes (interquartile range [IQR], 34–92) for FebriDx-negative patients and 203 minutes (IQR, 142–255) for FebriDx-positive patients (difference of −134 minutes; 95% CI, −144 to −122; P < .0001). The negative predictive value of FebriDx for the identification of COVID-19 was 96% (661 of 690; 95% CI, 94%–97%).
FebriDx improved the triage of patients with suspected COVID-19 and reduced the time that severe acute respiratory coronavirus virus 2 (SARS-CoV-2) PCR-negative patients spent in a high-risk area alongside SARS-CoV-2–positive patients.
Infectious disease outbreaks on the scale of the current coronavirus disease 2019 (COVID-19) pandemic are a new phenomenon in many parts of the world. Many isolation unit designs with corresponding workflow dynamics and personal protective equipment postures have been proposed for each emerging disease at the health facility level, depending on the mode of transmission. However, personnel and resource management at the isolation units for a resilient response will vary by human resource capacity, reporting requirements, and practice setting. This study describes an approach to isolation unit management at a rural Uganda Hospital and shares lessons from the Uganda experience for isolation unit managers in low- and middle-income settings.
This story from the Korean War goes to the heart of the unique bond between Australian and New Zealand soldiers, one cemented in mutual respect, expressed by a fierce rivalry and a steadfastness to stand shoulder-to-shoulder against any foe, perceived or real. The old coat of arms for New Zealand carried the motto ‘Onward’ (also the motto of the 1 New Zealand Expeditionary Force during the First World War and of the 1 Royal New Zealand Infantry Regiment today). It is a motto of modest intent somewhat in keeping with the retiring, nocturnal and flightless kiwi emblazoned on the sleeves of members of the New Zealand Army.
This study aimed to investigate general factors associated with prognosis regardless of the type of treatment received, for adults with depression in primary care.
We searched Medline, Embase, PsycINFO and Cochrane Central (inception to 12/01/2020) for RCTs that included the most commonly used comprehensive measure of depressive and anxiety disorder symptoms and diagnoses, in primary care depression RCTs (the Revised Clinical Interview Schedule: CIS-R). Two-stage random-effects meta-analyses were conducted.
Twelve (n = 6024) of thirteen eligible studies (n = 6175) provided individual patient data. There was a 31% (95%CI: 25 to 37) difference in depressive symptoms at 3–4 months per standard deviation increase in baseline depressive symptoms. Four additional factors: the duration of anxiety; duration of depression; comorbid panic disorder; and a history of antidepressant treatment were also independently associated with poorer prognosis. There was evidence that the difference in prognosis when these factors were combined could be of clinical importance. Adding these variables improved the amount of variance explained in 3–4 month depressive symptoms from 16% using depressive symptom severity alone to 27%. Risk of bias (assessed with QUIPS) was low in all studies and quality (assessed with GRADE) was high. Sensitivity analyses did not alter our conclusions.
When adults seek treatment for depression clinicians should routinely assess for the duration of anxiety, duration of depression, comorbid panic disorder, and a history of antidepressant treatment alongside depressive symptom severity. This could provide clinicians and patients with useful and desired information to elucidate prognosis and aid the clinical management of depression.
Social anxiety disorder (SAD) is common. It usually starts in adolescence, and without treatment can disrupt key developmental milestones. Existing generic treatments are less effective for young people with SAD than with other anxiety disorders, but an adaptation of an effective adult therapy (CT-SAD-A) has shown promising results for adolescents.
The aim of this study was to conduct a qualitative exploration to contribute towards the evaluation of CT-SAD-A for adoption into Child and Adolescent Mental Health Services (CAMHS).
We used interpretative phenomenological analysis (IPA) to analyse the transcripts of interviews with a sample of six young people, six parents and seven clinicians who were learning the treatment.
Three cross-cutting themes were identified: (i) endorsing the treatment; (ii) finding therapy to be collaborative and active; challenging but helpful; and (iii) navigating change in a complex setting. Young people and parents found the treatment to be useful and acceptable, although simultaneously challenging. This was echoed by the clinicians, with particular reference to integrating CT-SAD-A within community CAMHS settings.
The acceptability of the treatment with young people, their parents and clinicians suggests further work is warranted in order to support its development and implementation within CAMHS settings.
Phillips and colleagues claim that the capacity to ascribe knowledge is a “basic” capacity, but most studies reporting linguistic data reviewed by Phillips et al. were conducted in English with American participants – one of more than 6,500 languages currently spoken. We highlight the importance of cross-cultural and cross-linguistic research when one is theorizing about fundamental human representational capacities.
The southern pine beetle, Dendroctonus frontalis Zimmermann (Coleoptera: Curculionidae: Scolytinae), is among the most destructive bark beetle pests of pines (Pinaceae) of the southeast and mid-Atlantic United States of America, Mexico, and Central America. Numerous volatile compounds can stimulate or reduce attraction of the beetle, but efforts to incorporate these into effective, practical technologies for pest management have yielded mixed results. Attractants have been incorporated into lures used in monitoring traps that are employed operationally to forecast outbreaks and detect emerging populations. The attraction inhibitor, verbenone, shows efficacy for suppressing southern pine beetle infestations but has not yet been adopted operationally. No effective semiochemical tree protectant has been developed for the beetle. We discuss complexities in the chemical ecology of the beetle that likely have impeded research and development of semiochemical management tools, and we describe basic science gaps that may hinder further progress if not addressed. We also report some supporting, original experimental data indicating (1) that a verbenone device can inhibit the beetle’s response to sources of attractant in a radius of at least several metres, (2) similar olfactory responses by the beetle to both enantiomers of verbenone, and (3) that pheromone background can cause conflicting results in semiochemical field tests.
Horace Walpole is pivotal to the early Gothic Revival as the author of what has long been hailed as the first Gothic novel, and as the creator of the most influential of all early Gothic Revival houses. This essay explores his intuitively imaginative response to Gothic, and how his love of the decorative profusion and allusive richness that it could offer was played out in his novel The Castle of Otranto (1765) and his play The Mysterious Mother (1768) – as well as in in his ‘castle’ at Strawberry Hill. That house, with its subtle management of scale, colour and light, and in the suggestive riches of the collection it contained, created a heady mixture of fantasy and atmosphere, displaying an historically informed but archaeologically unrestrained imagination. These are qualities that it shared with Walpole’s Gothic fictions. There is hardly a feature of Gothic romance that does not appear in Otranto, and its gloomy castle, predatory patriarch and pursued virgin, along with the guilt-tormented Countess and evil friars of The Mysterious Mother, like the Gothic battlements and evocative interiors of Strawberry Hill, engendered a lasting and pervasive progeny.
Although there is some evidence that duration of untreated psychosis (DUP) is geographically stable, few have examined whether the phenomenon is temporally stable. We examined DUP in two cohorts within two discrete time periods (1995–1999 and 2003–2005) spanning a decade in the same geographically defined community psychiatric service with no early intervention programme. Patients were diagnosed by Structured Clinical Interview for DSM (SCID) and we determined the DUP using the Beiser Scale. The DUP of the 240 participants did not differ significantly between study periods.
Traditional dietary assessment methods in research can be challenging, with participant burden to complete an interview, diary, 24 h recall or questionnaire and researcher burden to code the food record to obtain a nutrient breakdown. Self-reported assessment methods are subject to recall and social desirability biases, in addition to selection bias from the nature of volunteering to take part in a research study. Supermarket loyalty card transaction records, linked to back of pack nutrient information, present a novel opportunity to use objective records of food purchases to assess diet at a household level. With a large sample size and multiple transactions, it is possible to review variation in food purchases over time and across different geographical areas.
Materials and methods:
This study uses supermarket loyalty card transactions for one retailer's customers in Leeds, for 12 months during 2016. Fruit and vegetable purchases for customers who appear to shop regularly for a ‘complete’ shop, buying from at least 7 of 11 Living Cost and Food Survey categories, were calculated. Using total weight of fruits and vegetables purchased over one year, average portions (80g) per day, per household were generated. Descriptive statistics of fruit and vegetable purchases by age, gender and Index of Multiple Deprivation of the loyalty card holder were generated. Using Geographical Information Systems, maps of neighbourhood purchases per month of the year were created to visualise variations.
The loyalty card holder transaction records represent 6.4% of the total Leeds population. On average, households in Leeds purchase 3.5 portions of fruit and vegetables per day, per household. Affluent and rural areas purchase more fruit and vegetables than average with 22% purchasing more than 5 portions/day. Conversely poor urban areas purchase less, with 18% purchasing less than 1 portion/day. Highest purchases are in the winter months, with lowest in the summer holidays. Loyalty cards registered to females purchased 0.4 portions per day more than male counterparts. The over 65 years purchased 1.5 portions per day more than the 17–24 year olds. A clear deprivation gradient is observed, with the most deprived purchasing 1.5 portions less per day than the least deprived.
Loyalty card transaction data offer an exciting opportunity for measuring variation in fruit and vegetable purchases. Variation is observed by age, gender, deprivation, geographically across a city and throughout the seasons. These insights can inform both policymakers and retailers regarding areas for fruit and vegetable promotion.
Supermarket transaction data, generated from loyalty cards, offers a novel source of food purchase information. Data are available for large sample sizes, over sustained periods of time, allowing for habitual purchasing patterns to be generated. In the UK, recommended dietary patterns to achieve a healthy diet are pictorially illustrated using the Eatwell Guide. Foods include: Fruit and vegetables; starchy products including potatoes, bread, pasta, rice; dairy or dairy alternatives; proteins such as beans, pulses, fish, eggs and meat; oils and spreads; and advice to limit foods high in salt, fat and sugar. Through mapping of foods purchased to the categories of the Eatwell Guide it is possible to review population performance against these national recommendations.
Materials and methods
All loyalty card transaction records for purchases made in a UK supermarket chain, by residents of Yorkshire and the Humber during 2016 were included in this research. Customers who purchased foods from 7 or 11 Living Cost and Food Survey (LCFS) categories on ten or more occasions throughout the year were included in the sample, as these customers were considered to be purchasing the majority of their foods from the supermarket. All foods purchased were mapped to the Eatwell Guide food groups via the LCFS categories.
Households purchased: 25% of their total spend on fruits and vegetables, compared with 39% recommended; 13% on starchy products compared to 37% recommended; 23% of protein rich foods compared with 12% recommended; 12% dairy and alternatives compared to 8%; oils and spreads 2% compared to 1% recommended; and 25% foods that should be limited compared to 3% (recommended, but not pictorially illustrated on the plate).
Supermarket transaction data is a novel source of food purchase information which can be used to illustrate dietary behaviours in the UK population. However, it represents foods purchased, not consumed and is at a household level, not individual. Food purchases outside the home are not included. That said, it is arguably an objective measure for dietary assessment. From this study, it is clear to see that food purchases do not match the recommendations. Purchases of high sugar, high fat and high salt snacks constitute a significant proportion of spending, when they should in fact be limited. Protein rich products are also over-represented. Fruit and vegetables and starchy products are under-represented. This insight can benefit both retailers and policy makers for understanding the food purchase behaviours of our society.
Substantial clinical heterogeneity of major depressive disorder (MDD) suggests it may group together individuals with diverse aetiologies. Identifying distinct subtypes should lead to more effective diagnosis and treatment, while providing more useful targets for further research. Genetic and clinical overlap between MDD and schizophrenia (SCZ) suggests an MDD subtype may share underlying mechanisms with SCZ.
The present study investigated whether a neurobiologically distinct subtype of MDD could be identified by SCZ polygenic risk score (PRS). We explored interactive effects between SCZ PRS and MDD case/control status on a range of cortical, subcortical and white matter metrics among 2370 male and 2574 female UK Biobank participants.
There was a significant SCZ PRS by MDD interaction for rostral anterior cingulate cortex (RACC) thickness (β = 0.191, q = 0.043). This was driven by a positive association between SCZ PRS and RACC thickness among MDD cases (β = 0.098, p = 0.026), compared to a negative association among controls (β = −0.087, p = 0.002). MDD cases with low SCZ PRS showed thinner RACC, although the opposite difference for high-SCZ-PRS cases was not significant. There were nominal interactions for other brain metrics, but none remained significant after correcting for multiple comparisons.
Our significant results indicate that MDD case-control differences in RACC thickness vary as a function of SCZ PRS. Although this was not the case for most other brain measures assessed, our specific findings still provide some further evidence that MDD in the presence of high genetic risk for SCZ is subtly neurobiologically distinct from MDD in general.
We present two studies on neural network architectures that learn to represent sentences by composing their words according to automatically induced binary trees, without ever being shown a correct parse tree. We use Tree-Long Short-Term Memories (LSTMs) as our composition function, applied along a tree structure found by a differentiable natural language chart parser. The models simultaneously optimise both the composition function and the parser, thus eliminating the need for externally provided parse trees, which are normally required for Tree-LSTMs. They can therefore be seen as tree-based recurrent neural networks that are unsupervised with respect to the parse trees. Due to being fully differentiable, the models are easily trained with an off-the-shelf gradient descent method and backpropagation.
In the first part of this paper, we introduce a model based on the CKY chart parser, and evaluate its downstream performance on a natural language inference task and a reverse dictionary task. Further, we show how its performance can be improved with an attention mechanism which fully exploits the parse chart, by attending over all possible subspans of the sentence. We find that our approach is competitive against similar models of comparable size and outperforms Tree-LSTMs that use trees produced by a parser.
Finally, we present an alternative architecture based on a shift-reduce parser. We perform an analysis of the trees induced by both our models, to investigate whether they are consistent with each other and across re-runs, and whether they resemble the trees produced by a standard parser.