To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This national pre-pandemic survey compared demand and capacity of adult community eating disorder services (ACEDS) with NHS England (NHSE) commissioning guidance.
Thirteen services in England and Scotland responded (covering 10.7 million population). Between 2016–2017 and 2019–2020 mean referral rates increased by 18.8%, from 378 to 449/million population. Only 3.7% of referrals were from child and adolescent eating disorder services (CEDS-CYP), but 46% of patients were aged 18–25 and 54% were aged >25. Most ACEDS had waiting lists and rationed access. Many could not provide full medical monitoring, adapt treatment for comorbidities, offer assertive outreach or provide seamless transitions. For patient volume, the ACEDS workforce budget was 15%, compared with the NHSE workforce calculator recommendations for CEDS-CYP. Parity required £7 million investment/million population for the ACEDS.
This study highlights the severe pressure in ACEDS, which has increased since the COVID-19 pandemic. Substantial investment is required to ensure NHS ACEDS meet national guidance, offer evidence-based treatment, reduce risk and preventable deaths, and achieve parity with CEDS-CYP.
Over the last 20 years disasters have increasingly involved children, and pediatric disaster medicine research is growing. However, this research is largely reactive, has not been categorized in terms of the disaster cycle, and the quality of the research is variable. To understand the gaps in current literature and highlight areas for future research, we conducted a scoping review of pediatric disaster medicine literature. This work will help create recommendations for future pediatric disaster medicine research.
Using a published framework for scoping reviews, we worked with a medical librarian and a multi-institutional team to define the research question, develop eligibility criteria, and to identify a search strategy. We conducted a comprehensive Medline search from 2001-2022, which was distributed to nine reviewers. Each article was independently screened for inclusion by two reviewers. Discrepancies were resolved by a third reviewer.
Inclusion criteria included articles published in English, related to all stages of the disaster cycle, and disaster education, focused on or included pediatric populations; published in academic, peer-reviewed journals, and policies from professional societies.
967 pediatric disaster medicine articles were imported for screening and 35 duplicates were removed. 932 articles were screened for relevance and 109 were excluded. In 2000, three articles met inclusion criteria and 66 in 2021. We noticed reactive spikes in the number of articles after major disasters. Most articles focused on preparedness and response, with only a few articles on recovery, mitigation, and prevention. Methodology used for most studies was either qualitative or retrospective. Most were single site studies and there were < 10 meta-analyses over the 20 years.
This scoping review describes the trends in and quality of existing pediatric disaster medicine literature. By identifying the gaps in this body of literature, we can better prioritize future research.
Until recently, the influence of basal liquid water on the evolution of buried glaciers in Mars' mid latitudes was assumed to be negligible because the latter stages of Mars' Amazonian period (3 Ga to present) have long been thought to have been similarly cold and dry to today. Recent identifications of several landforms interpreted as eskers associated with these young (100s Ma) glaciers calls this assumption into doubt. They indicate basal melting (at least locally and transiently) of their parent glaciers. Although rare, they demonstrate a more complex mid-to-late Amazonian environment than was previously understood. Here, we discuss several open questions posed by the existence of glacier-linked eskers on Mars, including on their global-scale abundance and distribution, the drivers and dynamics of melting and drainage, and the fate of meltwater upon reaching the ice margin. Such questions provide rich opportunities for collaboration between the Mars and Earth cryosphere research communities.
We use a mathematical model to investigate the effect of basal topography and ice surface slope on transport and deposition of sediment within a water-filled subglacial channel. In our model, three zones of different behaviour occur. In the zone furthest upstream, variations in basal topography lead to sediment deposition under a wide range of conditions. In this first zone, even very small and gradually varying basal undulations (~5 m amplitude) can lead to the deposition of sediment within a modelled channel. Deposition is concentrated on the downstream gradient of subglacial ridges, and on the upstream gradient of subglacial troughs. The thickness and steepness of the ice sheet has a substantial impact on deposition rates, with shallow ice profiles strongly promoting both the magnitude and extent of sediment deposition. In a second zone, all sediment is transported downstream. Finally, a third zone close to the ice margin is characterised by high rates of sediment deposition. The existence of these zones has implications for esker formation and the dynamics of the subglacial environment.
Experience of crisis care may vary across different care models.
To explore the experience of care in standard care and ‘open dialogue’ (a peer-supported community service focused on open dialogue and involving social networks for adults with a recent mental health crisis) 3 months after a crisis.
We conducted semi-structured interviews with 11 participants (6 received open dialogue; 5 received treatment as usual (TAU)) in a feasibility study of open dialogue and analysed the data using a three-step inductive thematic analysis to identify themes that (a) were frequently endorsed and (b) represented the experiences of all participants.
Four themes emerged: (a) feeling able to rely on and access mental health services; (b) supportive and understanding family and friends; (c) having a choice and a voice; and (d) confusion and making sense of experiences. Generally, there was a divergence in experience across the two care models. Open dialogue participants often felt able to rely on and access services and involve their family and friends in their care. TAU participants described a need to rely on services and difficulty when it was not met, needing family and friends for support and wanting them to be more involved in their care. Some participants across both care models experienced confusion after a crisis and described benefits of sense-making.
Understanding crisis care experiences across different care models can inform service development in crisis and continuing mental healthcare services.
According to Aristotle, the goal of anyone who is not simply stupid or slavish is to live a worthwhile life.1 There are, no doubt, people who have no goal at all beyond the moment’s pleasure or release from pain. There may be people incapable of reaching any reasoned decision about what to do, and acting on it.2 But anyone who asks how she should live implicitly agrees that her goal is to live well, to live a life that she can think worth living. That goal, eudaimonia, is something that is sought for its own sake, and for nothing else. Anyone who asks herself how she should live can answer that she should live well. The answer, admittedly, needs further comment. Aristotle went on to suggest that ‘living well’ amounted to living in accordance with virtue, or if there is more than one virtue, in accordance with the best and most complete. Eudaimonia, happiness, is virtuous activity over a whole life. To live a worthwhile life we must acquire and practice habits of doing the right thing, for the right reason. Equivalently, we must do what a virtuous person would, and in the way she would, for the sake of to kalon, or beauty.
Adverse drug reactions (ADRs) are associated with increased morbidity, mortality, and resource utilization. Drug interactions (DDIs) are among the most common causes of ADRs, and estimates have cited that up to 22% of patients take interacting medications. DDIs are often due to the propensity for agents to induce or inhibit enzymes responsible for the metabolism of concomitantly administered drugs. However, this phenomenon is further complicated by genetic variants of such enzymes. The aim of this study is to quantify and describe potential drug-drug, drug-gene, and drug-drug-gene interactions in a community-based patient population.
A regional pharmacy with retail outlets in Arkansas provided deidentified prescription data from March 2020 for 4761 individuals. Drug-drug and drug-drug-gene interactions were assessed utilizing the logic incorporated into GenMedPro, a commercially available digital gene-drug interaction software program that incorporates variants of 9 pharmacokinetic (PK) and 2 pharmacodynamic (PD) genes to evaluate DDIs and drug-gene interactions. The data were first assessed for composite drug-drug interaction risk, and each individual was stratified to a risk category using the logic incorporated in GenMedPro. To calculate the frequency of potential drug-gene interactions, genotypes were imputed and allocated to the cohort according to each gene’s frequency in the general population. Potential genotypes were randomly allocated to the population 100 times in a Monte Carlo simulation. Potential drug-drug, gene-drug, or gene-drug-drug interaction risk was characterized as minor, moderate, or major.
Based on prescription data only, the probability of a DDI of any impact (mild, moderate, or major) was 26% [95% CI: 0.248-0.272] in the population. This probability increased to 49.6% [95% CI: 0.484-0.507] when simulated genetic polymorphisms were additionally assessed. When assessing only major impact interactions, there was a 7.8% [95% CI: 0.070-0.085] probability of drug-drug interactions and 10.1% [95% CI: 0.095-0.108] probability with the addition of genetic contributions. The probability of drug-drug-gene interactions of any impact was correlated with the number of prescribed medications, with an approximate probability of 77%, 85%, and 94% in patients prescribed 5, 6, or 7+ medications, respectively. When stratified by specific drug class, antidepressants (19.5%), antiemetics (21.4%), analgesics (16%), antipsychotics (15.6%), and antiparasitics (49.7%) had the highest probability of major drug-drug-gene interaction.
In a community-based population of outpatients, the probability of drug-drug interaction risk increases when genetic polymorphisms are attributed to the population. These data suggest that pharmacogenetic testing may be useful in predicting drug interactions, drug-gene interactions, and severity of interactions when proactively evaluating patient medication profiles.
Patients presenting to hospital with suspected coronavirus disease 2019 (COVID-19), based on clinical symptoms, are routinely placed in a cohort together until polymerase chain reaction (PCR) test results are available. This procedure leads to delays in transfers to definitive areas and high nosocomial transmission rates. FebriDx is a finger-prick point-of-care test (PoCT) that detects an antiviral host response and has a high negative predictive value for COVID-19. We sought to determine the clinical impact of using FebriDx for COVID-19 triage in the emergency department (ED).
We undertook a retrospective observational study evaluating the real-world clinical impact of FebriDx as part of an ED COVID-19 triage algorithm.
Emergency department of a university teaching hospital.
Patients presenting with symptoms suggestive of COVID-19, placed in a cohort in a ‘high-risk’ area, were tested using FebriDx. Patients without a detectable antiviral host response were then moved to a lower-risk area.
Between September 22, 2020, and January 7, 2021, 1,321 patients were tested using FebriDx, and 1,104 (84%) did not have a detectable antiviral host response. Among 1,104 patients, 865 (78%) were moved to a lower-risk area within the ED. The median times spent in a high-risk area were 52 minutes (interquartile range [IQR], 34–92) for FebriDx-negative patients and 203 minutes (IQR, 142–255) for FebriDx-positive patients (difference of −134 minutes; 95% CI, −144 to −122; P < .0001). The negative predictive value of FebriDx for the identification of COVID-19 was 96% (661 of 690; 95% CI, 94%–97%).
FebriDx improved the triage of patients with suspected COVID-19 and reduced the time that severe acute respiratory coronavirus virus 2 (SARS-CoV-2) PCR-negative patients spent in a high-risk area alongside SARS-CoV-2–positive patients.
Infectious disease outbreaks on the scale of the current coronavirus disease 2019 (COVID-19) pandemic are a new phenomenon in many parts of the world. Many isolation unit designs with corresponding workflow dynamics and personal protective equipment postures have been proposed for each emerging disease at the health facility level, depending on the mode of transmission. However, personnel and resource management at the isolation units for a resilient response will vary by human resource capacity, reporting requirements, and practice setting. This study describes an approach to isolation unit management at a rural Uganda Hospital and shares lessons from the Uganda experience for isolation unit managers in low- and middle-income settings.
Phillips and colleagues claim that the capacity to ascribe knowledge is a “basic” capacity, but most studies reporting linguistic data reviewed by Phillips et al. were conducted in English with American participants – one of more than 6,500 languages currently spoken. We highlight the importance of cross-cultural and cross-linguistic research when one is theorizing about fundamental human representational capacities.
This story from the Korean War goes to the heart of the unique bond between Australian and New Zealand soldiers, one cemented in mutual respect, expressed by a fierce rivalry and a steadfastness to stand shoulder-to-shoulder against any foe, perceived or real. The old coat of arms for New Zealand carried the motto ‘Onward’ (also the motto of the 1 New Zealand Expeditionary Force during the First World War and of the 1 Royal New Zealand Infantry Regiment today). It is a motto of modest intent somewhat in keeping with the retiring, nocturnal and flightless kiwi emblazoned on the sleeves of members of the New Zealand Army.
This study aimed to investigate general factors associated with prognosis regardless of the type of treatment received, for adults with depression in primary care.
We searched Medline, Embase, PsycINFO and Cochrane Central (inception to 12/01/2020) for RCTs that included the most commonly used comprehensive measure of depressive and anxiety disorder symptoms and diagnoses, in primary care depression RCTs (the Revised Clinical Interview Schedule: CIS-R). Two-stage random-effects meta-analyses were conducted.
Twelve (n = 6024) of thirteen eligible studies (n = 6175) provided individual patient data. There was a 31% (95%CI: 25 to 37) difference in depressive symptoms at 3–4 months per standard deviation increase in baseline depressive symptoms. Four additional factors: the duration of anxiety; duration of depression; comorbid panic disorder; and a history of antidepressant treatment were also independently associated with poorer prognosis. There was evidence that the difference in prognosis when these factors were combined could be of clinical importance. Adding these variables improved the amount of variance explained in 3–4 month depressive symptoms from 16% using depressive symptom severity alone to 27%. Risk of bias (assessed with QUIPS) was low in all studies and quality (assessed with GRADE) was high. Sensitivity analyses did not alter our conclusions.
When adults seek treatment for depression clinicians should routinely assess for the duration of anxiety, duration of depression, comorbid panic disorder, and a history of antidepressant treatment alongside depressive symptom severity. This could provide clinicians and patients with useful and desired information to elucidate prognosis and aid the clinical management of depression.
Social anxiety disorder (SAD) is common. It usually starts in adolescence, and without treatment can disrupt key developmental milestones. Existing generic treatments are less effective for young people with SAD than with other anxiety disorders, but an adaptation of an effective adult therapy (CT-SAD-A) has shown promising results for adolescents.
The aim of this study was to conduct a qualitative exploration to contribute towards the evaluation of CT-SAD-A for adoption into Child and Adolescent Mental Health Services (CAMHS).
We used interpretative phenomenological analysis (IPA) to analyse the transcripts of interviews with a sample of six young people, six parents and seven clinicians who were learning the treatment.
Three cross-cutting themes were identified: (i) endorsing the treatment; (ii) finding therapy to be collaborative and active; challenging but helpful; and (iii) navigating change in a complex setting. Young people and parents found the treatment to be useful and acceptable, although simultaneously challenging. This was echoed by the clinicians, with particular reference to integrating CT-SAD-A within community CAMHS settings.
The acceptability of the treatment with young people, their parents and clinicians suggests further work is warranted in order to support its development and implementation within CAMHS settings.
The southern pine beetle, Dendroctonus frontalis Zimmermann (Coleoptera: Curculionidae: Scolytinae), is among the most destructive bark beetle pests of pines (Pinaceae) of the southeast and mid-Atlantic United States of America, Mexico, and Central America. Numerous volatile compounds can stimulate or reduce attraction of the beetle, but efforts to incorporate these into effective, practical technologies for pest management have yielded mixed results. Attractants have been incorporated into lures used in monitoring traps that are employed operationally to forecast outbreaks and detect emerging populations. The attraction inhibitor, verbenone, shows efficacy for suppressing southern pine beetle infestations but has not yet been adopted operationally. No effective semiochemical tree protectant has been developed for the beetle. We discuss complexities in the chemical ecology of the beetle that likely have impeded research and development of semiochemical management tools, and we describe basic science gaps that may hinder further progress if not addressed. We also report some supporting, original experimental data indicating (1) that a verbenone device can inhibit the beetle’s response to sources of attractant in a radius of at least several metres, (2) similar olfactory responses by the beetle to both enantiomers of verbenone, and (3) that pheromone background can cause conflicting results in semiochemical field tests.
Horace Walpole is pivotal to the early Gothic Revival as the author of what has long been hailed as the first Gothic novel, and as the creator of the most influential of all early Gothic Revival houses. This essay explores his intuitively imaginative response to Gothic, and how his love of the decorative profusion and allusive richness that it could offer was played out in his novel The Castle of Otranto (1765) and his play The Mysterious Mother (1768) – as well as in in his ‘castle’ at Strawberry Hill. That house, with its subtle management of scale, colour and light, and in the suggestive riches of the collection it contained, created a heady mixture of fantasy and atmosphere, displaying an historically informed but archaeologically unrestrained imagination. These are qualities that it shared with Walpole’s Gothic fictions. There is hardly a feature of Gothic romance that does not appear in Otranto, and its gloomy castle, predatory patriarch and pursued virgin, along with the guilt-tormented Countess and evil friars of The Mysterious Mother, like the Gothic battlements and evocative interiors of Strawberry Hill, engendered a lasting and pervasive progeny.
Traditional dietary assessment methods in research can be challenging, with participant burden to complete an interview, diary, 24 h recall or questionnaire and researcher burden to code the food record to obtain a nutrient breakdown. Self-reported assessment methods are subject to recall and social desirability biases, in addition to selection bias from the nature of volunteering to take part in a research study. Supermarket loyalty card transaction records, linked to back of pack nutrient information, present a novel opportunity to use objective records of food purchases to assess diet at a household level. With a large sample size and multiple transactions, it is possible to review variation in food purchases over time and across different geographical areas.
Materials and methods:
This study uses supermarket loyalty card transactions for one retailer's customers in Leeds, for 12 months during 2016. Fruit and vegetable purchases for customers who appear to shop regularly for a ‘complete’ shop, buying from at least 7 of 11 Living Cost and Food Survey categories, were calculated. Using total weight of fruits and vegetables purchased over one year, average portions (80g) per day, per household were generated. Descriptive statistics of fruit and vegetable purchases by age, gender and Index of Multiple Deprivation of the loyalty card holder were generated. Using Geographical Information Systems, maps of neighbourhood purchases per month of the year were created to visualise variations.
The loyalty card holder transaction records represent 6.4% of the total Leeds population. On average, households in Leeds purchase 3.5 portions of fruit and vegetables per day, per household. Affluent and rural areas purchase more fruit and vegetables than average with 22% purchasing more than 5 portions/day. Conversely poor urban areas purchase less, with 18% purchasing less than 1 portion/day. Highest purchases are in the winter months, with lowest in the summer holidays. Loyalty cards registered to females purchased 0.4 portions per day more than male counterparts. The over 65 years purchased 1.5 portions per day more than the 17–24 year olds. A clear deprivation gradient is observed, with the most deprived purchasing 1.5 portions less per day than the least deprived.
Loyalty card transaction data offer an exciting opportunity for measuring variation in fruit and vegetable purchases. Variation is observed by age, gender, deprivation, geographically across a city and throughout the seasons. These insights can inform both policymakers and retailers regarding areas for fruit and vegetable promotion.