To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Emergency neurosurgery encompasses serious and high-risk cranial and spinal conditions across all ages. The authors provide an overview of the changes occurring within emergency surgery to meet the challenges provided from unscheduled care. Considering the wider landscape of emergency surgery provides a context for the changes occurring within emergency neurosurgery. The delivery of emergency neurosurgery within the UK, the Republic of Ireland, the Netherlands, and the United States of America (USA) is then described to provide an overview of different models of care.
The Tiwanaku civilization (around AD 500–1100) originated in the Bolivian altiplano of the south-central Andes and established agrarian colonies (AD 600–1100) in the Peruvian coastal valleys. Current dietary investigations at Tiwanaku colonial sites focus on maize, a coastal valley cultivar with ritual and political significance. Here, we examine Tiwanaku provincial foodways and ask to what degree the Tiwanaku settlers maintained their culinary and agrarian traditions as they migrated into the lower-altitude coastal valleys to farm the land. We analyze archaeobotanical remains from the Tiwanaku site of Cerro San Antonio (600 m asl) in the Locumba Valley and compare them to data from the Tiwanaku site in the altiplano and the Rio Muerto site in the Moquegua Valley during the period of state expansion. Our findings show high proportions of wild, weedy, and domesticated Amaranthaceae cultivars, suggesting that Tiwanaku colonists grew traditional high-valley (2,000–3,000 m asl) and altiplano (3,000–4,000 m asl) foods on the lowland frontier because of their established cultural dietary preferences and Amaranthaceae's ability to adapt to various agroclimatic and edaphic conditions.
Déjà vu and involuntary autobiographical memories (IAMs) are differentiated by a number of factors including metacognition. In contrast to IAMs, déjà vu activates regions associated with self-awareness including the right dorsolateral prefrontal cortex.
People who inject drugs are at risk of acute bacterial and fungal injecting-related infections. There is evidence that incidence of hospitalizations for injecting-related infections are increasing in several countries, but little is known at an individual level. We aimed to examine injecting-related infections in a linked longitudinal cohort of people who inject drugs in Melbourne, Australia. A retrospective descriptive analysis was conducted to estimate the prevalence and incidence of injecting-related infections using administrative emergency department and hospital separation datasets linked to the SuperMIX cohort, from 2008 to 2018. Over the study period, 33% (95%CI: 31–36%) of participants presented to emergency department with any injecting-related infections and 27% (95%CI: 25–30%) were admitted to hospital. Of 1,044 emergency department presentations and 740 hospital separations, skin and soft tissue infections were most common, 88% and 76%, respectively. From 2008 to 2018, there was a substantial increase in emergency department presentations and hospital separations with any injecting-related infections, 48 to 135 per 1,000 person-years, and 18 to 102 per 1,000 person-years, respectively. The results emphasize that injecting-related infections are increasing, and that new models of care are needed to help prevent and facilitate early detection of superficial infection to avoid potentially life-threatening severe infections.
Background: Antiviral chemoprophylaxis for influenza is recommended in nursing homes to prevent transmission and severe disease among residents with higher risk of severe influenza complications. Interim CDC guidance recommends that long-term care facilities initiate antiviral chemoprophylaxis with oral oseltamivir for all non-ill residents living in the same unit following the start of an outbreak in a facility (ie, ≥2 patients ill within 72 hours and of whom at least 1 resident has laboratory-confirmed influenza). Prophylaxis continues for a minimum of 2 weeks and for at least 7 days after the last laboratory-confirmed case. However, facilities may not strictly adhere to this guidance, with 1 study showing up to 68% of facilities were nonadherent to national guidance (Silva et al 2020). Here, we model the potential impacts of different antiviral prophylaxis strategies. Methods: We developed a susceptible–exposed–asymptomatic–infected–recovered (SEAIR) compartmental model of an average-sized nursing home comprising short-stay residents, long-stay residents, and healthcare personnel (HCP). Persons treated with antiviral chemoprophylaxis were less susceptible to infection, had a lower probability of symptoms if infected, a reduced viral load, and a shortened duration of infectiousness. We included influenza vaccination for residents and HCP through reduced probability of symptomatic infection. Coverage rates were estimated from CDC FluVaxView and CMS COVID-19 nursing home data. As a base case, we modeled a scenario with prophylaxis implemented according to guidance. We varied uptake by residents and HCP (from 10% to 90%), case thresholds for prophylaxis initiation (1–5 cases identified), and timing of prophylaxis cessation: either time dependent (ie, 10–14 days of prophylaxis) or case-dependent (ie, continuing prophylaxis for 1–7 days with no cases). Results: In the scenario based on current guidance, prophylaxis reduced resident cases by 16% and resident hospitalizations by 45%, compared to no prophylaxis (Fig. 1A). Scenarios that differed from the guidance altered case burden and timing: Time-dependent prophylaxis cessation increased resident cases and hospitalizations (Fig. 1A). Timing of prophylaxis initiation had slight effects on the timing of the epidemic and minimal effects on resident cases and hospitalizations (Fig. 1B). High resident uptake was important for reducing resident cases and hospitalizations (Fig. 1C), but increasing HCP uptake had minimal effect (Fig. 1D). Conclusions: Our findings support the current prophylaxis guidance. Promptly implementing prophylaxis reduces resident cases and hospitalizations. Continuing prophylaxis until cases are no longer identified reduces cases and hospitalizations.
Background: Antibiotics alone are often insufficient to treat recurrent C. difficile infection (rCDI) because they have no activity against C. difficile spores that germinate within a disrupted microbiome. SER-109, an investigational, oral, microbiome therapeutic comprised of purified Firmicutes spores, was designed to reduce rCDI through microbiome repair. We report an integrated efficacy analysis through week 24 for SER-109 from phase 3 studies, ECOSPOR III and ECOSPOR IV. Methods: ECOSPOR III was a randomized, placebo-controlled phase 3 trial conducted at 56 US or Canadian sites that included 182 participants with ≥2 CDI recurrences, confirmed via toxin EIA testing. Participants were stratified by age (<65 years or ≥65 years) and antibiotic regimen (vancomycin, fidaxomicin) and were randomized 1:1 to placebo or SER-109 groups. ECOSPOR IV was an open-label, single-arm study conducted at 72 US or Canadian sites including 263 participants with rCDI enrolled in 2 cohorts: (1) rollover participants from ECOSPOR III who experienced on-study recurrence diagnosed by toxin EIA (n = 29) and (2) participants with ≥1 CDI recurrence (diagnosed by PCR or toxin EIA), inclusive of the current episode (n = 234). In both studies, the investigational product was administered orally as 4 capsules over 3 consecutive days following symptom resolution after standard-of-care antibiotics. The primary efficacy end point was rCDI (recurrent toxin-positive diarrhea requiring treatment) through week 8. Other end points included CDI recurrence rates and safety through 24 weeks. Results: These 349 participants received at least 1 dose of SER-109 in ECOSPOR III or ECOSPOR IV (mean age 64.2; 68.8% female). Overall, 77 participants (22.1%) enrolled with their first CDI recurrence. Four participants received blinded SER-109 in ECOSPOR III followed by a second dose of open-label SER-109 in ECOSPOR IV. Overall, the proportion of participants who received any dose of SER-109 with rCDI at week 8 was 9.5% (33 of 349; 95% CI, 6.6 %–13.0%), and the CDI recurrence rate remained low through 24 weeks (15.2%, 53 of 349; 95% CI, 11.6%–19.4%), corresponding to sustained clinical response rates of 90.5% (95% CI, 87.0%–93.4%) and 84.8% (95% CI, 80.6%–88.4%), respectively (Fig. 1). Most rollover participants (25 of 29, 86.2%) were from the placebo arm; 13.8% had rCDI by week 8. Conclusions: In this integrated analysis, the rates of rCDI were low and durable in participants who received the investigational microbiome therapeutic SER-109, with sustained clinical response rates of 90.5% and 84.8% at weeks 8 and 24, respectively. These data further support the potential benefit of microbiome repair with SER-109 following antibiotics for rCDI to prevent recurrence in high-risk patients.
Financial support: This study was funded by Seres Therapeutics.
Background:Clostridioides difficile infection (CDI) often recurs in patients aged ≥65 years and those with comorbidities. Clinical trials often exclude patients with history of immunosuppression, malignancy, renal insufficiency, or other comorbidities. In a phase 3 trial (ECOSPOR III), SER-109 was superior to placebo in reducing recurrent CDI (rCDI) risk at week 8 and was well tolerated. We report integrated safety data for SER-109 in a broad patient population through week 24 from phase 3 studies: ECOSPOR III and ECOSPOR IV. Methods: ECOSPOR III was a double-blind, placebo-controlled trial conducted in participants with ≥2 CDI recurrences randomized 1:1 to placebo or SER-109. ECOSPOR IV was an open-label, single-arm study conducted in 263 patients with rCDI enrolled in 2 cohorts: (1) rollover participants from ECOSPOR III with on-study recurrence and (2) participants with ≥1 CDI recurrence, inclusive of the current episode. In both studies, the investigational product was administered as 4 oral capsules over 3 days. Treatment-emergent adverse events (TEAEs) were collected through week 8; serious TEAEs and TEAEs of special interest (ie, bacteremia, abscess, meningitis) were collected through week 24. Results: In total, 349 participants received SER-109 in ECOSPOR III and/or ECOSPOR IV (mean age 64.2; 68.8% female). Chronic diseases included cardiac disease (31.2%), immunocompromised or immunosuppressed (21.2%), diabetes (18.9% ), and renal impairment or failure (13.2%). Overall, 221 (63.3%) of 349 participants who received SER-109 experienced TEAEs through week 24. Most were mild to moderate and gastrointestinal. The most common (>5% of participants) treatment related TEAEs were flatulence, abdominal pain and distension, decreased appetite, constipation, nausea, fatigue, and diarrhea. No participants experienced a treatment-related TEAE leading to study withdrawal. Invasive infections were observed in 28 participants (8%); those with identified pathogens were unrelated to SER-109 species, and all were deemed unrelated to treatment by the investigators. There were 11 deaths (3.2%) and 48 participants (13.8%) with serious TEAEs, none of which were deemed treatment related. There were no clinically important differences in the safety profile across subgroups of sex, race, prior antibiotic regimen, or number of CDI recurrences. No safety signals were observed in participants with renal impairment or failure, diabetes, cardiac disease, or immunocompromised or immunosuppressed individuals. Conclusions: In this integrated analysis of phase 3 trials, SER-109, an investigational microbiome therapeutic, was well tolerated in this vulnerable patient population with prevalent comorbidities. No infections, nor those with identified pathogens, were attributed to SER-109 or product species. This safety profile might be expected because this purified product is composed of spore-forming Firmicutes normally abundant in the healthy microbiome.
Financial support: This study was funded by Seres Therapeutics.
The aim of this study was to identify and prioritize strategies for strengthening public health system resilience for pandemics, disasters, and other emergencies using a scorecard approach.
The United Nations Public Health System Resilience Scorecard (Scorecard) was applied across 5 workshops in Slovenia, Turkey, and the United States of America. The workshops focused on participants reviewing and discussing 23 questions/indicators. A Likert type scale was used for scoring with zero being the lowest and 5 the highest. The workshop scores were analyzed and discussed by participants to prioritize areas of need and develop resilience strategies. Data from all workshops were aggregated, analyzed, and interpreted to develop priorities representative of participating locations.
Eight themes emerged representing the need for better integration of public health and disaster management systems. These include: assessing community disease burden; embedding long-term recovery groups in emergency systems; exploring mental health care needs; examining ecosystem risks; evaluating reserve funds; identifying what crisis communication strategies worked well; providing non-medical services; and reviewing resilience of existing facilities, alternate care sites, and institutions.
The Scorecard is an effective tool for establishing baseline resilience and prioritizing actions. The strategies identified reflect areas in most need for investment to improve public health system resilience.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Aerial application of an herbicide mixture of triclopyr, dicamba, picloram, and aminopyralid is used to control dense infestations of exotic conifers, notably lodgepole pine (Pinus contorta Douglas ex Loudon), in New Zealand. The rates of herbicide applied to control these tree weeds has the potential for off-target impacts through persistence in the forest floor, soil, and water. Persistence of three of these herbicides was investigated in cast needles, forest floor (litter, fermented humic layer [LFH]), and soil following their operational aerial application (triclopyr: 18 kg ai ha−1; dicamba: 5 kg ai ha−1; picloram: 2 kg ai ha−1) at three sites across New Zealand (KF, MD, GE) with dense invasions of P. contorta. Water was collected from a local stream at two sites (KF, MD) in the days/months after spraying. Active ingredients detected across all sites in cast needles, LFH, and mineral soil generally reflected their application rates, with total amounts comprising 81% triclopyr, 14% dicamba, and 5% picloram. Most of the active ingredients were detected in the LFH (59%), a heavy lignin-rich layer of dead needles overlaying the soil. All three herbicides persisted in this layer, at all sites, for up to 2 yr (at study termination). Only triclopyr was detected in mineral soil, where it declined to below detection levels (0.2 mg kg−1) within 1 yr. All three herbicides were detected in stream water on the day of spray application at KF, and during a rainfall event 1 mo later. However, amounts did not exceed New Zealand environmental and drinking water standards, an outcome attributed to a 30-m no-spray buffer zone used at this site. At MD, herbicides were detectable in water up to 4 mo after spraying, with amounts exceeding New Zealand drinking water standards on one occasion, 1 mo after spray application. No spray buffer zones were used at the MD site.
Terrorist incidents lead to a range of mental health outcomes for people affected, sometimes extending years after the event. Secondary stressors can exacerbate them, and social support can provide mitigation and aid recovery. There is a need to better understand distress and mitigating factors among survivors of the Manchester Arena attack in 2017.
We explored three questions. First, what experiences of distress did participants report? Second, how might secondary stressors have influenced participants’ psychosocial recoveries? Third, what part has social support played in the relationships between distress and participants’ recovery trajectories?
We conducted a cross-sectional online survey of a convenience sample of survivors of the Manchester Arena bombing (N = 84) in January 2021 (3 years 8 months post-incident), and a longitudinal study of the same participants’ scores on mental health measures over 3 years from September 2017.
Survivors’ mental well-being scores in early 2021 were significantly lower than general population norms. Longitudinal follow-up provided evidence of enduring distress. Secondary stressors, specifically disruptions to close relationships, were associated with greater post-event distress and slower recovery. We found an indirect relationship between identifying with, and receiving support from, others present at the event and mental well-being >3 years later.
The Arena attack has had an enduring impact on mental health, even in survivors who had a mild response to the event. The quality of close relationships is pivotal to long-term outcome. Constructive support from family and friends, and people with shared experiences, are key to social cure processes that facilitate coping and recovery.
Behavioural and Psychological Symptoms of Dementia (BPSD) include a range of neuropsychiatric disturbances such as agitation, aggression, depression, and psychotic symptoms. These common symptoms can impact patients’ functioning and quality of life. Antipsychotic medication can be prescribed to alleviate some symptoms, but this comes with significant risks including cerebrovascular events and increased mortality. We aimed to review antipsychotic prescribing of the Harrogate Older Adult Community Mental Health Team (CMHT); to measure compliance with NICE guidance and local policy and thus improve the prescribing and monitoring process.
Using electronic patient records, we identified all patients under the care of the CMHT with a diagnosis of dementia currently receiving antipsychotic treatment; a total of 55 patients. A random sample of 24 patients were reviewed; their records were hand searched for relevant information.
The standards measured were derived from the NICE Guideline (NG97) June 2018: ‘Dementia: assessment, management and support for people living with dementia and their carers’ as well as local trust guidance.
All 24 patients were receiving antipsychotics for severe distress or aggression. 88% of patients had an assessment of sources of distress before treatment was started, but only 42% had a non-pharmacological intervention before antipsychotic treatment was started. Once antipsychotic treatment had started this increased to 58%. For some patients, the reason for not receiving a non-pharmacological intervention was due to urgency of treatment or being on a waiting list for occupational therapy, but for most the reason was not explicitly documented.
For 63%, there was evidence of a discussion of the risks of treatment with the patient, carer or family member. 63% had initial baseline blood tests and 54% had a baseline ECG. Of the patients who did not have initial monitoring, a suitable reason was given for just over 60%. Only 33% of patients who had antipsychotic treatment for over 12 weeks had a trial of discontinuation or dose reduction. Less than 22% of patients had physical health monitoring at one year of treatment.
There were shortfalls in several areas including the offer of non-pharmacological interventions, regular review of the ongoing need for antipsychotics, and physical health monitoring.
Introduction of a checklist before antipsychotics are prescribed is recommended, to include discussion of risks and benefits, non-pharmacological interventions, and initial monitoring. Also recommended is a system to identify when monitoring and review of antipsychotics are due.
We present the results of a theoretical investigation into the existence, evolution and excitation of resonant triads of nonlinear free-surface gravity waves confined to a cylinder of finite depth. It is well known that resonant triads are impossible for gravity waves in laterally unbounded domains; we demonstrate, however, that horizontal confinement of the fluid may induce resonant triads for particular fluid depths. For any three correlated wave modes arising in a cylinder of arbitrary cross-section, we prove necessary and sufficient conditions for the existence of a depth at which nonlinear resonance may arise, and show that the resultant critical depth is unique. We enumerate the low-frequency triads for circular cylinders, including a new class of resonances between standing and counter-propagating waves, and also briefly discuss annular and rectangular cylinders. Upon deriving the triad amplitude equations for a finite-depth cylinder of arbitrary cross-section, we deduce that the triad evolution is always periodic, and determine parameters controlling the efficiency of energy exchange. In order to excite a particular triad, we explore the influence of external forcing; in this case, the triad evolution may be periodic, quasi-periodic or chaotic. Finally, our results have potential implications on resonant water waves in man-made and natural basins, such as industrial-scale fluid tanks, harbours and bays.
Social learning is a critical adaptation for dealing with different forms of variability. Uncertainty is a severe form of variability where the space of possible decisions or probabilities of associated outcomes are unknown. We identified four theoretically important sources of uncertainty: temporal environmental variability; payoff ambiguity; selection-set size; and effective lifespan. When these combine, it is nearly impossible to fully learn about the environment. We develop an evolutionary agent-based model to test how each form of uncertainty affects the evolution of social learning. Agents perform one of several behaviours, modelled as a multi-armed bandit, to acquire payoffs. All agents learn about behavioural payoffs individually through an adaptive behaviour-choice model that uses a softmax decision rule. Use of vertical and oblique payoff-biased social learning evolved to serve as a scaffold for adaptive individual learning – they are not opposite strategies. Different types of uncertainty had varying effects. Temporal environmental variability suppressed social learning, whereas larger selection-set size promoted social learning, even when the environment changed frequently. Payoff ambiguity and lifespan interacted with other uncertainty parameters. This study begins to explain how social learning can predominate despite highly variable real-world environments when effective individual learning helps individuals recover from learning outdated social information.