We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Metabolic and bariatric surgery (MBS) is safe and efficacious for adolescents with severe obesity. Pairing MBS with behavioral lifestyle interventions may be effective for optimizing treatment outcomes. However, no standardized program exists. Adolescent perspectives are critical to understanding how to design interventions to enhance engagement, sustain motivation, and meet informational needs for pre- and post-MBS self-management behaviors. The aim of this study was to develop an MBS lifestyle support intervention built on evidence-based content with input from adolescents and their families.
Methods:
A mixed-methods design identified adolescent preferences for MBS lifestyle support. Data were collected from a racially and ethnically diverse sample of adolescents (N = 17, 76% females, 24% males 41.2% non-Hispanic Black, 41.2% Hispanic/Latino, 11.8% non-Hispanic White, 5.8% Other) and their mothers (N = 13, 38.4% Hispanic) recruited from an MBS clinic. Quantitative surveys and qualitative interviews assessed preferred types of pre-post MBS content, modality, frequency, and delivery platforms to inform the design of the intervention. Mixed methods data were triangulated to provide a comprehensive understanding of adolescent/parent preferences.
Results:
Adolescents prioritized eating well, managing stress, and maintaining motivation as desired support strategies. Parents identified parental support groups and nutrition guidance as priorities. Peer support and social media platforms were identified as key approaches for boosting motivation and engagement.
Conclusions:
The patient voice is an important first step in understanding how, and whether behavioral lifestyle programs combined with MBS for weight management can be optimized. Adolescent preferences may enhance program fit and identify health behavior supports needed to sustain behavior change.
The COVID-19 has had major direct (e.g., deaths) and indirect (e.g., social inequities) effects in the United States. While the public health response to the epidemic featured some important successes (e.g., universal masking ,and rapid development and approval of vaccines and therapeutics), there were systemic failures (e.g., inadequate public health infrastructure) that overshadowed these successes. Key deficiency in the U.S. response were shortages of personal protective equipment (PPE) and supply chain deficiencies. Recommendations are provided for mitigating supply shortages and supply chain failures in healthcare settings in future pandemics. Some key recommendations for preventing shortages of essential components of infection control and prevention include increasing the stockpile of PPE in the U.S. National Strategic Stockpile, increased transparency of the Stockpile, invoking the Defense Production Act at an early stage, and rapid review and authorization by FDA/EPA/OSHA of non-U.S. approved products. Recommendations are also provided for mitigating shortages of diagnostic testing, medications and medical equipment.
Throughout the COVID-19 pandemic, many areas in the United States experienced healthcare personnel (HCP) shortages tied to a variety of factors. Infection prevention programs, in particular, faced increasing workload demands with little opportunity to delegate tasks to others without specific infectious diseases or infection control expertise. Shortages of clinicians providing inpatient care to critically ill patients during the early phase of the pandemic were multifactorial, largely attributed to increasing demands on hospitals to provide care to patients hospitalized with COVID-19 and furloughs.1 HCP shortages and challenges during later surges, including the Omicron variant-associated surges, were largely attributed to HCP infections and associated work restrictions during isolation periods and the need to care for family members, particularly children, with COVID-19. Additionally, the detrimental physical and mental health impact of COVID-19 on HCP has led to attrition, which further exacerbates shortages.2 Demands increased in post-acute and long-term care (PALTC) settings, which already faced critical staffing challenges difficulty with recruitment, and high rates of turnover. Although individual healthcare organizations and state and federal governments have taken actions to mitigate recurring shortages, additional work and innovation are needed to develop longer-term solutions to improve healthcare workforce resiliency. The critical role of those with specialized training in infection prevention, including healthcare epidemiologists, was well-demonstrated in pandemic preparedness and response. The COVID-19 pandemic underscored the need to support growth in these fields.3 This commentary outlines the need to develop the US healthcare workforce in preparation for future pandemics.
Throughout history, pandemics and their aftereffects have spurred society to make substantial improvements in healthcare. After the Black Death in 14th century Europe, changes were made to elevate standards of care and nutrition that resulted in improved life expectancy.1 The 1918 influenza pandemic spurred a movement that emphasized public health surveillance and detection of future outbreaks and eventually led to the creation of the World Health Organization Global Influenza Surveillance Network.2 In the present, the COVID-19 pandemic exposed many of the pre-existing problems within the US healthcare system, which included (1) a lack of capacity to manage a large influx of contagious patients while simultaneously maintaining routine and emergency care to non-COVID patients; (2) a “just in time” supply network that led to shortages and competition among hospitals, nursing homes, and other care sites for essential supplies; and (3) longstanding inequities in the distribution of healthcare and the healthcare workforce. The decades-long shift from domestic manufacturing to a reliance on global supply chains has compounded ongoing gaps in preparedness for supplies such as personal protective equipment and ventilators. Inequities in racial and socioeconomic outcomes highlighted during the pandemic have accelerated the call to focus on diversity, equity, and inclusion (DEI) within our communities. The pandemic accelerated cooperation between government entities and the healthcare system, resulting in swift implementation of mitigation measures, new therapies and vaccinations at unprecedented speeds, despite our fragmented healthcare delivery system and political divisions. Still, widespread misinformation or disinformation and political divisions contributed to eroded trust in the public health system and prevented an even uptake of mitigation measures, vaccines and therapeutics, impeding our ability to contain the spread of the virus in this country.3 Ultimately, the lessons of COVID-19 illustrate the need to better prepare for the next pandemic. Rising microbial resistance, emerging and re-emerging pathogens, increased globalization, an aging population, and climate change are all factors that increase the likelihood of another pandemic.4
The Society for Healthcare Epidemiology in America (SHEA) strongly supports modernization of data collection processes and the creation of publicly available data repositories that include a wide variety of data elements and mechanisms for securely storing both cleaned and uncleaned data sets that can be curated as clinical and research needs arise. These elements can be used for clinical research and quality monitoring and to evaluate the impacts of different policies on different outcomes. Achieving these goals will require dedicated, sustained and long-term funding to support data science teams and the creation of central data repositories that include data sets that can be “linked” via a variety of different mechanisms and also data sets that include institutional and state and local policies and procedures. A team-based approach to data science is strongly encouraged and supported to achieve the goal of a sustainable, adaptable national shared data resource.
Isolation of an unusual organism, Achromobacter xylosoxidans, from 2 cardiac surgical patients on the same day prompted an investigation to search for cases and cause. An extensive review demonstrated a pseudo-outbreak related to practices to conserve laboratory saline due to short supply resulting from supply chain shortage from the coronavirus disease 2019 pandemic.
Glowacki recognizes the importance of norms in enabling war and peace, but does not focus on the cultural evolutionary mechanisms by which these norms are maintained. We highlight how group-structured cultural selection shapes the scale and nature of peaceful intergroup interactions. The mechanistic perspective reveals that there are many more cases of peaceful intergroup relations than the current account implies.
Psychological and cultural evolutionary accounts of human sociality propose that beliefs in punitive and monitoring gods that care about moral norms facilitate cooperation. While there is some evidence to suggest that belief in supernatural punishment and monitoring generally induce cooperative behaviour, the effect of a deity's explicitly postulated moral concerns on cooperation remains unclear. Here, we report a pre-registered set of analyses to assess whether perceiving a locally relevant deity as moralistic predicts cooperative play in two permutations of two economic games using data from up to 15 diverse field sites. Across games, results suggest that gods’ moral concerns do not play a direct, cross-culturally reliable role in motivating cooperative behaviour. The study contributes substantially to the current literature by testing a central hypothesis in the evolutionary and cognitive science of religion with a large and culturally diverse dataset using behavioural and ethnographically rich methods.
The prevalence of obesity among pre-school-aged children in the USA remains unacceptably high. Here, we examine the impact of Healthy Caregivers-Healthy Children (HC2) Phase 2, a childcare centre (CCC)-based obesity prevention intervention on changes in the CCC nutrition and physical activity environment over 2 school years.
Design:
This was a cluster-randomised trial with twelve CCC receiving the HC2 intervention arm and twelve in the control arm. The primary outcome was change in the Environment and Policy Assessment and Observation (EPAO) tool over 2 school years (Fall 2015, Spring 2016 and Spring 2017). Changes in EPAO physical activity and nutrition score were analysed via a: (1) random effects mixed models and (2) mixed models to determine the effect of HC2 v. control.
Setting:
The study was conducted in twenty-four CCC serving low-income, ethnically diverse families in Miami-Dade County.
Participants:
Intervention CCC received (1) teachers/parents/children curriculum, (2) snack, beverage, physical activity, and screen time policies, and (3) menu modifications.
Results:
Two-year EPAO nutrition score changes in intervention CCC were almost twice that of control CCC. The EPAO physical activity environment scores only slightly improved in intervention CCC v. control CCC. Intervention CCC showed higher combined EPAO physical activity and nutrition scores compared to control CCC over the 2-year study period (β = 0·09, P = 0·05).
Conclusions:
Obesity prevention programmes can have a positive impact on the CCC nutrition environment and can promote healthy weight in early childhood. CCC may need consistent support to improve the physical activity environment to ensure the policies remain intact.
Background: Antibiotics targeted against Clostridioides difficile bacteria are necessary, but insufficient, to achieve a durable clinical response because they have no effect on C. difficile spores that germinate within a disrupted microbiome. ECOSPOR-III evaluated SER-109, an investigational, biologically derived microbiome therapeutic of purified Firmicute spores for treatment of rCDI. Herein, we present the interim analysis in the ITT population at 8 and 12 weeks. Methods: Adults ≥18 years with rCDI (≥3 episodes in 12 months) were screened at 75 US and CAN sites. CDI was defined as ≥3 unformed stools per day for <48 hours with a positive C. difficile assay. After completion of 10–21 days of vancomycin or fidaxomicin, adults with symptom resolution were randomized 1:1 to SER-109 (4 capsules × 3 days) or matching placebo and stratified by age (≥ or <65 years) and antibiotic received. Primary objectives were safety and efficacy at 8 weeks. Primary efficacy endpoint was rCDI (recurrent toxin+ diarrhea requiring treatment); secondary endpoints included efficacy at 12 weeks after dosing. Results: Overall, 287 participants were screened and 182 were randomized (59.9% female; mean age, 65.5 years). The most common reason for screen failure was a negative C. difficile toxin assay. A significantly lower proportion of SER-109 participants had rCDI after dosing compared to placebo at week 8 (11.1% vs 41.3%, respectively; relative risk [RR], 0.27; 95% confidence interval [CI], 0.15–0.51; p-value <0.001). Efficacy rates were significantly higher with SER-109 vs placebo in both stratified age groups (Figure 1). SER-109 was well-tolerated with a safety profile similar to placebo. The most common treatment-emergent adverse events (TEAEs) were gastrointestinal and were mainly mild to moderate. No serious TEAEs, infections, deaths, or drug discontinuations were deemed related to study drug. Conclusions: SER-109, an oral live microbiome therapeutic, achieved high rates of sustained clinical response with a favorable safety profile. By enriching for Firmicute spores, SER-109 achieves high efficacy while mitigating risk of transmitting infectious agents, beyond donor screening alone. SER-109 represents a major paradigm shift in the clinical management of patients with recurrent CDI. Clinicaltrials.gov Identifier NCT03183128. These data were previously presented as a late breaker at American College of Gastroenterology 2020.
Driving is complex, requiring adequate: attention and concentration, memory, insight and understanding, judgement, planning and the ability to self-monitor1. Psychiatric illness, and associated medications, may affect patients’ ability to drive safely. The DVLA is responsible for determining individuals’ safety to drive and produces guidance specific to psychiatric disorders. Patients must comply with relevant guidance and clinicians must determine patients’ driving status and offer appropriate advice about medications and any need to inform the DVLA. This audit aimed to determine the compliance with DVLA guidance on a single inpatient psychiatric ward within Merseycare NHS Foundation Trust, UK.
Method
A retrospective review of electronic patient records was completed. Clerical staff identified all patients admitted to Windsor House from 1/8/20–30/11/20 (n = 42). Data relating to driving status and driving advice were collected onto individual patient audit proformas, and uploaded to the online Audit Management and Tracking (AMaT) system.
Result
100% of patients had diagnoses that would require the DVLA to be informed and 100% were prescribed medication with potential side effects that could impair ones’ ability to drive safely such as dizziness, drowsiness or impaired concentration2. Driving status was only documented for 12 patients (29%) and type of vehicle driven for only 6 patients (1 of whom had an HGV licence).
Discussion of DVLA guidance within the last 3/12 by the mental health team was documented in 17% patients. Of these patients, appropriate driving advice was given to 86%. All patients advised to cease driving were willing to. No patients were advised about side effects of medications on driving. No notes evidenced if the DVLA had been informed of patients’ admission, diagnosis or medication regimes.
Conclusion
Discussing diving status and DVLA advice with psychiatric patients is important but may not always happen in inpatient settings, despite most patients having a relevant diagnosis. Failure to determine driving status may mean some patients are not being given appropriate guidance as required. Counselling on medication side effects in relation to driving should be encouraged as the majority of patients are taking prescribed medication that can potentially impair driving. Recommendations to improve compliance include: adding “driving status” to admission clerking and ward review proformas, educating staff to actively discuss driving with inpatients and create discharge checklists which prompt discussing driving status, medications and driving advice, and to re-audit in 6 months time.
The Montreal Cognitive Assessment (MoCA) is routinely used during the early assessment of people after stroke to indicate cognitive effects and inform clinical decision-making.
Aim:
The purpose of this study was to examine the relationship between cognition in the first week post-stroke and personal and instrumental activities of daily skills at 1 month and 3 months post-stroke.
Method:
A prospective cohort study consecutively recruited people admitted to the acute stroke ward. Acute cognitive status was measured using the MoCA within 1 week post-stroke onset. Functional outcomes were measured using the Functional Independence Measure (FIM) and the Australian Modified Lawton’s Instrumental Activities of Daily Living Scale (Lawton’s) at 1 month and 3 months post-stroke.
Results:
Fifty participants with predominantly mild stroke (n = 47) and mean age of 69.8 achieved a mean MoCA score of 23.1. Controlling for age, the MoCA was associated with the overall FIM score at 1 month (P = 0.02). It was nearing significance for the Lawton’s at 1 month (P = 0.06) but was not associated with either outcome at 3 months. A score of less than 23 on the MoCA was indicative of lower scores on both outcomes.
Conclusions:
A low MoCA score within 1 week of stroke may indicate need for support or rehabilitation due to early impacts on personal activities of daily living, but is not associated with poor functional outcomes at 3 months.
The primary objective of this study was to evaluate feasibility and acceptability of Mindfulness-based Wellness and Resilience (MBWR): a brief mindfulness-based intervention designed to enhance resilience and is delivered to interdisciplinary primary care teams.
Background:
Burnout is a pervasive, international problem affecting the healthcare workforce, characterized by emotional exhaustion, depersonalization, and decreased professional effectiveness. Delivery models of mindfulness-based resilience interventions that enhance feasibility for onsite delivery, consider cultural considerations specific to primary care, and utilize team processes that are integral to primary care are now needed.
Methods:
We conducted a mixed-methods feasibility and acceptability trial of MBWR. Primary feasibility and acceptability outcomes were assessed by number of participants recruited, percent of MBWR treatment completer, and attrition rate during the 8-week intervention, and four items on a Likert-type scale. Secondary outcomes of perceived effects were measured by focus groups, an online survey, and self-reported questionnaires, including the Brief Resilience Scale, the Five Facet Mindfulness Questionnaire-Short Form, and the Self-Compassion Scale-Short Form. Participants included 31 healthcare providers on interdisciplinary primary care teams employed a safety-net medical center. In the MBWR group, 68% identified as Latinx, compared to 64% in the control group.
Findings:
All criteria for feasibility were met and participants endorsed high levels of satisfaction and acceptability. The results of this study suggest that MBWR provides multiple perceived benefits to the individual healthcare provider, cohesion of the healthcare team, and enhanced patient care. MBWR may be a feasible and acceptable method to integrate mindfulness, resilience, and teamwork training into the primary care setting.
Inflammation of the mammary gland following bacterial infection, commonly known as mastitis, affects all mammalian species. Although the aetiology and epidemiology of mastitis in the dairy cow are well described, the genetic factors mediating resistance to mammary gland infection are not well known, due in part to the difficulty in obtaining robust phenotypic information from sufficiently large numbers of individuals. To address this problem, an experimental mammary gland infection experiment was undertaken, using a Friesian-Jersey cross breed F2 herd. A total of 604 animals received an intramammary infusion of Streptococcus uberis in one gland, and the clinical response over 13 milkings was used for linkage mapping and genome-wide association analysis. A quantitative trait locus (QTL) was detected on bovine chromosome 11 for clinical mastitis status using micro-satellite and Affymetrix 10 K SNP markers, and then exome and genome sequence data used from the six F1 sires of the experimental animals to examine this region in more detail. A total of 485 sequence variants were typed in the QTL interval, and association mapping using these and an additional 37 986 genome-wide markers from the Illumina SNP50 bovine SNP panel revealed association with markers encompassing the interleukin-1 gene cluster locus. This study highlights a region on bovine chromosome 11, consistent with earlier studies, as conferring resistance to experimentally induced mammary gland infection, and newly prioritises the IL1 gene cluster for further analysis in genetic resistance to mastitis.
Deciphering the folding pathways and predicting the structures of complex three-dimensional biomolecules is central to elucidating biological function. RNA is single-stranded, which gives it the freedom to fold into complex secondary and tertiary structures. These structures endow RNA with the ability to perform complex chemistries and functions ranging from enzymatic activity to gene regulation. Given that RNA is involved in many essential cellular processes, it is critical to understand how it folds and functions in vivo. Within the last few years, methods have been developed to probe RNA structures in vivo and genome-wide. These studies reveal that RNA often adopts very different structures in vivo and in vitro, and provide profound insights into RNA biology. Nonetheless, both in vitro and in vivo approaches have limitations: studies in the complex and uncontrolled cellular environment make it difficult to obtain insight into RNA folding pathways and thermodynamics, and studies in vitro often lack direct cellular relevance, leaving a gap in our knowledge of RNA folding in vivo. This gap is being bridged by biophysical and mechanistic studies of RNA structure and function under conditions that mimic the cellular environment. To date, most artificial cytoplasms have used various polymers as molecular crowding agents and a series of small molecules as cosolutes. Studies under such in vivo-like conditions are yielding fresh insights, such as cooperative folding of functional RNAs and increased activity of ribozymes. These observations are accounted for in part by molecular crowding effects and interactions with other molecules. In this review, we report milestones in RNA folding in vitro and in vivo and discuss ongoing experimental and computational efforts to bridge the gap between these two conditions in order to understand how RNA folds in the cell.
The main objective of our target article was to sketch the empirical case for the importance of selection at the level of groups on cultural variation. Such variation is massive in humans, but modest or absent in other species. Group selection processes acting on this variation is a framework for developing explanations of the unusual level of cooperation between non-relatives found in our species. Our case for cultural group selection (CGS) followed Darwin's classic syllogism regarding natural selection: If variation exists at the level of groups, if this variation is heritable, and if it plays a role in the success or failure of competing groups, then selection will operate at the level of groups. We outlined the relevant domains where such evidence can be sought and characterized the main conclusions of work in those domains. Most commentators agree that CGS plays some role in human evolution, although some were considerably more skeptical. Some contributed additional empirical cases. Some raised issues of the scope of CGS explanations versus competing ones.
The gender† dimension of science and technology has become one of the most important and debated issues worldwide, impacting society at every level. A variety of international initiatives on the subject have been undertaken, including the continued monitoring of the status of women in science by Unesco Institute for Statistics (UIS) or the annual reports “Education at a Glance” by the Organization for Economic Co-operation and Development (OECD) as well as field-related working groups and networking in order to collect data in a consistent manner. The majority of the international organizations have made clear statements about their discrimination policies (independently of their main field(s) of action), including the International Council for Science whose regulations are followed by the IAU. Gender equality at large is one of the eight United Nations Millennium Development Goals, which clearly calls for action related to science, technology and gender.
Human cooperation is highly unusual. We live in large groups composed mostly of non-relatives. Evolutionists have proposed a number of explanations for this pattern, including cultural group selection and extensions of more general processes such as reciprocity, kin selection, and multi-level selection acting on genes. Evolutionary processes are consilient; they affect several different empirical domains, such as patterns of behavior and the proximal drivers of that behavior. In this target article, we sketch the evidence from five domains that bear on the explanatory adequacy of cultural group selection and competing hypotheses to explain human cooperation. Does cultural transmission constitute an inheritance system that can evolve in a Darwinian fashion? Are the norms that underpin institutions among the cultural traits so transmitted? Do we observe sufficient variation at the level of groups of considerable size for group selection to be a plausible process? Do human groups compete, and do success and failure in competition depend upon cultural variation? Do we observe adaptations for cooperation in humans that most plausibly arose by cultural group selection? If the answer to one of these questions is “no,” then we must look to other hypotheses. We present evidence, including quantitative evidence, that the answer to all of the questions is “yes” and argue that we must take the cultural group selection hypothesis seriously. If culturally transmitted systems of rules (institutions) that limit individual deviance organize cooperation in human societies, then it is not clear that any extant alternative to cultural group selection can be a complete explanation.