We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Childhood bullying is a public health priority. We evaluated the effectiveness and costs of KiVa, a whole-school anti-bullying program that targets the peer context.
Methods
A two-arm pragmatic multicenter cluster randomized controlled trial with embedded economic evaluation. Schools were randomized to KiVa-intervention or usual practice (UP), stratified on school size and Free School Meals eligibility. KiVa was delivered by trained teachers across one school year. Follow-up was at 12 months post randomization. Primary outcome: student-reported bullying-victimization; secondary outcomes: self-reported bullying-perpetration, participant roles in bullying, empathy and teacher-reported Strengths and Difficulties Questionnaire. Outcomes were analyzed using multilevel linear and logistic regression models.
Findings
Between 8/11/2019–12/02/2021, 118 primary schools were recruited in four trial sites, 11 111 students in primary analysis (KiVa-intervention: n = 5944; 49.6% female; UP: n = 5167, 49.0% female). At baseline, 21.6% of students reported being bullied in the UP group and 20.3% in the KiVa-intervention group, reducing to 20.7% in the UP group and 17.7% in the KiVa-intervention group at follow-up (odds ratio 0.87; 95% confidence interval 0.78 to 0.97, p value = 0.009). Students in the KiVa group had significantly higher empathy and reduced peer problems. We found no differences in bullying perpetration, school wellbeing, emotional or behavioral problems. A priori subgroup analyses revealed no differences in effectiveness by socioeconomic gradient, or by gender. KiVa costs £20.78 more per pupil than usual practice in the first year, and £1.65 more per pupil in subsequent years.
Interpretation
The KiVa anti-bullying program is effective at reducing bullying victimization with small-moderate effects of public health importance.
Funding
The study was funded by the UK National Institute for Health and Care Research (NIHR) Public Health Research program (17-92-11). Intervention costs were funded by the Rayne Foundation, GwE North Wales Regional School Improvement Service, Children's Services, Devon County Council and HSBC Global Services (UK) Ltd.
A surveillance system for measuring patient-level antimicrobial adverse drug events (ADE) may support stewardship activities, however, design and implementation questions remain. In this national survey, stewardship experts favored simple, laboratory-based ADE definitions although there were tensions between feasibility, ability to identify attribution without chart review, and importance of specific ADE.
A Nebraska statewide webinar series was initiated during the coronavirus disease 2019 (COVID-19) pandemic for long-term care (LTC) and acute care/outpatient (AC) facilities. An impact survey was completed by 48 of 96 AC and 109 of 429 LTC facilities. The majority reported increased regulatory awareness (AC: 65%, LTC: 54%) and updated COVID-19 (AC: 61%, LTC: 69%) and general infection prevention (AC: 61%, LTC: 60%) policies.
‘Inhalants’ have been associated with poorer mental health in adolescence, but little is known of associations with specific types of inhalants.
Aims
We aimed to investigate associations of using volatile substances, nitrous oxide and alkyl nitrates with mental health problems in adolescence.
Method
We conducted a cross-sectional analysis using data from 13- to 14-year-old adolescents across England and Wales collected between September 2019 and March 2020. Multilevel logistic regression examined associations between lifetime use of volatile substances, nitrous oxide and alkyl nitrates with self-reported symptoms of probable depression, anxiety, conduct disorder and auditory hallucinations.
Results
Of the 6672 adolescents in the study, 5.1% reported use of nitrous oxide, 4.9% volatile solvents and 0.1% alkyl nitrates. After accounting for multiple testing, adolescents who had used volatile solvents were significantly more likely to report probable depressive (odds ratio = 4.59, 95% CI 3.58, 5.88), anxiety (odds ratio = 3.47, 95% CI 2.72, 4.43) or conduct disorder (odds ratio = 7.52, 95% CI 5.80, 9.76) and auditory hallucinations (odds ratio = 5.35, 95% CI 4.00, 7.17) than those who had not. Nitrous oxide use was significantly associated with probable depression and conduct disorder but not anxiety disorder or auditory hallucinations. Alkyl nitrate use was rare and not associated with mental health outcomes. Adjustment for use of other inhalants, tobacco and alcohol resulted in marked attenuation but socioeconomic disadvantage had little effect.
Conclusion
To our knowledge, this study provides the first general population evidence that volatile solvents and nitrous oxide are associated with probable mental health disorders in adolescence. These findings require replication, ideally with prospective designs.
College student food insecurity (FI) is a public health concern. Programming and policies to support students have expanded but utilisation is often limited. The aim of this study was to summarise the barriers to accessing college FI programming guided by the social ecological model (SEM) framework. A scoping review of peer-reviewed literature included an electronic search conducted in MEDLINE, ERIC, and PubMed databases, with a secondary search in Google Scholar. Of the 138 articles identified, 18 articles met eligibility criteria and were included. Articles primarily encompassed organisational (17/18) level barriers, followed by individual (15/18), relationship (15/18), community (9/18), and policy (6/18) levels. Individual barriers included seven themes: Knowledge of Process, Awareness, Limited Time or Schedules, Personal Transportation, Internal Stigma, Perception of Need, and Type of Student. Four relationship barriers were identified: External Stigma, Comparing Need, Limited Availability Causes Negative Perceptions, and Staff. Ten barrier themes comprised the organisational level: Application Process, Operational Process, Location, Hours of Operation, Food Quality, Food Quantity, Food Desirability or Variety of Food, Marketing Materials, Awareness of the Program, and COVID-19 Restrictions. Two barrier themes were identified at the community level, Public Transportation and Awareness of SNAP, while one barrier theme, SNAP Eligibility and Process, encompassed the policy level. Higher education stakeholders should seek to overcome these barriers to the use of food programmes as a means to address the issue of college FI. This review offers recommendations to overcome these barriers at each SEM level.
To describe neutropenic fever management practices among healthcare institutions.
Design:
Survey.
Participants:
Members of the Society for Healthcare Epidemiology of America Research Network (SRN) representing healthcare institutions within the United States.
Methods:
An electronic survey was distributed to SRN representatives, with questions pertaining to demographics, antimicrobial prophylaxis, supportive care, and neutropenic fever management. The survey was distributed from fall 2022 through spring 2023.
Results:
40 complete responses were recorded (54.8% response rate), with respondent institutions accounting for approximately 15.7% of 2021 US hematologic malignancy hospitalizations and 14.9% of 2020 US bone marrow transplantations. Most entities have institutional guidelines for neutropenic fever management (35, 87.5%) and prophylaxis (31, 77.5%), and first-line treatment included IV antipseudomonal antibiotics (35, 87.5% cephalosporin; 5, 12.5% penicillin; 0, 0% carbapenem).
We observed significant heterogeneity in treatment course decisions, with roughly half (18, 45.0%) of respondents continuing antibiotics until neutrophil recovery, while the remainder having criteria for de-escalation prior to neutrophil recovery. Respondents were more willing to de-escalate prior to neutrophil recovery in patients with identified clinical (27, 67.5% with pneumonia) or microbiological (30, 75.0% with bacteremia) sources after dedicated treatment courses.
Conclusions:
We found substantial variation in the practice of de-escalation of empiric antibiotics relative to neutrophil recovery, highlighting a need for more robust evidence for and adoption of this practice. No respondents use carbapenems as first-line therapy, comparing favorably to prior survey studies conducted in other countries.
During the coronavirus disease 2019 pandemic, mathematical modeling has been widely used to understand epidemiological burden, trends, and transmission dynamics, to facilitate policy decisions, and, to a lesser extent, to evaluate infection prevention and control (IPC) measures. This review highlights the added value of using conventional epidemiology and modeling approaches to address the complexity of healthcare-associated infections (HAI) and antimicrobial resistance. It demonstrates how epidemiological surveillance data and modeling can be used to infer transmission dynamics in healthcare settings and to forecast healthcare impact, how modeling can be used to improve the validity of interpretation of epidemiological surveillance data, how modeling can be used to estimate the impact of IPC interventions, and how modeling can be used to guide IPC and antimicrobial treatment and stewardship decision-making. There are several priority areas for expanding the use of modeling in healthcare epidemiology and IPC. Importantly, modeling should be viewed as complementary to conventional healthcare epidemiological approaches, and this requires collaboration and active coordination between IPC, healthcare epidemiology, and mathematical modeling groups.
Several factors shape the neurodevelopmental trajectory. A key area of focus in neurodevelopmental research is to estimate the factors that have maximal influence on the brain and can tip the balance from typical to atypical development.
Methods
Utilizing a dissimilarity maximization algorithm on the dynamic mode decomposition (DMD) of the resting state functional MRI data, we classified subjects from the cVEDA neurodevelopmental cohort (n = 987, aged 6–23 years) into homogeneously patterned DMD (representing typical development in 809 subjects) and heterogeneously patterned DMD (indicative of atypical development in 178 subjects).
Results
Significant DMD differences were primarily identified in the default mode network (DMN) regions across these groups (p < 0.05, Bonferroni corrected). While the groups were comparable in cognitive performance, the atypical group had more frequent exposure to adversities and faced higher abuses (p < 0.05, Bonferroni corrected). Upon evaluating brain-behavior correlations, we found that correlation patterns between adversity and DMN dynamic modes exhibited age-dependent variations for atypical subjects, hinting at differential utilization of the DMN due to chronic adversities.
Conclusion
Adversities (particularly abuse) maximally influence the DMN during neurodevelopment and lead to the failure in the development of a coherent DMN system. While DMN's integrity is preserved in typical development, the age-dependent variability in atypically developing individuals is contrasting. The flexibility of DMN might be a compensatory mechanism to protect an individual in an abusive environment. However, such adaptability might deprive the neural system of the faculties of normal functioning and may incur long-term effects on the psyche.
Neurocognitive decline is prevalent in patients with metastatic cancers, attributed to various disease, treatment, and individual factors. Whether the presence of brain metastases (BrMets) contributes to neurocognitive decline is unclear. Aims of this study are to examine neurocognitive performance in BrMets patients and compare findings to patients with advanced metastatic cancer without BrMets. Here, we present baseline findings from an ongoing, prospective longitudinal study.
Participants and Methods:
English-speaking adults with advanced metastatic cancers were recruited from the brain metastases and lung clinics at the Princess Margaret Cancer Centre. Participants completed standardized tests (WTAR, HVLT-R, BVMT-R, COWAT, Trailmaking test, WAIS-IV Digit Span) and questionnaires (FACT-Cog v3, EORTC-QLQ C30 and BN20, PROMIS Depression(8a) and Anxiety(6a)) prior to cranial radiotherapy for those who required it. Test scores were converted to z-scores based on published normative data and averaged to create a composite neurocognitive performance score and domain scores for memory, attention/working memory, processing speed and executive function. Neurocognitive impairment was defined according to International Cancer and Cognition Task Force criteria. Univariate and multivariate regressions were used to identify individual, disease and treatment variables that predict cognitive performance.
Results:
76 patients (mean (SD) age: 63.2 (11.7) years; 53% male) with BrMets were included. 61% experienced neurocognitive impairment overall; impairment rates varied across domains (38% memory, 39% executive functioning, 13% attention/working memory, 8% processing speed). BrMets quantity, volume, and location were not associated with neurocognitive performance. Better performance status (ECOG; ß[95%CI];-0.38[-0.70,-0.05], p=0.021), higher premorbid IQ (0.34[0.10,0.58], p=0.005) and greater cognitive concerns (0.02[-3.9e-04,0.04], p=0.051) were associated with better neurocognitive performance in univariate analyses. Only premorbid IQ (0.37[0.14,0.60], p=0.003) and cognitive concerns (0.02[0.0004, 0.03], p=0.05) remained significant in multivariate analysis. We also recruited 31 patients with metastatic non-small cell lung cancer (mNSCLC) with no known BrMets (age: 67.5 (8.3); 32% male) and compared them to the subgroup of BrMets patients in our sample with mNSCLC (N=32; age: 67.8 (11.7); 53% male). We found no differences in impairment rates (BrMets/non-BrMets: Cognitive Composite, 59%/55%; Memory, 31%/32%; Executive Functioning, 35%/29%; Attention/working memory, 16%/13%; Processing speed, 7%/6%; Wilcoxon rank-sum test, all p-value’s > 0.5). The presence or absence of BrMets did not predict neurocognitive performance. Among patients with mNSCLC, higher education (0.11[0.03,0.18], p=0.004) and premorbid IQ (0.36[0.12,0.61], p=0.003), fewer days since primary diagnosis (0.00290[-0.0052,-0.0005], p=0.015) fewer pack-years smoking history (0.01[0.02,-0.001], p=0.027) and greater cognitive concerns (0.02[7e-5,0.04], p=0.045) were associated with better neurocognitive performance in univariate analyses; only premorbid IQ (0.26[0.02,0.51], p=0.04) and cognitive concerns (0.02[0.01,0.04], p=0.02) remained significant in multivariate analysis.
Conclusions:
Cognitive impairment is prevalent in patients with advanced metastatic cancers, particularly affecting memory and executive functioning. However, 39% of patients in our sample were not impaired in any domain. We found no associations between the presence of BrMets and neurocognitive function in patients with advanced cancers prior to cranial radiation. Premorbid IQ, a proxy for cognitive reserve, was associated with cognitive outcomes in our sample. Our longitudinal study will allow us to identify risk and resilience factors associated with neurocognitive changes in patients with metastatic cancers to better inform therapeutic interventions in this population.
In the UK over 12,400 yearly cases of head and neck cancers are reported (2021). Pharyngolaryngeal biopsies (OLB) may improve the speed of diagnosis and treatment of head and neck cancers under local anesthetic. The Scottish Health Technologies Group (SHTG) published advice on this technology in 2018. Since this, additional evidence has been published to warrant a health technology assessment (HTA) for Wales. The aim of this review was to provide update on the clinical and cost-effectiveness of OLB when compared to undergoing biopsy in an operating theatre (OTB) under general anesthetic to inform decision making in Wales.
Methods
A rapid review was undertaken of relevant databases since 2018 of the clinical evidence, health economics and patient perspectives relevant to Wales. Health Technology Wales (HTW) developed a de-novo cost-utility analysis comparing OLB to OTB over a lifetime horizon. Inputs were sourced from the SHTG budget impact analysis, updated with values more relevant to a Welsh setting.
Results
From consultation to biopsy procedure, the mean number of days was 1.3 for OLB compared to 17.4 days under OTB (p < 0.05). The mean time from consultation to start of treatment was 27 days for OLB compared to 41.5 days for OTB (p < 0.05). The economic analysis found a resulting ICER of GBP21,011 (EUR23,824.23) in a population with 2,183 at risk patients. As OLB was associated with lower costs (GBP816 per person) (EUR925.26) and fewer quality adjusted life years than OTB (-0.04), this ICER corresponds to OLB being considered a cost-effective diagnostic strategy.
Conclusions
HTW guidance was able to recommend use of OLB within the diagnostic pathway for head and neck cancers within Wales. For people with a positive test, OLB is sufficient to confirm a diagnosis but should not be used to rule out a diagnosis due to the potential in reducing the time to diagnosis and treatment in a cost-saving way.
Background: Neutropenic fever management decisions are complex and result in prolonged duration of broad-spectrum antibiotics. Strategies for antibiotic stewardship in this context have been studied, including de-escalation of antibiotics prior to resolution of neutropenia, with unclear implementation. Here, we present the first survey study to describe real-world neutropenic fever management practices in US healthcare institutions, with particular emphasis on de-escalation strategies after initiation of broad-spectrum antibiotics. Methods: Using REDCap, we conducted a survey of US healthcare institutions through the SHEA Research Network (SRN). Questions pertained to antimicrobial prophylaxis and supportive care in the management of oncology patients and neutropenic fever management (including specific antimicrobial choices and clinical scenarios). Hematologic malignancy hospitalization (2020) and bone-marrow transplantation (2016–2020) volumes were obtained from CMS and Health Resources & Services Administration databases, respectively. Results: Overall, 23 complete responses were recorded (response rate, 35.4%). Collectively, these entities account for ~11.0% of hematologic malignancy hospitalizations and 13.3% bone marrow transplantations nationwide. Of 23 facilities, 19 had institutional guidelines for neutropenic fever management and 18 had institutional guidelines for prophylaxis, with similar definitions for neutropenic fever. Firstline treatment universally utilized antipseudomonal broad-spectrum IV antibiotics (20 of 23 use cephalosporin, 3 of 23 use penicillin agent, and no respondents use carbapenem). Fluoroquinolone prophylaxis was common for leukemia induction patients (18 of 23) but was mixed for bone-marrow transplantation (10 of 23). We observed significant heterogeneity in treatment decisions. For stable neutropenic fever patients with no clinical source of infection identified, 13 of 23 respondents continued IV antibiotics until ANC (absolute neutrophil count) recovery. The remainder had criteria for de-escalation back to prophylaxis prior to this (eg, a fever-free period). Respondents were more willing to de-escalate prior to ANC recovery in patients with identified clinical sources (14 of 23 de-escalations in patients with pneumonia) or microbiological sources (15 of 23 de-escalations in patients with bacteremia) after dedicated treatment courses. In free-text responses, several respondents described opportunities for more systemic de-escalation for antimicrobial stewardship in these scenarios. Conclusions: Our results illustrate the real-world management of neutropenic fever in US hospitals, including initiation of therapy, prophylaxis, and treatment duration. We found significant heterogeneity in de-escalation of empiric antibiotics relative to ANC recovery, highlighting a need for more robust evidence for and adoption of this practice.
Data from a national survey of 348 U.S. sports field managers were used to examine the effects of participation in Cooperative Extension events on the adoption of turfgrass weed management practices. Of the respondents, 94% had attended at least one event in the previous 3 yr. Of this 94%, 97% reported adopting at least one practice as a result of knowledge gained at an Extension turfgrass event. Half of the respondents had adopted four or more practices; a third adopted five or more practices. Nonchemical, cultural practices were the most-adopted practices (65% of respondents). Multiple regression analysis was used to examine factors explaining practice adoption and Extension event attendance. Compared to attending one event, attending three events increased total adoption by an average of one practice. Attending four or more events increased total adoption by two practices. Attending four or more events (compared to one event) increased the odds of adopting six individual practices by 3- to 6-fold, depending on the practice. This suggests that practice adoption could be enhanced by encouraging repeat attendance among past Extension event attendees. Manager experience was a statistically significant predictor of the number of Extension events attended but a poor direct predictor of practice adoption. Experience does not appear to increase adoption directly, but indirectly, via its impact on Extension event attendance. In addition to questions about weed management generally, the survey asked questions specifically about annual bluegrass management. Respondents were asked to rank seven sources of information for their helpfulness in managing annual bluegrass. There was no single dominant information source, but Extension was ranked more than any other source as the most helpful (by 22% of the respondents) and was ranked among the top three by 53%, closely behind field representative/local distributor sources at 54%.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Cannabis has been associated with poorer mental health, but little is known of the effect of synthetic cannabinoids or cannabidiol (often referred to as CBD).
Aims
To investigate associations of cannabis, synthetic cannabinoids and cannabidiol with mental health in adolescence.
Method
We conducted a cross-sectional analysis with 13- to 14-year-old adolescents across England and Wales in 2019–2020. Multilevel logistic regression was used to examine the association of lifetime use of cannabis, synthetic cannabinoids and cannabidiol with self-reported symptoms of probable depression, anxiety, conduct disorder and auditory hallucinations.
Results
Of the 6672 adolescents who participated, 5.2% reported using of cannabis, 1.9% reported using cannabidiol and 0.6% reported using synthetic cannabinoids. After correction for multiple testing, adolescents who had used these substances were significantly more likely to report a probable depressive, anxiety or conduct disorder, as well as auditory hallucinations, than those who had not. Adjustment for socioeconomic disadvantage had little effect on associations, but weekly tobacco use resulted in marked attenuation of associations. The association of cannabis use with probable anxiety and depressive disorders was weaker in those who reported using cannabidiol than those who did not. There was little evidence of an interaction between synthetic cannabinoids and cannabidiol.
Conclusions
To our knowledge, this study provides the first general population evidence that synthetic cannabinoids and cannabidiol are associated with probable mental health disorders in adolescence. These associations require replication, ideally with prospective cohorts and stronger study designs.
Despite over a decade of both quantitative and qualitative studies, food insecurity among US college/university students remains a pervasive problem within higher education. The purpose of this perspective piece was to highlight research gaps in the area of college food insecurity and provide rationale for the research community to focus on these gaps going forward. A group of food insecurity researchers from a variety of higher education institutions across the United States identified five thematic areas of research gaps: screening and estimates of food insecurity; longitudinal changes in food insecurity; impact of food insecurity on broader health and academic outcomes; evaluation of impact, sustainability and cost effectiveness of existing programmes and initiatives; and state and federal policies and programmes. Within these thematic areas, nineteen specific research gaps were identified that have limited or no peer-reviewed, published research. These research gaps result in a limited understanding of the magnitude, severity and persistence of college food insecurity, the negative short- and long-term impacts of food insecurity on health, academic performance and overall college experience, and effective solutions and policies to prevent or meaningfully address food insecurity among college students. Research in these identified priority areas may help accelerate action and interdisciplinary collaboration to alleviate food insecurity among college students and play a critical role in informing the development or refinement of programmes and services that better support college student food security needs.
Legislators must decide when, if ever, to cosponsor legislation. Scholars have shown legislators strategically time their positions on salient issues of national importance, but we know little about the timing of position-taking for routine bills or what this activity looks like in state legislatures. We argue that legislators’ cosponsorship decision-making depends on the type of legislation and the partisan dynamics among the current cosponsors. Members treat everyday legislation as generalized position-taking motivated by reelection, yet for key legislation, legislators are policy-oriented. With a new dataset of over 73,000 bills introduced in both chambers of the Texas state legislature in the 75th to 86th regular sessions (1997–2020), we use pooled Cox proportional hazard models to evaluate the dynamics of when legislators legislate, comparing all bills introduced with a subset of key bills. The results show that legislators time their cosponsorship activity in response to electoral vulnerability, partisanship, and the dynamics of the chamber in which they serve.
This analysis examined the impact of a digital therapeutic for treating chronic insomnia (currently marketed as Somryst®, at the time called Sleep Healthy Using The internet [SHUTi]) on healthcare resource use (HCRU) by comparing patients treated with the digital cognitive behavioral therapy for insomnia (dCBTi) to patients not treated with dCBTi, but with insomnia medications.
Methods
A retrospective observational study using health claims data was conducted in two cohorts across the United States: patients who registered for dCBTi (cases) between June 1, 2016 and October 31, 2018 (index date) vs. patients who did not register for dCBTi but initiated a second prescription for an insomnia medication in the same time period (controls). Observation period was 16–24 months. No other inclusion/exclusion criteria were used. Control patients were matched using a nearest neighbor within-caliper matching without replacement approach. Incidence rates for HCRU encounter type were calculated using a negative binomial model for both cohorts. Costs were estimated by multiplying HCRU by published average costs for each medical resource.
Results
Evaluated were 248 cases (median age 56.5 years, 57.3% female, 52.4% treated with sleep-related medications) and 248 matched controls (median age 55.0 years, 56.0% female, 100.0% treated with sleep-related medications). Over the course of 24 months post-initiation, cases had significantly lower incidences of inpatient stays (55% lower, IRR: 0.45; 95% CI: 0.28–0.73; P=0.001), significantly fewer emergency department (ED) visits without inpatient admission (59% lower; IRR: 0.41; 95% CI: 0.27–0.63; P<0.001), and significantly fewer hospital outpatient visits (36% lower; IRR: 0.64; 95% CI: 0.49–0.82; P<0.001). There was also a trend for fewer ambulatory surgical center visits (23% lower; IRR: 0.77; 95% CI: 0.52–1.14; P=0.197) and fewer office visits (7% lower; IRR: 0.93; 95% CI: 0.81–1.07; P=0.302) with the use of SHUTi. Use of sleep medications was more than four times greater in controls vs. cases, with 9.6 (95% CI: 7.88–11.76) and 2.4 (95% CI: 1.91–2.95) prescriptions/patient, respectively (P<0.001). All-cause per-patient HCRU costs were $8,202 lower over 24 months for cases vs. controls, driven primarily by a lower incidence of hospitalizations (-$4,996 per patient) and hospital outpatient visits (-$2,003 per patient).
Conclusions
Patients with chronic insomnia who used a digital CBTi treatment had significant and durable real-world reductions in hospital inpatient stays, ED visits, hospital outpatient visits, and office visits compared to matched controls treated with medications.
Racial and ethnic minority groups have higher rates of SARS-CoV-2 infection, severe illness, and death; however, they receive monoclonal antibody (mAb) treatment at lower rates than non-Hispanic White patients. We report data from a systematic approach to improve equitable provision of COVID-19 neutralizing monoclonal antibody treatment.
Methods:
Treatment was administered at a community health urgent care clinic affiliated with a safety-net urban hospital. The approach included a stable treatment supply, a same-day test and treat model, a referral process, patient outreach, and financial support. We analyzed the race/ethnicity data descriptively and compared proportions using a chi-square test.
Results:
Over 17 months, 2524 patients received treatment. Compared to the demographics of county COVID-19-positive cases, a greater proportion of patients who received mAb treatment were Hispanic (44.7% treatment vs. 36.5% positive cases, p < 0.001), a lower proportion were White Non-Hispanic (40.7% treatment vs. 46.3% positive cases, p < 0.001), equal proportion were Black (8.2% treatment vs. 7.4% positive cases, P = 0.13), and equal proportion occurred for other race patients.
Discussion:
Implementation of multiple systematic strategies to administer COVID-19 monoclonal antibodies resulted in an equitable race/ethnic distribution of treatment.
Food insecurity on college campuses is a major public health problem and has been documented for the last decade. Sufficient food access is a crucial social determinant of health, thus campuses across the country have implemented various programmes, systems and policies to enhance access to food which have included food pantries, campus gardens, farmers’ markets, meal share or voucher programmes, mobile food applications, campus food gleaning, food recovery efforts, meal deliveries and task force/working groups. However, little is understood about how to best address food insecurity and support students who are struggling with basic needs. The impact of food insecurity on students’ academic and social success, in addition to their overall well-being, should be investigated and prioritised at each higher education institution. This is especially true for marginalised students, such as minority or first-generation students, who are at heightened risk for food insecurity. In order to create a culture of health equity, in which most at-risk students are provided resources and opportunities to achieve optimal well-being, higher education institutions must prioritise mitigating food insecurity on the college campus. Higher education institutions could benefit from adopting comprehensive and individualised approaches to promoting food security for marginalised students in order to facilitate equal opportunity for optimal scholastic achievement among students of all socio-demographic backgrounds.
The context- and person-specific nature of the Mental Capacity Act 2005 (MCA) in England and Wales means inherent indeterminacy characterises decision-making in the Court of Protection (CoP), not least regarding conflicting values and the weight that should be accorded to competing factors. This paper explores how legal professionals frame and influence the MCA's deliberative and adjudicative processes in the social space of the courtroom through a thematic analysis of semi-structured interviews with legal practitioners specialising in mental capacity law and retired judges from the CoP and the Courts of Appeal with specific experience of adjudicating mental capacity disputes. The concept of the ‘human element’ offers important new insight into how legal professionals perform their roles and justify their activities in the conduct of legal proceedings. The ‘human element’ takes effect in two ways: first, it operates as an overarching normative prism that accounts for what good practice demands of legal professionals in mental capacity law; secondly, it explains how these professionals orientate these norms in the day-to-day conduct of their work. The ‘human element’ further presents challenges that demand practical negotiation in relation to countervailing normative commitments to objectivity and socio-institutional expectations around professional hierarchies, expertise, and evidential thresholds.