To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Neutropenic fever management decisions are complex and result in prolonged duration of broad-spectrum antibiotics. Strategies for antibiotic stewardship in this context have been studied, including de-escalation of antibiotics prior to resolution of neutropenia, with unclear implementation. Here, we present the first survey study to describe real-world neutropenic fever management practices in US healthcare institutions, with particular emphasis on de-escalation strategies after initiation of broad-spectrum antibiotics. Methods: Using REDCap, we conducted a survey of US healthcare institutions through the SHEA Research Network (SRN). Questions pertained to antimicrobial prophylaxis and supportive care in the management of oncology patients and neutropenic fever management (including specific antimicrobial choices and clinical scenarios). Hematologic malignancy hospitalization (2020) and bone-marrow transplantation (2016–2020) volumes were obtained from CMS and Health Resources & Services Administration databases, respectively. Results: Overall, 23 complete responses were recorded (response rate, 35.4%). Collectively, these entities account for ~11.0% of hematologic malignancy hospitalizations and 13.3% bone marrow transplantations nationwide. Of 23 facilities, 19 had institutional guidelines for neutropenic fever management and 18 had institutional guidelines for prophylaxis, with similar definitions for neutropenic fever. Firstline treatment universally utilized antipseudomonal broad-spectrum IV antibiotics (20 of 23 use cephalosporin, 3 of 23 use penicillin agent, and no respondents use carbapenem). Fluoroquinolone prophylaxis was common for leukemia induction patients (18 of 23) but was mixed for bone-marrow transplantation (10 of 23). We observed significant heterogeneity in treatment decisions. For stable neutropenic fever patients with no clinical source of infection identified, 13 of 23 respondents continued IV antibiotics until ANC (absolute neutrophil count) recovery. The remainder had criteria for de-escalation back to prophylaxis prior to this (eg, a fever-free period). Respondents were more willing to de-escalate prior to ANC recovery in patients with identified clinical sources (14 of 23 de-escalations in patients with pneumonia) or microbiological sources (15 of 23 de-escalations in patients with bacteremia) after dedicated treatment courses. In free-text responses, several respondents described opportunities for more systemic de-escalation for antimicrobial stewardship in these scenarios. Conclusions: Our results illustrate the real-world management of neutropenic fever in US hospitals, including initiation of therapy, prophylaxis, and treatment duration. We found significant heterogeneity in de-escalation of empiric antibiotics relative to ANC recovery, highlighting a need for more robust evidence for and adoption of this practice.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Cannabis has been associated with poorer mental health, but little is known of the effect of synthetic cannabinoids or cannabidiol (often referred to as CBD).
To investigate associations of cannabis, synthetic cannabinoids and cannabidiol with mental health in adolescence.
We conducted a cross-sectional analysis with 13- to 14-year-old adolescents across England and Wales in 2019–2020. Multilevel logistic regression was used to examine the association of lifetime use of cannabis, synthetic cannabinoids and cannabidiol with self-reported symptoms of probable depression, anxiety, conduct disorder and auditory hallucinations.
Of the 6672 adolescents who participated, 5.2% reported using of cannabis, 1.9% reported using cannabidiol and 0.6% reported using synthetic cannabinoids. After correction for multiple testing, adolescents who had used these substances were significantly more likely to report a probable depressive, anxiety or conduct disorder, as well as auditory hallucinations, than those who had not. Adjustment for socioeconomic disadvantage had little effect on associations, but weekly tobacco use resulted in marked attenuation of associations. The association of cannabis use with probable anxiety and depressive disorders was weaker in those who reported using cannabidiol than those who did not. There was little evidence of an interaction between synthetic cannabinoids and cannabidiol.
To our knowledge, this study provides the first general population evidence that synthetic cannabinoids and cannabidiol are associated with probable mental health disorders in adolescence. These associations require replication, ideally with prospective cohorts and stronger study designs.
Despite over a decade of both quantitative and qualitative studies, food insecurity among US college/university students remains a pervasive problem within higher education. The purpose of this perspective piece was to highlight research gaps in the area of college food insecurity and provide rationale for the research community to focus on these gaps going forward. A group of food insecurity researchers from a variety of higher education institutions across the United States identified five thematic areas of research gaps: screening and estimates of food insecurity; longitudinal changes in food insecurity; impact of food insecurity on broader health and academic outcomes; evaluation of impact, sustainability and cost effectiveness of existing programmes and initiatives; and state and federal policies and programmes. Within these thematic areas, nineteen specific research gaps were identified that have limited or no peer-reviewed, published research. These research gaps result in a limited understanding of the magnitude, severity and persistence of college food insecurity, the negative short- and long-term impacts of food insecurity on health, academic performance and overall college experience, and effective solutions and policies to prevent or meaningfully address food insecurity among college students. Research in these identified priority areas may help accelerate action and interdisciplinary collaboration to alleviate food insecurity among college students and play a critical role in informing the development or refinement of programmes and services that better support college student food security needs.
Legislators must decide when, if ever, to cosponsor legislation. Scholars have shown legislators strategically time their positions on salient issues of national importance, but we know little about the timing of position-taking for routine bills or what this activity looks like in state legislatures. We argue that legislators’ cosponsorship decision-making depends on the type of legislation and the partisan dynamics among the current cosponsors. Members treat everyday legislation as generalized position-taking motivated by reelection, yet for key legislation, legislators are policy-oriented. With a new dataset of over 73,000 bills introduced in both chambers of the Texas state legislature in the 75th to 86th regular sessions (1997–2020), we use pooled Cox proportional hazard models to evaluate the dynamics of when legislators legislate, comparing all bills introduced with a subset of key bills. The results show that legislators time their cosponsorship activity in response to electoral vulnerability, partisanship, and the dynamics of the chamber in which they serve.
This analysis examined the impact of a digital therapeutic for treating chronic insomnia (currently marketed as Somryst®, at the time called Sleep Healthy Using The internet [SHUTi]) on healthcare resource use (HCRU) by comparing patients treated with the digital cognitive behavioral therapy for insomnia (dCBTi) to patients not treated with dCBTi, but with insomnia medications.
A retrospective observational study using health claims data was conducted in two cohorts across the United States: patients who registered for dCBTi (cases) between June 1, 2016 and October 31, 2018 (index date) vs. patients who did not register for dCBTi but initiated a second prescription for an insomnia medication in the same time period (controls). Observation period was 16–24 months. No other inclusion/exclusion criteria were used. Control patients were matched using a nearest neighbor within-caliper matching without replacement approach. Incidence rates for HCRU encounter type were calculated using a negative binomial model for both cohorts. Costs were estimated by multiplying HCRU by published average costs for each medical resource.
Evaluated were 248 cases (median age 56.5 years, 57.3% female, 52.4% treated with sleep-related medications) and 248 matched controls (median age 55.0 years, 56.0% female, 100.0% treated with sleep-related medications). Over the course of 24 months post-initiation, cases had significantly lower incidences of inpatient stays (55% lower, IRR: 0.45; 95% CI: 0.28–0.73; P=0.001), significantly fewer emergency department (ED) visits without inpatient admission (59% lower; IRR: 0.41; 95% CI: 0.27–0.63; P<0.001), and significantly fewer hospital outpatient visits (36% lower; IRR: 0.64; 95% CI: 0.49–0.82; P<0.001). There was also a trend for fewer ambulatory surgical center visits (23% lower; IRR: 0.77; 95% CI: 0.52–1.14; P=0.197) and fewer office visits (7% lower; IRR: 0.93; 95% CI: 0.81–1.07; P=0.302) with the use of SHUTi. Use of sleep medications was more than four times greater in controls vs. cases, with 9.6 (95% CI: 7.88–11.76) and 2.4 (95% CI: 1.91–2.95) prescriptions/patient, respectively (P<0.001). All-cause per-patient HCRU costs were $8,202 lower over 24 months for cases vs. controls, driven primarily by a lower incidence of hospitalizations (-$4,996 per patient) and hospital outpatient visits (-$2,003 per patient).
Patients with chronic insomnia who used a digital CBTi treatment had significant and durable real-world reductions in hospital inpatient stays, ED visits, hospital outpatient visits, and office visits compared to matched controls treated with medications.
Racial and ethnic minority groups have higher rates of SARS-CoV-2 infection, severe illness, and death; however, they receive monoclonal antibody (mAb) treatment at lower rates than non-Hispanic White patients. We report data from a systematic approach to improve equitable provision of COVID-19 neutralizing monoclonal antibody treatment.
Treatment was administered at a community health urgent care clinic affiliated with a safety-net urban hospital. The approach included a stable treatment supply, a same-day test and treat model, a referral process, patient outreach, and financial support. We analyzed the race/ethnicity data descriptively and compared proportions using a chi-square test.
Over 17 months, 2524 patients received treatment. Compared to the demographics of county COVID-19-positive cases, a greater proportion of patients who received mAb treatment were Hispanic (44.7% treatment vs. 36.5% positive cases, p < 0.001), a lower proportion were White Non-Hispanic (40.7% treatment vs. 46.3% positive cases, p < 0.001), equal proportion were Black (8.2% treatment vs. 7.4% positive cases, P = 0.13), and equal proportion occurred for other race patients.
Implementation of multiple systematic strategies to administer COVID-19 monoclonal antibodies resulted in an equitable race/ethnic distribution of treatment.
Food insecurity on college campuses is a major public health problem and has been documented for the last decade. Sufficient food access is a crucial social determinant of health, thus campuses across the country have implemented various programmes, systems and policies to enhance access to food which have included food pantries, campus gardens, farmers’ markets, meal share or voucher programmes, mobile food applications, campus food gleaning, food recovery efforts, meal deliveries and task force/working groups. However, little is understood about how to best address food insecurity and support students who are struggling with basic needs. The impact of food insecurity on students’ academic and social success, in addition to their overall well-being, should be investigated and prioritised at each higher education institution. This is especially true for marginalised students, such as minority or first-generation students, who are at heightened risk for food insecurity. In order to create a culture of health equity, in which most at-risk students are provided resources and opportunities to achieve optimal well-being, higher education institutions must prioritise mitigating food insecurity on the college campus. Higher education institutions could benefit from adopting comprehensive and individualised approaches to promoting food security for marginalised students in order to facilitate equal opportunity for optimal scholastic achievement among students of all socio-demographic backgrounds.
The context- and person-specific nature of the Mental Capacity Act 2005 (MCA) in England and Wales means inherent indeterminacy characterises decision-making in the Court of Protection (CoP), not least regarding conflicting values and the weight that should be accorded to competing factors. This paper explores how legal professionals frame and influence the MCA's deliberative and adjudicative processes in the social space of the courtroom through a thematic analysis of semi-structured interviews with legal practitioners specialising in mental capacity law and retired judges from the CoP and the Courts of Appeal with specific experience of adjudicating mental capacity disputes. The concept of the ‘human element’ offers important new insight into how legal professionals perform their roles and justify their activities in the conduct of legal proceedings. The ‘human element’ takes effect in two ways: first, it operates as an overarching normative prism that accounts for what good practice demands of legal professionals in mental capacity law; secondly, it explains how these professionals orientate these norms in the day-to-day conduct of their work. The ‘human element’ further presents challenges that demand practical negotiation in relation to countervailing normative commitments to objectivity and socio-institutional expectations around professional hierarchies, expertise, and evidential thresholds.
Blast polytrauma is among the most serious mechanisms of injury confronted by medical providers. There are currently no specific studies or guidelines that define risk factors for mortality in the context of pediatric blast injuries or describe pediatric blast injury profiles.
The objectives of this study were to evaluate risk factors for pediatric mortality and to describe differences in injury profiles between explosions related to terrorism versus unrelated to terrorism within the pediatric population.
A PRISMA systematic review and meta-analysis was performed where articles published from the years 2000-2021 were extracted from PubMed. Mortality and injury profile data were extracted from articles that met inclusion criteria. A bivariant unadjusted odds ratio (OR) analysis was performed to establish protective and harmful factors associated with mortality and to describe the injury profiles of blasts related to terrorism. Statistical significance was established at P < .05.
Thirty-eight articles were included and described a total of 222,638 unique injuries. Factors associated with increased mortality included if the explosion was related to terrorism (OR = 32.73; 95% CI, 28.80-37.21; P < .05) and if the explosion involved high-grade explosives utilized in the Global War on Terror ([GWOT] OR = 1.28; 95% CI, 1.04-1.44; P < .05). Factors associated with decreased mortality included if the patient was resuscitated in a North Atlantic Treaty Organization (NATO)-affiliated combat trauma hospital (OR = 0.48; 95% CI, 0.37-0.62; P < .05); if the explosive was fireworks (OR = 3.20×10-5; 95% CI, 2.00×10-6-5.16×10-4; P < .05); and if the explosion occurred in the United States (OR = 2.40×10-5; 95% CI, 1.51×10-6-3.87×10-4; P < .05). On average, victims of explosions related to terrorism were 10.30 years old (SD = 2.73) with 68.96% (SD = 17.58%) of victims reported as male. Comparison of victims of explosions related to terrorism revealed a higher incidence of thoracoabdominal trauma (30.2% versus 8.6%), similar incidence of craniocerebral trauma (39.5% versus 43.1%), and lower incidence of extremity trauma (31.8% versus 48.3%) compared to victims of explosions unrelated to terrorism.
Explosions related to terrorism are associated with increased mortality and unique injury profiles compared to explosions unrelated to terrorism in the pediatric population. Such findings are important for optimizing disaster medical education of pediatric providers in preparation for and management of acute sequelae of blast injuries—terror-related and otherwise.
Background: Healthcare facilities have experienced many challenges during the COVID-19 pandemic, including limited personal protective equipment (PPE) supplies. Healthcare personnel (HCP) rely on PPE, vaccines, and other infection control measures to prevent SARS-CoV-2 infections. We describe PPE concerns reported by HCP who had close contact with COVID-19 patients in the workplace and tested positive for SARS-CoV-2. Method: The CDC collaborated with Emerging Infections Program (EIP) sites in 10 states to conduct surveillance for SARS-CoV-2 infections in HCP. EIP staff interviewed HCP with positive SARS-CoV-2 viral tests (ie, cases) to collect data on demographics, healthcare roles, exposures, PPE use, and concerns about their PPE use during COVID-19 patient care in the 14 days before the HCP’s SARS-CoV-2 positive test. PPE concerns were qualitatively coded as being related to supply (eg, low quality, shortages); use (eg, extended use, reuse, lack of fit test); or facility policy (eg, lack of guidance). We calculated and compared the percentages of cases reporting each concern type during the initial phase of the pandemic (April–May 2020), during the first US peak of daily COVID-19 cases (June–August 2020), and during the second US peak (September 2020–January 2021). We compared percentages using mid-P or Fisher exact tests (α = 0.05). Results: Among 1,998 HCP cases occurring during April 2020–January 2021 who had close contact with COVID-19 patients, 613 (30.7%) reported ≥1 PPE concern (Table 1). The percentage of cases reporting supply or use concerns was higher during the first peak period than the second peak period (supply concerns: 12.5% vs 7.5%; use concerns: 25.5% vs 18.2%; p Conclusions: Although lower percentages of HCP cases overall reported PPE concerns after the first US peak, our results highlight the importance of developing capacity to produce and distribute PPE during times of increased demand. The difference we observed among selected groups of cases may indicate that PPE access and use were more challenging for some, such as nonphysicians and nursing home HCP. These findings underscore the need to ensure that PPE is accessible and used correctly by HCP for whom use is recommended.
Animal and human data demonstrate independent relationships between fetal growth, hypothalamic-pituitary-adrenal axis function (HPA-A) and adult cardiometabolic outcomes. While the association between fetal growth and adult cardiometabolic outcomes is well-established, the role of the HPA-A in these relationships is unclear. This study aims to determine whether HPA-A function mediates or moderates this relationship. Approximately 2900 pregnant women were recruited between 1989-1991 in the Raine Study. Detailed anthropometric data was collected at birth (per cent optimal birthweight [POBW]). The Trier Social Stress Test was administered to the offspring (Generation 2; Gen2) at 18 years; HPA-A responses were determined (reactive responders [RR], anticipatory responders [AR] and non-responders [NR]). Cardiometabolic parameters (BMI, systolic BP [sBP] and LDL cholesterol) were measured at 20 years. Regression modelling demonstrated linear associations between POBW and BMI and sBP; quadratic associations were observed for LDL cholesterol. For every 10% increase in POBW, there was a 0.54 unit increase in BMI (standard error [SE] 0.15) and a 0.65 unit decrease in sBP (SE 0.34). The interaction between participant’s fetal growth and HPA-A phenotype was strongest for sBP in young adulthood. Interactions for BMI and LDL-C were non-significant. Decomposition of the total effect revealed no causal evidence of mediation or moderation.
Developmental adversities early in life are associated with later psychopathology. Clustering may be a useful approach to group multiple diverse risks together and study their relation with psychopathology. To generate risk clusters of children, adolescents, and young adults, based on adverse environmental exposure and developmental characteristics, and to examine the association of risk clusters with manifest psychopathology. Participants (n = 8300) between 6 and 23 years were recruited from seven sites in India. We administered questionnaires to elicit history of previous exposure to adverse childhood environments, family history of psychiatric disorders in first-degree relatives, and a range of antenatal and postnatal adversities. We used these variables to generate risk clusters. Mini-International Neuropsychiatric Interview-5 was administered to evaluate manifest psychopathology. Two-step cluster analysis revealed two clusters designated as high-risk cluster (HRC) and low-risk cluster (LRC), comprising 4197 (50.5%) and 4103 (49.5%) participants, respectively. HRC had higher frequencies of family history of mental illness, antenatal and neonatal risk factors, developmental delays, history of migration, and exposure to adverse childhood experiences than LRC. There were significantly higher risks of any psychiatric disorder [Relative Risk (RR) = 2.0, 95% CI 1.8–2.3], externalizing (RR = 4.8, 95% CI 3.6–6.4) and internalizing disorders (RR = 2.6, 95% CI 2.2–2.9), and suicidality (2.3, 95% CI 1.8–2.8) in HRC. Social-environmental and developmental factors could classify Indian children, adolescents and young adults into homogeneous clusters at high or low risk of psychopathology. These biopsychosocial determinants of mental health may have practice, policy and research implications for people in low- and middle-income countries.
Evidence for risk of dying by suicide and other causes following discharge from in-patient psychiatric care throughout adulthood is sparse.
To estimate risks of all-cause mortality, natural and external-cause deaths, suicide and accidental, alcohol-specific and drug-related deaths in working-age and older adults within a year post-discharge.
Using interlinked general practice, hospital, and mortality records in the Clinical Practice Research Datalink we delineated a cohort of discharged adults in England, 2001–2018. Each patient was matched to up to 20 general population comparator patients. Cumulative incidence (absolute risks) and hazard ratios (relative risks) were estimated separately for ages 18–64 and ≥65 years with additional stratification by gender and practice-level deprivation.
The 1-year cumulative incidence of dying post-discharge was 2.1% among working-age adults (95% CI 2.0–2.3) and 14.1% (95% CI 13.6–14.5) among older adults. Suicide risk was particularly elevated in the first 3 months, with hazard ratios of 191.1 (95% CI 125.0–292.0) among working-age adults and 125.4 (95% CI 52.6–298.9) in older adults. Older patients were vulnerable to dying by natural causes within 3 months post-discharge. Risk of dying by external causes was greater among discharged working-age adults in the least deprived areas. Relative risk of suicide in discharged working-age women relative to their general population peers was double the equivalent male risk elevation.
Recently discharged adults at any age are at increased risk of dying from external and natural causes, indicating the importance of close monitoring and provision of optimal support to all such patients, particularly during the first 3 months post-discharge.
The prevalence of serious psychological distress (SPD) was elevated during the COVID-19 pandemic in the USA, but the relationships of SPD during the pandemic with pre-pandemic SPD, pre-pandemic socioeconomic status, and pandemic-related social stressors remain unexamined.
A probability-based sample (N = 1751) of the US population age 20 and over was followed prospectively from February 2019 (T1), with subsequent interviews in May 2020 (T2) and August 2020 (T3). Multinomial logistic regression was used to assess prospective relationships between T1 SPD with experiences of disruption of employment, health care, and childcare at T2. Binary logistic regression was then used to assess relationships of T1 SPD, and socioeconomic status and T2 pandemic-related stressors with T3 SPD.
At T1, SPD was associated with age, race/ethnicity, and household income. SPD at T1 predicted disruption of employment (OR 4.5, 95% CI 1.4–3.8) and health care (OR 3.2, 95% CI 1.4–7.1) at T2. SPD at T1 (OR 10.2, 95% CI 4.5–23.3), low household income at T1 (OR 2.6, 95% CI 1.1–6.4), disruption of employment at T2 (OR 3.2, 95% CI 1.4–7.6), and disruption of healthcare at T2 (OR 3.3, 95% CI 1.5–7.2) were all significantly associated with elevated risk for SPD at T3.
Elevated risk for SPD during the COVID-19 pandemic is related to multiple psychological and social pathways that are likely to interact over the life course. Policies and interventions that target individuals with pre-existing mental health conditions as well as those experiencing persistent unemployment should be high priorities in the mental health response to the pandemic.
Researchers, clinicians and patients are increasingly using real-time monitoring methods to understand and predict suicidal thoughts and behaviours. These methods involve frequently assessing suicidal thoughts, but it is not known whether asking about suicide repeatedly is iatrogenic. We tested two questions about this approach: (a) does repeatedly assessing suicidal thinking over short periods of time increase suicidal thinking, and (b) is more frequent assessment of suicidal thinking associated with more severe suicidal thinking? In a real-time monitoring study (n = 101 participants, n = 12 793 surveys), we found no evidence to support the notion that repeated assessment of suicidal thoughts is iatrogenic.
Healthcare personnel with severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection were interviewed to describe activities and practices in and outside the workplace. Among 2,625 healthcare personnel, workplace-related factors that may increase infection risk were more common among nursing-home personnel than hospital personnel, whereas selected factors outside the workplace were more common among hospital personnel.
North-western Arabia is marked by thousands of prehistoric stone structures. Of these, the monumental, rectilinear type known as mustatils has received only limited attention. Recent fieldwork in AlUla and Khaybar Counties, Saudi Arabia, demonstrates that these monuments are architecturally more complex than previously supposed, featuring chambers, entranceways and orthostats. These structures can now be interpreted as ritual installations dating back to the late sixth millennium BC, with recent excavations revealing the earliest evidence for a cattle cult in the Arabian Peninsula. As such, mustatils are amongst the earliest stone monuments in Arabia and globally one of the oldest monumental building traditions yet identified.
Surface energy-balance models are commonly used in conjunction with satellite thermal imagery to estimate supraglacial debris thickness. Removing the need for local meteorological data in the debris thickness estimation workflow could improve the versatility and spatiotemporal application of debris thickness estimation. We evaluate the use of regional reanalysis data to derive debris thickness for two mountain glaciers using a surface energy-balance model. Results forced using ERA-5 agree with AWS-derived estimates to within 0.01 ± 0.05 m for Miage Glacier, Italy, and 0.01 ± 0.02 m for Khumbu Glacier, Nepal. ERA-5 data were then used to estimate spatiotemporal changes in debris thickness over a ~20-year period for Miage Glacier, Khumbu Glacier and Haut Glacier d'Arolla, Switzerland. We observe significant increases in debris thickness at the terminus for Haut Glacier d'Arolla and at the margins of the expanding debris cover at all glaciers. While simulated debris thickness was underestimated compared to point measurements in areas of thick debris, our approach can reconstruct glacier-scale debris thickness distribution and its temporal evolution over multiple decades. We find significant changes in debris thickness over areas of thin debris, areas susceptible to high ablation rates, where current knowledge of debris evolution is limited.
The COVID-19 pandemic and mitigation measures are likely to have a marked effect on mental health. It is important to use longitudinal data to improve inferences.
To quantify the prevalence of depression, anxiety and mental well-being before and during the COVID-19 pandemic. Also, to identify groups at risk of depression and/or anxiety during the pandemic.
Data were from the Avon Longitudinal Study of Parents and Children (ALSPAC) index generation (n = 2850, mean age 28 years) and parent generation (n = 3720, mean age 59 years), and Generation Scotland (n = 4233, mean age 59 years). Depression was measured with the Short Mood and Feelings Questionnaire in ALSPAC and the Patient Health Questionnaire-9 in Generation Scotland. Anxiety and mental well-being were measured with the Generalised Anxiety Disorder Assessment-7 and the Short Warwick Edinburgh Mental Wellbeing Scale.
Depression during the pandemic was similar to pre-pandemic levels in the ALSPAC index generation, but those experiencing anxiety had almost doubled, at 24% (95% CI 23–26%) compared with a pre-pandemic level of 13% (95% CI 12–14%). In both studies, anxiety and depression during the pandemic was greater in younger members, women, those with pre-existing mental/physical health conditions and individuals in socioeconomic adversity, even when controlling for pre-pandemic anxiety and depression.
These results provide evidence for increased anxiety in young people that is coincident with the pandemic. Specific groups are at elevated risk of depression and anxiety during the COVID-19 pandemic. This is important for planning current mental health provisions and for long-term impact beyond this pandemic.