To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Identifying youths most at risk to COVID-19-related mental illness is essential for the development of effective targeted interventions.
To compare trajectories of mental health throughout the pandemic in youth with and without prior mental illness and identify those most at risk of COVID-19-related mental illness.
Data were collected from individuals aged 18–26 years (N = 669) from two existing cohorts: IMAGEN, a population-based cohort; and ESTRA/STRATIFY, clinical cohorts of individuals with pre-existing diagnoses of mental disorders. Repeated COVID-19 surveys and standardised mental health assessments were used to compare trajectories of mental health symptoms from before the pandemic through to the second lockdown.
Mental health trajectories differed significantly between cohorts. In the population cohort, depression and eating disorder symptoms increased by 33.9% (95% CI 31.78–36.57) and 15.6% (95% CI 15.39–15.68) during the pandemic, respectively. By contrast, these remained high over time in the clinical cohort. Conversely, trajectories of alcohol misuse were similar in both cohorts, decreasing continuously (a 15.2% decrease) during the pandemic. Pre-pandemic symptom severity predicted the observed mental health trajectories in the population cohort. Surprisingly, being relatively healthy predicted increases in depression and eating disorder symptoms and in body mass index. By contrast, those initially at higher risk for depression or eating disorders reported a lasting decrease.
Healthier young people may be at greater risk of developing depressive or eating disorder symptoms during the COVID-19 pandemic. Targeted mental health interventions considering prior diagnostic risk may be warranted to help young people cope with the challenges of psychosocial stress and reduce the associated healthcare burden.
To examine differences in noticing and use of nutrition information comparing jurisdictions with and without mandatory menu labelling policies and examine differences among sociodemographic groups.
Cross-sectional data from the International Food Policy Study (IFPS) online survey.
IFPS participants from Australia, Canada, Mexico, United Kingdom and USA in 2019.
Adults aged 18–99; n 19 393.
Participants in jurisdictions with mandatory policies were significantly more likely to notice and use nutrition information, order something different, eat less of their order and change restaurants compared to jurisdictions without policies. For noticed nutrition information, the differences between policy groups were greatest comparing older to younger age groups and comparing high education (difference of 10·7 %, 95 % CI 8·9, 12·6) to low education (difference of 4·1 %, 95 % CI 1·8, 6·3). For used nutrition information, differences were greatest comparing high education (difference of 4·9 %, 95 % CI 3·5, 6·4) to low education (difference of 1·8 %, 95 % CI 0·2, 3·5). Mandatory labelling was associated with an increase in ordering something different among the majority ethnicity group and a decrease among the minority ethnicity group. For changed restaurant visited, differences were greater for medium and high education compared to low education, and differences were greater for higher compared to lower income adequacy.
Participants living in jurisdictions with mandatory nutrition information in restaurants were more likely to report noticing and using nutrition information, as well as greater efforts to modify their consumption. However, the magnitudes of these differences were relatively small.
Rapid antigen detection tests (Ag-RDT) for SARS-CoV-2 with emergency use authorization generally include a condition of authorization to evaluate the test’s performance in asymptomatic individuals when used serially. We aim to describe a novel study design that was used to generate regulatory-quality data to evaluate the serial use of Ag-RDT in detecting SARS-CoV-2 virus among asymptomatic individuals.
This prospective cohort study used a siteless, digital approach to assess longitudinal performance of Ag-RDT. Individuals over 2 years old from across the USA with no reported COVID-19 symptoms in the 14 days prior to study enrollment were eligible to enroll in this study. Participants throughout the mainland USA were enrolled through a digital platform between October 18, 2021 and February 15, 2022. Participants were asked to test using Ag-RDT and molecular comparators every 48 hours for 15 days. Enrollment demographics, geographic distribution, and SARS-CoV-2 infection rates are reported.
A total of 7361 participants enrolled in the study, and 492 participants tested positive for SARS-CoV-2, including 154 who were asymptomatic and tested negative to start the study. This exceeded the initial enrollment goals of 60 positive participants. We enrolled participants from 44 US states, and geographic distribution of participants shifted in accordance with the changing COVID-19 prevalence nationwide.
The digital site-less approach employed in the “Test Us At Home” study enabled rapid, efficient, and rigorous evaluation of rapid diagnostics for COVID-19 and can be adapted across research disciplines to optimize study enrollment and accessibility.
Increasing the availability of lower energy food options is a promising public health approach. However, it is unclear the extent to which availability interventions may result in consumers later ‘compensating’ for reductions in energy intake caused by selecting lower energy food options and to what extent these effects may differ based on socio-economic position (SEP). Our objective was to examine the impact of increasing availability of lower energy meal options on immediate meal energy intake and subsequent energy intake in participants of higher v. lower SEP. In a within-subjects design, seventy-seven UK adults ordered meals from a supermarket ready meal menu with standard (30 %) and increased (70 %) availability of lower energy options. The meals were delivered to be consumed at home, with meal intake measured using the Digital Photography of Foods Method. Post-meal compensation was measured using food diaries to determine self-reported energy intake after the meal and the next day. Participants consumed significantly less energy (196 kcal (820 kJ), 95 % CI 138, 252) from the menu with increased availability of lower energy options v. the standard availability menu (P < 0·001). There was no statistically significant evidence that this reduction in energy intake was substantially compensated for (33 % compensated, P = 0·57). The effects of increasing availability of lower energy food items were similar in participants from lower and higher SEP. Increasing the availability of lower energy food options is likely to be an effective and equitable approach to reducing energy intake which may contribute to improving diet and population health.
Psychiatric hospitalization is a major driver of cost in the treatment of schizophrenia. Here, we asked whether a technology-enhanced approach to relapse prevention could reduce days spent in a hospital after discharge.
The Improving Care and Reducing Cost (ICRC) study was a quasi-experimental clinical trial in outpatients with schizophrenia conducted between 26 February 2013 and 17 April 2015 at 10 different sites in the USA in an outpatient setting. Patients were between 18 and 60 years old with a diagnosis of schizophrenia, schizoaffective disorder, or psychotic disorder not otherwise specified. Patients received usual care or a technology-enhanced relapse prevention program during a 6-month period after discharge. The health technology program included in-person, individualized relapse prevention planning with treatments delivered via smartphones and computers, as well as a web-based prescriber decision support program. The main outcome measure was days spent in a psychiatric hospital during 6 months after discharge.
The study included 462 patients, of which 438 had complete baseline data and were thus used for propensity matching and analysis. Control participants (N = 89; 37 females) were enrolled first and received usual care for relapse prevention followed by 349 participants (128 females) who received technology-enhanced relapse prevention. During 6-month follow-up, 43% of control and 24% of intervention participants were hospitalized (χ2 = 11.76, p<0.001). Days of hospitalization were reduced by 5 days (mean days: b = −4.58, 95% CI −9.03 to −0.13, p = 0.044) in the intervention condition compared to control.
These results suggest that technology-enhanced relapse prevention is an effective and feasible way to reduce rehospitalization days among patients with schizophrenia.
Portion sizes of many foods have increased over time. However, the size of effect that reducing food portion sizes has on daily energy intake and body weight is less clear. We used a systematic review methodology to identify eligible articles that used an experimental design to manipulate portion size served to human participants and measured energy intake for a minimum of 1 d. Searches were conducted in September 2020 and again in October 2021. Fourteen eligible studies contributing eighty-five effects were included in the primary meta-analysis. There was a moderate-to-large reduction in daily energy intake when comparing smaller v. larger portions (Standardised Mean Difference (SMD) = –0·709 (95 % CI: –0·956, –0·461), approximately 235 kcal (983·24 kJ)). Larger reductions to portion size resulted in larger decreases in daily energy intake. There was evidence of a curvilinear relationship between portion size and daily energy intake; reductions to daily energy intake were markedly smaller when reducing portion size from very large portions. In a subset of studies that measured body weight (four studies contributing five comparisons), being served smaller v. larger portions was associated with less weight gain (0·58 kg). Reducing food portion sizes may be an effective population-level strategy to prevent weight gain.
We explore the long-term environmental and human history of a small outer coast archipelago on the Northwest Coast in western Canada. Using relative sea-level change, we reconstruct ancient landscapes to design archaeological surveys that document a rich archaeological record spanning at least 11 000 years and demonstrate the cultural centrality of this geographically marginal landscape.
The SARS-CoV-2 pandemic has highlighted the need for rapid creation and management of ICU field hospitals with effective remote monitoring which is dependent on the rapid deployment and integration of an Electronic Health Record (EHR). We describe the use of simulation to evaluate a rapidly scalable hub-and-spoke model for EHR deployment and monitoring using asynchronous training.
We adapted existing commercial EHR products to serve as the point of entry from a simulated hospital and a separate system for tele-ICU support and monitoring of the interfaced data. To train our users we created a modular video-based curriculum to facilitate asynchronous training. Effectiveness of the curriculum was assessed through completion of common ICU documentation tasks in a high-fidelity simulation. Additional endpoints include assessment of EHR navigation, user satisfaction (Net Promoter), system usability (System Usability Scale-SUS), and cognitive load (NASA-TLX).
A total of 5 participants achieved a 100% task completion on all domains except ventilator data (91%). Systems demonstrated high degrees of satisfaction (Net Promoter = 65.2), acceptable usability (SUS = 66.5), and acceptable cognitive load (NASA-TLX = 41.5); with higher levels of cognitive load correlating with the number of screens employed.
Clinical usability of a comprehensive and rapidly deployable EHR was acceptable in an intensive care simulation which was preceded by < 1 hour of video education about the EHR. This model should be considered in plans for integrated clinical response with remote and accessory facilities.
The novel coronavirus (SARS-CoV-2) has produced a considerable public health burden but the impact that contracting the disease has on mental health is unclear. In this observational population-based cohort study, we examined longitudinal changes in psychological distress associated with testing positive for coronavirus disease 2019 (COVID-19).
Participants (N = 8002; observations = 139 035) were drawn from 23 waves of the Understanding America Study, a nationally representative probability-based online panel of American adults followed-up every 2 weeks from 1 April 2020 to 15 February 2021. Psychological distress was assessed using the standardized total score on the Patient Health Questionnaire-4.
Over the course of the study, 576 participants reported testing positive for COVID-19. Using regression analysis including individual and time-fixed effects we found that psychological distress increased by 0.29 standard deviations (p < 0.001) during the 2-week period when participants first tested positive for COVID-19. Distress levels remained significantly elevated (d = 0.16, p < 0.01) for a further 2 weeks, before returning to baseline levels. Coronavirus symptom severity explained changes in distress attributable to COVID-19, whereby distress was more pronounced among those whose symptoms were more severe and were slower to subside.
This study indicates that testing positive for COVID-19 is associated with an initial increase in psychological distress that diminishes quickly as symptoms subside. Although COVID-19 may not produce lasting psychological distress among the majority of the general population it remains possible that a minority may suffer longer-term mental health consequences.
Coordinated specialty care (CSC) is widely accepted as an evidence-based treatment for first episode psychosis (FEP). The NAVIGATE intervention from the Recovery After an Initial Schizophrenia Episode Early Treatment Program (RAISE-ETP) study is a CSC intervention which offers a suite of evidence-based treatments shown to improve engagement and clinical outcomes, especially in those with shorter duration of untreated psychosis (DUP). Coincident with the publication of this study, legislation was passed by the United States Congress in 2014–15 to fund CSC for FEP via a Substance Abuse and Mental Health Services Administration (SAMHSA) block grant set-aside for each state. In Michigan (MI) the management of this grant was delegated to Network180, the community mental health authority in Kent County, with the goal of making CSC more widely available to the 10 million people in MI. Limited research describes the outcomes of implementation of CSC into community practices with no published accounts evaluating the use of the NAVIGATE intervention in a naturalistic setting. We describe the outcomes of NAVIGATE implementation in the state of MI.
In 2014, 3 centers in MI were selected and trained to provide NAVIGATE CSC for FEP. In 2016 a 4th center was added, and 2 existing centers were expanded to provide additional access to NAVIGATE. Inclusion: age 18–31, served in 1 of 4 FEP centers in MI. Data collection began in 2015 for basic demographics, global illness (CGI q3 mo), hospital/ED use and work/school (SURF q3 mo) and was expanded in 2016 to include further demographics, diagnosis, DUP, vital signs; and in 2018 for clinical symptoms with the modified Colorado Symptom Inventory (mCSI q6 mo), reported via an online portal. This analysis used data until 12/31/19. Mixed effects models adjusted by age, sex and race were used to account for correlated data within patients.
N=283 had useable demographic information and were included in the analysis. Age at enrollment was 21.6 ± 3.0 yrs; 74.2% male; 53.4% Caucasian, 34.6% African American; 12.9 ± 1.7 yrs of education (N=195). 18 mo retention was 67% with no difference by sex or race. CGI scores decreased 20% from baseline (BL) to 18 mo (BL=3.5, N=134; 15–18 mo=2.8, N=60). Service utilization via the SURF was measured at BL (N=172) and 18 mo (N=72): psychiatric hospitalizations occurred in 37% at BL and 6% at 18 mo (p<0.01); ER visits occurred in 40% at BL and 13% at 18 mo (p<0.01). 44% were working or in school at BL and 68% at 18 mo (p<0.01). 21% were on antipsychotics (AP) at BL (N=178) and 85% at 18 mo (N=13) with 8% and 54% on long acting injectable-AP at BL and 18 mo, respectively. Limitations include missing data and lack of a control group.
The implementation of the NAVIGATE CSC program for FEP in MI resulted in meaningful clinical improvement for enrollees. Further support could make this evidence-based intervention available to more people with FEP.
Supported by funds from the SAMHSA Medicaid State Block Grant set-aside awarded to Network180 (Achtyes, Kempema). The funders had no role in the design of the study, the analysis or the decision to publish the results.
In April 2019, the U.S. Fish and Wildlife Service (USFWS) released its recovery plan for the jaguar Panthera onca after several decades of discussion, litigation and controversy about the status of the species in the USA. The USFWS estimated that potential habitat, south of the Interstate-10 highway in Arizona and New Mexico, had a carrying capacity of c. six jaguars, and so focused its recovery programme on areas south of the USA–Mexico border. Here we present a systematic review of the modelling and assessment efforts over the last 25 years, with a focus on areas north of Interstate-10 in Arizona and New Mexico, outside the recovery unit considered by the USFWS. Despite differences in data inputs, methods, and analytical extent, the nine previous studies found support for potential suitable jaguar habitat in the central mountain ranges of Arizona and New Mexico. Applying slightly modified versions of the USFWS model and recalculating an Arizona-focused model over both states provided additional confirmation. Extending the area of consideration also substantially raised the carrying capacity of habitats in Arizona and New Mexico, from six to 90 or 151 adult jaguars, using the modified USFWS models. This review demonstrates the crucial ways in which choosing the extent of analysis influences the conclusions of a conservation plan. More importantly, it opens a new opportunity for jaguar conservation in North America that could help address threats from habitat losses, climate change and border infrastructure.
The COVID-19 pandemic has had a range of negative social and economic effects that may contribute to a rise in mental health problems. In this observational population-based study, we examined longitudinal changes in the prevalence of mental health problems from before to during the COVID-19 crisis and identified subgroups that are psychologically vulnerable during the pandemic.
Participants (N = 14 393; observations = 48 486) were adults drawn from wave 9 (2017–2019) of the nationally representative United Kingdom Household Longitudinal Study (UKHLS) and followed-up across three waves of assessment in April, May, and June 2020. Mental health problems were assessed using the 12-item General Health Questionnaire (GHQ-12).
The population prevalence of mental health problems (GHQ-12 score ⩾3) increased by 13.5 percentage points from 24.3% in 2017–2019 to 37.8% in April 2020 and remained elevated in May (34.7%) and June (31.9%) 2020. All sociodemographic groups examined showed statistically significant increases in mental health problems in April 2020. The increase was largest among those aged 18–34 years (18.6 percentage points, 95% CI 14.3–22.9%), followed by females and high-income and education groups. Levels of mental health problems subsequently declined between April and June 2020 but remained significantly above pre-COVID-19 levels. Additional analyses showed that the rise in mental health problems observed throughout the COVID-19 pandemic was unlikely to be due to seasonality or year-to-year variation.
This study suggests that a pronounced and prolonged deterioration in mental health occurred as the COVID-19 pandemic emerged in the UK between April and June 2020.
Reducing food portion size could reduce energy intake. However, it is unclear at what point consumers respond to reductions by increasing intake of other foods. We predicted that a change in served portion size would only result in significant additional eating within the same meal if the resulting portion size was no longer visually perceived as ‘normal’. Participants in two crossover experiments (Study 1: n 45; Study 2: n 37; adults, 51 % female) were served different-sized lunchtime portions on three occasions that were perceived by a previous sample of participants as ‘large-normal’, ‘small-normal’ and ‘smaller than normal’, respectively. Participants were able to serve themselves additional helpings of the same food (Study 1) or dessert items (Study 2). In Study 1 there was a small but significant increase in additional intake when participants were served the ‘smaller than normal’ compared with the ‘small-normal’ portion (m difference = 161 kJ, P = 0·002, d = 0·35), but there was no significant difference between the ‘small-normal’ and ‘large-normal’ conditions (m difference = 88 kJ, P = 0·08, d = 0·24). A similar pattern was observed in Study 2 (m difference = 149 kJ, P = 0·06, d = 0·18; m difference = 83 kJ, P = 0·26, d = 0·10). However, smaller portion sizes were each associated with a significant reduction in total meal intake. The findings provide preliminary evidence that reductions that result in portions appearing ‘normal’ in size may limit additional eating, but confirmatory research is needed.
Pesticide bans in Canada have resulted in a requirement for municipal turfgrass managers to use cultural methods of weed control to provide a safe playing surface for athletes. A field study was conducted to determine if overseeding provides enough competition to decrease weed populations in Kentucky bluegrass athletic turf typically used in municipal parks for recreation. Perennial ryegrass was overseeded at 2, 4, and 8 kg/100 m2 in May, July, or September, and all permutations of these timings in nonirrigated and irrigated trials at the Guelph Turfgrass Institute (GTI) field station in Guelph, and on in-use soccer fields at the University of Guelph campus and in the town of Oakville, Ontario, Canada over 2 yr. Plant cover by species was recorded every other month using a randomized point quadrat method throughout the growing seasons of 2005 and 2006. Weed populations were not affected by overseeding in 2005, a dry growing season. However, when weed populations were high and normal growing conditions existed in 2006, overseeding applications in May/July/September at 4 and 8 kg/100 m2 decreased perennial weed cover, specifically white clover in the irrigated trial and dandelion in the nonirrigated trial at the GTI. An increase in perennial ryegrass was observed in all plots that received an overseeding treatment. Treatments applied on the in-use soccer fields in Oakville and Guelph, which included May/September and May only overseedings, had no effect on weed populations or perennial ryegrass populations compared to the weedy control. Over the short term, high-rate and frequent overseeding with perennial ryegrass appears to provide competition against perennial weeds when weed cover is high and should be considered an important part of a weed management program for municipal turfgrass managers.
Sea turtles host a diverse array of epibionts, yet it is not well understood what factors influence epibiont community composition. To test whether epibiont communities of sea turtles are influenced by the hosts’ nesting or foraging habitats, we characterized the epibiota of leatherback, olive ridley and green turtles nesting at a single location on the Pacific coast of Costa Rica. We also compared the epibiota of these turtles to conspecific populations nesting elsewhere in the East Pacific. If epibiont communities are influenced by nesting habitats, we predicted that sympatrically nesting turtles would have comparable epibiont taxa. Alternatively, if epibiont communities are influenced by foraging habitats, we predicted the diversity of epibiont taxa should reflect the type and diversity of the hosts’ foraging habitats. We identified 18 epibiont taxa from 18 leatherback, 19 olive ridley and six green turtles. Epibiont diversity was low on leatherbacks (four taxa), but higher for olive ridley and green turtles (12 and nine epibiont taxa respectively). The epibiont communities of olive ridley and green turtles were not statistically different, but both were different from leatherbacks. In addition, conspecific sea turtles from other nesting locations hosted more similar epibiont communities than sympatrically nesting, non-conspecifics. We conclude that epibiont diversity of nesting sea turtles is partially linked to the diversity of their foraging habitats. We also conclude that the surface properties of the skin and carapace of these turtles may contribute to the uniqueness of leatherback turtle epibiont communities and the similarities between olive ridley and green turtle epibiont communities.
In this paper we consider the relationship between the Assouad and box-counting dimension and how both behave under the operation of taking products. We introduce the notion of ‘equi-homogeneity’ of a set, which requires a uniformity in the cardinality of local covers at all length-scales and at all points, and we show that a large class of homogeneous Moran sets have this property. We prove that the Assouad and box-counting dimensions coincide for sets that have equal upper and lower box-counting dimensions provided that the set ‘attains’ these dimensions (analogous to ‘s-sets’ when considering the Hausdorff dimension), and the set is equi-homogeneous. Using this fact we show that for any α ∈ (0, 1) and any β, γ ∈ (0, 1) such that β + γ ⩾ 1 we can construct two generalised Cantor sets C and D such that dimBC = αβ, dimBD = α γ, and dimAC = dimAD = dimA (C × D) = dimB (C × D) = α.
Transition-metal dichalcogenides (TMDCs) are compounds consisting of a transition-metal M (Ti, Hf, Zr, V, Nb, Ta, Mo, W, Tc, Re, Pd, Pt) and chalcogen atoms X (S, Se, Te). There are approximately 60 compounds in the metal chalcogenide family, and two-thirds of them are in the form of layered structures where the in-plane bonds are strong (covalent), and the out-of-plane bonds are weak (van der Waals). This provides a means to mechanically or chemically thin (exfoliate) these materials down to a single atomic two-dimensional (2D) layer. While graphene, the 2D form of graphite, is metallic, the layered metal chalcogenides cover a wide range of electrical properties, from true metals (NbS2) and superconductors (TaS2) to semiconductors (MoS2) with a wide range of bandgaps and offsets. Multiple techniques are currently being developed to synthesize large-area monolayers, including alloys, and lateral and vertical heterostructures. The wide range of properties and the ability to tune them on an atomic scale has led to numerous applications in electronics, optoelectronics, sensors, and energy. This article provides an introduction to TMDCs, serving as a background for the articles in this issue of MRS Bulletin.
The use of smaller dishware as a way of reducing food consumption has intuitive appeal and is recommended to the general public. Recent experimental studies have failed to find an effect of plate size on food intake, although the methods used across studies have varied. The aim of the present study was to examine the effect that bowl size had on snack food consumption in a ‘typical’ snacking context (snacking while watching television).
Sixty-one adult participants served themselves and ate popcorn while watching television. Participants were randomly assigned to serve themselves with and eat from either a small or a large bowl.
The use of a smaller bowl size did not reduce food consumption. Unexpectedly, participants in the small bowl condition tended to consume more popcorn (34·0 g) than participants in the large bowl condition (24·9 g; 37 % increase, d=0·5), although the statistical significance of this difference depended on whether analyses were adjusted to account for participant characteristics (e.g. gender) associated with food intake (P=0·02) or not (P=0·07).
Counter to widely held belief, the use of a smaller bowl did not reduce snack food intake. Public health recommendations advising the use of smaller dishware to reduce food consumption are premature, as this strategy may not be effective.