To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Although non-suicidal self-injury (NSSI) is an issue of major concern to colleges worldwide, we lack detailed information about the epidemiology of NSSI among college students. The objectives of this study were to present the first cross-national data on the prevalence of NSSI and NSSI disorder among first-year college students and its association with mental disorders.
Data come from a survey of the entering class in 24 colleges across nine countries participating in the World Mental Health International College Student (WMH-ICS) initiative assessed in web-based self-report surveys (20 842 first-year students). Using retrospective age-of-onset reports, we investigated time-ordered associations between NSSI and Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-IV) mood (major depressive and bipolar disorder), anxiety (generalized anxiety and panic disorder), and substance use disorders (alcohol and drug use disorder).
NSSI lifetime and 12-month prevalence were 17.7% and 8.4%. A positive screen of 12-month DSM-5 NSSI disorder was 2.3%. Of those with lifetime NSSI, 59.6% met the criteria for at least one mental disorder. Temporally primary lifetime mental disorders predicted subsequent onset of NSSI [median odds ratio (OR) 2.4], but these primary lifetime disorders did not consistently predict 12-month NSSI among respondents with lifetime NSSI. Conversely, even after controlling for pre-existing mental disorders, NSSI consistently predicted later onset of mental disorders (median OR 1.8) as well as 12-month persistence of mental disorders among students with a generalized anxiety disorder (OR 1.6) and bipolar disorder (OR 4.6).
NSSI is common among first-year college students and is a behavioral marker of various common mental disorders.
To determine whether cascade reporting is associated with a change in meropenem and fluoroquinolone consumption.
A quasi-experimental study was conducted using an interrupted time series to compare antimicrobial consumption before and after the implementation of cascade reporting.
A 399-bed, tertiary-care, Veterans’ Affairs medical center.
Antimicrobial consumption data across 8 inpatient units were extracted from the Center for Disease Control and Prevention (CDC) National Health Safety Network (NHSN) antimicrobial use (AU) module from April 2017 through March 2019, reported as antimicrobial days of therapy (DOT) per 1,000 days present (DP).
Cascade reporting is a strategy of reporting antimicrobial susceptibility test results in which secondary agents are only reported if an organism is resistant to primary, narrow-spectrum agents. A multidisciplinary team developed cascade reporting algorithms for gram-negative bacteria based on local antibiogram and infectious diseases practice guidelines, aimed at restricting the use of fluoroquinolones and carbapenems. The algorithms were implemented in March 2018.
Following the implementation of cascade reporting, mean monthly meropenem (P =.005) and piperacillin/tazobactam (P = .002) consumption decreased and cefepime consumption increased (P < .001). Ciprofloxacin consumption decreased by 2.16 DOT per 1,000 DP per month (SE, 0.25; P < .001). Clostridioides difficile rates did not significantly change.
Ciprofloxacin consumption significantly decreased after the implementation of cascade reporting. Mean meropenem consumption decreased after cascade reporting was implemented, but we observed no significant change in the slope of consumption. cascade reporting may be a useful strategy to optimize antimicrobial prescribing.
Pharmacogenomic testing has emerged to aid medication selection for patients with major depressive disorder (MDD) by identifying potential gene-drug interactions (GDI). Many pharmacogenomic tests are available with varying levels of supporting evidence, including direct-to-consumer and physician-ordered tests. We retrospectively evaluated the safety of using a physician-ordered combinatorial pharmacogenomic test (GeneSight) to guide medication selection for patients with MDD in a large, randomized, controlled trial (GUIDED).
Materials and Methods
Patients diagnosed with MDD who had an inadequate response to ≥1 psychotropic medication were randomized to treatment as usual (TAU) or combinatorial pharmacogenomic test-guided care (guided-care). All received combinatorial pharmacogenomic testing and medications were categorized by predicted GDI (no, moderate, or significant GDI). Patients and raters were blinded to study arm, and physicians were blinded to test results for patients in TAU, through week 8. Measures included adverse events (AEs, present/absent), worsening suicidal ideation (increase of ≥1 on the corresponding HAM-D17 question), or symptom worsening (HAM-D17 increase of ≥1). These measures were evaluated based on medication changes [add only, drop only, switch (add and drop), any, and none] and study arm, as well as baseline medication GDI.
Most patients had a medication change between baseline and week 8 (938/1,166; 80.5%), including 269 (23.1%) who added only, 80 (6.9%) who dropped only, and 589 (50.5%) who switched medications. In the full cohort, changing medications resulted in an increased relative risk (RR) of experiencing AEs at both week 4 and 8 [RR 2.00 (95% CI 1.41–2.83) and RR 2.25 (95% CI 1.39–3.65), respectively]. This was true regardless of arm, with no significant difference observed between guided-care and TAU, though the RRs for guided-care were lower than for TAU. Medication change was not associated with increased suicidal ideation or symptom worsening, regardless of study arm or type of medication change. Special attention was focused on patients who entered the study taking medications identified by pharmacogenomic testing as likely having significant GDI; those who were only taking medications subject to no or moderate GDI at week 8 were significantly less likely to experience AEs than those who were still taking at least one medication subject to significant GDI (RR 0.39, 95% CI 0.15–0.99, p=0.048). No other significant differences in risk were observed at week 8.
These data indicate that patient safety in the combinatorial pharmacogenomic test-guided care arm was no worse than TAU in the GUIDED trial. Moreover, combinatorial pharmacogenomic-guided medication selection may reduce some safety concerns. Collectively, these data demonstrate that combinatorial pharmacogenomic testing can be adopted safely into clinical practice without risking symptom degradation among patients.
Colleges and universities around the world engaged diverse strategies during the COVID-19 pandemic. Baylor University, a community of ˜22,700 individuals, was 1 of the institutions which resumed and sustained operations. The key strategy was establishment of multidisciplinary teams to develop mitigation strategies and priority areas for action. This population-based team approach along with implementation of a “Swiss Cheese” risk mitigation model allowed small clusters to be rapidly addressed through testing, surveillance, tracing, isolation, and quarantine. These efforts were supported by health protocols including face coverings, social distancing, and compliance monitoring. As a result, activities were sustained from August 1 to December 8, 2020. There were 62,970 COVID-19 tests conducted with 1435 people testing positive for a positivity rate of 2.28%. A total of 1670 COVID-19 cases were identified with 235 self-reports. The mean number of tests per week was 3500 with approximately 80 of these positive (11/d). More than 60 student tracers were trained with over 120 personnel available to contact trace, at a ratio of 1 per 400 university members. The successes and lessons learned provide a framework and pathway for similar institutions to mitigate the ongoing impacts of COVID-19 and sustain operations during a global pandemic.
Among 353 healthcare personnel in a longitudinal cohort in 4 hospitals in Atlanta, Georgia (May–June 2020), 23 (6.5%) had severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies. Spending >50% of a typical shift at the bedside (OR, 3.4; 95% CI, 1.2–10.5) and black race (OR, 8.4; 95% CI, 2.7–27.4) were associated with SARS-CoV-2 seropositivity.
Archaeology field schools provide unique opportunities for firsthand exposure, team-based learning, and pre-professional experience. A participant's decision to pursue a career in archaeology may reflect initial fieldwork group experiences and individual interactions with field school leaders and staff. Today, safety, security, and equity policies along with staff and operational procedures that support them are essential for instructing and inspiring all who wish to experience archaeological fieldwork. Drawing on three decades of field school participation and administration, the author describes specific examples of fieldwork learning contexts as well as insights into operating a safe, secure, and welcoming field school. Conclusions include general guidelines that are applicable and desirable for short-term, season-long, or special skills field schools.
This SHEA white paper identifies knowledge gaps and challenges in healthcare epidemiology research related to coronavirus disease 2019 (COVID-19) with a focus on core principles of healthcare epidemiology. These gaps, revealed during the worst phases of the COVID-19 pandemic, are described in 10 sections: epidemiology, outbreak investigation, surveillance, isolation precaution practices, personal protective equipment (PPE), environmental contamination and disinfection, drug and supply shortages, antimicrobial stewardship, healthcare personnel (HCP) occupational safety, and return to work policies. Each section highlights three critical healthcare epidemiology research questions with detailed description provided in supplementary materials. This research agenda calls for translational studies from laboratory-based basic science research to well-designed, large-scale studies and health outcomes research. Research gaps and challenges related to nursing homes and social disparities are included. Collaborations across various disciplines, expertise and across diverse geographic locations will be critical.
A study was conducted at the Louisiana State University Agricultural Center’s H. Rouse Caffey Rice Research Station in 2017 and 2018 to evaluate the interaction between a prepackage mixture of clomazone plus pendimethalin applied at 0, 760, 1,145, or 1,540 g ai ha−1 mixed with propanil at 0, 1,120, 2,240, or 4,485 g ai ha−1. A synergistic response occurred when barnyardgrass was treated with all rates of clomazone plus pendimethalin mixed with either rate of propanil evaluated at 56 d after treatment (DAT). Unlike barnyardgrass, an antagonistic response occurred in yellow nutsedge used as a control when treated with 760 and 1,540 g ha−1 of clomazone plus pendimethalin mixed with 1,120 or 22,40 g ha−1 of propanil at 28 DAT; however, 1,145 g ha−1 of clomazone plus pendimethalin mixed with 4,485 g ha−1 of propanil resulted in a neutral interaction. At 28 DAT, rice flatsedge treated with all herbicide mixtures resulted in neutral interactions. The synergism of clomazone plus pendimethalin applied at 1,540 g ha−1 mixed with propanil applied at 2,240 or 4,485 g ha−1 to control barnyardgrass resulted in an increased rough rice yield compared with 760 or 1,145 g ha−1 of clomazone plus pendimethalin mixed with propanil applied at 1,120 or 2,240 g ha−1. These results indicate that if barnyardgrass and rice flatsedge are present in a rice field the prepackage mixture of clomazone plus pendimethalin mixed with propanil can be an option for growers. However, if yellow nutsedge infest the area other herbicides may be needed.
This study examined longitudinal associations between performance on the Rey–Osterrieth Complex Figure–Developmental Scoring System (ROCF-DSS) at 8 years of age and academic outcomes at 16 years of age in 133 children with dextro-transposition of the great arteries (d-TGA).
The ROCF-DSS was administered at the age of 8 and the Wechsler Individual Achievement Test, First and Second Edition (WIAT/WIAT-II) at the ages of 8 and 16, respectively. ROCF-DSS protocols were classified by Organization (Organized/Disorganized) and Style (Part-oriented/Holistic). Two-way univariate (ROCF-DSS Organization × Style) ANCOVAs were computed with 16-year academic outcomes as the dependent variables and socioeconomic status (SES) as the covariate.
The Organization × Style interaction was not statistically significant. However, ROCF-DSS Organization at 8 years was significantly associated with Reading, Math, Associative, and Assembled academic skills at 16 years, with better organization predicting better academic performance.
Performance on the ROCF-DSS, a complex visual-spatial problem-solving task, in children with d-TGA can forecast academic performance in both reading and mathematics nearly a decade later. These findings may have implications for identifying risk in children with other medical and neurodevelopmental disorders affecting brain development.
Optical tracking systems typically trade off between astrometric precision and field of view. In this work, we showcase a networked approach to optical tracking using very wide field-of-view imagers that have relatively low astrometric precision on the scheduled OSIRIS-REx slingshot manoeuvre around Earth on 22 Sep 2017. As part of a trajectory designed to get OSIRIS-REx to NEO 101955 Bennu, this flyby event was viewed from 13 remote sensors spread across Australia and New Zealand to promote triangulatable observations. Each observatory in this portable network was constructed to be as lightweight and portable as possible, with hardware based off the successful design of the Desert Fireball Network. Over a 4-h collection window, we gathered 15 439 images of the night sky in the predicted direction of the OSIRIS-REx spacecraft. Using a specially developed streak detection and orbit determination data pipeline, we detected 2 090 line-of-sight observations. Our fitted orbit was determined to be within about 10 km of orbital telemetry along the observed 109 262 km length of OSIRIS-REx trajectory, and thus demonstrating the impressive capability of a networked approach to Space Surveillance and Tracking.
As the pathophysiology of Covid-19 emerges, this paper describes dysphagia as a sequela of the disease, including its diagnosis and management, hypothesised causes, symptomatology in relation to viral progression, and concurrent variables such as intubation, tracheostomy and delirium, at a tertiary UK hospital.
During the first wave of the Covid-19 pandemic, 208 out of 736 patients (28.9 per cent) admitted to our institution with SARS-CoV-2 were referred for swallow assessment. Of the 208 patients, 102 were admitted to the intensive treatment unit for mechanical ventilation support, of which 82 were tracheostomised. The majority of patients regained near normal swallow function prior to discharge, regardless of intubation duration or tracheostomy status.
Dysphagia is prevalent in patients admitted either to the intensive treatment unit or the ward with Covid-19 related respiratory issues. This paper describes the crucial role of intensive swallow rehabilitation to manage dysphagia associated with this disease, including therapeutic respiratory weaning for those with a tracheostomy.
This work investigated the photophysical pathways for light absorption, charge generation, and charge separation in donor–acceptor nanoparticle blends of poly(3-hexylthiophene) and indene-C60-bisadduct. Optical modeling combined with steady-state and time-resolved optoelectronic characterization revealed that the nanoparticle blends experience a photocurrent limited to 60% of a bulk solution mixture. This discrepancy resulted from imperfect free charge generation inside the nanoparticles. High-resolution transmission electron microscopy and chemically resolved X-ray mapping showed that enhanced miscibility of materials did improve the donor–acceptor blending at the center of the nanoparticles; however, a residual shell of almost pure donor still restricted energy generation from these nanoparticles.
Associations of socioenvironmental features like urbanicity and neighborhood deprivation with psychosis are well-established. An enduring question, however, is whether these associations are causal. Genetic confounding could occur due to downward mobility of individuals at high genetic risk for psychiatric problems into disadvantaged environments.
We examined correlations of five indices of genetic risk [polygenic risk scores (PRS) for schizophrenia and depression, maternal psychotic symptoms, family psychiatric history, and zygosity-based latent genetic risk] with multiple area-, neighborhood-, and family-level risks during upbringing. Data were from the Environmental Risk (E-Risk) Longitudinal Twin Study, a nationally-representative cohort of 2232 British twins born in 1994–1995 and followed to age 18 (93% retention). Socioenvironmental risks included urbanicity, air pollution, neighborhood deprivation, neighborhood crime, neighborhood disorder, social cohesion, residential mobility, family poverty, and a cumulative environmental risk scale. At age 18, participants were privately interviewed about psychotic experiences.
Higher genetic risk on all indices was associated with riskier environments during upbringing. For example, participants with higher schizophrenia PRS (OR = 1.19, 95% CI = 1.06–1.33), depression PRS (OR = 1.20, 95% CI = 1.08–1.34), family history (OR = 1.25, 95% CI = 1.11–1.40), and latent genetic risk (OR = 1.21, 95% CI = 1.07–1.38) had accumulated more socioenvironmental risks for schizophrenia by age 18. However, associations between socioenvironmental risks and psychotic experiences mostly remained significant after covariate adjustment for genetic risk.
Genetic risk is correlated with socioenvironmental risk for schizophrenia during upbringing, but the associations between socioenvironmental risk and adolescent psychotic experiences appear, at present, to exist above and beyond this gene-environment correlation.
Cognitive behavior therapy (CBT) is effective for most patients with a social anxiety disorder (SAD) but a substantial proportion fails to remit. Experimental and clinical research suggests that enhancing CBT using imagery-based techniques could improve outcomes. It was hypothesized that imagery-enhanced CBT (IE-CBT) would be superior to verbally-based CBT (VB-CBT) on pre-registered outcomes.
A randomized controlled trial of IE-CBT v. VB-CBT for social anxiety was completed in a community mental health clinic setting. Participants were randomized to IE (n = 53) or VB (n = 54) CBT, with 1-month (primary end point) and 6-month follow-up assessments. Participants completed 12, 2-hour, weekly sessions of IE-CBT or VB-CBT plus 1-month follow-up.
Intention to treat analyses showed very large within-treatment effect sizes on the social interaction anxiety at all time points (ds = 2.09–2.62), with no between-treatment differences on this outcome or clinician-rated severity [1-month OR = 1.45 (0.45, 4.62), p = 0.53; 6-month OR = 1.31 (0.42, 4.08), p = 0.65], SAD remission (1-month: IE = 61.04%, VB = 55.09%, p = 0.59); 6-month: IE = 58.73%, VB = 61.89%, p = 0.77), or secondary outcomes. Three adverse events were noted (substance abuse, n = 1 in IE-CBT; temporary increase in suicide risk, n = 1 in each condition, with one being withdrawn at 1-month follow-up).
Group IE-CBT and VB-CBT were safe and there were no significant differences in outcomes. Both treatments were associated with very large within-group effect sizes and the majority of patients remitted following treatment.
Even though sub-Saharan African women spend millions of person-hours per day fetching water and pounding grain, to date, few studies have rigorously assessed the energy expenditure costs of such domestic activities. As a result, most analyses that consider head-hauling water or hand pounding of grain with a mortar and pestle (pilão use) employ energy expenditure values derived from limited research. The current paper compares estimated energy expenditure values from heart rate monitors v. indirect calorimetry in order to understand some of the limitations with using such monitors to measure domestic activities.
This confirmation study estimates the metabolic equivalent of task (MET) value for head-hauling water and hand-pounding grain using both indirect calorimetry and heart rate monitors under laboratory conditions.
The study was conducted in Nampula, Mozambique.
Forty university students in Nampula city who recurrently engaged in water-fetching activities.
Including all participants, the mean MET value for head hauling 20 litres (20·5 kg, including container) of water (2·7 km/h, 0 % slope) was 4·3 (sd 0·9) and 3·7 (sd 1·2) for pilão use. Estimated energy expenditure predictions from a mixed model were found to correlate with observed energy expenditure (r2 0·68, r 0·82). Re-estimating the model with pilão use data excluded improved the fit substantially (r2 0·83, r 0·91).
The current study finds that heart rate monitors are suitable instruments for providing accurate quantification of energy expenditure for some domestic activities, such as head-hauling water, but are not appropriate for quantifying expenditures of other activities, such as hand-pounding grain.
Spatially and temporally unpredictable rainfall patterns presented food production challenges to small-scale agricultural communities, requiring multiple risk-mitigating strategies to increase food security. Although site-based investigations of the relationship between climate and agricultural production offer insights into how individual communities may have created long-term adaptations to manage risk, the inherent spatial variability of climate-driven risk makes a landscape-scale perspective valuable. In this article, we model risk by evaluating how the spatial structure of ancient climate conditions may have affected the reliability of three major strategies used to reduce risk: drawing upon social networks in time of need, hunting and gathering of wild resources, and storing surplus food. We then explore how climate-driven changes to this reliability may relate to archaeologically observed social transformations. We demonstrate the utility of this methodology by comparing the Salinas and Cibola regions in the prehispanic U.S. Southwest to understand the complex relationship among climate-driven threats to food security, risk-mitigation strategies, and social transformations. Our results suggest key differences in how communities buffered against risk in the Cibola and Salinas study regions, with the structure of precipitation influencing the range of strategies to which communities had access through time.
Archaeologists have struggled to combine remotely sensed datasets with preexisting information for landscape-level analyses. In the American Southeast, for example, analyses of lidar data using automated feature extraction algorithms have led to the identification of over 40 potential new pre-European-contact Native American shell ring deposits in Beaufort County, South Carolina. Such datasets are vital for understanding settlement distributions, yet a comprehensive assessment requires remotely sensed and previously surveyed archaeological data. Here, we use legacy data and airborne lidar-derived information to conduct a series of point pattern analyses using spatial models that we designed to assess the factors that best explain the location of shell rings. The results reveal that ring deposit locations are highly clustered and best explained through a combination of environmental conditions such as distance to water and elevation as well as social factors.
Acute cannabis administration can produce transient psychotic-like effects in healthy individuals. However, the mechanisms through which this occurs and which factors predict vulnerability remain unclear. We investigate whether cannabis inhalation leads to psychotic-like symptoms and speech illusion; and whether cannabidiol (CBD) blunts such effects (study 1) and adolescence heightens such effects (study 2).
Two double-blind placebo-controlled studies, assessing speech illusion in a white noise task, and psychotic-like symptoms on the Psychotomimetic States Inventory (PSI). Study 1 compared effects of Cann-CBD (cannabis containing Δ-9-tetrahydrocannabinol (THC) and negligible levels of CBD) with Cann+CBD (cannabis containing THC and CBD) in 17 adults. Study 2 compared effects of Cann-CBD in 20 adolescents and 20 adults. All participants were healthy individuals who currently used cannabis.
In study 1, relative to placebo, both Cann-CBD and Cann+CBD increased PSI scores but not speech illusion. No differences between Cann-CBD and Cann+CBD emerged. In study 2, relative to placebo, Cann-CBD increased PSI scores and incidence of speech illusion, with the odds of experiencing speech illusion 3.1 (95% CIs 1.3–7.2) times higher after Cann-CBD. No age group differences were found for speech illusion, but adults showed heightened effects on the PSI.
Inhalation of cannabis reliably increases psychotic-like symptoms in healthy cannabis users and may increase the incidence of speech illusion. CBD did not influence psychotic-like effects of cannabis. Adolescents may be less vulnerable to acute psychotic-like effects of cannabis than adults.
Hyperspectral soft X-ray emission (SXE) and cathodoluminescence (CL) spectrometry have been used to investigate a carbonaceous-rich geological deposit to understand the crystallinity and morphology of the carbon and the associated quartz. Panchromatic CL maps show both the growth of the quartz and the evidence of recrystallization. A fitted CL map reveals the distribution of Ti4+ within the grains and shows subtle growth zoning, together with radiation halos from 238U decay. The sensitivity of the SXE spectrometer to carbon, together with the anisotropic X-ray emission from highly orientated pyrolytic graphite, has enabled the C Kα peak shape to be used to measure the crystal orientation of individual graphite regions. Mapping has revealed that most grains are predominantly of a single orientation, and a number of graphite grains have been investigated to demonstrate the application of this new SXE technique. A peak fitting approach to analyzing the SXE spectra was developed to project the C Kα 2pz and 2p(x+y) orbital components of the graphite. The shape of these two end-member components is comparable to those produced by electron density of states calculations. The angular sensitivity of the SXE spectrometer has been shown to be comparable to that of electron backscatter diffraction.