To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Climate change is increasing the frequency of extreme weather events, such as drought and heat waves. In this paper, we assess the impact of drought and high temperatures on the employment outcomes of working-age individuals in South Africa between 2008 and 2017. We merge high-resolution weather data with detailed individual-level survey data on labor market outcomes, and estimate causal impacts using a fixed effects framework. We find that increases in the occurrence of drought reduce overall employment. These effects are concentrated in the tertiary sector, amongst informal workers, and in provinces with a higher reliance on tourism. Taken together, our results suggest that the impacts of climate change will be felt unequally by South Africa's workers.
To analyze the frequency and rates of community respiratory virus infections detected in patients at the National Institutes of Health Clinical Center (NIHCC) between January 2015 and March 2021, comparing the trends before and during the coronavirus disease 2019 (COVID-19) pandemic.
We conducted a retrospective study comparing frequency and rates of community respiratory viruses detected in NIHCC patients between January 2015 and March 2021. Test results from nasopharyngeal swabs and washes, bronchoalveolar lavages, and bronchial washes were included in this study. Results from viral-challenge studies and repeated positives were excluded. A quantitative data analysis was completed using cross tabulations. Comparisons were performed using mixed models, applying the Dunnett correction for multiplicity.
Frequency of all respiratory pathogens declined from an annual range of 0.88%–1.97% between January 2015 and March 2020 to 0.29% between April 2020 and March 2021. Individual viral pathogens declined sharply in frequency during the same period, with no cases of influenza A/B orparainfluenza and 1 case of respiratory syncytial virus (RSV). Rhino/enterovirusdetection continued, but with a substantially lower frequency of 4.27% between April 2020 and March 2021, compared with an annual range of 8.65%–18.28% between January 2015 and March 2020.
The decrease in viral respiratory infections detected in NIHCC patients during the pandemic was likely due to the layered COVID-19 prevention and mitigation measures implemented in the community and the hospital. Hospitals should consider continuing the use of nonpharmaceutical interventions in the future to prevent nosocomial transmission of respiratory viruses during times of high community viral load.
Ethnohistoric accounts indicate that the people of Australia's Channel Country engaged in activities rarely recorded elsewhere on the continent, including food storage, aquaculture and possible cultivation, yet there has been little archaeological fieldwork to verify these accounts. Here, the authors report on a collaborative research project initiated by the Mithaka people addressing this lack of archaeological investigation. The results show that Mithaka Country has a substantial and diverse archaeological record, including numerous large stone quarries, multiple ritual structures and substantial dwellings. Our archaeological research revealed unknown aspects, such as the scale of Mithaka quarrying, which could stimulate re-evaluation of Aboriginal socio-economic systems in parts of ancient Australia.
Herbivore distribution throughout Africa is strongly linked to mean annual precipitation. We use that relationship to predict functional group composition of herbivore communities during the last glacial maximum (ca. 21 ka) on the now submerged Palaeo-Agulhas Plain (PAP), South Africa. We used metabolic large herbivore biomass (MLHB) from 39 South African protected areas, in five functional groups (characterized by behavior and physiology). We examined how modern factors influenced MLHB and considered the effects of biome, annual rainfall, percentage winter rainfall, and protected area size. Overall, biome was the most important factor influencing the relationship between MLHB and rainfall. In general, MLHB increased with rainfall, but not for the grassland biome. Outside grasslands, most functional groups’ metabolic biomass increased with increasing rainfall, irrespective of biome, except for medium-sized social mixed feeder species in savanna and thicket. Protected area size was influential for medium-sized social mixed feeders and large browsers and rainfall influenced medium-sized social mixed feeders, offering some perspectives on spatial constraints on past large herbivore biomass densities. These results improve our understanding of the likely herbivore community composition and relative biomass structure on the PAP, an essential driver of how early humans utilized large mammals as a food resource.
To assess the relationship between food insecurity, sleep quality, and days with mental and physical health issues among college students.
An online survey was administered. Food insecurity was assessed using the ten-item Adult Food Security Survey Module. Sleep was measured using the nineteen-item Pittsburgh Sleep Quality Index (PSQI). Mental health and physical health were measured using three items from the Healthy Days Core Module. Multivariate logistic regression was conducted to assess the relationship between food insecurity, sleep quality, and days with poor mental and physical health.
Twenty-two higher education institutions.
College students (n 17 686) enrolled at one of twenty-two participating universities.
Compared with food-secure students, those classified as food insecure (43·4 %) had higher PSQI scores indicating poorer sleep quality (P < 0·0001) and reported more days with poor mental (P < 0·0001) and physical (P < 0·0001) health as well as days when mental and physical health prevented them from completing daily activities (P < 0·0001). Food-insecure students had higher adjusted odds of having poor sleep quality (adjusted OR (AOR): 1·13; 95 % CI 1·12, 1·14), days with poor physical health (AOR: 1·01; 95 % CI 1·01, 1·02), days with poor mental health (AOR: 1·03; 95 % CI 1·02, 1·03) and days when poor mental or physical health prevented them from completing daily activities (AOR: 1·03; 95 % CI 1·02, 1·04).
College students report high food insecurity which is associated with poor mental and physical health, and sleep quality. Multi-level policy changes and campus wellness programmes are needed to prevent food insecurity and improve student health-related outcomes.
Colleges and universities around the world engaged diverse strategies during the COVID-19 pandemic. Baylor University, a community of ˜22,700 individuals, was 1 of the institutions which resumed and sustained operations. The key strategy was establishment of multidisciplinary teams to develop mitigation strategies and priority areas for action. This population-based team approach along with implementation of a “Swiss Cheese” risk mitigation model allowed small clusters to be rapidly addressed through testing, surveillance, tracing, isolation, and quarantine. These efforts were supported by health protocols including face coverings, social distancing, and compliance monitoring. As a result, activities were sustained from August 1 to December 8, 2020. There were 62,970 COVID-19 tests conducted with 1435 people testing positive for a positivity rate of 2.28%. A total of 1670 COVID-19 cases were identified with 235 self-reports. The mean number of tests per week was 3500 with approximately 80 of these positive (11/d). More than 60 student tracers were trained with over 120 personnel available to contact trace, at a ratio of 1 per 400 university members. The successes and lessons learned provide a framework and pathway for similar institutions to mitigate the ongoing impacts of COVID-19 and sustain operations during a global pandemic.
ABSTRACT IMPACT: This is the first examination of risk factors for prolonged opioid use after an ICU stay and will inform efforts to strengthen prescribing guidelines and care transition models for patients after critical illness. OBJECTIVES/GOALS: The majority of patients in intensive care units (ICU) receive opioids during admission, and up to 25% receive a prescription at discharge. However, transitions of care and prolonged use after discharge remain unknown. We aim to characterize risk factors for prolonged opioid use after an ICU stay. METHODS/STUDY POPULATION: A retrospective study using insurance claims from Optum Clinformatics ®Data Mart was conducted for opioid-naive adult patients (18-64 years) with an ICU admission from 2010 to 2019. The primary outcome was new persistent opioid use, defined as a continued prescription fill 91-180 days after discharge, in addition to a fill in the first 90 days. The primary exposure was an opioid fill at discharge. The ICU admission was characterized using the Clinical Classification System from the Agency of Healthcare Research and Quality, based on patients’primary diagnosis code. Diagnoses were combined into 11 groups highlighting the affected organ system/mechanism of injury. Logistic regression evaluated the associations of patient demographic and clinical characteristics with the probability of persistent opioid use. RESULTS/ANTICIPATED RESULTS: In this cohort of 90,721 patients discharged from the ICU, 3.3% continued to fill opioids at 6 months. An opioid prescription fill (OR 3.1; 95% CI 28 - 3.3) and benzodiazepine prescription fill (OR 1.6; 95% CI 1.4 - 1.8) within 3 days of ICU discharge were each significantly associated with the development of new persistent opioid use. Patient diagnosis groups of Musculoskeletal/Trauma (OR 2.3; 95% CI 2.0 - 2.6), Neoplasms (OR 1.6; 95% CI 1.5 - 1.9), and GI/Hepatobiliary (OR 1.5; 95% CI 1.3 - 1.8) were significantly more likely to develop new persistent use when compared to the Cardiovascular diagnosis group. DISCUSSION/SIGNIFICANCE OF FINDINGS: Opioid prescriptions at discharge after an ICU stay increase the odds of prolonged opioid use. These results will inform efforts to strengthen prescribing guidelines and care models after a critical illness. Further work will characterize the trajectory of prescribing and patient exposure to high-risk prescribing after ICU discharge.
Antisaccade tasks can be used to index cognitive control processes, e.g. attention, behavioral inhibition, working memory, and goal maintenance in people with brain disorders. Though diagnoses of schizophrenia (SZ), schizoaffective (SAD), and bipolar I with psychosis (BDP) are typically considered to be distinct entities, previous work shows patterns of cognitive deficits differing in degree, rather than in kind, across these syndromes.
Large samples of individuals with psychotic disorders were recruited through the Bipolar-Schizophrenia Network on Intermediate Phenotypes 2 (B-SNIP2) study. Anti- and pro-saccade task performances were evaluated in 189 people with SZ, 185 people with SAD, 96 people with BDP, and 279 healthy comparison participants. Logistic functions were fitted to each group's antisaccade speed-performance tradeoff patterns.
Psychosis groups had higher antisaccade error rates than the healthy group, with SZ and SAD participants committing 2 times as many errors, and BDP participants committing 1.5 times as many errors. Latencies on correctly performed antisaccade trials in SZ and SAD were longer than in healthy participants, although error trial latencies were preserved. Parameters of speed-performance tradeoff functions indicated that compared to the healthy group, SZ and SAD groups had optimal performance characterized by more errors, as well as less benefit from prolonged response latencies. Prosaccade metrics did not differ between groups.
With basic prosaccade mechanisms intact, the higher speed-performance tradeoff cost for antisaccade performance in psychosis cases indicates a deficit that is specific to the higher-order cognitive aspects of saccade generation.
Previous research on the depression scale of the Patient Health Questionnaire (PHQ-9) has found that different latent factor models have maximized empirical measures of goodness-of-fit. The clinical relevance of these differences is unclear. We aimed to investigate whether depression screening accuracy may be improved by employing latent factor model-based scoring rather than sum scores.
We used an individual participant data meta-analysis (IPDMA) database compiled to assess the screening accuracy of the PHQ-9. We included studies that used the Structured Clinical Interview for DSM (SCID) as a reference standard and split those into calibration and validation datasets. In the calibration dataset, we estimated unidimensional, two-dimensional (separating cognitive/affective and somatic symptoms of depression), and bi-factor models, and the respective cut-offs to maximize combined sensitivity and specificity. In the validation dataset, we assessed the differences in (combined) sensitivity and specificity between the latent variable approaches and the optimal sum score (⩾10), using bootstrapping to estimate 95% confidence intervals for the differences.
The calibration dataset included 24 studies (4378 participants, 652 major depression cases); the validation dataset 17 studies (4252 participants, 568 cases). In the validation dataset, optimal cut-offs of the unidimensional, two-dimensional, and bi-factor models had higher sensitivity (by 0.036, 0.050, 0.049 points, respectively) but lower specificity (0.017, 0.026, 0.019, respectively) compared to the sum score cut-off of ⩾10.
In a comprehensive dataset of diagnostic studies, scoring using complex latent variable models do not improve screening accuracy of the PHQ-9 meaningfully as compared to the simple sum score approach.
Impairment in reciprocal social behavior (RSB), an essential component of early social competence, clinically defines autism spectrum disorder (ASD). However, the behavioral and genetic architecture of RSB in toddlerhood, when ASD first emerges, has not been fully characterized. We analyzed data from a quantitative video-referenced rating of RSB (vrRSB) in two toddler samples: a community-based volunteer research registry (n = 1,563) and an ethnically diverse, longitudinal twin sample ascertained from two state birth registries (n = 714). Variation in RSB was continuously distributed, temporally stable, significantly associated with ASD risk at age 18 months, and only modestly explained by sociodemographic and medical factors (r2 = 9.4%). Five latent RSB factors were identified and corresponded to aspects of social communication or restricted repetitive behaviors, the two core ASD symptom domains. Quantitative genetic analyses indicated substantial heritability for all factors at age 24 months (h2 ≥ .61). Genetic influences strongly overlapped across all factors, with a social motivation factor showing evidence of newly-emerging genetic influences between the ages of 18 and 24 months. RSB constitutes a heritable, trait-like competency whose factorial and genetic structure is generalized across diverse populations, demonstrating its role as an early, enduring dimension of inherited variation in human social behavior. Substantially overlapping RSB domains, measurable when core ASD features arise and consolidate, may serve as markers of specific pathways to autism and anchors to inform determinants of autism's heterogeneity.
Air pollution is linked to mortality and morbidity. Since humans spend nearly all their time indoors, improving indoor air quality (IAQ) is a compelling approach to mitigate air pollutant exposure. To assess interventions, relying on clinical outcomes may require prolonged follow-up, which hinders feasibility. Thus, identifying biomarkers that respond to changes in IAQ may be useful to assess the effectiveness of interventions.
We conducted a narrative review by searching several databases to identify studies published over the last decade that measured the response of blood, urine, and/or salivary biomarkers to variations (natural and intervention-induced) of changes in indoor air pollutant exposure.
Numerous studies reported on associations between IAQ exposures and biomarkers with heterogeneity across study designs and methods. This review summarizes the responses of 113 biomarkers described in 30 articles. The biomarkers which most frequently responded to variations in indoor air pollutant exposures were high sensitivity C-reactive protein (hsCRP), von Willebrand Factor (vWF), 8-hydroxy-2′-deoxyguanosine (8-OHdG), and 1-hydroxypyrene (1-OHP).
This review will guide the selection of biomarkers for translational studies evaluating the impact of indoor air pollutants on human health.
The relationship between nutrition and behavioural health (BH) outcomes has been established in the literature. However, the relationship between nutrition and anxiety is unclear. Furthermore, the relationship between nutrition and BH outcomes has not been examined in a US Army Soldier population. This study sought to understand the relationship between Soldiers’ nutritional intake and anxiety as well as depression.
This cross-sectional study utilised multivariable logistic regression analyses to examine the relationship between nutritional intake and BH outcomes.
The study utilised data collected in 2018 during a BH epidemiological consultation conducted at one Army installation.
Participants were 7043 US Army Soldiers at one Army installation.
Of the Soldiers completing the survey, 12 % (n 812) screened positive for anxiety and 11 % (n 774) for depression. The adjusted odds of anxiety were significantly higher among Soldiers who reported low fruit intake compared with Soldiers who reported high fruit intake (adjusted OR (AOR) 1·36; 95 % CI 1·04, 1·79). The adjusted odds of depression were higher for Soldiers who reported low fruit intake (AOR 1·35; 95 % CI 1·01, 1·79) and/or low green vegetable intake (AOR 1·37; 95 % CI 1·02, 1·83). Lastly, the adjusted odds of depression were lower for Soldiers who reported low sugary drink intake (AOR 0·62; 95 % CI 0·48, 0·81).
This study is the first to examine the important connection between nutritional intake and anxiety and depression at a US military installation. The information learned from this study has implications for enhancing Soldiers’ nutritional knowledge and BH, ultimately improving Soldiers’ health and medical readiness.
OBJECTIVES/GOALS: Breast cancer metastases are stochastic and difficult to detect. Therapy is often ineffective due to phenotypic changes of tumor cells at these sites. We engineered a synthetic metastatic niche to study the role of phenotypic transitions in the microenvironment on tumor cell phenotype. METHODS/STUDY POPULATION: The engineered metastatic niche is composed of a porous polycaprolactone scaffold implanted subcutaneously in Balb/c mice. The mice received an orthotopic inoculation of 4T1 cells (murine triple negative breast cancer) in the fourth right mammary fat pad and the disease was allowed to progress for 7-21 days (pre-metastatic to overt metastatic disease). The scaffolds and lungs (native metastatic site) were explanted and analyzed by single cell RNA-seq via Drop-seq. Cell phenotypes were identified and tracked over time with the Seurat and Monocle3 pipelines. Assessment of the impact of these cell populations on tumor cell phenotype was conducted through Transwell co-cultures. RESULTS/ANTICIPATED RESULTS: Healthy scaffolds are primarily composed of macrophages, dendritic cells, and fibroblasts – consistent with a foreign body response. Despite differences in the lung and scaffold prior to tumor inoculation, both tissues were marked by >5-fold increase in neutrophils/MDSCs. Additionally, 79% of genes at the scaffold that significantly changed over time were also identified in the lung, indicating key similarities in niche maturation. However, many immune cells at the scaffold had distinct phenotypes, with pro-inflammatory/cytotoxic characteristics. These changes clearly impacted tumor cell phenotype, as cells from the scaffold increased tumor cell migration and apoptosis in vitro. DISCUSSION/SIGNIFICANCE OF IMPACT: Early phenotypic changes at the engineered metastatic niche can identify signs of metastasis prior to colonization of tumor cells. Furthermore, dynamics of immune and stromal cells change throughout niche maturation, influencing tumor cell phenotype and may suggest targeted therapies. CONFLICT OF INTEREST DESCRIPTION: Lonnie Shea, Jacqueline Jeruss, and Grace Bushnell are named inventors on patents or patent applications.
Evidence suggests that healthy older adults with subjective memory complaints are at increased risk of dementia. Subjective Cognitive Impairment (SCI) may precede Mild Cognitive Impairment (MCI) in the clinical continuum of Alzheimer's disease (AD). Attentional deficits may be present early in AD, and associated functional changes have been reported in both MCI and AD. In the present study, activation during divided attention in SCI subjects was investigated using functional magnetic resonance imaging (fMRI). Additionally, amyloid uptake was investigated using 11C-PIB with positron emission tomography (PET).
Brain activation in 11 SCI subjects and 10 controls was compared during a divided attention task using fMRI. Additionally, five SCI subjects and 14 cognitively normal healthy controls underwent 11C-PIB PET scanning. Criteria for diagnosis of SCI were:
1. self-reported memory complaints,
2. objectively normal cognition on detailed neurocognitive testing,
3. absence of psychiatric or causative physical illness,
4. normal activities of daily living and
5. absence of MCI or dementia.
There were no differences in performance between SCI and control groups in terms of cognitive or behavioural measures. However, SCIs had increased activation in left medial temporal lobe, and bilateral thalamus, posterior cingulate and caudate. One SCI subject and one control subject had a pattern of 11C-PIB uptake similar to that seen in AD.
The activation changes identified in SCI may relate to compensatory increased activation in the face of early AD pathology. Larger, longitudinal studies are needed to determine the extent and significance of PIB uptake in SCI.
‘Munchausen's syndrome by proxy’ characteristically describes women alleged to have fabricated or induced illnesses in children under their care, purportedly to attract attention. Where conclusive evidence exists the condition's aetiology remains speculative, where such evidence is lacking diagnosis hinges upon denial of wrong-doing (conduct also compatible with innocence). How might investigators obtain objective evidence of guilt or innocence? Here, we examine the case of a woman convicted of poisoning a child. She served a prison sentence but continues to profess her innocence. Using a modified fMRI protocol (previously published in 2001) we scanned the subject while she affirmed her account of events and that of her accusers. We hypothesized that she would exhibit longer response times in association with greater activation of ventrolateral prefrontal and anterior cingulate cortices when endorsing those statements she believed to be false (i.e., when she ‘lied’). The subject was scanned 4 times at 3 Tesla. Results revealed significantly longer response times and relatively greater activation of ventrolateral prefrontal and anterior cingulate cortices when she endorsed her accusers' version of events. Hence, while we have not ‘proven’ that this subject is innocent, we demonstrate that her behavioural and functional anatomical parameters behave as if she were.
Depression is a major public health problem in European countries, and health systems need to ensure access to effective psychological and pharmacological treatments. Research suggests that improvements in depression care require “complex interventions” that implement change in several areas simultaneously.
We describe an observational study of the implementation of a “stepped care” model to provide care for all adults presenting with a new case of depression in a mixed urban-rural area of Scotland with a population of 76,000 people.
A team of 5.2 clinicians provided care for about 1,000 new cases of depression each year. “Guided Self-Help” was the baseline intervention for all patients, supplemented where necessary with pharmacological treatment and Cognitive Behavioural or Interpersonal Therapy.
Service delivery systems were reformed to provide: specialist treatment in primary care settings using primarily non-medical clinicians, comprehensive electronic clinical records, continuous outcome monitoring and intensive investment in staff training and support.
Clinical outcomes (measured by the Personal Health Questionnaire, Social and Work Adjustment Scale and EQ-5D) showed significant improvement despite relatively brief clinician contact (2.5 hours over 4.6 contacts). Savings of more than 50% were made on the antidepressant drug budget. Service user satisfaction ratings were high.
Population needs for depression care can be met using “stepped care” models such as that described above. A randomised controlled study of this approach would be required to fully test the model.
We show that the isomorphism problems for left distributive algebras, racks, quandles and kei are as complex as possible in the sense of Borel reducibility. These algebraic structures are important for their connections with the theory of knots, links and braids. In particular, Joyce showed that a quandle can be associated with any knot, and this serves as a complete invariant for tame knots. However, such a classification of tame knots heuristically seemed to be unsatisfactory, due to the apparent difficulty of the quandle isomorphism problem. Our result confirms this view, showing that, from a set-theoretic perspective, classifying tame knots by quandles replaces one problem with (a special case of) a much harder problem.
In this paper, we revisit our previous work in which we derive an effective macroscale description suitable to describe the growth of biological tissue within a porous tissue-engineering scaffold. The underlying tissue dynamics is described as a multiphase mixture, thereby naturally accommodating features such as interstitial growth and active cell motion. Via a linearization of the underlying multiphase model (whose nonlinearity poses a significant challenge for such analyses), we obtain, by means of multiple-scale homogenization, a simplified macroscale model that nevertheless retains explicit dependence on both the microscale scaffold structure and the tissue dynamics, via so-called unit-cell problems that provide permeability tensors to parameterize the macroscale description. In our previous work, the cell problems retain macroscale dependence, posing significant challenges for computational implementation of the eventual macroscopic model; here, we obtain a decoupled system whereby the quasi-steady cell problems may be solved separately from the macroscale description. Moreover, we indicate how the formulation is influenced by a set of alternative microscale boundary conditions.
Item 9 of the Patient Health Questionnaire-9 (PHQ-9) queries about thoughts of death and self-harm, but not suicidality. Although it is sometimes used to assess suicide risk, most positive responses are not associated with suicidality. The PHQ-8, which omits Item 9, is thus increasingly used in research. We assessed equivalency of total score correlations and the diagnostic accuracy to detect major depression of the PHQ-8 and PHQ-9.
We conducted an individual patient data meta-analysis. We fit bivariate random-effects models to assess diagnostic accuracy.
16 742 participants (2097 major depression cases) from 54 studies were included. The correlation between PHQ-8 and PHQ-9 scores was 0.996 (95% confidence interval 0.996 to 0.996). The standard cutoff score of 10 for the PHQ-9 maximized sensitivity + specificity for the PHQ-8 among studies that used a semi-structured diagnostic interview reference standard (N = 27). At cutoff 10, the PHQ-8 was less sensitive by 0.02 (−0.06 to 0.00) and more specific by 0.01 (0.00 to 0.01) among those studies (N = 27), with similar results for studies that used other types of interviews (N = 27). For all 54 primary studies combined, across all cutoffs, the PHQ-8 was less sensitive than the PHQ-9 by 0.00 to 0.05 (0.03 at cutoff 10), and specificity was within 0.01 for all cutoffs (0.00 to 0.01).
PHQ-8 and PHQ-9 total scores were similar. Sensitivity may be minimally reduced with the PHQ-8, but specificity is similar.