To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A recent suicidal drive hypothesis posits that psychotic experiences (PEs) may serve to externalize internally generated and self-directed threat (i.e., self-injurious/suicidal behavior [SIB]) in order to optimize survival; however, it must first be demonstrated that such internal threat can both precede and inform PEs. The current study conducted the first known bidirectional analysis of SIB and PEs to test whether SIB could be considered as a plausible antecedent for PEs. Prospective data were utilized from the Environmental Risk (E-Risk) Longitudinal Twin Study, a nationally representative birth cohort of 2232 twins, that captured SIB (any self-harm or suicidal attempt) and PEs at ages 12 and 18 years. Cross-lagged panel models demonstrated that the association between SIB at age 12 and PEs at age 18 was as strong as the association between PEs at age 12 and SIB at age 18. Indeed, the best representation of the data was a model where these paths were constrained to be equal (OR = 2.48, 95% CI = 1.63–3.79). Clinical interview case notes for those who reported both SIB and PEs at age 18, revealed that PEs were explicitly characterized by SIB/threat/death-related content for 39% of cases. These findings justify further investigation of the suicidal drive hypothesis.
Communities with high levels of social capital enjoy an array of positive economic and community development outcomes. We assess the role of several key community characteristics, including the strength of government institutions, in explaining local social capital variation. The analysis draws on data from United States counties and includes regression modelling and a Blinder–Oaxaca decomposition to explore differences in social capital across an area’s metropolitan status and region. The data show social capital determinants vary by place both due to the endowment levels of these determinants and the productive value of their coefficients. For example, the coefficient productive values of government capacity explain some differences in social capital levels across metropolitan status (but not across region). Concurrently, variations in government capacity endowment levels help explain some differences in social capital levels across region (but not across metropolitan status).
Twenty-six percent of children experience a traumatic event by the age of 4. Negative events during childhood have deleterious correlates later in life, including antisocial behavior. However, the mechanisms that play into this relation are unclear. We explored deficits in neurocognitive functioning, specifically problems in passive avoidance, a construct with elements of inhibitory control and learning as a potential acquired mediator for the pathway between cumulative early childhood adversity from birth to age 7 and later antisocial behavior through age 18, using prospective longitudinal data from 585 participants. Path analyses showed that cumulative early childhood adversity predicted impaired passive avoidance during adolescence and increased antisocial behavior during late adolescence. Furthermore, poor neurocognition, namely, passive avoidance, predicted later antisocial behavior and significantly mediated the relation between cumulative early childhood adversity and later antisocial behavior. This research has implications for understanding the development of later antisocial behavior and points to a potential target for neurocognitive intervention within the pathway from cumulative early childhood adversity to later antisocial behavior.
The administration of naloxone therapy is restricted by scope of practice to Advanced Life Support (ALS) in many Emergency Medical Services (EMS) systems throughout the United States. In Delaware’s two-tiered EMS system, Basic Life Support (BLS) often arrives on-scene prior to ALS, but BLS providers were not previously authorized to administer naloxone. Through a BLS naloxone pilot study, the researchers sought to evaluate BLS naloxone administration and timing compared to ALS.
After undergoing specialized training, BLS providers would be able to appropriately administer naloxone to opioid overdose patients in a more timely manner than ALS providers.
This was a retrospective, observational study using data collected from February 2014 through May 2015 throughout a state BLS naloxone pilot program. A total of 14 out of 72 state BLS agencies participated in the study. Pilot BLS agencies attended a training session on the indications and administration of naloxone, and then were authorized to carry and administer naloxone. Researchers then compared vital signs and the time of BLS arrival to administration of naloxone by BLS and ALS. Data were analyzed using paired and independent sample t-tests, as well as chi-square, as appropriate.
A total of 131 incidents of naloxone administration were reviewed. Of those, 62 patients received naloxone by BLS (pilot group) and 69 patients received naloxone by ALS (control group). After naloxone administration, BLS patients showed improvements in heart rate (HR; P < .01), respiratory rate (RR; P < .01), and pulse oximetry (spO2; P < .01); ALS patients also showed improvement in RR (P < .01), and in spO2 (P = .005). There was no significant improvement in HR for ALS providers (P = .189).
There was a significant difference in arrival time of BLS to the time of naloxone administration between the two groups, with shorter times in the BLS group compared to the ALS group (1.9 minutes versus 9.8 minutes; P < .01); BLS administration was 7.8 minutes faster when compared to ALS administration (95% CI, 6.2-9.3 minutes).
Patients improved similarly and received naloxone therapy sooner when treated by BLS agencies carrying naloxone than those who awaited ALS arrival. All EMS systems should consider allowing BLS to carry and administer naloxone for an effective and potentially faster naloxone administration when treating respiratory compromise related to opiate overdose.
In this article, two white, Western female researchers reflect on the methodological, ethical, and practical dilemmas experienced while conducting social science fieldwork in Botswana for their doctoral degrees. In addition, their shared research assistant examines her role as a social and cultural interlocutor, which was essential to the researchers’ successful navigation in their various field sites. Drawing on distinct but common experiences conducting research in northern and western regions of rural Botswana, the authors reflexively consider a series of interwoven issues tied to their positionalities: the disparity in benefits and return on research investment between the researcher and research participants; the nature of commodified or transactional relations, especially in an impoverished region highly dependent on foreign tourists; the complex nature of researcher–research assistant relationships; and the contradictory dynamics of being female researchers in a patriarchal society while also embodying privileges of whiteness and Western nationality. Building on these reflections, the authors engage with current debates in the social sciences to argue that researcher reflexivity is not an adequate end point and should result in engagement with ethical and epistemological questions regarding the decolonization of research practices more broadly.
Resource utilisation for infants with single ventricle CHD remains high without well-studied ways to decrease economic burden. Same-day discharge following cardiac catheterisation has been shown to be safe and effective in children with CHD, but those with single ventricle physiology are commonly excluded. The purpose of this study was to investigate the economic implications of planned same-day discharge following cardiac catheterisation versus universal overnight hospital admission in infants with single ventricle CHD.
Methods and Results:
A probabilistic decision-tree analysis with sensitivity analyses was performed. All included patients were categorised into four possible outcomes; discharge, readmission following discharge (within 48 hours), observation and prolonged hospitalisation. Baseline probabilities of each node of the tree were then combined with the cost data to evaluate the comparative dominance of one decision (immediately discharge) versus the other decision (routinely admit). Patients discharged on the same day as the procedure accrued the lowest attributed hospital cost ($5469), while patients readmitted to the hospital had the highest attributed cost ($11,851). Currently, no other studies have assessed the cost of hospitalisation following cardiac catheterisation in this population. Thus, we allowed for a wide range of cost variation, but same-day discharge dominated the decision outcome with a lower economic burden.
Same-day discharge following routine cardiac catheterisation in patients with single ventricle physiology is less costly compared to universal overnight admission. This demonstrates an important cost-limiting step in a complex population of patients who have high resource utilisation.
While child poverty is a significant risk factor for poor mental health, the developmental pathways involved with these associations are poorly understood. To advance knowledge about these important linkages, the present study examined the developmental sequelae of childhood exposure to poverty in a multiyear longitudinal study. Here, we focused on exposure to poverty, neurobiological circuitry connected to emotion dysregulation, later exposure to stressful life events, and symptoms of psychopathology. We grounded our work in a biopsychosocial perspective, with a specific interest in “stress sensitization” and emotion dysregulation. Motivated by past work, we first tested whether exposure to poverty was related to changes in the resting-state coupling between two brain structures centrally involved with emotion processing and regulation (the amygdala and the ventromedial prefrontal cortex; vmPFC). As predicted, we found lower household income at age 10 was related to lower resting-state coupling between these areas at age 15. We then tested if variations in amygdala–vmPFC connectivity interacted with more contemporaneous stressors to predict challenges with mental health at age 16. In line with past reports showing risk for poor mental health is greatest in those exposed to early and then later, more contemporaneous stress, we predicted and found that lower vmPFC–amygdala coupling in the context of greater contemporaneous stress was related to higher levels of internalizing and externalizing symptoms. We believe these important interactions between neurobiology and life history are an additional vantage point for understanding risk and resiliency, and suggest avenues for prediction of psychopathology related to early life challenge.
Children with congenital heart disease are at high risk for malnutrition. Standardisation of feeding protocols has shown promise in decreasing some of this risk. With little standardisation between institutions’ feeding protocols and no understanding of protocol adherence, it is important to analyse the efficacy of individual aspects of the protocols.
Adherence to and deviation from a feeding protocol in high-risk congenital heart disease patients between December 2015 and March 2017 were analysed. Associations between adherence to and deviation from the protocol and clinical outcomes were also assessed. The primary outcome was change in weight-for-age z score between time intervals.
Increased adherence to and decreased deviation from individual instructions of a feeding protocol improves patients change in weight-for-age z score between birth and hospital discharge (p = 0.031). Secondary outcomes such as markers of clinical severity and nutritional delivery were not statistically different between groups with high or low adherence or deviation rates.
High-risk feeding protocol adherence and fewer deviations are associated with weight gain independent of their influence on nutritional delivery and caloric intake. Future studies assessing the efficacy of feeding protocols should include the measures of adherence and deviations that are not merely limited to caloric delivery and illness severity.
To test the effect of a behavioural economics intervention in two food pantries on the nutritional quality of foods available at the pantries and the foods selected by adults visiting food pantries.
An intervention (SuperShelf) was implemented in two food pantries (Sites A and B), with two other pantries (Sites C and D) serving as a control for pantry outcomes. The intervention aimed to increase the amount and variety of healthy foods (supply), as well as the appeal of healthy foods (demand) using behavioural economics strategies. Assessments included baseline and 4-month follow-up client surveys, client cart inventories, pantry inventories and environmental assessments. A fidelity score (range 0–100) was assigned to each intervention pantry to measure the degree of implementation. A Healthy Eating Index-2010 (HEI-2010) score (range 0–100) was generated for each client cart and pantry.
Four Minnesota food pantries, USA.
Clients visiting intervention pantries before (n 71) and after (n 70) the intervention.
Fidelity scores differed by intervention site (Site A=82, Site B=51). At Site A, in adjusted models, client cart HEI-2010 scores increased on average by 11·8 points (P<0·0001), whereas there was no change at Site B. HEI-2010 pantry environment scores increased in intervention pantries (Site A=8 points, Site B=19 points) and decreased slightly in control pantries (Site C=−4 points, Site D=−3 points).
When implemented as intended, SuperShelf has the potential to improve the nutritional quality of foods available to and selected by pantry clients.
Objectives: Down syndrome (DS) is a population with known hippocampal impairment, with studies showing that individuals with DS display difficulties in spatial navigation and remembering arbitrary bindings. Recent research has also demonstrated the importance of the hippocampus for novel word-learning. Based on these data, we aimed to determine whether individuals with DS show deficits in learning new labels and if they may benefit from encoding conditions thought to be less reliant on hippocampal function (i.e., through fast mapping). Methods: In the current study, we examined immediate, 5-min, and 1-week delayed word-learning across two learning conditions (e.g., explicit encoding vs. fast mapping). These conditions were examined across groups (twenty-six 3- to 5-year-old typically developing children and twenty-six 11- to 28-year-old individuals with DS with comparable verbal and nonverbal scores on the Kaufman Brief Intelligence Test – second edition) and in reference to sleep quality. Results: Both individuals with and without DS showed retention after a 1-week delay, and the current study found no benefit of the fast mapping condition in either group contrary to our expectations. Eye tracking data showed that preferential eye movements to target words were not present immediately but emerged after 1-week in both groups. Furthermore, sleep measures collected via actigraphy did not relate to retention in either group. Conclusions: This study presents novel data on long-term knowledge retention in reference to sleep patterns in DS and adds to a body of knowledge helping us to understand the processes of word-learning in typical and atypically developing populations. (JINS, 2018, 24, 955–965)
The rumen microbiome has the important task of supplying ruminants with most of their dietary requirements and is responsible for up to 90% of their metabolic needs. This tremendous feat is possible due to the large diversity of microorganisms in the rumen. The rumen is considered one of the most diverse ecosystems on the planet in terms of species diversity and functional richness. From the moment the feed is ingested, it enters a vast cascade in which specialized microorganisms degrade specific components of the feed turning them into molecules, which in turn are utilized as anabolic precursors and energy sources for the animal. The output of this degradation process not only affects the animal, but also has an extensive impact on the environment. Some of the byproducts that are emitted as waste from this process, such as methane, act as greenhouse gases which greatly contribute to global warming. Recent technological advances developed to study this community enabled a larger overview of its vast taxonomic and functional diversity, thus leading to a better understanding of its ecology and function. This deeper understanding of the forces affecting the microbiome includes the forces that shape composition, the variation among animals, the stability of its key components, the processes of succession on a short- and long-time scales such as primary colonization and diurnal oscillations. These collective understandings have helped to provide insights into the potential effects that these forces have on the outputs observed from the animal itself. Over the recent years, there has been a growing body of evidence demonstrating the link between the microbiome and its effect on productivity of the host animals and the environment, which has placed rumen microbiome studies in the forefront of animal agricultural research. In this review, we focus on the natural variations in community composition, which are not the results of different management or feed but rather intrinsic features of animals. We characterize the rumen microbiome, its potential impact on its host as well as the barriers in implementing the current knowledge to modulate the microbiome and point toward potential avenues to overcome these hurdles.
Delirium is heterogeneous and can vary by etiology.
We sought to determine how delirium subtyped by etiology affected six-month function and cognition.
Prospective cohort study.
Tertiary care, academic medical center.
A total of 228 hospitalized patients > 65 years old were admitted from the emergency department (ED).
The modified Brief Confusion Assessment Method was used to determine delirium in the ED. Delirium etiology was determined by three trained physician reviewers using a Delirium Etiology checklist. Pre-illness and six-month function and cognition were determined using the Older American Resources and Services Activities of Daily Living (OARS ADL) questionnaire and the short-form Informant Questionnaire on Cognitive Decline in the Elderly (IQCODE). Multiple linear regression was performed to determine if delirium etiology subtypes were associated with six-month function and cognition adjusted for baseline OARS ADL and IQCODE. Two-factor interactions were incorporated to determine pre-illness function or cognition-modified relationships between delirium subtypes and six-month function and cognition.
In patients with poorer pre-illness function only, delirium secondary to metabolic disturbance (β coefficient = −2.9 points, 95%CI: −0.3 to −5.6) and organ dysfunction (β coefficient = −4.3 points, 95%CI: −7.2 to −1.4) was significantly associated with poorer six-month function. In patients with intact cognition only, delirium secondary to central nervous system insults was significantly associated with poorer cognition (β coefficient = 0.69, 95%CI: 0.19 to 1.20).
Delirium is heterogeneous and different etiologies may have different prognostic implications. Furthermore, the effect of these delirium etiologies on outcome may be dependent on the patient's pre-illness functional status and cognition.
Surgical site infections (SSIs) following colorectal surgery (CRS) are among the most common healthcare-associated infections (HAIs). Reduction in colorectal SSI rates is an important goal for surgical quality improvement.
To examine rates of SSI in patients with and without cancer and to identify potential predictors of SSI risk following CRS
American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) data files for 2011–2013 from a sample of 12 National Comprehensive Cancer Network (NCCN) member institutions were combined. Pooled SSI rates for colorectal procedures were calculated and risk was evaluated. The independent importance of potential risk factors was assessed using logistic regression.
Of 22 invited NCCN centers, 11 participated (50%). Colorectal procedures were selected by principal procedure current procedural technology (CPT) code. Cancer was defined by International Classification of Disease, Ninth Revision, Clinical Modification (ICD-9-CM) codes.
The primary outcome of interest was 30-day SSI rate.
A total of 652 SSIs (11.06%) were reported among 5,893 CRSs. Risk of SSI was similar for patients with and without cancer. Among CRS patients with underlying cancer, disseminated cancer (SSI rate, 17.5%; odds ratio [OR], 1.66; 95% confidence interval [CI], 1.23–2.26; P=.001), ASA score ≥3 (OR, 1.41; 95% CI, 1.09–1.83; P=.001), chronic obstructive pulmonary disease (COPD; OR, 1.6; 95% CI, 1.06–2.53; P=.02), and longer duration of procedure were associated with development of SSI.
Patients with disseminated cancer are at a higher risk for developing SSI. ASA score >3, COPD, and longer duration of surgery predict SSI risk. Disseminated cancer should be further evaluated by the Centers for Disease Control and Prevention (CDC) in generating risk-adjusted outcomes.
The need for increased monitoring and evaluation within the conservation sector has been well documented, and includes the monitoring and evaluation of training activities. We evaluated the impacts of a long-term training programme in Mauritius, using a questionnaire and semi-structured key informant interviews to develop a theory of change from the perspective of the trainers, and validated it against participants' perceptions of the benefits of training. Our findings indicated that an important outcome of training was to increase participants' belief that they could effect change, also called perception of control; this is related to an increase in a trainee's practical skills, which enables them to become more effective in their work. However, if a trainee's work environment was negative, the impact of training on practical skills, job performance and perception of control was lower. Neither the acquisition of conservation theory nor the opportunity to network was perceived by participants as improving their conservation performance, despite trainers anticipating that these matters would be important. Perception of control and work environment should therefore be considered when designing conservation training programmes, and the effectiveness of teaching conservation theory and networking should be examined further.
The developmental course of daily functioning prior to first psychosis-onset remains poorly understood. This study explored age-related periods of change in social and role functioning. The longitudinal study included youth (aged 12–23, mean follow-up years = 1.19) at clinical high risk (CHR) for psychosis (converters [CHR-C], n = 83; nonconverters [CHR-NC], n = 275) and a healthy control group (n = 164). Mixed-model analyses were performed to determine age-related differences in social and role functioning. We limited our analyses to functioning before psychosis conversion; thus, data of CHR-C participants gathered after psychosis onset were excluded. In controls, social and role functioning improved over time. From at least age 12, functioning in CHR was poorer than in controls, and this lag persisted over time. Between ages 15 and 18, social functioning in CHR-C stagnated and diverged from that of CHR-NC, who continued to improve (p = .001). Subsequently, CHR-C lagged behind in improvement between ages 21 and 23, further distinguishing them from CHR-NC (p < .001). A similar period of stagnation was apparent for role functioning, but to a lesser extent (p = .007). The results remained consistent when we accounted for the time to conversion. Our findings suggest that CHR-C start lagging behind CHR-NC in social and role functioning in adolescence, followed by a period of further stagnation in adulthood.
This study assessed the strength of the association between socioeconomic status (SES) and low birth weight (LBW) and preterm birth (PTB) in Southwestern Ontario. Utilizing perinatal and neonatal databases at the London Health Science Centre, maternal postal codes were entered into a Geographic Information System to determine home neighbourhoods. Neighbourhoods were defined by dissemination areas (DAs). Median household income for each DA was extracted from the latest Canadian Census and linked to each mother. All singleton infants born between February 2009 and February 2014 were included. Of 26,654 live singleton births, 6.4% were LBW and 9.7% were PTB. Top risk factors for LBW were: maternal amphetamine use, chronic hypertension and maternal marijuana use (OR respectively: 17.51, 3.18, 2.72); previously diagnosed diabetes, maternal narcotic use and insulin-controlled gestational diabetes predicted PTB (OR respectively: 17.95, 2.69, 2.42). Overall, SES had little impact on adverse birth outcomes, although low maternal education increased the likelihood of a LBW neonate (OR: 1.01).
In this article, I present the results of an analysis of codex-style polychrome ceramics recovered from excavations of commoner households at the Late Postclassic center of Tututepec (Yucu Dzaa), Oaxaca, Mexico. In employing a basic semiotic approach, I examine the images depicted on these materials to draw inferences regarding salient themes they expressed, and how these themes related to broader social discourse and political ideology. In short, I argue that the data suggest an articulation between the popular ideologies of commoners and ideals that were promoted by the site's ruling elites, a concordance that likely arose through dialogic social processes, rather than coercion or false consciousness. This state of affairs may have contributed to the success of the polity, which was the center of a powerful territorial empire at the time of Spanish contact.