To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Loss-of-control (LOC) eating commonly develops during adolescence, and it predicts full-syndrome eating disorders and excess weight gain. Although negative emotions and emotion dysregulation are hypothesized to precede and predict LOC eating, they are rarely examined outside the self-report domain. Autonomic indices, including heart rate (HR) and heart rate variability (HRV), may provide information about stress and capacity for emotion regulation in response to stress.
We studied whether autonomic indices predict LOC eating in real-time in adolescents with LOC eating and body mass index (BMI) ⩾70th percentile. Twenty-four adolescents aged 12–18 (67% female; BMI percentile mean ± standard deviation = 92.6 ± 9.4) who reported at least twice-monthly LOC episodes wore biosensors to monitor HR, HRV, and physical activity for 1 week. They reported their degree of LOC after all eating episodes on a visual analog scale (0–100) using a smartphone.
Adjusting for physical activity and time of day, higher HR and lower HRV predicted higher self-reported LOC after eating. Parsing between- and within-subjects effects, there was a significant, positive, within-subjects association between pre-meal HR and post-meal LOC rating. However, there was no significant within-subjects effect for HRV, nor were there between-subjects effects for either electrophysiologic variable.
Findings suggest that autonomic indices may either be a marker of risk for subsequent LOC eating or contribute to LOC eating. Linking physiological markers with behavior in the natural environment can improve knowledge of illness mechanisms and provide new avenues for intervention.
To overcome grass supply shortages on the main grazing block, some pasture-based dairy farmers are using zero-grazing (also known as ‘cut and carry’), whereby cows are periodically housed and fed fresh grass harvested from external land blocks. To determine the effect of zero-grazing on cow performance, two early-lactation experiments were conducted with autumn and spring-calving dairy cows. Cows were assigned to one of two treatments in a randomized complete block design. The two treatments were zero-grazing (ZG) and grazing (G). The ZG group were housed and fed zero-grazed grass, while the G group grazed outdoors at pasture. Both treatments were fed perennial ryegrass (Lolium perenne L.) from the same paddock. In experiment 1, 24 Holstein Friesian cows (n = 12) were studied over a 35-day experimental period in autumn and offered fresh grass, grass silage, ground maize and concentrates. In experiment 2, 30 Holstein Friesian cows (n = 15) were studied over a 42-day experimental period and offered fresh grass and concentrates. Average dry matter intake and milk yield was similar for ZG and G in both experiments. Likewise, ZG did not have an effect on milk composition, body condition or locomotion. Zero-grazing had no effect on total nitrogen excretion or nitrogen utilization efficiency in either experiment, or on rumen pH and ammonia concentration in experiment 1. While zero-grazing may enable farmers to supply fresh grass to early-lactation cows in spring and autumn, results from this study suggest that there are no additional benefits to cow performance in comparison to well-managed grazed grass.
Caregivers of patients with cancer are at significant risk for existential distress. Such distress negatively impacts caregivers’ quality of life and capacity to serve in their role as healthcare proxies, and ultimately, contributes to poor bereavement outcomes. Our team developed Meaning-Centered Psychotherapy for Cancer Caregivers (MCP-C), the first targeted psychosocial intervention that directly addresses existential distress in caregivers.
Nine caregivers of patients with glioblastoma multiforme (GBM) enrolled in a pilot randomized controlled trial evaluating the feasibility, acceptability, and effects of MCP-C, and completed in-depth interviews about their experience in the therapy. One focus group with three MCP-C interventionists was also completed.
Four key themes emerged from interviews: (1) MCP-C validated caregivers’ experience of caregiving; (2) MCP-C helped participants reframe their “caregiving identity” as a facet of their larger self-identity, by placing caregiving in the context of their life's journey; (3) MCP-C enabled caregivers to find ways to assert their agency through caregiving; and (4) the structure and sequence of sessions made MCP-C accessible and feasible. Feedback from interventionists highlighted several potential manual changes and overall ways in which MCP-C can help facilitate caregivers’ openness to discussing death and engaging in advanced care planning discussions with the patient.
Significance of results
The overarching goal of MCP-C is to allow caregivers to concurrently experience meaning and suffering; the intervention does not seek to deny the reality of challenges endured by caregivers, but instead to foster a connection to meaning and purpose alongside their suffering. Through in-depth interviews with caregivers and a focus group with MCP interventionists, we have refined and improved our MCP-C manual so that it can most effectively assist caregivers in experiencing meaning and purpose, despite inevitable suffering.
The National Institutes of Health launched the NIH Centers for Accelerated Innovation and the Research Evaluation and Commercialization Hubs programs to develop approaches and strategies to promote academic entrepreneurship and translate research discoveries into products and tools to help patients. The two programs collectively funded 11 sites at individual research institutions or consortia of institutions around the United States. Sites provided funding, project management, and coaching to funded investigators and commercialization education programs open to their research communities.
We implemented an evaluation program that included longitudinal tracking of funded technology development projects and commercialization outcomes; interviews with site teams, funded investigators, and relevant institutional and innovation ecosystem stakeholders and analysis and review of administrative data.
As of May 2021, interim results for 366 funded projects show that technologies have received nearly $1.7 billion in follow-on funding to-date. There were 88 start-ups formed, a 40% Small Business Innovation Research/Small Business Technology Transfer application success rate, and 17 licenses with small and large businesses. Twelve technologies are currently in clinical testing and three are on the market.
Best practices used by the sites included leadership teams using milestone-based project management, external advisory boards that evaluated funding applications for commercial merit as well as scientific, sustained engagement with the academic community about commercialization in an effort to shift attitudes about commercialization, application processes synced with education programs, and the provision of project managers with private-sector product development expertise to coach funded investigators.
Although the DSM-5 was adopted in 2013, the validity of the new substance use disorder (SUD) diagnosis and craving criterion has not been investigated systematically across substances.
Adults (N = 588) who engaged in binge drinking or illicit drug use and endorsed at least one DSM-5 SUD criterion were included. DSM-5 SUD criteria were assessed for alcohol, tobacco, cannabis, cocaine, heroin, and opioids. Craving was considered positive if “wanted to use so badly that could not think of anything else” (severe craving) or “felt a very strong desire or urge to use” (moderate craving) was endorsed. Baseline information on substance-related variables and psychopathology was collected, and electronic daily assessment queried substance use for the following 90 days. For each substance, logistic regression estimated the association between craving and validators, i.e. variables expected to be related to craving/SUD, and whether association with the validators differed for DSM-5 SUD diagnosed with craving as a criterion v. without.
Across substances, craving was associated with most baseline validators (p values<0.05); neither moderate nor severe craving consistently showed greater associations. Baseline craving predicted subsequent use [odds ratios (OR): 4.2 (alcohol) – 234.3 (heroin); p's ⩽ 0.0001], with stronger associations for moderate than severe craving (p's < 0.05). Baseline DSM-5 SUD showed stronger associations with subsequent use when diagnosed with craving than without (p's < 0.05).
The DSM-5 craving criterion as operationalized in this study is valid. Including craving improves the validity of DSM-5 SUD diagnoses, and clinical relevance, since craving may cause impaired control over use and development and maintenance of SUD.
How does a ‘space culture’ emerge and evolve, and how can archaeologists study such a phenomenon? The International Space Station Archaeological Project seeks to analyse the social and cultural context of an assemblage relating to the human presence in space. Drawing on concepts from contemporary archaeology, the project pursues a unique perspective beyond sociological or ethnographical approaches. Semiotic analysis of material culture and proxemic analysis of embodied space can be achieved using NASA's archives of documentation, images, video and audio media. Here, the authors set out a method for the study of this evidence. Understanding how individuals and groups use material culture in space stations, from discrete objects to contextual relationships, promises to reveal intersections of identity, nationality and community.
Cultural evolutionary theory conceptualises culture as an information-transmission system whose dynamics take on evolutionary properties. Within this framework, however, innovation has been likened to random mutations, reducing its occurrence to chance or fortuitous transmission error. In introducing the special collection on children and innovation, we here place object play and play objects – especially functional miniatures – from carefully chosen archaeological contexts in a niche construction perspective. Given that play, including object play, is ubiquitous in human societies, we suggest that plaything construction, provisioning and use have, over evolutionary timescales, paid substantial selective dividends via ontogenetic niche modification. Combining findings from cognitive science, ethology and ethnography with insights into hominin early developmental life-history, we show how play objects and object play probably had decisive roles in the emergence of innovative capabilities. Importantly, we argue that closer attention to play objects can go some way towards addressing changes in innovation rates that occurred throughout human biocultural evolution and why innovations are observable within certain technological domains but not others.
Chaff lining and chaff tramlining are harvest weed seed control (HWSC) systems that involve the concentration of chaff material containing weed seed into narrow (20 to 30 cm) rows between or on the harvester wheel tracks during harvest. These lines of chaff are left intact in the fields through subsequent cropping seasons in the assumption that the chaff environment is unfavorable for weed seed survival. The chaff row environment effect on weed seed survival was examined in field studies, and chaff response studies determined the influence of increasing amounts of chaff on weed seedling emergence. The objectives of these studies were to determine the influences of (1) chaff lines on the summer–autumn seed survival of selected weed species and (2) chaff type and amount on rigid ryegrass seedling emergence. There was frequently no difference (P > 0.05) in seed survival of four weed species (rigid ryegrass, wild oat, annual sowthistle, and turnip weed) when seeds were placed beneath or beside chaff lines. In one instance, wild oat seed survival was increased (P < 0.05) when seed were placed beneath compared to beside a chaff line. The pot studies determined that increasing amounts of chaff consistently resulted in decreasing numbers of rigid ryegrass seedlings emerging through chaff material. The suppression of emergence broadly followed a linear relationship in which there was approximately a 2.0% reduction in emergence with every 1,000 kg ha–1 increase in chaff material. This relationship was consistent across wheat, barley, canola, and lupin chaff types, indicating that the physical presence of the chaff was more important than chaff type. These studies suggested that chaff lines may not affect the survival over summer–autumn of the contained weed seeds but that the subsequent emergence of weed seedlings will be restricted by high amounts of chaff (>40,000 kg ha–1).
The purpose of this article was to determine the impact of employing a telephone clinic for follow-up of patients with stable lateral skull-base tumours.
An analysis of 1515 patients in the national lateral skull-base service was performed, and 148 patients enrolled in the telephone clinic to date were identified. The length of time that patients waited for results of their follow-up scans and the travel distance saved by patients not having to attend the hospital for their results was determined.
The mean time from scan to receiving results was 30.5 ± 32 days, 14 days sooner than in the face-to-face group (p = 0.0016). The average round-trip distance travelled by patients to the hospital for results of their scans was 256 ± 131 km.
The telephone clinic led to a significant reduction in time until patients received their scan results and helped reduce travel distance and clinic numbers in traditional face-to-face clinics.
Inflammation may contribute to the high prevalence of depressive symptoms seen in lung cancer. “Sickness behavior” is a cluster of symptoms induced by inflammation that are similar but distinct from depressive symptoms. The Sickness Behavior Inventory-Revised (SBI-R) was developed to measure sickness behavior. We hypothesized that the SBI-R would demonstrate adequate psychometric properties in association with inflammation.
Participants with stage IV lung cancer (n = 92) were evaluated for sickness behavior using the SBI-R. Concomitant assessments were made of depression (Patient Hospital Questionniare-9, Hospital Anxiety and Depression Scale) and inflammation [C-reactive protein (CRP)]. Classical test theory (CTT) was applied and multivariate models were created to explain SBI-R associations with depression and inflammation. Factor Analysis was also used to identify the underlying factor structure of the hypothesized construct of sickness behavior. A longitudinal analysis was conducted for a subset of participants.
The sample mean for the 12-item SBI-R was 8.3 (6.7) with a range from 0 to 33. The SBI-R demonstrated adequate internal consistency with a Cronbach's coefficient of 0.85, which did not increase by more than 0.01 with any single-item removal. This analysis examined factor loadings onto a single factor extracted using the principle components method. Eleven items had factor loadings that exceeded 0.40. SBI-R total scores were significantly correlated with depressive symptoms (r = 0.78, p < 0.001) and CRP (r = 0.47, p < 0.001). Multivariate analyses revealed that inflammation and depressive symptoms explained 67% of SBI-R variance.
Significance of results
The SBI-R demonstrated adequate reliability and construct validity in this patient population with metastatic lung cancer. The observed findings suggest that the SBI-R can meaningfully capture the presence of sickness behavior and may facilitate a greater understanding of inflammatory depression.
The Late Triassic fauna of the Lossiemouth Sandstone Formation (LSF) from the Elgin area, Scotland, has been pivotal in expanding our understanding of Triassic terrestrial tetrapods. Frustratingly, due to their odd preservation, interpretations of the Elgin Triassic specimens have relied on destructive moulding techniques, which only provide incomplete, and potentially distorted, information. Here, we show that micro-computed tomography (μCT) could revitalise the study of this important assemblage. We describe a long-neglected specimen that was originally identified as a pseudosuchian archosaur, Ornithosuchus woodwardi. μCT scans revealed dozens of bones belonging to at least two taxa: a small-bodied pseudosuchian and a specimen of the procolophonid Leptopleuron lacertinum. The pseudosuchian skeleton possesses a combination of characters that are unique to the clade Erpetosuchidae. As a basis for investigating the phylogenetic relationships of this new specimen, we reviewed the anatomy, taxonomy and systematics of other erpetosuchid specimens from the LSF (all previously referred to Erpetosuchus). Unfortunately, due to the differing representation of the skeleton in the available Erpetosuchus specimens, we cannot determine whether the erpetosuchid specimen we describe here belongs to Erpetosuchus granti (to which we show it is closely related) or if it represents a distinct new taxon. Nevertheless, our results shed light on rarely preserved details of erpetosuchid anatomy. Finally, the unanticipated new information extracted from both previously studied and neglected specimens suggests that fossil remains may be much more widely distributed in the Elgin quarries than previously recognised, and that the richness of the LSF might have been underestimated.
Introduction: The opioid crisis has reached epidemic levels in Canada, driven in large part by prescription drug use. Emergency physicians are frequent prescribers of opioids; therefore, the emergency department (ED) represents an important setting for potential intervention to encourage rational and safe prescribing. The objective of this study was to systematically review the literature on interventions aimed to influence opioid prescribing in the ED. Methods: Electronic searches of Medline and Cochrane were conducted and reference lists were hand-searched. All quantitative studies published in English from 2009 to 2019 were eligible for inclusion. Two reviewers independently screened the search output to identify potentially eligible studies, the full texts of which were retrieved and assessed for inclusion. Outcomes of interest included opioid prescribing rate (proportion of ED visits resulting in an opioid prescription at discharge), morphine milligram equivalents per prescription and variability among prescribers. Results: The search strategy yielded 797 potentially relevant citations. After eliminating duplicate citations and studies that did not meet eligibility criteria, 34 potentially relevant studies were retrieved in full text. Of these, 28 studies were included in the review. The majority (26, 92.9%) of studies were based in the United States and two (7.1%) were from Australia. Four (14.3%) were randomized controlled trials. The interventions were classified into six categories: prescribing guidelines (n = 10), regulation/rescheduling of opioids (n = 6), prescribing data transparency (n = 4), education (n = 4), care coordination (n = 3), and electronic medical record changes (n = 1). The majority of interventions reduced the opioid prescribing rate from the ED (21/28, 75.0%), although regulation/rescheduling of opioids had mixed effectiveness, with 3/6 (50%) studies reporting a small increase in the opioid prescribing rate post-intervention. Education had small yet consistent effects on reducing the opioid prescribing rate. Conclusion: A variety of interventions have attempted to improve opioid prescribing from the ED. These interventions include prescribing guidelines, regulation/rescheduling, data transparency, education, care coordination, and electronic medical record changes. The majority of interventions reduced the opioid prescribing rate; however, regulation/rescheduling of opioids demonstrated mixed effectiveness.
The aim of the study was to assess the experiences of discrimination as reported by people with mental health problems and to explore the impact of hospitalisation.
306 people with mental health problems provided sociodemographic data and data on discrimination using the discrimination and stigma scale version 12 (DISC-12) with the domains negative experienced discrimination, anticipated discrimination, overcoming stigma and discrimination, and positive experienced discrimination. Logistic regression analysis was used to test the impact of hospitalisation on discrimination, controlled for age, gender, education, employment, diagnosis and having been prescribed medication.
Hospitalisation had a major impact on negative discrimination: People were more likely to be treated unfairly in making or keeping friends, in marriage or divorce, by people in their neighbourhood, in social life, by mental health staff and in terms of privacy, if they had been hospitalised. They were also more likely to be avoided or shunned by people who knew about the mental health problem. People with a history of hospitalisation also reported more anticipated discrimination: They had stopped themselves more often from having a close personal relationship and concealed their mental health problem from others more often than those without a history of hospitalisation. However, people who had been hospitalised also experienced more positive discrimination in terms of being treated more positively in getting welfare benefits or disability pensions and in housing.
Findings suggest that treatment in hospital contributed to a higher extent to experienced discrimination than treatment in the community.
Conservation practitioners use a wide range of sources to inform decisions, but studies report that personal experience is usually most important; scientific papers and unpublished research are rarely referred to. For site-based conservation practitioners, day-to-day decisions are typically made within a context of earlier decisions taken at two levels: strategic decisions that define the aims and policies of the wider organisation; and management planning decisions which outline the objectives for a site and the actions needed to achieve them. Even where decisions are underpinned by scientific evidence, personal judgement is valuable in ensuring management actions are tailored to the specific site. The integration of scientific evidence into conservation decision-making could be improved. We suggest two main approaches. First, increase the synthesis, translation and exchange of scientific research into easily accessible, practical information. Second, ensure that decision-making processes involve skilled ecological advisors and scientists who keep up to date with relevant literature and are able to advise on site-specific evidence-based solutions.
This study investigated the attitudes of medical students towards psychiatry, both as a subject on the medical curriculum and as a career choice. Three separate questionnaires previously validated on medical student populations were administered prior to and immediately following an 8-week clinical training programme. The results indicate that the perception of psychiatry was positive prior to clerkship and became even more so on completion of training. On completion of the clerkship, there was a rise in the proportion of students who indicated that they might choose a career in psychiatry. Attitudes toward psychiatry correlated positively with the psychiatry examination results. Those that intended to specialise in psychiatry achieved significantly higher examination scores in the psychiatry examination.
The present functional magnetic resonance imaging (fMRI) study investigated neural changes in relation to mood biased processing in depression, before and after cognitive behavioral therapy (CBT) using an emotional Stroop task.
Sixteen unmedicated patients (mean age 40 years), fulfilling DSM-IV diagnosis for unipolar major depression underwent fMRI, prior to and after 16 once-weekly sessions of CBT. Sixteen matched healthy volunteers were scanned at similar time intervals. In an emotional Stroop task negative and neutral words were presented in various colors and volunteers had to name the color of words. Latencies were recorded to determine behavioral emotional interference effects. MRI images were acquired using clustered image acquisition. Whole-brain and region of interest analysis examined the neural basis of interference and mood biased processing.
At baseline patients displayed increased latencies during color naming negative words, in comparison to neutral words and in relation to healthy volunteers. After treatment, latencies did not significantly differ between groups. With regard to neural activity, depressed patients showed increased activation at baseline in amygdala, dorsolateral prefrontal cortex (DLPFC), and ventrolateral prefrontal cortex (VLPFC), which normalized after CBT. Additionally, hyperactivation in the rostral anterior cingulate at baseline was positively correlated with symptom reduction after CBT.
Evidence was found for an emotional interference effect during acute states of depression which improved following CBT. The neural basis is associated with increased activity in the amygdala, DLPFC and VLPFC which normalized after treatment. CBT seems to affect behavioral biases and neural circuits involved in processing negative information.
Although genetic and environmental factors operating before or around the time of birth have been demonstrated to be relevant to the aetiology of the major psychoses, a seasonal variation in the rates of admission of such patients has long been recognised. Few studies have compared first and readmissions. This study examined for seasonal variation of admission in the major psychoses, and compared diagnostic categories by admission status. Patients admitted to Irish psychiatric inpatient facilities between 1989 and 1994 with an ICD-9/10 diagnosis of schizophrenia or affective disorder were identified from the National Psychiatric Inpatient Reporting System (NPIRS). The data were analysed using a hierarchical log linear model, the chi-square test, a Kolmogorov-Smirnov (KS) type statistic, and the method of Walter and Elwood. The hierarchical log linear model demonstrated significant interactions between the month of admission and admission order (change in scaled deviance 28.77, df = 11, P < 0.003). Both first admissions with mania, and readmissions with bipolar affective disorder exhibited significant seasonality. In contrast, only first admissions with schizophrenia showed significant seasonal effects. Although first admissions with mania and readmissions with bipolar disorder both show seasonality, seasonal influences appear to be more relevant to onset of schizophrenia than subsequent relapse.
We sought to explore whether obstetric complications (OCs) are more likely to occur in the presence of familial/genetic susceptibility for schizophrenia or whether they themselves represent an independent environmental risk factor for schizophrenia.
The presence of OCs was assessed through maternal interview on 216 subjects, comprising 36 patients with schizophrenia from multiply affected families, 38 of their unaffected siblings, 31 schizophrenic patients with no family history of psychosis, 51 of their unaffected siblings and 60 normal comparison subjects. We examined the familiality of OCs and whether OCs were commoner in the patient and sibling groups than in the control group.
OCs tended to cluster within families, especially in multiply affected families. Patients with schizophrenia, especially those from multiply affected families, had a significantly higher rate of OCs compared to normal comparison subjects, but there was no evidence for an elevated rate of OCs in unaffected siblings.
Our data provides little evidence for a link between OCs and genetic susceptibility to schizophrenia. If high rates of OCs are related to schizophrenia genes, this relationship is weak and will only be detected by very large sample sizes.