We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Economic arrangements, Ramseyer writes, are structured and implemented with the intent and hope that they will be carried out with 'care, intelligence, discretion, and effort.' Yet entrepreneurs work with partial information about the products, and people, they are dealing with. Contracting in Japan illustrates this by examining five sets of negotiations and unusual contractual arrangements among non-specialist businessmen, and women, in Japan. In it, Ramseyer explores how sake brewers were able to obtain and market the necessary, but difficult-to-grow, sake rice that captured the local terroir; how Buddhist temples tried to compensate for rapidly falling donations by negotiating unusual funerary contracts; and how pre-war local elites used leasing instead of loans to fund local agriculture. Ramseyer examines these entrepreneurs, discovering how they structured contracts, made credible commitments, obtained valuable information, and protected themselves from adverse consequences to create, maintain, strengthen, and leverage the social networks in which they operated.
Sequential thermal analysis allows for deconvoluting the refractory nature and complexity of carbon mixtures embedded in mineral matrices for subsequent offline stable carbon and radiocarbon (14C) isotope analyses. Originally developed to separate Holocene from more ancient sedimentary organic matter to improve dating of marine sediments, the Ramped Pyrolysis and Oxidation (RPO) apparatus, or informally, the “dirt burner” is now used to address pressing questions in the broad field of biogeochemistry. The growing interest in the community now necessitates improved handling and procedures for routine analyses of difficult sample types. Here we report on advances in CO2 purification during sample processing, modifications to the instrumentation at the National Ocean Sciences Accelerator Mass Spectrometry (NOSAMS) facility, and introduce sodium bicarbonate procedural standards with differing natural abundance 14C signatures for blank assessment. Measurements from different environmental samples are used to compare the procedure to the different generations of sequential thermal analyses. With this study, we aim to improve the standardization of the procedures and prepare this instrumentation for innovations in online stable carbon isotopes and direct AMS-interface measurements in the future.
We present U–Pb dates from peridotitic pyrope-rich garnet from four mantle xenoliths entrained in a kimberlite from Bultfontein, South Africa. Garnet dates magmatic emplacement due to the high mantle residence temperatures of the source material prior to eruption, which were most likely above the closure temperature for the pyrope U–Pb system. We determine a U–Pb date of 84.0 ± 8.1 Ma for the emplacement of the Bultfontein kimberlite from garnet in our four xenolith samples. The date reproduces previous dates obtained from other mineral-isotope systems (chiefly Rb–Sr in phlogopite). Garnet can be dated despite extremely low concentrations of U (median ∼0.05 μg/g), because concentrations of common Pb are often low or non-detectable. This means that sub-concordant garnets can be dated with moderate precision using very large laser-ablation spots (130 μm) measured by quadrupole inductively coupled plasma – mass spectrometry (LA-Q-ICP-MS). Our strategy demonstrates successful U–Pb dating of a U-poor mineral due to high initial ratios of U to common Pb in some grains, and the wide spread of isotopic compositions of grains on a concordia diagram. In addition, the analytical protocol is not complex and uses widely available analytical methods and strategies. This new methodology has some advantages and disadvantages for dating kimberlite emplacement versus established methods (U-based decay systems in perovskite and zircon, or Rb- or K-based systems in phlogopite). However, this method has unique promise for its potential application to detrital diamond prospecting and, more speculatively, to the dating of pyrope inclusions in diamond.
Sipping, an early form of alcohol initiation, is associated with aspects of psychopathology and personality that reflect long-term risk for harmful alcohol use. In the Adolescent Brain and Cognitive Development cohort (N = 11,872), sipping by age 9–10 was concurrently associated with impulsivity, other aspects of externalizing, and prodromal schizophrenia symptoms. Still, these associations were cross-sectional in nature, leaving open the possibility that these features of psychopathology and personality might not reflect long-term risk for alcohol consumption and related harm across development. Here, we attempted to replicate baseline concurrent associations across three waves of data to extend concurrent associations to prospective ones. Most cross-sectional associations replicated across waves, such that impulsivity, other aspects of externalizing, reward sensitivity (e.g., surgency, sensation seeking), and prodromal schizophrenia symptoms were associated with increased odds of having sipped alcohol by the age of 12. Nevertheless, not all concurrent associations replicated prospectively; impulsigenic features did not reflect long-term risk for sipping. Thus, some psychopathology features appeared to reflect stable risk factors, whereas others appeared to reflect state-dependent risk factors. All told, sipping might not reflect long-term risk for harmful alcohol use, and the nature of sipping may change across development.
Contaminated blood cultures result in extended hospital stays and extended durations of antibiotic therapy. Rapid molecular-based blood culture testing can speed positive culture detection and improve clinical outcomes, particularly when combined with an antimicrobial stewardship program. We investigated the impact of a multiplex polymerase chain reaction (PCR) FilmArray Blood Culture Identification (BCID) system on clinical outcomes associated with contaminated blood cultures.
Methods:
We conducted a retrospective cohort study involving secondary data analysis at a single institution. In this before-and-after study, patients with contaminated blood cultures in the period before PCR BCID was implemented (ie, the pre-PCR period; n = 305) were compared to patients with contaminated blood cultures during the period after PCR BCID was implemented (ie, the post-PCR implementation period; n = 464). The primary exposure was PCR status and the main outcomes of the study were length of hospital stay and days of antibiotic therapy.
Results:
We did not detect a significant difference in adjusted mean length of hospital stay before (10.8 days; 95% confidence interval [CI], 9.8–11.9) and after (11.2 days; 95% CI, 10.2–12.3) the implementation of the rapid BCID panel in patients with contaminated blood cultures (P = .413). Likewise, adjusted mean days of antibiotic therapy between patients in pre-PCR group (5.1 days; 95% CI, 4.5–5.7) did not significantly differ from patients in post-PCR group (5.3 days; 95% CI, 4.8–5.9; P = .543).
Conclusion:
The introduction of a rapid PCR-based blood culture identification system did not improve clinical outcomes, such as length of hospital stay and duration of antibiotic therapy, in patients with contaminated blood cultures.
Postpartum depression (PPD) affects up to one in five mothers and birthing parents, yet as few as 10% access evidence-based treatment. One-day cognitive behavioral therapy (CBT)-based workshops for PPD have the potential to reach large numbers of sufferers and be integrated into stepped models of care.
Methods
This randomized controlled trial of 461 mothers and birthing parents in Ontario, Canada with Edinburgh Postnatal Depression Scale (EPDS) scores ⩾10, age ⩾18 years, and an infant <12 months of age compared the effects of a 1-day CBT-based workshop plus treatment as usual (TAU; i.e. care from any provider(s) they wished) to TAU alone at 12-weeks post-intervention on PPD, anxiety, the mother–infant relationship, offspring behavior, health-related quality of life, and cost-effectiveness. Data were collected via REDCap.
Results
Workshops led to meaningful reductions in EPDS scores (m = 15.77 to 11.22; b = −4.6, p < 0.01) and were associated with three times higher odds of a clinically significant decrease in PPD [odds ratio (OR) 3.00, 95% confidence interval (CI) 1.93–4.67]. Anxiety also decreased and participants had three times the odds of clinically significant improvement (OR 3.20, 95% CI 2.03–5.04). Participants reported improvements in mother–infant bonding, infant-focused rejection and anger, and effortful control in their toddlers. The workshop plus TAU achieved similar quality-adjusted life-years at lower costs than TAU alone.
Conclusions
One-day CBT-based workshops for PPD can lead to improvements in depression, anxiety, and the mother–infant relationship and are cost-saving. This intervention could represent a perinatal-specific option that can treat larger numbers of individuals and be integrated into stepped care approaches at reasonable cost.
This paper proposes a framework for comprehensive, collaborative, and community-based care (C4) for accessible mental health services in low-resource settings. Because mental health conditions have many causes, this framework includes social, public health, wellness and clinical services. It accommodates integration of stand-alone mental health programs with health and non-health community-based services. It addresses gaps in previous models including lack of community-based psychotherapeutic and social services, difficulty in addressing comorbidity of mental and physical conditions, and how workers interact with respect to referral and coordination of care. The framework is based on task-shifting of services to non-specialized workers. While the framework draws on the World Health Organization’s Mental Health Gap Action Program and other global mental health models, there are important differences. The C4 Framework delineates types of workers based on their skills. Separate workers focus on: basic psychoeducation and information sharing; community-level, evidence-based psychotherapeutic counseling; and primary medical care and more advanced, specialized mental health services for more severe or complex cases. This paper is intended for individuals, organizations and governments interested in implementing mental health services. The primary aim is to provide a framework for the provision of widely accessible mental health care and services.
Carer burden is common in younger-onset dementia (YOD), often due to the difficulty of navigating services often designed for older people with dementia. Compared to Alzheimer’s disease (AD), the burden is reported to be higher in behavioral variant frontotemporal dementia (bvFTD). However, there is little literature comparing carer burden specifically in YOD. This study hypothesized that carer burden in bvFTD would be higher than in AD.
Design:
Retrospective cross-sectional study.
Setting:
Tertiary neuropsychiatry service in Victoria, Australia.
Participants:
Patient-carer dyads with YOD.
Measurements:
We collected patient data, including behaviors using the Cambridge Behavioral Inventory-Revised (CBI-R). Carer burden was rated using the Zarit Burden Inventory-short version (ZBI-12). Descriptive statistics and Mann-Whitney U tests were used to analyze the data.
Results:
Carers reported high burden (ZBI-12 mean score = 17.2, SD = 10.5), with no significant difference in burden between younger-onset AD and bvFTD. CBI-R stereotypic and motor behaviors, CBI-R everyday skills, and total NUCOG scores differed between the two groups. There was no significant difference in the rest of the CBI-R subcategories, including the behavior-related domains.
Conclusion:
Carers of YOD face high burden and are managing significant challenging behaviors. We found no difference in carer burden between younger-onset AD and bvFTD. This could be due to similarities in the two subtypes in terms of abnormal behavior, motivation, and self-care as measured on CBI-R, contrary to previous literature. Clinicians should screen for carer burden and associated factors including behavioral symptoms in YOD syndromes, as they may contribute to carer burden regardless of the type.
Despite three decades of research, gaps remain in meeting the needs of people with dementia and their family/friend carers as they navigate the often-tumultuous process of driving cessation. This paper describes the process of using a knowledge-to-action (KTA) approach to develop an educational web-based resource (i.e. toolkit), called the Driving and Dementia Roadmap (DDR), aimed at addressing some of these gaps.
Design:
Aligned with the KTA framework, knowledge creation and action cycle activities informed the development of the DDR. These activities included systematic reviews; meta-synthesis of qualitative studies; interviews and focus groups with key stakeholders; development of a Driving and Dementia Intervention Framework (DD-IF); and a review and curation of publicly available resources and tools. An Advisory Group comprised of people with dementia and family carers provided ongoing feedback on the DDR’s content and design.
Results:
The DDR is a multi-component online toolkit that contains separate portals for current and former drivers with dementia and their family/friend carers. Based on the DD-IF, various topics of driving cessation are presented to accommodate users’ diverse stages and needs in their experiences of decision-making and transitioning to non-driving.
Conclusion:
Guided by the KTA framework that involved a systematic and iterative process of knowledge creation and translation, the resulting person-centered, individualized and flexible DDR can bring much-needed support to help people with dementia and their families maintain their mobility, community access, and social and emotional wellbeing during and post-driving cessation.
Reward processing has been proposed to underpin the atypical social feature of autism spectrum disorder (ASD). However, previous neuroimaging studies have yielded inconsistent results regarding the specificity of atypicalities for social reward processing in ASD.
Aims
Utilising a large sample, we aimed to assess reward processing in response to reward type (social, monetary) and reward phase (anticipation, delivery) in ASD.
Method
Functional magnetic resonance imaging during social and monetary reward anticipation and delivery was performed in 212 individuals with ASD (7.6–30.6 years of age) and 181 typically developing participants (7.6–30.8 years of age).
Results
Across social and monetary reward anticipation, whole-brain analyses showed hypoactivation of the right ventral striatum in participants with ASD compared with typically developing participants. Further, region of interest analysis across both reward types yielded ASD-related hypoactivation in both the left and right ventral striatum. Across delivery of social and monetary reward, hyperactivation of the ventral striatum in individuals with ASD did not survive correction for multiple comparisons. Dimensional analyses of autism and attention-deficit hyperactivity disorder (ADHD) scores were not significant. In categorical analyses, post hoc comparisons showed that ASD effects were most pronounced in participants with ASD without co-occurring ADHD.
Conclusions
Our results do not support current theories linking atypical social interaction in ASD to specific alterations in social reward processing. Instead, they point towards a generalised hypoactivity of ventral striatum in ASD during anticipation of both social and monetary rewards. We suggest this indicates attenuated reward seeking in ASD independent of social content and that elevated ADHD symptoms may attenuate altered reward seeking in ASD.
Adverse childhood experiences (ACE) can affect educational attainments, but little is known about their impact on educational achievements in people at clinical high risk of psychosis (CHR).
Methods
In total, 344 CHR individuals and 67 healthy controls (HC) were recruited as part of the European Community’s Seventh Framework Programme-funded multicenter study the European Network of National Schizophrenia Networks Studying Gene–Environment Interactions (EU-GEI). The brief version of the Child Trauma Questionnaire was used to measure ACE, while educational attainments were assessed using a semi-structured interview.
Results
At baseline, compared with HC, the CHR group spent less time in education and had higher rates of ACE, lower rates of employment, and lower estimated intelligence quotient (IQ). Across both groups, the total number of ACE was associated with fewer days in education and lower level of education. Emotional abuse was associated with fewer days in education in HC. Emotional neglect was associated with a lower level of education in CHR, while sexual abuse was associated with a lower level of education in HC. In the CHR group, the total number of ACE, physical abuse, and neglect was significantly associated with unemployment, while emotional neglect was associated with employment.
Conclusions
ACE are strongly associated with developmental outcomes such as educational achievement. Early intervention for psychosis programs should aim at integrating specific interventions to support young CHR people in their educational and vocational recovery. More generally, public health and social interventions focused on the prevention of ACE (or reduce their impact if ACE occur) are recommended.
Individuals living in residential aged care facilities with cognitive decline are at risk of social isolation and decreased wellbeing. These risks may be exacerbated by decline in communication skills. There is growing awareness that group singing may improve sense of wellbeing for individuals with dementia. However, to date few studies have examined broader rehabilitative effects on skills such as communication of individuals with dementia.
Aims:
To determine the feasibility and acceptability of the MuSic to Connect (MuSiCON) choir and language/communication assessment protocol in people with cognitive impairment living in non-high-care wards of a residential facility.
Methods:
Six individuals with mild-moderate cognitive impairment participated (age range 55–91 years, five female, one male). A mixed method approach was used. Quantitative outcomes included attendance rates, quality of life and communication measures. The qualitative measure was a brief survey of experience completed by participants and carers post-intervention.
Results:
Overall, MuSiCON was perceived as positive and beneficial, with high attendance, perception of improved daily functioning and high therapeutic benefit without harmful effects. While there was no reliable change in communication skills over the course of the six-week intervention, most participants successfully engaged in the conversational task, suggesting it is a suitable and ecologically valid method for data collection
Conclusions:
The MuSiCON protocol demonstrated feasibility and was well received by participants and staff at the residential facility. A co-design approach is recommended to improve upon feasibility, acceptability and validity of the assessment protocol prior to Phase II testing.
Voters prefer political candidates who are currently in office (incumbents) over new candidates (challengers). Using the premise of query theory (Johnson, Häubl & Keinan, 2007), we clarify the underlying cognitive mechanisms by asking whether memory retrieval sequences affect political decision making. Consistent with predictions, Experiment 1 (N= 256) replicated the incumbency advantage and showed that participants tended to first query information about the incumbent. Experiment 2 (N= 427) showed that experimentally manipulating participants’ query order altered the strength of the incumbency advantage. Experiment 3 (N= 713) replicated Experiment 1 and, in additional experimental conditions, showed that the effects of incumbency can be overridden by more valid cues, like the candidates’ ideology. Participants queried information about ideologically similar candidates earlier and also preferred these ideologically similar candidates. This is initial evidence for a cognitive, memory-retrieval process underling the incumbency advantage and political decision making.
In October 2010, the provincial government of Ontario, Canada enacted the Open for Business Act (OBA). A central component of the OBA is its provisions aiming to streamline the enforcement of Ontario’s Employment Standards Act (ESA). The OBA’s changes to the ESA are an attempt to manage a crisis of employment standards (ES) enforcement, arising from decades of ineffective regulation, by entrenching an individualised enforcement model. The Act aims to streamline enforcement by screening people assumed to be lacking definitive proof of violations out of the complaints process. The OBA therefore produces a new category of ‘illegitimate claimants’ and attributes administrative backlogs to these people. Instead of improving the protection of workers, the OBA embeds new racialised and gendered modes of exclusion in the ES enforcement process.
This study investigates regional variations in the factors associated with acceptance and actual experience of intimate partner violence (IPV) among married women in northern and southern Nigeria - two regions with distinct socio-cultural and economic differences. Data from the 2018 demographic and health survey are analysed to compare these two regions. The sample comprised married/living-with-partner women within the reproductive age of 15-49. Overall, a positive association exists between IPV experience and IPV acceptance, regardless of which is used as the outcome variable. Contrary to the notion that IPV is prevalent where its acceptance is high, this study finds that the reverse is true. IPV acceptance is significantly higher in the north than in the south (39.4% versus 14.7%), but the reverse is the case for the actual experience of IPV (20.1% versus 24.7%). Being employed and having access to the internet reduce the odds of IPV victimisation for women in the south, but increases the chances for northern women. Muslims in the north have significantly higher odds of IPV acceptance than their Christian counterparts in the same region, but the reverse is the case in the south. Regional differences also exist in the influence of decision-making, educational difference between spouses, and media exposure. While the cosmopolitan-success and conservative-failure hypothesis explains the regional differences in the acceptance of IPV, it fails to explain differences in the actual experience of IPV. The study provides alternative explanations for the regional differences in the experience of IPV and acceptance of it in Nigeria, and it points to the need for differing intervention programmes across regions. Notably, the study found that the association between justification of IPV and actual experience of it is bi-directional and suggests caution in making causal inferences.
The May 2019 IPBES emphasised the scale of the current biodiversity crisis and the need for transformative change, but highlighted that the tools exist to enable this change. Conservation translocation is an increasingly used tool that involves people deliberately moving and releasing organisms where the primary goal is conservation – it includes species reintroductions, reinforcements, assisted colonisations and ecological replacements. It can be complex, expensive, time consuming, and sometimes controversial, but when best practice guidelines are followed it can be a very effective conservation method and a way of exciting and engaging people in environmental issues. Conservation translocations have an important role to play not only in improving the conservation status of individual species but also in ecological restoration and rewilding by moving keystone and other influential species. As the climate continues to change, species with poor dispersal abilities or opportunities will be at particular risk. Assisted colonisation, which involves moving species outside their indigenous range, is likely to become an increasingly used method. It is also a tool that may become increasingly used to avoid threats from the transmission of pathogens. Other more radical forms of conservation translocation, such as ecological replacements, multi-species conservation translocations, and the use of de-extinction and genetic interventions, are also likely to be given stronger consideration within the wider framework of ecological restoration. There have been significant advances in the science of reintroduction biology over the last three decades. However new ways of transferring and sharing such information are needed to enable a wider spectrum of practitioners to have easier access to knowledge and guidance. In the past the biological considerations of conservation translocations have often heavily outweighed the people considerations. However it is increasingly important that socio-economic factors are also built into projects and relevant experts involved to reduce conflict and improve the chances of success. Some level of biological and socio-economic risk will be present for most conservation translocations, but these can often be managed through the use of sensitivity, professionalism, and the application of tried and tested best practice. The role of species reintroduction and other forms of conservation translocations will be an increasingly important tool if we are to restore, and make more resilient, our damaged ecosystems.
The dominant paradigm of experiments in the social and behavioral sciences views an experiment as a test of a theory, where the theory is assumed to generalize beyond the experiment's specific conditions. According to this view, which Alan Newell once characterized as “playing twenty questions with nature,” theory is advanced one experiment at a time, and the integration of disparate findings is assumed to happen via the scientific publishing process. In this article, we argue that the process of integration is at best inefficient, and at worst it does not, in fact, occur. We further show that the challenge of integration cannot be adequately addressed by recently proposed reforms that focus on the reliability and replicability of individual findings, nor simply by conducting more or larger experiments. Rather, the problem arises from the imprecise nature of social and behavioral theories and, consequently, a lack of commensurability across experiments conducted under different conditions. Therefore, researchers must fundamentally rethink how they design experiments and how the experiments relate to theory. We specifically describe an alternative framework, integrative experiment design, which intrinsically promotes commensurability and continuous integration of knowledge. In this paradigm, researchers explicitly map the design space of possible experiments associated with a given research question, embracing many potentially relevant theories rather than focusing on just one. The researchers then iteratively generate theories and test them with experiments explicitly sampled from the design space, allowing results to be integrated across experiments. Given recent methodological and technological developments, we conclude that this approach is feasible and would generate more-reliable, more-cumulative empirical and theoretical knowledge than the current paradigm—and with far greater efficiency.
Adoption of cover crops in arid agroecosystems has been slow due to concerns regarding limited water resources and possible soil moisture depletion. In irrigated organic systems, potential ecosystem services from cover crops also must be considered in light of the concerns for water conservation. A constructive balance could be achieved with fall-sown small grain cover crops; however, their impacts on irrigated organic systems are poorly understood. Our first objective was to determine the ability of fall-sown small grains [cereal rye (Secale cereale L), winter wheat (Triticum aestivum L.), barley (Hordeum vulgare L.) and oat (Avena sativa L.)] to suppress winter weeds in an irrigated, organic transition field in the southwestern USA. Small grains were planted following the legume sesbania (Sesbania exaltata (Raf.) Rydb. ex A.W. Hill) during Fall 2012 and Fall 2013. In Spring 2013 and 2014, weed densities and biomass were determined within each cover crop treatment and compared against unplanted controls. Results indicated that both barley and oat were effective in suppressing winter weeds. Our second objective was to compare weed suppression and soil moisture levels among seven barley varieties developed in the western United States. Barley varieties (‘Arivat’, ‘Hayes Beardless’, ‘P919’, ‘Robust’, ‘UC603’, ‘UC937’, ‘Washford Beardless’) were fall-sown in replicated strip plots in Fall 2016. Weed densities were measured in Spring 2017 and volumetric soil moisture near the soil surface (5.1 cm depth) was measured at time intervals beginning in December 2016 and ending in March 2017. With the exception of ‘UC937’, barley varieties caused marked reductions in weed density in comparison with the unplanted control. Soil moisture content for the unplanted control was consistently lower than soil moisture contents for barley plots. Barley variety did not influence volumetric soil moisture. During the 2017–2018 growing season, we re-examined three barley varieties considered most amenable to the cropping system requirements (‘Robust’, ‘UC603’, ‘P919’), and these varieties were again found to support few weeds (≤ 5.0 weeds m−2). We conclude that several organically certified barley varieties could fill the need for a ‘non-thirsty’ cover crop that suppresses winter weeds in irrigated organic systems in the southwestern United States.