To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Manen et al. provide here a reply to the critical comment published by A. J. Ammerman regarding their article “The Neolithic Transition in the Western Mediterranean: a complex and non-linear diffusion process—the radiocarbon record revisited,” published in 2019 in Radiocarbon. They also use this occasion to reaffirm the need to elaborate novel interpretive frameworks that combine both geo-chronological and cultural data.
Previous studies attest that early bilinguals can modify their perceptual identification according to the fine-grained phonetic detail of the language they believe they are hearing. Following Gonzales et al. (2019), we replicate the double phonemic boundary effect in late learners (LBs) using conceptual-based cueing. We administered a forced choice identification task to 169 native English adult learners of Spanish in two sessions. In both sessions, participants identified the same /b/-/p/ voicing continuum, but language context was cued conceptually using the instructions. The data were analyzed using Bayesian multilevel regression. Learners categorized the continuum in a similar manner when they believed they were hearing English. However, when they believed they were hearing Spanish, “voiceless” responses increased as a function of L2 proficiency. This research demonstrates the double phonemic boundary effect can be conceptually cued in LBs and supports accounts positing selective activation of independent perception grammars in L2 learning.
La stimulation magnétique transcrânienne répétitive (SMTr) appliquée sur le cortex préfrontal dorso-latéral (CPFDL) a prouvé son efficacité dans le traitement de la dépression résistante . En plus d’une amélioration sur la symptomatologie, des études rapportent des effets positifs sur le fonctionnement cognitif , dont la mémoire de travail. Cependant, cet effet ne semble pas être retrouvé chez des sujets sains  lors d’une tâche de N-back sans leurre. L’objectif de notre étude est donc d’évaluer l’impact de la SMTr sur le CPFDL, région plus sensible à une tâche de N-back avec leurre .
Une étude randomisée en double insu a été menée chez 30 participants sains. Une stimulation de type iTBS (intermittent theta burst stimulation) a été effectuée pendant 5 jours à raison de 2 séances/jour appliquée au niveau du CPFDL gauche ciblé par neuro-navigation sur les coordonnées MNI (X, Y, Z = –50,30,36). Nous avons observé l’impact de la SMTr sur le comportement des participants durant la tâche de N-back. Pour cela, les participants ont effectué cette tâche, composée de blocs de 0-back, 3-back et 3-back contenant des leurres, lors de deux sessions d’IRMf (une avant et une après la semaine de stimulation active ou placebo). La performance, le temps de réaction ainsi que les données d’imageries ont été recueillis.
Les 2 groupes ne montrent pas de différence au niveau de l’âge ou du genre. Au niveau comportemental, les premières analyses sur la performance ainsi que sur le temps de réaction ne montrent pas d’effet d’interaction Groupe (actif/placebo) * Temps (avant/après SMTr). Au niveau des données de neuro-imagerie, une analyse d’interaction Groupe * Temps en prenant en compte la condition leurre nous permettra de mieux comprendre l’impact de la SMTr sur la mémoire de travail impliquant le CPFDL.
To translate SCOFF questionnaire in French and evaluate its metrological features for the screening of eating disorders (ED) in a student French population.
SCOFF questionnaire is composed by 5 questions and it has been developed for the screening of ED and Its French version isn't currently available. The translation and the transcultural validation were done using international criteria. The validation study employed the Mini International Neuropsychiatric Interview as the gold standard and the French version of SCOFF questionnaire (QD-TCA) paper and pencil form was applied to female students attending yearly evaluation in the Students Health clinic.
The sample was composed of 120 women with a mean age of 20 years (standard deviation - SD - 3.1 years, range 18-35). Thirteen cases (10.8%) of ED were diagnosed having ED (3 cases (2.8%) of anorexia nervosa and 10 cases (8%) of bulimia nervosa). Diagnostic threshold was calculated using the receiver operating characteristics (ROC) curve and fixated at two positive answers. The sensitivity of QD-TCA was of 92% with a specificity of 91.5%. Its positive and negative predictive values for ED were 57.1% and 99%, respectively. Similar results were obtained for AN and BN. Intraclass correlation-R was of 89%.
The French version of SCOFF questionnaire developed by our team (QD-TCA) seems to be a reliable and practical eating disorder's screening tool in a moderate risk student setting.
To analyse variations in workers psychological distress, depression and burnout within a model encompassing the stress promoted by constraints-resources embedded in structures of daily life (workplace, family, social networks outside the workplace) and worker individual characteristics (demography, physical health, psychological traits, life habits, stressful childhood events).
Data were collected in 2009-2012 from a sample of 64 Quebec (Canada)workplaces, where 2162 employees were surved, for a response rate of 73.1%. Multilevel regression models were used to analyse the data.
Variables explain 31.8% of psychological distress (GHQ), 47.7% of depression (BDI) and 48.1% of burnout (MBI). Associations are not the same for each outcome. Skill utilization (BDI, MBI), decision authority (GHQ), abusive supervision (BDI, MBI), conflicts (BDI, MBI) and job security (GHQ, BDI, MBI) are related to the outcomes. For the family, being in couple (BDI, MBI), minor children (BDI, MBI), family to work conflict (MBI), work-to-family conflict (GHQ, BDI, MBI), strained marital and parental relations (GHQ, BDI) associated with the outcomes. Social support outside the workplace predicts both psychological distress and depression. Most of the individual characteristics correlated with the three outcomes.
The results of this study suggest expanding approaches in occupational mental health in order to avoid coming to erroneous conclusions about the relationship between work and mental health. Depression and burnout seem to share a similar explanatory structure, while psychogical distress appear mostly explained by non-work factors.
The death rate due to suicide in elderly people is particularly high. As part of suicide selective prevention measures for at-risk populations, the WHO recommends training “gatekeepers”.
In order to assess the impact of gatekeeper training for members of staff, we carried out a controlled quasi-experimental study over the course of one year, comparing 12 nursing homes where at least 30% of the staff had undergone gatekeeper training with 12 nursing homes without trained staff. We collected data about the residents considered to be suicidal, their management further to being identified, as well as measures taken at nursing home level to prevent suicide.
The two nursing home groups did not present significantly different characteristics. In the nursing homes with trained staff, the staff were deemed to be better prepared to approach suicidal individuals. The detection of suicidal residents relied more on the whole staff and less on the psychologist alone when compared to nursing homes without trained staff. A significantly larger number of measures were taken to manage suicidal residents in the trained nursing homes. Suicidal residents were more frequently referred to the psychologist. Trained nursing homes put in place significantly more suicide prevention measures at an institutional level.
Having trained gatekeepers has an impact not only for the trained individuals but also for the whole institution where they work, both in terms of managing suicidal residents and routine suicide prevention measures.
The eastern larch beetle (Dendroctonus simplex Le Conte) is recognized as a serious destructive forest pest in the upper part of North America. Under epidemic conditions, this beetle can attack healthy trees, causing severe damages to larch stands. Dendroctonus species are considered as holobionts, as they engage in multipartite interactions with microorganisms, such as bacteria, filamentous fungi, and yeasts, which are implicated in physiological processes of the insect, such as nutrition. They also play a key role in the beetle's attack, as they are responsible for the detoxification of the subcortical environment and weaken the tree's defense mechanisms. The eastern larch beetle is associated with bacteria and fungi, but their implication in the success of the beetle remains unknown. Here, we investigated the bacterial and fungal microbiota of this beetle pest throughout its ontogeny (pioneer adults, larvae and pupae) by high-throughput sequencing. A successional microbial assemblage was identified throughout the beetle developmental stages, reflecting the beetle's requirements. These results indicate that a symbiotic association between the eastern larch beetle and some of these microorganisms takes place and that this D. simplex symbiotic complex is helping the insect to colonize its host tree and survive the conditions encountered.
Distributed models and a good knowledge of the catchment studied are required to assess mitigation measures for nitrogen (N) pollution. A set of alternative scenarios (change of crop management practices and different strategies of landscape management, especially different sizes and distribution of set-aside areas) were simulated with a fully distributed model in a small agricultural catchment. The results show that current practices are close to complying with current regulations, which results in a limited effect of the implementation of best crop management practices. The location of set-aside zones is more important than their size in decreasing nitrate fluxes in stream water. The most efficient location is the lower parts of hillslopes, combining the dilution effect due to the decrease of N input per unit of land and the interception of nitrate transferred by sub-surface flows. The main process responsible for the interception effect is probably uptake by grassland and retention in soils since the denitrification load tends to decrease proportionally to N input and, for the scenarios considered, is lower in the interception scenarios than in the corresponding dilution zones.
The Neolithic transition is a particularly favorable field of research for the study of the emergence and evolution of cultures and cultural phenomena. In this framework, high-precision chronologies are essential for decrypting the rhythms of emergence of new techno-economic traits. As part of a project exploring the conditions underlying the emergence and dynamics of the development of the first agro-pastoral societies in the Western Mediterranean, this paper proposes a new chronological modeling. Based on 45 new radiocarbon (14C) dates and on a Bayesian statistical framework, this work examines the rhythms and dispersal paths of the Neolithic economy both on coastal and continental areas. These new data highlight a complex and far less unidirectional dissemination process than that envisaged so far.
An unprecedented outbreak of Ebola virus diseases (EVD) occurred in West Africa from March 2014 to January 2016. The French Institute for Public Health implemented strengthened surveillance to early identify any imported case and avoid secondary cases.
Febrile travellers returning from an affected country had to report to the national emergency healthcare hotline. Patients reporting at-risk exposures and fever during the 21st following day from the last at-risk exposure were defined as possible cases, hospitalised in isolation and tested by real-time polymerase chain reaction. Asymptomatic travellers reporting at-risk exposures were considered as contact and included in a follow-up protocol until the 21st day after the last at-risk exposure.
From March 2014 to January 2016, 1087 patients were notified: 1053 were immediately excluded because they did not match the notification criteria or did not have at-risk exposures; 34 possible cases were tested and excluded following a reliable negative result. Two confirmed cases diagnosed in West Africa were evacuated to France under stringent isolation conditions. Patients returning from Guinea (n = 531; 49%) and Mali (n = 113; 10%) accounted for the highest number of notifications.
No imported case of EVD was detected in France. We are confident that our surveillance system was able to classify patients properly during the outbreak period.
The Brangus breed was developed to combine the superior characteristics of both of its founder breeds, Angus and Brahman. It combines the high adaptability to tropical and subtropical environments, disease resistance, and overall hardiness of Zebu cattle with the reproductive potential and carcass quality of Angus. It is known that the major histocompatibility complex (MHC, also known as bovine leucocyte antigen: BoLA), located on chromosome 23, encodes several genes involved in the adaptive immune response and may be responsible for adaptation to harsh environments. The objective of this work was to evaluate whether the local breed ancestry percentages in the BoLA locus of a Brangus population diverged from the estimated genome-wide proportions and to identify signatures of positive selection in this genomic region. For this, 167 animals (100 Brangus, 45 Angus and 22 Brahman) were genotyped using a high-density single nucleotide polymorphism array. The local ancestry analysis showed that more than half of the haplotypes (55.0%) shared a Brahman origin. This value was significantly different from the global genome-wide proportion estimated by cluster analysis (34.7% Brahman), and the proportion expected by pedigree (37.5% Brahman). The analysis of selection signatures by genetic differentiation (Fst) and extended haplotype homozygosity-based methods (iHS and Rsb) revealed 10 and seven candidate regions, respectively. The analysis of the genes located within these candidate regions showed mainly genes involved in immune response-related pathway, while other genes and pathways were also observed (cell surface signalling pathways, membrane proteins and ion-binding proteins). Our results suggest that the BoLA region of Brangus cattle may have been enriched with Brahman haplotypes as a consequence of selection processes to promote adaptation to subtropical environments.
The aim of this study was to assess the seroprevalence of the Toxoplasma gondii parasite in pork produced in France, and to determine infection risk factors. An innovative survey was designed based on annual numbers of slaughtered pigs from intensive and outdoor farms in France. A total of 1549 samples of cardiac fluids were collected from pig hearts to determine seroprevalence using a Modified Agglutination Test. Of those, 160 hearts were bio-assayed in mice to isolate live parasites. The overall seroprevalence among fattening pigs was 2·9%. The adjusted seroprevalence in pigs from intensive farms was 3·0%; the highest in sows (13·4%); 2·9% in fattening pigs and 2·6% in piglets. Adjusted seroprevalence in fattening animals from outdoor farms was 6·3%. Strains were isolated from 41 animals and all were genotyped by Restriction Fragment Length Polymorphism as type II. Risk-factor analysis showed that the risk of infection was more than three times higher for outdoor pigs, and that sows’ risk was almost five times higher than that of fattening animals. This study provides further evidence of extensive pork infection with T. gondii regardless of breeding systems, indicating that farm conditions are still insufficient to guarantee ‘Toxoplasma-free pork’.
Quantitative assessment of mitigation measures for nitrogen (N) pollution requires adequate models, good knowledge of catchment functioning and a thorough understanding of agricultural systems and stakeholder constraints. The current paper analyses a set of results from simulations, with two models, of agricultural changes in two catchments in different contexts with different constraints. The results show that reducing N inputs and increasing grassland areas are the most efficient measures, not only because they reduce N fluxes in streams but also because they enhance N use by agriculture and the whole catchment system. Introducing catch crops, hedgerows and riparian buffers are interesting complementary measures but of limited impact when implemented alone. These results are sensitive to the way mitigation measures are translated into model inputs, and their operational implications are discussed.
This article describes the nitrogen flows in the environment and points to the specificities of the livestock production. Till the beginning of the 20th century, the symbiotic fixation and the recycling of animal excreta supplied the nitrogen necessary for the fertility of soil. In 1913, the Haber-Bosch process allowed the industrial synthesis of ammonia and made possible the fertilisation without association of crop production with the livestock farming. The efficiency of the nitrogen in livestock farming is low with nearly half or more of the inputs losses to the environment. These losses have diverse impacts that intervene at various spatial scales owing to the nitrogen cascade. Quantitative assessment of nitrogen flows at the scale of regions started in the early 1980s in Western Europe and North America. These studies provided estimates of the spatial variability of nitrogen discharge within a region. They confirmed the differences between areas with a high animal density such as Brittany (western region, France) and other regions. It was also found that the same nitrogenous losses could lead to different levels of environmental impacts according to the sensibility of a given environment and its capacity to cope with nitrogen excess. Climate, soils characteristics, animal density, and proportions of agricultural land under annual and perennial crops are drivers of this sensibility.
The nitrogen efficiency is the ratio between the output of nitrogen in the animal products and the input required for the livestock production. This ratio is a driver of the economic profitability and can be calculated at various levels of the production system: animal, field or farm. Calculated at the scale of the animal, it is generally low with less than half-ingested nitrogen remaining in the milk, the eggs or the meat in the form of proteins; the major part of the nitrogen being rejected in the environment. Significant gains were achieved in the past via the genetic improvement and the adjustment of feed supply. At the farm level, the efficiency increases to 45% to 50%, thanks to the recycling of animal excreta as fertilisers. From excretion to land application of manure, the losses of nitrogen are very variable depending on the animal species and the manure management system. Considering the risks of pollution swapping, all management and handling steps need to be considered. Collective initiatives or local rules on agricultural practices allow new opportunities to restore nitrogen balances on local territory.
This study aimed to investigate the impact of repeated acidosis challenges (ACs) and the effect of live yeast supplementation (Saccharomyces cerevisiae I-1077, SC) on rumen fermentation, microbial ecosystem and inflammatory response. The experimental design involved two groups (SC, n=6; Control, n=6) of rumen fistulated wethers that were successively exposed to three ACs of 5 days each, preceded and followed by resting periods (RPs) of 23 days. AC diets consisted of 60% wheat-based concentrate and 40% hay, whereas RPs diets consisted of 20% concentrate and 80% hay. ACs induced changes in rumen fermentative parameters (pH, lactate and volatile fatty-acid concentrations and proportions) as well as in microbiota composition and diversity. The first challenge drove the fermentation pattern towards propionate. During successive challenges, rumen pH measures worsened in the control group and the fermentation profile was characterised by a higher butyrate proportion and changes in the microbiota. The first AC induced a strong release of rumen histamine and lipopolysaccharide that triggered the increase of acute-phase proteins in the plasma. This inflammatory status was maintained during all AC repetitions. Our study suggests that the response of sheep to an acidosis diet is greatly influenced by the feeding history of individuals. In live yeast-supplemented animals, the first AC was as drastic as in control sheep. However, during subsequent challenges, yeast supplementation contributed to stabilise fermentative parameters, promoted protozoal numbers and decreased lactate producing bacteria. At the systemic level, yeast helped normalising the inflammatory status of the animals.