To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To complete missing information on the influence of spiritual and religious advisors as informal providers for mental health problems in Europe.
Recourse to religious practice or belief when coping with mental health problems was evaluated using data from the ESEMED survey. This was a stratified, multistage, clustered-area probability sample survey of mental health carried out in six European countries which included 8796 subjects. Between countries differences in sociodemographic characteristics, religious affiliation, and prevalence of mental disorders and management of mental disorders were evaluated.
Religion appears to play a limited role in coping with mental health problems in Europe. Only 7.9% of individuals seeking help for such problems turned to a religious advisor. This proportion differed between countries from 13% in Italy, 12.5% in Germany, 10.5% in the Netherlands, 5.8% in France, 4.7% in Belgium to 4% in Spain. In addition, seeking help exclusively from religion was reported by only 1.3% of subjects. Practicing religion at least once a week and considering religion as important in daily life were predictors of using religion versus conventional health care only. Use of religion was not influenced by gender and age. Non-Christian respondents and individuals with alcohol disorders were more likely to use religion. In Spain, the use of religion is much lower than average.
Unlike the situation in the United States, organised religion does not provide alternative informal mental health care in Europe. At best, it could be considered as an adjunct to conventional care.
Species distribution models (SDMs) are statistical tools used to develop continuous predictions of species occurrence. ‘Integrated SDMs’ (ISDMs) are an elaboration of this approach with potential advantages that allow for the dual use of opportunistically collected presence-only data and site-occupancy data from planned surveys. These models also account for survey bias and imperfect detection through the use of a hierarchical modelling framework that separately estimates the species–environment response and detection process. This is particularly helpful for conservation applications and predictions for rare species, where data are often limited and prediction errors may have significant management consequences. Despite this potential importance, ISDMs remain largely untested under a variety of scenarios. We performed an exploration of key modelling decisions and assumptions on an ISDM using the endangered Baird’s tapir (Tapirus bairdii) as a test species. We found that site area had the strongest effect on the magnitude of population estimates and underlying intensity surface and was driven by estimates of model intercepts. Selecting a site area that accounted for the individual movements of the species within an average home range led to population estimates that coincided with expert estimates. ISDMs that do not account for the individual movements of species will likely lead to less accurate estimates of species intensity (number of individuals per unit area) and thus overall population estimates. This bias could be severe and highly detrimental to conservation actions if uninformed ISDMs are used to estimate global populations of threatened and data-deficient species, particularly those that lack natural history and movement information. However, the ISDM was consistently the most accurate model compared to other approaches, which demonstrates the importance of this new modelling framework and the ability to combine opportunistic data with systematic survey data. Thus, we recommend researchers use ISDMs with conservative movement information when estimating population sizes of rare and data-deficient species. ISDMs could be improved by using a similar parameterization to spatial capture–recapture models that explicitly incorporate animal movement as a model parameter, which would further remove the need for spatial subsampling prior to implementation.
Major depressive disorder (MDD) is a leading cause of disease burden worldwide, with lifetime prevalence in the United States of 17%. Here we present the results of the first prospective, large-scale, patient- and rater-blind, randomized controlled trial evaluating the clinical importance of achieving congruence between combinatorial pharmacogenomic (PGx) testing and medication selection for MDD.
1,167 outpatients diagnosed with MDD and an inadequate response to ≥1 psychotropic medications were enrolled and randomized 1:1 to a Treatment as Usual (TAU) arm or PGx-guided care arm. Combinatorial PGx testing categorized medications in three groups based on the level of gene-drug interactions: use as directed, use with caution, or use with increased caution and more frequent monitoring. Patient assessments were performed at weeks 0 (baseline), 4, 8, 12 and 24. Patients, site raters, and central raters were blinded in both arms until after week 8. In the guided-care arm, physicians had access to the combinatorial PGx test result to guide medication selection. Primary outcomes utilized the Hamilton Depression Rating Scale (HAM-D17) and included symptom improvement (percent change in HAM-D17 from baseline), response (50% decrease in HAM-D17 from baseline), and remission (HAM-D17<7) at the fully blinded week 8 time point. The durability of patient outcomes was assessed at week 24. Medications were considered congruent with PGx test results if they were in the ‘use as directed’ or ‘use with caution’ report categories while medications in the ‘use with increased caution and more frequent monitoring’ were considered incongruent. Patients who started on incongruent medications were analyzed separately according to whether they changed to congruent medications by week8.
At week 8, symptom improvement for individuals in the guided-care arm was not significantly different than TAU (27.2% versus 24.4%, p=0.11). However, individuals in the guided-care arm were more likely than those in TAU to achieve remission (15% versus 10%; p<0.01) and response (26% versus 20%; p=0.01). Remission rates, response rates, and symptom reductions continued to improve in the guided-treatment arm until the 24week time point. Congruent prescribing increased to 91% in the guided-care arm by week 8. Among patients who were taking one or more incongruent medication at baseline, those who changed to congruent medications by week 8 demonstrated significantly greater symptom improvement (p<0.01), response (p=0.04), and remission rates (p<0.01) compared to those who persisted on incongruent medications.
Combinatorial PGx testing improves short- and long-term response and remission rates for MDD compared to standard of care. In addition, prescribing congruency with PGx-guided medication recommendations is important for achieving symptom improvement, response, and remission for MDD patients.
Funding Acknowledgements: This study was supported by Assurex Health, Inc.
Pigs selected for high performance may be more at risk of developing diseases. This study aimed to assess the health and performance of two pig lines divergently selected for residual feed intake (RFI) (low RFI (LRFI) v. high RFI (HRFI)) and housed in two contrasted hygiene conditions (poor v. good) using a 2×2 factorial design (n=40/group). The challenge period (Period 1), started on week zero (W0) when 12-week-old pigs were transferred to good or poor housing conditions. At week 6 (W6), half of the pigs in each group were slaughtered. During a recovery period (Period 2) from W6 to W13 to W14, the remaining pigs (n=20/group) were transferred in good hygiene conditions before being slaughtered. Blood was collected every three (Period 1) or 2 weeks (Period 2) to assess blood indicators of immune and inflammatory responses. Pulmonary lesions at slaughter and performance traits were evaluated. At W6, pneumonia prevalence was greater for pigs housed in poor than in good conditions (51% v. 8%, respectively, P<0.001). Irrespective of hygiene conditions, lung lesion scores were lower for LRFI pigs than for HRFI pigs (P=0.03). At W3, LRFI in poor conditions had the highest number of blood granulocytes (hygiene×line, P=0.03) and at W6, HRFI pigs in poor conditions had the greatest plasma haptoglobin concentrations (hygiene×line, P=0.02). During Period 1, growth rate and growth-to-feed ratio were less affected by poor hygiene in LRFI pigs than in HRFI pigs (hygiene×line, P=0.001 and P=0.02, respectively). Low residual feed intake pigs in poor conditions ate more than the other groups (hygiene×line, P=0.002). Irrespective of the line, fasting plasma glucose concentrations were higher in poor conditions, whereas fasting free fatty acids concentrations were lower than in good conditions. At the end of Period 2, pneumonia prevalence was similar for both housing conditions (39% v. 38%, respectively). During Period 2, plasma protein concentrations were greater for pigs previously housed in poor than in good conditions during Period 1. Immune traits, gain-to-feed ratio, BW gain and feed consumption did not differ during Period 2. Nevertheless, at W12, BW of HRFI previously housed in poor conditions was 13.4 kg lower than BW of HRFI pigs (P<0.001) previously housed in good conditions. In conclusion, health of the most feed efficient LRFI pigs was less impaired by poor hygiene conditions. This line was able to preserve its health, growth performance and its feed ingestion to a greater extent than the less efficient HRFI line.
This review summarizes the results from the INRA (Institut National de la Recherche Agronomique) divergent selection experiment on residual feed intake (RFI) in growing Large White pigs during nine generations of selection. It discusses the remaining challenges and perspectives for the improvement of feed efficiency in growing pigs. The impacts on growing pigs raised under standard conditions and in alternative situations such as heat stress, inflammatory challenges or lactation have been studied. After nine generations of selection, the divergent selection for RFI led to highly significant (P<0.001) line differences for RFI (−165 g/day in the low RFI (LRFI) line compared with high RFI line) and daily feed intake (−270 g/day). Low responses were observed on growth rate (−12.8 g/day, P<0.05) and body composition (+0.9 mm backfat thickness, P=0.57; −2.64% lean meat content, P<0.001) with a marked response on feed conversion ratio (−0.32 kg feed/kg gain, P<0.001). Reduced ultimate pH and increased lightness of the meat (P<0.001) were observed in LRFI pigs with minor impact on the sensory quality of the meat. These changes in meat quality were associated with changes of the muscular energy metabolism. Reduced maintenance energy requirements (−10% after five generations of selection) and activity (−21% of time standing after six generations of selection) of LRFI pigs greatly contributed to the gain in energy efficiency. However, the impact of selection for RFI on the protein metabolism of the pig remains unclear. Digestibility of energy and nutrients was not affected by selection, neither for pigs fed conventional diets nor for pigs fed high-fibre diets. A significant improvement of digestive efficiency could likely be achieved by selecting pigs on fibre diets. No convincing genetic or blood biomarker has been identified for explaining the differences in RFI, suggesting that pigs have various ways to achieve an efficient use of feed. No deleterious impact of the selection on the sow reproduction performance was observed. The resource allocation theory states that low RFI may reduce the ability to cope with stressors, via the reduction of a buffer compartment dedicated to responses to stress. None of the experiments focussed on the response of pigs to stress or challenges could confirm this theory. Understanding the relationships between RFI and responses to stress and energy demanding processes, as such immunity and lactation, remains a major challenge for a better understanding of the underlying biological mechanisms of the trait and to reconcile the experimental results with the resource allocation theory.
We analyze photoluminescence (PL) and electroluminescence (EL) using a hyperspectral imager that records spectrally resolved luminescence images of solar cell absorbers. The system is calibrated to yield the luminescence flux in absolute values. This system enables to quantitatively image physical parameters such as the photovoltage with an uncertainty of less than 30mV. The wide field illumination, low power excitation and fast acquisition brings new insights compare to classical setups such as confocal microscope. Several types of absorbers have been analyzed. For instance, we can investigate spatial fluctuations of the Quasi Fermi Levels splitting in CIGS polycristalline absorbers and link those fluctuations to transport properties. The method is general to the point that third generation PV cells absorbers can also be evaluated. We illustrate the great potential of our setup by imaging quasi Fermi levels splitting in Intermediate Band Solar cells. Such techniques, directly evaluating the performance of photovoltaic absorbers and devices are needed for fast, high throughput investigations of combinatorial experiments such as the projects carried out for the material genomics programme.
We examined changes to the behaviour of flour beetles, Tribolium confusum, infected with the rodent stomach worm, the spirurid Protospirura muricola, in the context of the ‘Behavioural Manipulation Hypothesis’. Trobolium confusum infected with the third-stage infective larvae of P. muricola showed consistently altered patterns of behaviour. Relative to uninfected beetles, over a measured time period, beetles infected with P. muricola were likely to move over a shorter distance, when moving their speed of movement was slower, they were more likely to stay in the illuminated area of their environment, more likely to emerge from darkened areas into the illuminated areas, and their longevity was significantly shortened. The changes in behaviour, as reflected in effects on speed of movement, were only evident among beetles that actually harboured infective cysts and not among those carrying younger infections when the larvae within their haemocoels would have been at an earlier stage of development and not yet capable of infecting the definitive murine hosts. We discuss whether these changes would have made the beetles more susceptible to predation by rodents, and specifically by the omnivorous eastern spiny mouse, Acomys dimidiatus, the natural definitive host of this parasite in Egypt, from where the P. muricola isolate originated, and whether they support the Behavioural Manipulation Hypothesis or reflect parasite-induced pathology.
We compared serotype distributions of Streptococcus pneumoniae isolates from patients aged <5 and ⩾5 years with invasive pneumococcal disease in New South Wales, Australia, and antibiotic susceptibilities of isolates from the <5 years age group only, before (2002–2004) and after (2005–2009) introduction of the 7-valent pneumococcal conjugate vaccine (PCV7). Overall, there were significant decreases in the mean annual number of referred isolates (770 vs. 515) and the proportion belonging to PCV7 serotypes (74% vs. 38%), but non-PCV7 serotypes, particularly 19A, increased (5% vs. 18%). All changes were more marked in the <5 years age group. Susceptibility testing of isolates from the <5 years age group showed variation in resistance between serotypes, but significant overall increases in penicillin non-susceptibility (23% vs. 31%), ceftriaxone resistance (2% vs. 12%) and multidrug resistance (4% vs. 7%) rates; erythromycin resistance fell (32% vs. 25%). Continued surveillance is needed to monitor changes following the introduction of 13-valent PCV in 2012.
Lyme borreliosis (LB) is the most common arthropod-borne disease of humans in the Northern hemisphere. In Europe, the causative agent, Borrelia burgdorferi sensu lato complex, is principally vectored by Ixodes ricinus ticks. The aim of this study was to identify environmental factors influencing questing I. ricinus nymph abundance and B. burgdorferi s.l. infection in questing nymphs using a large-scale survey across Scotland. Ticks, host dung and vegetation were surveyed at 25 woodland sites, and climatic variables from a Geographical Information System (GIS) were extracted for each site. A total of 2397 10 m2 transect surveys were conducted and 13 250 I. ricinus nymphs counted. Questing nymphs were assayed for B. burgdorferi s.l. and the average infection prevalence was 5·6% (range 0·8–13·9%). More questing nymphs and higher incidence of B. burgdorferi s.l. infection were found in areas with higher deer abundance and in mixed/deciduous compared to coniferous forests, as well as weaker correlations with season, altitude, rainfall and ground vegetation. No correlation was found between nymph abundance and infection prevalence within the ranges encountered. An understanding of the environmental conditions associated with tick abundance and pathogen prevalence may be used to reduce risk of exposure and to predict future pathogen prevalence and distributions under environmental changes.
The numbers and serotypes of Clostridium perfringens present in the faeces of three groups of hospital patients and young healthy laboratory workers were examined in studies lasting between 10 and 13 weeks.
In one hospital some long-stay geriatric patients carried relatively high numbers of C. perfringens (> 107/g) most of the time and it was not unusual in any one week for the majority of these patients to carry the same serotype(s). However, the numbers of C. perfringens in the faeces of young long-stay patients in the same hospital were in the range of 103–104/g and carriage of common serotypes was not observed. These results were similar to the findings with the young laboratory workers.
This investigation indicates that two of the laboratory criteria often used in the investigation of C. perfringens food poisoning, i.e. faecal counts of ≥ 105C. perfringens/g and patients carrying the same serological type need to be interpreted with caution with suspected outbreaks involving some groups of geriatric long-stay hospital patients.
Comparing pertussis epidemiology over time and between countries is confounded by differences in diagnostic and notification practices. Standardized serological methods applied to population-based samples enhance comparability. Population prevalence of different levels of pertussis toxin IgG (PT IgG) antibody, measured by standardized methods, were compared by age group and region of Australia between 1997/1998 and 2002. The proportion of 5- to 9-year-olds with presumptive recent pertussis infection (based on IgG levels ⩾62·5 ELISA units/ml) significantly decreased in 2002, consistent with notification data for the same period and improved uptake of booster vaccines following the schedule change from whole-cell to acellular vaccine. In contrast, recent presumptive infection significantly increased in adults aged 35–49 years. Population-based serosurveillance using standardized PT IgG antibody assays has the potential to aid interpretation of trends in pertussis incidence in relation to vaccine programmes and between countries.
We analysed 3 independently collected datasets of fully censused helminth burdens in wood mice, Apodemus sylvaticus, testing the a priori hypothesis of Behnke et al. (2005) that the presence of the intestinal nematode Heligmosomoides polygyrus predisposes wood mice to carrying other species of helminths. In Portugal, mice carrying H. polygyrus showed a higher prevalence of other helminths but the magnitude of the effect was seasonal. In Egham, mice with H. polygyrus showed a higher prevalence of other helminth species, not confounded by other factors. In Malham Tarn, mice carrying H. polygyrus were more likely to be infected with other species, but only among older mice. Allowing for other factors, heavy residual H. polygyrus infections carried more species of other helminths in both the Portugal and Egham data; species richness in Malham was too low to conduct a similar analysis, but as H. polygyrus worm burdens increased, so the prevalence of other helminths also increased. Our results support those of Behnke et al. (2005), providing firm evidence that at the level of species richness a highly predictable element of co-infections in wood mice has now been defined: infection with H. polygyrus has detectable consequences for the susceptibility of wood mice to other intestinal helminth species.
Serological typing was used as an epidemiological tool in the investigation of 524 outbreakes of Clostridium perfringens food poisoning in the United Kingdom and 37 outbreaks in other countries.
Five thousand five hundred and fifty-four (77%) of 7245 strains of C. perfringens association 561 outbreaks were typable with the 75 Food Hygiene Laboratory antisera; in 354 (63%) of these outbreaks were typable with the 75 Food Hygiene Laboratory antisera; in 354 (63%) of these outbreaks a specific serotype was established as being responsible for the outbreak.
An assessment is made of the ability of two additional sets of antisera, prepared against 34 American and 34 Japanese strains of C. perfringens, to increase the number of strains which can be typed. The extent of cross-reaction between the three sets of antisera was determined and the results are discussed in relation to the source and history of the type strains.
Effective primary prevention of congenital toxoplasmosis requires up to date information on locally relevant risk factors for infection in pregnant women. In Naples, risk factors for toxoplasma infection were compared in recently infected women (as assessed by detection of specific IgM in serum) and susceptible, IgG negative women. Recent infection was strongly associated with frequency of consumption of cured pork and raw meat. Eating cured pork or raw meat at least once a month increased the risk of toxoplasma infection threefold.
This simple study design for determining locally relevant sources of toxoplasma infection is the first report of cured pork as a risk factor for infection. Further research is required to determine cyst viability in cured pork products. Our findings suggest that in southern Italy, cured pork and raw meat should be avoided by susceptible pregnant women.