To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Abnormal effort-based decision-making represents a potential mechanism underlying motivational deficits (amotivation) in psychotic disorders. Previous research identified effort allocation impairment in chronic schizophrenia and focused mostly on physical effort modality. No study has investigated cognitive effort allocation in first-episode psychosis (FEP).
Cognitive effort allocation was examined in 40 FEP patients and 44 demographically-matched healthy controls, using Cognitive Effort-Discounting (COGED) paradigm which quantified participants’ willingness to expend cognitive effort in terms of explicit, continuous discounting of monetary rewards based on parametrically-varied cognitive demands (levels N of N-back task). Relationship between reward-discounting and amotivation was investigated. Group differences in reward-magnitude and effort-cost sensitivity, and differential associations of these sensitivity indices with amotivation were explored.
Patients displayed significantly greater reward-discounting than controls. In particular, such discounting was most pronounced in patients with high levels of amotivation even when N-back performance and reward base amount were taken into consideration. Moreover, patients exhibited reduced reward-benefit sensitivity and effort-cost sensitivity relative to controls, and that decreased sensitivity to reward-benefit but not effort-cost was correlated with diminished motivation. Reward-discounting and sensitivity indices were generally unrelated to other symptom dimensions, antipsychotic dose and cognitive deficits.
This study provides the first evidence of cognitive effort-based decision-making impairment in FEP, and indicates that decreased effort expenditure is associated with amotivation. Our findings further suggest that abnormal effort allocation and amotivation might primarily be related to blunted reward valuation. Prospective research is required to clarify the utility of effort-based measures in predicting amotivation and functional outcome in FEP.
Better understanding of interplay among symptoms, cognition and functioning in first-episode psychosis (FEP) is crucial to promoting functional recovery. Network analysis is a promising data-driven approach to elucidating complex interactions among psychopathological variables in psychosis, but has not been applied in FEP.
This study employed network analysis to examine inter-relationships among a wide array of variables encompassing psychopathology, premorbid and onset characteristics, cognition, subjective quality-of-life and psychosocial functioning in 323 adult FEP patients in Hong Kong. Graphical Least Absolute Shrinkage and Selection Operator (LASSO) combined with extended Bayesian information criterion (BIC) model selection was used for network construction. Importance of individual nodes in a generated network was quantified by centrality analyses.
Our results showed that amotivation played the most central role and had the strongest associations with other variables in the network, as indexed by node strength. Amotivation and diminished expression displayed differential relationships with other nodes, supporting the validity of two-factor negative symptom structure. Psychosocial functioning was most strongly connected with amotivation and was weakly linked to several other variables. Within cognitive domain, digit span demonstrated the highest centrality and was connected with most of the other cognitive variables. Exploratory analysis revealed no significant gender differences in network structure and global strength.
Our results suggest the pivotal role of amotivation in psychopathology network of FEP and indicate its critical association with psychosocial functioning. Further research is required to verify the clinical significance of diminished motivation on functional outcome in the early course of psychotic illness.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Norovirus, a major cause of gastroenteritis in people of all ages worldwide, was first reported in South Korea in 1999. The most common causal agents of pediatric acute gastroenteritis are norovirus and rotavirus. While vaccination has reduced the pediatric rotavirus infection rate, norovirus vaccines have not been developed. Therefore, prediction and prevention of norovirus are very important. Norovirus is divided into genogroups GI–GVII, with GII.4 being the most prevalent. However, in 2012–2013, GII.17 showed a higher incidence than GII.4 and a novel variant, GII.P17-GII.17, appeared. In this study, 204 stool samples collected in 2013–2014 were screened by reverse transcriptase-polymerase chain reaction; 11 GI (5.39%) and 45 GII (22.06%) noroviruses were identified. GI.4, GI.5, GII.4, GII.6 and GII.17 were detected. The whole genomes of the three norovirus GII.17 were sequenced. The whole genome of GII.17 consists of three open reading frames of 5109, 1623 and 780 bp. Compared with 20 GII.17 strains isolated in other countries, we observed numerous changes in the protruding P2 domain of VP1 in the Korean GII.17 viruses. Our study provided genome information that might aid in epidemic prevention, epidemiology studies and vaccine development.
Introduction: Many drugs, including cannabis and alcohol, cause impairment and contribute to motor vehicle collisions (MVCs). Policy makers require knowledge of the prevalence of drug use in crash-involved drivers, and types of drugs used in order to develop effective prevention programs. This issue is particularly relevant with the recent legalization of cannabis. We aim to study the prevalence of alcohol, cannabis, sedating medications, and other drugs in injured drivers from 4 Canadian Provinces. Methods: This prospective cohort study obtained excess clinical blood samples from consecutive injured drivers who attended a participating Canadian trauma centre following a MVC. Blood samples were analyzed using a broad spectrum toxicology screen capable of detecting cannabinoids, cocaine, amphetamines (including their major analogues), and opioids as well as psychotropic pharmaceuticals (including antihistamines, benzodiazepines, other hypnotics, and sedating antidepressants). Alcohol and cannabinoids were quantified. Health records were reviewed to extract demographic, medical, and MVC information using a standardized data collection tool. Results: This study has been collecting data in 4 trauma centres in British Columbia (BC) since 2011 and was launched in 2 trauma centres in Alberta (AB), 1 in Saskatchewan (SK), and 2 in Ontario (ON) in 2018. In preliminary results from BC (n = 2412), 8% of injured drivers tested positive for THC and 13% for alcohol. Preliminary results from other provinces (n = 301) suggest a regional variation in prevalence of drivers testing positive for THC (10% - 27%), alcohol (17% - 29%), and other drugs. By May 2018, an estimated 4500 cases from BC, 600 from AB, 150 from SK, and 650 from ON will have been analyzed. We will report the prevalence of positive tests for alcohol, THC, other recreational drugs, and sedating medications, pre and post cannabis legalization. The number of cases with alcohol and/or THC levels above Canadian per se limits will also be reported. Results will be reported according to province, driver sex, age, single vs. multi vehicle crashes, and requirement for hospital admission. Conclusion: This will be among the largest international datasets on drug use by injured drivers. Our findings will provide patterns of drug and alcohol impairment in 4 Canadian provinces pre and post cannabis legalization. The significance of these findings and implication for impaired driving policy and prevention programs in Canada will be discussed.
Increasingly, products are designed for global markets, yet studies of design practices primarily investigate designers from high-income countries. Specifically, the use of prototypes during design is likely affected by the background of the designer and the environment in which they are designing. To broaden our understanding of the extent to which prototyping best practices are used beyond Western designers, in this study, we conducted interviews with novice designers from Ghana, a middle-income country (MIC), to examine how Ghanaian novice designers (upper-level undergraduate students) used prototypes throughout their design courses. We compared the reported use of prototypes to best practice behaviors and analyzed the types of prototypes used. We found evidence that these Ghanaian novice designers used some critical prototyping best practice behaviors, while other behaviors were underutilized, specifically during the front-end phases of design and for the purpose of engaging with stakeholders. Additionally, virtual models dominated their prototyping choices. We discuss likely reasons for these trends based on participants’ design experiences and design contexts.
This study evaluated tumour necrosis factor-α, interleukins 10 and 12, and interferon-γ levels, peripheral blood mononuclear cells, and clusters of differentiation 17c and 86 expression in unilateral sudden sensorineural hearing loss.
Twenty-four patients with unilateral sudden sensorineural hearing loss, and 24 individuals with normal hearing and no history of sudden sensorineural hearing loss (who were attending the clinic for other problems), were enrolled. Peripheral blood mononuclear cells, and clusters of differentiation 11c and 86 were isolated and analysed. Plasma and supernatant levels of tumour necrosis factor-α, interferon-γ, and interleukins 10 and 12 were measured.
There were no significant differences with respect to age and gender. Monocyte population, mean tumour necrosis factor-α level and cluster of differentiation 86 expression were significantly increased in the study group compared to the control group. However, interferon-γ and interleukin 12 levels were significantly decreased. The difference in mean interleukin 10 level was not significant.
Increases in tumour necrosis factor-α level and monocyte population might play critical roles in sudden sensorineural hearing loss. This warrants detailed investigation and further studies on the role of dendritic cells in sudden sensorineural hearing loss.
The present study examined the prevalence of and risk factors for malnutrition in a population-based cohort of women of childbearing age in rural Bangladesh.
A cross-sectional study that collected pre-pregnancy weight, height, and data on selected risk factors for nutritional status of women.
The study was conducted in Sylhet District of Bangladesh.
Study subjects included 13 230 non-pregnant women of childbearing age. Women were classified into underweight (<18·5 kg/m2), normal (18·5–24·9 kg/m2) and overweight/obese (≥25·0 kg/m2) using BMI; and into moderate to severe stunting (<150 cm), mild stunting (150–<155 cm) and normal (≥155 cm) using height. Two multinomial logistic regression models were fitted for BMI: model 1 examined individual and household factors associated with BMI, and model 2 additionally examined the association of community variables. The same analysis was conducted for height.
Prevalence of underweight, overweight/obesity and moderate to severe stunting was 37·0, 7·2 and 48·6 %, respectively. Women’s education and household wealth were inversely related to both underweight status and stunting. Underweight rate was significantly lower in the post-harvest season. Women with any education and who belonged to households with higher wealth were more likely to be overweight/obese.
The study documented high underweight and stunting, and moderate overweight/obesity rates among rural Bangladeshi women; and recommends design and implementation of a multidimensional intervention programme based on individual-, household- and community-level risk factors that can address underweight, stunting and overweight/obesity to improve the nutritional status of women of childbearing age in Bangladesh.
Psychotropic medication use and psychiatric symptoms during pregnancy each are associated with adverse neurodevelopmental outcomes in offspring. Commonly, studies considering medication effects do not adequately assess symptoms, nor evaluate children when the effects are believed to occur, the fetal period. This study examined maternal serotonin reuptake inhibitor and polypharmacy use in relation to serial assessments of five indices of fetal neurobehavior and Bayley Scales of Infant Development at 12 months in N = 161 socioeconomically advantaged, non-Hispanic White women with a shared risk phenotype, diagnosed major depressive disorder. On average fetuses showed the expected development over gestation. In contrast, infant average Bayley psychomotor and mental development scores were low (M = 84.10 and M = 89.92, range of normal limits 85–114) with rates of delay more than 2–3 times what would be expected based on this measure's normative data. Controlling for prenatal and postnatal depressive symptoms, prenatal medication effects on neurobehavioral development were largely undetected in the fetus and infant. Mental health care directed primarily at symptoms may not address the additional psychosocial needs of women parenting infants. Speculatively, prenatal serotonin reuptake inhibitor exposure may act as a plasticity rather than risk factor, potentially enhancing receptivity to a nonoptimal postnatal environment in some mother–infant dyads.
Giardia duodenalis is the most common intestinal parasite of humans in the USA, but the risk factors for sporadic (non-outbreak) giardiasis are not well described. The Centers for Disease Control and Prevention and the Colorado and Minnesota public health departments conducted a case-control study to assess risk factors for sporadic giardiasis in the USA. Cases (N = 199) were patients with non-outbreak-associated laboratory-confirmed Giardia infection in Colorado and Minnesota, and controls (N = 381) were matched by age and site. Identified risk factors included international travel (aOR = 13.9; 95% CI 4.9–39.8), drinking water from a river, lake, stream, or spring (aOR = 6.5; 95% CI 2.0–20.6), swimming in a natural body of water (aOR = 3.3; 95% CI 1.5–7.0), male–male sexual behaviour (aOR = 45.7; 95% CI 5.8–362.0), having contact with children in diapers (aOR = 1.6; 95% CI 1.01–2.6), taking antibiotics (aOR = 2.5; 95% CI 1.2–5.0) and having a chronic gastrointestinal condition (aOR = 1.8; 95% CI 1.1–3.0). Eating raw produce was inversely associated with infection (aOR = 0.2; 95% CI 0.1–0.7). Our results highlight the diversity of risk factors for sporadic giardiasis and the importance of non-international-travel-associated risk factors, particularly those involving person-to-person transmission. Prevention measures should focus on reducing risks associated with diaper handling, sexual contact, swimming in untreated water, and drinking untreated water.
Introduction: Survival from cardiac arrest has been linked to the quality of resuscitation care. Unfortunately, healthcare providers frequently underperform in these critical scenarios, with a well-documented deterioration in skills weeks to months following advanced life support courses. Improving initial training and preventing decay in knowledge and skills are a priority in resuscitation education. The spacing effect has repeatedly been shown to have an impact on learning and retention. Despite its potential advantages, the spacing effect has seldom been applied to organized education training or complex motor skill learning where it has the potential to make a significant impact. The purpose of this study was to determine if a resuscitation course taught in a spaced format compared to the usual massed instruction results in improved retention of procedural skills. Methods: EMS providers (Paramedics and Emergency Medical Technicians (EMT)) were block randomized to receive a Pediatric Advanced Life Support (PALS) course in either a spaced format (four 210-minute weekly sessions) or a massed format (two sequential 7-hour days). Blinded observers used expert-developed 4-point global rating scales to assess video recordings of each learner performing various resuscitation skills before, after and 3-months following course completion. Primary outcomes were performance on infant bag-valve-mask ventilation (BVMV), intraosseous (IO) insertion, infant intubation, infant and adult chest compressions. Results: Forty-eight of 50 participants completed the study protocol (26 spaced and 22 massed). There was no significant difference between the two groups on testing before and immediately after the course. 3-months following course completion participants in the spaced cohort scored higher overall for BVMV (2.2 ± 0.13 versus 1.8 ± 0.14, p=0.012) without statistically significant difference in scores for IO insertion (3.0 ± 0.13 versus 2.7± 0.13, p= 0.052), intubation (2.7± 0.13 versus 2.5 ± 0.14, p=0.249), infant compressions (2.5± 0.28 versus 2.5± 0.31, p=0.831) and adult compressions (2.3± 0.24 versus 2.2± 0.26, p=0.728) Conclusion: Procedural skills taught in a spaced format result in at least as good learning as the traditional massed format; more complex skills taught in a spaced format may result in better long term retention when compared to traditional massed training as there was a clear difference in BVMV and trend toward a difference in IO insertion.
Exercise and physical training are known to affect gastrointestinal function and digestibility in horses and can lead to inaccurate estimates of nutrient and energy digestibility when markers are used. The effect of exercise on apparent nutrient digestibility and faecal recoveries of ADL and TiO2 was studied in six Welsh pony geldings subjected to either a low- (LI) or high-intensity (HI) exercise regime according to a cross-over design. Ponies performing LI exercise were walked once per day for 45 min in a horse walker (5 km/h) for 47 consecutive days. Ponies submitted to HI exercise were gradually trained for the same 47 days according a standardized protocol. Throughout the experiment, the ponies received a fixed level of feed and the daily rations consisted of 4.7 kg DM of grass hay and 0.95 kg DM of concentrate. The diet was supplemented with minerals, vitamins and TiO2 (3.0 g Ti/day). Total tract digestibility of DM, organic matter (OM), CP, crude fat, NDF, ADF, starch, sugar and energy was determined with the total faeces collection (TFC) method. In addition, DM and OM digestibility was estimated using internal ADL and the externally supplemented Ti as markers. Urine was collected on the final 2 days of each experimental period. Exercise did not affect apparent digestibility of CP, crude fat, starch and sugar. Digestibility of DM (DMD), OM (OMD), ADF and NDF tended to be lower and DE was decreased when ponies received the HI exercise regime. For all treatments combined, mean faecal recoveries of ADL and Ti were 87.8±1.7% and 99.3±1.7%, respectively. Ti was not detected in the urine, indicating that intestinal integrity was maintained with exercise. Dry matter digestibility estimated with the TFC, ADL and Ti for ponies subjected to LI exercise were 66.3%, 60.3% and 64.8%, respectively, while DMD for HI ponies were 64.2%, 60.3% and 65.2%, respectively. In conclusion, physical exercise has an influence on the GE digestibility of the feed in ponies provided with equivalent levels of feed intake. In addition, the two markers used for estimating apparent DMD and OMD indicate that externally supplemented Ti is a suitable marker to determine digestibility of nutrients in horses performing exercise unlike dietary ADL.
Simulation models are used widely in pharmacology, epidemiology and health economics (HEs). However, there have been no attempts to incorporate models from these disciplines into a single integrated model. Accordingly, we explored this linkage to evaluate the epidemiological and economic impact of oseltamivir dose optimisation in supporting pandemic influenza planning in the USA. An HE decision analytic model was linked to a pharmacokinetic/pharmacodynamics (PK/PD) – dynamic transmission model simulating the impact of pandemic influenza with low virulence and low transmissibility and, high virulence and high transmissibility. The cost-utility analysis was from the payer and societal perspectives, comparing oseltamivir 75 and 150 mg twice daily (BID) to no treatment over a 1-year time horizon. Model parameters were derived from published studies. Outcomes were measured as cost per quality-adjusted life year (QALY) gained. Sensitivity analyses were performed to examine the integrated model's robustness. Under both pandemic scenarios, compared to no treatment, the use of oseltamivir 75 or 150 mg BID led to a significant reduction of influenza episodes and influenza-related deaths, translating to substantial savings of QALYs. Overall drug costs were offset by the reduction of both direct and indirect costs, making these two interventions cost-saving from both perspectives. The results were sensitive to the proportion of inpatient presentation at the emergency visit and patients’ quality of life. Integrating PK/PD–EPI/HE models is achievable. Whilst further refinement of this novel linkage model to more closely mimic the reality is needed, the current study has generated useful insights to support influenza pandemic planning.
The aims of this study were to investigate the effects of either hearing, vision or dual sensory impairment on depressive symptoms and to identify subgroups that are vulnerable and significantly affected.
Data from the 2006–2014 Korean Longitudinal Study of Aging (KLoSA) were used and a total of 5832 individuals were included in this study. Depressive symptoms were assessed using the Center for Epidemiologic Studies Depression (CES-D10) scale. Sensory impairment was assessed according to the levels of self-reported hearing or vision, which were categorised as either good (excellent, very good or good) or poor (fair or poor). The changes in hearing or vision from records of previous survey were investigated. Changes from good to poor, which indicates new onset, were defined as hearing impairment or vision impairment. Interactions of changes in hearing and vision were considered in the analysis. Dual sensory impairment was indicated when hearing impairment and vision impairment both developed at the same time. Demographic, socioeconomic and health-related factors were considered as potential confounders and were adjusted for in the generalised estimating equation model.
Individuals with hearing impairment demonstrated significantly more severe depressive symptoms [β = 0.434, standard errors (s.e.) = 0.097, p < 0.001] than those who had good hearing. Those with vision impairment also showed significantly elevated depressive symptoms (β = 0.253, s.e. = 0.058, p < 0.001) than those with good vision. When the interactions between hearing and vision were considered, participants with dual sensory impairment showed significantly more severe depressive symptoms (β = 0.768, s.e. = 0.197, p < 0.001) than those with good hearing and vision. The effect of a single and dual sensory impairment on depressive symptoms was significant in both sexes and across age groups, except for vision impairment in male participants.
Hearing, vision and dual sensory impairment are significantly associated with depressive symptoms. Our results suggest that treatment or rehabilitation of either hearing or vision impairment would help prevent depression.
A substantial proportion of persons with mental disorders seek treatment from complementary and alternative medicine (CAM) professionals. However, data on how CAM contacts vary across countries, mental disorders and their severity, and health care settings is largely lacking. The aim was therefore to investigate the prevalence of contacts with CAM providers in a large cross-national sample of persons with 12-month mental disorders.
In the World Mental Health Surveys, the Composite International Diagnostic Interview was administered to determine the presence of past 12 month mental disorders in 138 801 participants aged 18–100 derived from representative general population samples. Participants were recruited between 2001 and 2012. Rates of self-reported CAM contacts for each of the 28 surveys across 25 countries and 12 mental disorder groups were calculated for all persons with past 12-month mental disorders. Mental disorders were grouped into mood disorders, anxiety disorders or behavioural disorders, and further divided by severity levels. Satisfaction with conventional care was also compared with CAM contact satisfaction.
An estimated 3.6% (standard error 0.2%) of persons with a past 12-month mental disorder reported a CAM contact, which was two times higher in high-income countries (4.6%; standard error 0.3%) than in low- and middle-income countries (2.3%; standard error 0.2%). CAM contacts were largely comparable for different disorder types, but particularly high in persons receiving conventional care (8.6–17.8%). CAM contacts increased with increasing mental disorder severity. Among persons receiving specialist mental health care, CAM contacts were reported by 14.0% for severe mood disorders, 16.2% for severe anxiety disorders and 22.5% for severe behavioural disorders. Satisfaction with care was comparable with respect to CAM contacts (78.3%) and conventional care (75.6%) in persons that received both.
CAM contacts are common in persons with severe mental disorders, in high-income countries, and in persons receiving conventional care. Our findings support the notion of CAM as largely complementary but are in contrast to suggestions that this concerns person with only mild, transient complaints. There was no indication that persons were less satisfied by CAM visits than by receiving conventional care. We encourage health care professionals in conventional settings to openly discuss the care patients are receiving, whether conventional or not, and their reasons for doing so.
Traumatic events are associated with increased risk of psychotic experiences, but it is unclear whether this association is explained by mental disorders prior to psychotic experience onset.
To investigate the associations between traumatic events and subsequent psychotic experience onset after adjusting for post-traumatic stress disorder and other mental disorders.
We assessed 29 traumatic event types and psychotic experiences from the World Mental Health surveys and examined the associations of traumatic events with subsequent psychotic experience onset with and without adjustments for mental disorders.
Respondents with any traumatic events had three times the odds of other respondents of subsequently developing psychotic experiences (OR=3.1, 95% CI 2.7–3.7), with variability in strength of association across traumatic event types. These associations persisted after adjustment for mental disorders.
Exposure to traumatic events predicts subsequent onset of psychotic experiences even after adjusting for comorbid mental disorders.
The treatment gap between the number of people with mental disorders and the number treated represents a major public health challenge. We examine this gap by socio-economic status (SES; indicated by family income and respondent education) and service sector in a cross-national analysis of community epidemiological survey data.
Data come from 16 753 respondents with 12-month DSM-IV disorders from community surveys in 25 countries in the WHO World Mental Health Survey Initiative. DSM-IV anxiety, mood, or substance disorders and treatment of these disorders were assessed with the WHO Composite International Diagnostic Interview (CIDI).
Only 13.7% of 12-month DSM-IV/CIDI cases in lower-middle-income countries, 22.0% in upper-middle-income countries, and 36.8% in high-income countries received treatment. Highest-SES respondents were somewhat more likely to receive treatment, but this was true mostly for specialty mental health treatment, where the association was positive with education (highest treatment among respondents with the highest education and a weak association of education with treatment among other respondents) but non-monotonic with income (somewhat lower treatment rates among middle-income respondents and equivalent among those with high and low incomes).
The modest, but nonetheless stronger, an association of education than income with treatment raises questions about a financial barriers interpretation of the inverse association of SES with treatment, although future within-country analyses that consider contextual factors might document other important specifications. While beyond the scope of this report, such an expanded analysis could have important implications for designing interventions aimed at increasing mental disorder treatment among socio-economically disadvantaged people.
Neonatal viability is one of the key factors affecting piglets’ vitality, which ultimately affects the survival and growth of piglets (England, 1974). As colostrum is the only food resource of neonatal piglets, their ability to acquire the colostrum as early as possible after their birth can determine their vitality. Piglets are usually supplied with creep food at some time during the suckling period in order to improve their performance before and after weaning. However, the creep food intake varies between litters and between individuals. Furthermore, the relationship between viability in early life and the acceptance of a new food (e.g. creep food) when they first encounter it, is not fully understood. The objectives of this study were to investigate factors affecting the neonatal viability of piglets at birth and to identify the relationship between neonatal viability and subsequent creep feeding behaviour by piglets on d14-d15.
Creep food intake of suckling piglets varies considerably between individuals (Pajor et al., 1991). The creep feeding status of individual piglets can be monitored by video recording or by combining the weight of the food removed from the electronic dispensers with monitoring by video recording. However, the analysis of videotapes is time-consuming, which limits its widespread use on farm. From a practical standpoint, monitoring the food intake by piglets either before or after weaning is important to provide useful information for a management strategy. Therefore a general, quick and valid method to detect the food intake experience of piglets would be valuable and is needed. The aim of this investigation was to determine if a device that automatically spray-marked piglets at the trough could reliably identify those pigs that had foraged the food in the trough.