To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Better understanding of interplay among symptoms, cognition and functioning in first-episode psychosis (FEP) is crucial to promoting functional recovery. Network analysis is a promising data-driven approach to elucidating complex interactions among psychopathological variables in psychosis, but has not been applied in FEP.
This study employed network analysis to examine inter-relationships among a wide array of variables encompassing psychopathology, premorbid and onset characteristics, cognition, subjective quality-of-life and psychosocial functioning in 323 adult FEP patients in Hong Kong. Graphical Least Absolute Shrinkage and Selection Operator (LASSO) combined with extended Bayesian information criterion (BIC) model selection was used for network construction. Importance of individual nodes in a generated network was quantified by centrality analyses.
Our results showed that amotivation played the most central role and had the strongest associations with other variables in the network, as indexed by node strength. Amotivation and diminished expression displayed differential relationships with other nodes, supporting the validity of two-factor negative symptom structure. Psychosocial functioning was most strongly connected with amotivation and was weakly linked to several other variables. Within cognitive domain, digit span demonstrated the highest centrality and was connected with most of the other cognitive variables. Exploratory analysis revealed no significant gender differences in network structure and global strength.
Our results suggest the pivotal role of amotivation in psychopathology network of FEP and indicate its critical association with psychosocial functioning. Further research is required to verify the clinical significance of diminished motivation on functional outcome in the early course of psychotic illness.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
We investigated whether neurobehavioral markers of risk for emotion dysregulation were evident among newborns, as well as whether the identified markers were associated with prenatal exposure to maternal emotion dysregulation. Pregnant women (N = 162) reported on their emotion dysregulation prior to a laboratory assessment. The women were then invited to the laboratory to assess baseline respiratory sinus arrhythmia (RSA) and RSA in response to an infant cry. Newborns were assessed after birth via the NICU Network Neurobehavioral Scale. We identified two newborn neurobehavioral factors—arousal and attention—via exploratory factor analysis. Low arousal was characterized by less irritability, excitability, and motor agitation, while low attention was related to a lower threshold for auditory and visual stimulation, less sustained attention, and poorer visual tracking abilities. Pregnant women who reported higher levels of emotion dysregulation had newborns with low arousal levels and less attention. Larger decreases in maternal RSA in response to cry were also related to lower newborn arousal. We provide the first evidence that a woman's emotion dysregulation while pregnant is associated with risks for dysregulation in her newborn. Implications for intergenerational transmission of emotion dysregulation are discussed.
Introduction: Survival from cardiac arrest has been linked to the quality of resuscitation care. Unfortunately, healthcare providers frequently underperform in these critical scenarios, with a well-documented deterioration in skills weeks to months following advanced life support courses. Improving initial training and preventing decay in knowledge and skills are a priority in resuscitation education. The spacing effect has repeatedly been shown to have an impact on learning and retention. Despite its potential advantages, the spacing effect has seldom been applied to organized education training or complex motor skill learning where it has the potential to make a significant impact. The purpose of this study was to determine if a resuscitation course taught in a spaced format compared to the usual massed instruction results in improved retention of procedural skills. Methods: EMS providers (Paramedics and Emergency Medical Technicians (EMT)) were block randomized to receive a Pediatric Advanced Life Support (PALS) course in either a spaced format (four 210-minute weekly sessions) or a massed format (two sequential 7-hour days). Blinded observers used expert-developed 4-point global rating scales to assess video recordings of each learner performing various resuscitation skills before, after and 3-months following course completion. Primary outcomes were performance on infant bag-valve-mask ventilation (BVMV), intraosseous (IO) insertion, infant intubation, infant and adult chest compressions. Results: Forty-eight of 50 participants completed the study protocol (26 spaced and 22 massed). There was no significant difference between the two groups on testing before and immediately after the course. 3-months following course completion participants in the spaced cohort scored higher overall for BVMV (2.2 ± 0.13 versus 1.8 ± 0.14, p=0.012) without statistically significant difference in scores for IO insertion (3.0 ± 0.13 versus 2.7± 0.13, p= 0.052), intubation (2.7± 0.13 versus 2.5 ± 0.14, p=0.249), infant compressions (2.5± 0.28 versus 2.5± 0.31, p=0.831) and adult compressions (2.3± 0.24 versus 2.2± 0.26, p=0.728) Conclusion: Procedural skills taught in a spaced format result in at least as good learning as the traditional massed format; more complex skills taught in a spaced format may result in better long term retention when compared to traditional massed training as there was a clear difference in BVMV and trend toward a difference in IO insertion.
For livestock production systems to play a positive role in global food security, the balance between their benefits and disbenefits to society must be appropriately managed. Based on the evidence provided by field-scale randomised controlled trials around the world, this debate has traditionally centred on the concept of economic-environmental trade-offs, of which existence is theoretically assured when resource allocation is perfect on the farm. Recent research conducted on commercial farms indicates, however, that the economic-environmental nexus is not nearly as straightforward in the real world, with environmental performances of enterprises often positively correlated with their economic profitability. Using high-resolution primary data from the North Wyke Farm Platform, an intensively instrumented farm-scale ruminant research facility located in southwest United Kingdom, this paper proposes a novel, information-driven approach to carry out comprehensive assessments of economic-environmental trade-offs inherent within pasture-based cattle and sheep production systems. The results of a data-mining exercise suggest that a potentially systematic interaction exists between ‘soil health’, ecological surroundings and livestock grazing, whereby a higher level of soil organic carbon (SOC) stock is associated with a better animal performance and less nutrient losses into watercourses, and a higher stocking density with greater botanical diversity and elevated SOC. We contend that a combination of farming system-wide trials and environmental instrumentation provides an ideal setting for enrolling scientifically sound and biologically informative metrics for agricultural sustainability, through which agricultural producers could obtain guidance to manage soils, water, pasture and livestock in an economically and environmentally acceptable manner. Priority areas for future farm-scale research to ensure long-term sustainability are also discussed.
Exercise and physical training are known to affect gastrointestinal function and digestibility in horses and can lead to inaccurate estimates of nutrient and energy digestibility when markers are used. The effect of exercise on apparent nutrient digestibility and faecal recoveries of ADL and TiO2 was studied in six Welsh pony geldings subjected to either a low- (LI) or high-intensity (HI) exercise regime according to a cross-over design. Ponies performing LI exercise were walked once per day for 45 min in a horse walker (5 km/h) for 47 consecutive days. Ponies submitted to HI exercise were gradually trained for the same 47 days according a standardized protocol. Throughout the experiment, the ponies received a fixed level of feed and the daily rations consisted of 4.7 kg DM of grass hay and 0.95 kg DM of concentrate. The diet was supplemented with minerals, vitamins and TiO2 (3.0 g Ti/day). Total tract digestibility of DM, organic matter (OM), CP, crude fat, NDF, ADF, starch, sugar and energy was determined with the total faeces collection (TFC) method. In addition, DM and OM digestibility was estimated using internal ADL and the externally supplemented Ti as markers. Urine was collected on the final 2 days of each experimental period. Exercise did not affect apparent digestibility of CP, crude fat, starch and sugar. Digestibility of DM (DMD), OM (OMD), ADF and NDF tended to be lower and DE was decreased when ponies received the HI exercise regime. For all treatments combined, mean faecal recoveries of ADL and Ti were 87.8±1.7% and 99.3±1.7%, respectively. Ti was not detected in the urine, indicating that intestinal integrity was maintained with exercise. Dry matter digestibility estimated with the TFC, ADL and Ti for ponies subjected to LI exercise were 66.3%, 60.3% and 64.8%, respectively, while DMD for HI ponies were 64.2%, 60.3% and 65.2%, respectively. In conclusion, physical exercise has an influence on the GE digestibility of the feed in ponies provided with equivalent levels of feed intake. In addition, the two markers used for estimating apparent DMD and OMD indicate that externally supplemented Ti is a suitable marker to determine digestibility of nutrients in horses performing exercise unlike dietary ADL.
Simulation models are used widely in pharmacology, epidemiology and health economics (HEs). However, there have been no attempts to incorporate models from these disciplines into a single integrated model. Accordingly, we explored this linkage to evaluate the epidemiological and economic impact of oseltamivir dose optimisation in supporting pandemic influenza planning in the USA. An HE decision analytic model was linked to a pharmacokinetic/pharmacodynamics (PK/PD) – dynamic transmission model simulating the impact of pandemic influenza with low virulence and low transmissibility and, high virulence and high transmissibility. The cost-utility analysis was from the payer and societal perspectives, comparing oseltamivir 75 and 150 mg twice daily (BID) to no treatment over a 1-year time horizon. Model parameters were derived from published studies. Outcomes were measured as cost per quality-adjusted life year (QALY) gained. Sensitivity analyses were performed to examine the integrated model's robustness. Under both pandemic scenarios, compared to no treatment, the use of oseltamivir 75 or 150 mg BID led to a significant reduction of influenza episodes and influenza-related deaths, translating to substantial savings of QALYs. Overall drug costs were offset by the reduction of both direct and indirect costs, making these two interventions cost-saving from both perspectives. The results were sensitive to the proportion of inpatient presentation at the emergency visit and patients’ quality of life. Integrating PK/PD–EPI/HE models is achievable. Whilst further refinement of this novel linkage model to more closely mimic the reality is needed, the current study has generated useful insights to support influenza pandemic planning.
A substantial proportion of persons with mental disorders seek treatment from complementary and alternative medicine (CAM) professionals. However, data on how CAM contacts vary across countries, mental disorders and their severity, and health care settings is largely lacking. The aim was therefore to investigate the prevalence of contacts with CAM providers in a large cross-national sample of persons with 12-month mental disorders.
In the World Mental Health Surveys, the Composite International Diagnostic Interview was administered to determine the presence of past 12 month mental disorders in 138 801 participants aged 18–100 derived from representative general population samples. Participants were recruited between 2001 and 2012. Rates of self-reported CAM contacts for each of the 28 surveys across 25 countries and 12 mental disorder groups were calculated for all persons with past 12-month mental disorders. Mental disorders were grouped into mood disorders, anxiety disorders or behavioural disorders, and further divided by severity levels. Satisfaction with conventional care was also compared with CAM contact satisfaction.
An estimated 3.6% (standard error 0.2%) of persons with a past 12-month mental disorder reported a CAM contact, which was two times higher in high-income countries (4.6%; standard error 0.3%) than in low- and middle-income countries (2.3%; standard error 0.2%). CAM contacts were largely comparable for different disorder types, but particularly high in persons receiving conventional care (8.6–17.8%). CAM contacts increased with increasing mental disorder severity. Among persons receiving specialist mental health care, CAM contacts were reported by 14.0% for severe mood disorders, 16.2% for severe anxiety disorders and 22.5% for severe behavioural disorders. Satisfaction with care was comparable with respect to CAM contacts (78.3%) and conventional care (75.6%) in persons that received both.
CAM contacts are common in persons with severe mental disorders, in high-income countries, and in persons receiving conventional care. Our findings support the notion of CAM as largely complementary but are in contrast to suggestions that this concerns person with only mild, transient complaints. There was no indication that persons were less satisfied by CAM visits than by receiving conventional care. We encourage health care professionals in conventional settings to openly discuss the care patients are receiving, whether conventional or not, and their reasons for doing so.
Little is known about the combined use of benzodiazepines and antidepressants in older psychiatric patients. This study examined the prescription pattern of concurrent benzodiazepines in older adults treated with antidepressants in Asia, and explored its demographic and clinical correlates.
The data of 955 older adults with any type of psychiatric disorders were extracted from the database of the Research on Asian Psychotropic Prescription Patterns for Antidepressants (REAP-AD) project. Demographic and clinical characteristics were recorded using a standardized protocol and data collection procedure. Both univariate and multiple logistic regression analyses were performed.
The proportion of benzodiazepine and antidepressant combination in this cohort was 44.3%. Multiple logistic regression analysis revealed that higher doses of antidepressants, younger age (<65 years), inpatients, public hospital, major comorbid medical conditions, antidepressant types, and country/territory were significantly associated with more frequent co-prescription of benzodiazepines and antidepressants.
Nearly, half of the older adults treated with antidepressants in Asia are prescribed concurrent benzodiazepines. Given the potentially adverse effects of benzodiazepines, the rationale of benzodiazepines and antidepressants co-prescription needs to be revisited.
The treatment gap between the number of people with mental disorders and the number treated represents a major public health challenge. We examine this gap by socio-economic status (SES; indicated by family income and respondent education) and service sector in a cross-national analysis of community epidemiological survey data.
Data come from 16 753 respondents with 12-month DSM-IV disorders from community surveys in 25 countries in the WHO World Mental Health Survey Initiative. DSM-IV anxiety, mood, or substance disorders and treatment of these disorders were assessed with the WHO Composite International Diagnostic Interview (CIDI).
Only 13.7% of 12-month DSM-IV/CIDI cases in lower-middle-income countries, 22.0% in upper-middle-income countries, and 36.8% in high-income countries received treatment. Highest-SES respondents were somewhat more likely to receive treatment, but this was true mostly for specialty mental health treatment, where the association was positive with education (highest treatment among respondents with the highest education and a weak association of education with treatment among other respondents) but non-monotonic with income (somewhat lower treatment rates among middle-income respondents and equivalent among those with high and low incomes).
The modest, but nonetheless stronger, an association of education than income with treatment raises questions about a financial barriers interpretation of the inverse association of SES with treatment, although future within-country analyses that consider contextual factors might document other important specifications. While beyond the scope of this report, such an expanded analysis could have important implications for designing interventions aimed at increasing mental disorder treatment among socio-economically disadvantaged people.
Creep food intake of suckling piglets varies considerably between individuals (Pajor et al., 1991). The creep feeding status of individual piglets can be monitored by video recording or by combining the weight of the food removed from the electronic dispensers with monitoring by video recording. However, the analysis of videotapes is time-consuming, which limits its widespread use on farm. From a practical standpoint, monitoring the food intake by piglets either before or after weaning is important to provide useful information for a management strategy. Therefore a general, quick and valid method to detect the food intake experience of piglets would be valuable and is needed. The aim of this investigation was to determine if a device that automatically spray-marked piglets at the trough could reliably identify those pigs that had foraged the food in the trough.
Neonatal viability is one of the key factors affecting piglets’ vitality, which ultimately affects the survival and growth of piglets (England, 1974). As colostrum is the only food resource of neonatal piglets, their ability to acquire the colostrum as early as possible after their birth can determine their vitality. Piglets are usually supplied with creep food at some time during the suckling period in order to improve their performance before and after weaning. However, the creep food intake varies between litters and between individuals. Furthermore, the relationship between viability in early life and the acceptance of a new food (e.g. creep food) when they first encounter it, is not fully understood. The objectives of this study were to investigate factors affecting the neonatal viability of piglets at birth and to identify the relationship between neonatal viability and subsequent creep feeding behaviour by piglets on d14-d15.
Piglets are usually supplied with solid food - creep food - at a time when most are still obtaining adequate nutrition from milk. Getting piglets started on solid food may help their growth performance both before and after weaning. As young piglets are highly exploratory animals (A'Ness et al., 1997) and food restriction increases the tendency of older pigs to express foraging behaviour (Lawrence et al., 1988), the objective of this experiment was to examine the relative importance of exploratory behaviour and hunger on initiation of creep feeding by piglets.
Eight litters of Large White x Landrace piglets were used in this study. Each piglet was ear tagged and weighed within 24h of birth. When a litter was 16 days old (d16), each piglet was weighed and 8 piglets were taken in pairs, between nursings, to one of two experimental pens for 30 mins familiarization and filming, twice each.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
Little is known about Clostridium difficile infection (CDI) in Asia. The aims of our study were to explore (i) the prevalence, risk factors and molecular epidemiology of CDI and colonization in a tertiary academic hospital in North-Eastern Peninsular Malaysia; (ii) the rate of carriage of C. difficile among the elderly in the region; (iii) the awareness level of this infection among the hospital staffs and students. For stool samples collected from hospital inpatients with diarrhea (n = 76) and healthy community members (n = 138), C. difficile antigen and toxins were tested by enzyme immunoassay. Stool samples were subsequently analyzed by culture and molecular detection of toxin genes, and PCR ribotyping of isolates. To examine awareness among hospital staff and students, participants were asked to complete a self-administered questionnaire. For the hospital and community studies, the prevalence of non-toxigenic C. difficile colonization was 16% and 2%, respectively. The prevalence of CDI among hospital inpatients with diarrhea was 13%. Out of 22 C. difficile strains from hospital inpatients, the toxigenic ribotypes 043 and 017 were most common (both 14%). In univariate analysis, C. difficile colonization in hospital inpatients was significantly associated with greater duration of hospitalization and use of penicillin (both P < 0·05). Absence of these factors was a possible reason for low colonization in the community. Only 3% of 154 respondents answered all questions correctly in the awareness survey. C. difficile colonization is prevalent in a Malaysian hospital setting but not in the elderly community with little or no contact with hospitals. Awareness of CDI is alarmingly poor.
Clonorchis sinensis and Capillaria hepatica are zoonotic parasites that mainly infect the liver and cause serious liver disorders. However, immunological parameters induced by co-infection with these parasites remain unknown. In this study, for the first time, we investigated immunological profiles induced by co-infection with C. hepatica (CH) in C. sinensis (CS)-infected rats (Sprague–Dawley). Rats were infected primarily with 50 metacercariae of C. sinensis; 4 weeks later, they were subsequently infected with 1000 infective C. hepatica eggs. Significantly higher levels of C. sinensis- or C. hepatica-specific IgG antibodies were found in the sera of rats. Interestingly, no cross-reacting antibody was observed between C. sinensis and C. hepatica infections. Significantly raised eosinophil levels were found in the blood of C. sinensis/C. hepatica co-infected rats (CS + CH) compared to the blood of rats infected singly with C. sinensis. Co-infected rats showed significantly higher levels of lymphocyte proliferation and cytokine production compared to a single C. sinensis infection. The worm burden of C. sinensis was significantly reduced in co-infected rats compared to the single C. sinensis infection. These results indicate that the eosinophils, lymphocyte proliferation and cytokine production induced by subsequent infection with C. hepatica in C. sinensis-infected rats might contribute to the observed C. sinensis worm reduction.
Research on post-traumatic stress disorder (PTSD) course finds a substantial proportion of cases remit within 6 months, a majority within 2 years, and a substantial minority persists for many years. Results are inconsistent about pre-trauma predictors.
The WHO World Mental Health surveys assessed lifetime DSM-IV PTSD presence-course after one randomly-selected trauma, allowing retrospective estimates of PTSD duration. Prior traumas, childhood adversities (CAs), and other lifetime DSM-IV mental disorders were examined as predictors using discrete-time person-month survival analysis among the 1575 respondents with lifetime PTSD.
20%, 27%, and 50% of cases recovered within 3, 6, and 24 months and 77% within 10 years (the longest duration allowing stable estimates). Time-related recall bias was found largely for recoveries after 24 months. Recovery was weakly related to most trauma types other than very low [odds-ratio (OR) 0.2–0.3] early-recovery (within 24 months) associated with purposefully injuring/torturing/killing and witnessing atrocities and very low later-recovery (25+ months) associated with being kidnapped. The significant ORs for prior traumas, CAs, and mental disorders were generally inconsistent between early- and later-recovery models. Cross-validated versions of final models nonetheless discriminated significantly between the 50% of respondents with highest and lowest predicted probabilities of both early-recovery (66–55% v. 43%) and later-recovery (75–68% v. 39%).
We found PTSD recovery trajectories similar to those in previous studies. The weak associations of pre-trauma factors with recovery, also consistent with previous studies, presumably are due to stronger influences of post-trauma factors.
Our intention was to describe and compare the perspectives of national hospice thought leaders, hospice nurses, and former family caregivers on factors that promote or threaten family caregiver perceptions of support.
Nationally recognized hospice thought leaders (n = 11), hospice nurses (n = 13), and former family caregivers (n = 14) participated. Interviews and focus groups were audiotaped and transcribed. Data were coded inductively, and codes were hierarchically grouped by topic. Emergent categories were summarized descriptively and compared across groups.
Four categories linked responses from the three participant groups (95%, 366/384 codes): (1) essentials of skilled communication (30.6%), (2) importance of building authentic relationships (28%), (3) value of expert teaching (22.4%), and (4) critical role of teamwork (18.3%). The thought leaders emphasized communication (44.6%), caregivers stressed expert teaching (51%), and nurses highlighted teamwork (35.8%). Nurses discussed teamwork significantly more than caregivers (z = 2.2786), thought leaders discussed communication more than caregivers (z = 2.8551), and caregivers discussed expert teaching more than thought leaders (z = 2.1693) and nurses (z = 2.4718; all values of p < 0.05).
Significance of Results:
Our findings suggest differences in priorities for caregiver support across family caregivers, hospice nurses, and thought leaders. Hospice teams may benefit from further education and training to help cross the schism of family-centered hospice care as a clinical ideal to one where hospice team members can fully support and empower family caregivers as a hospice team member.