To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study investigated the effects of group cognitive behavioural therapy (CBT) for patients with bipolar disorder. The development of CBT for this disorder is relatively under-explored.
Participants with bipolar I or II disorder were treated with group CBT in addition to treatment as usual. The effectiveness of the protocol was explored through sequence analysis of daily mood monitoring prior to, during and after the intervention. Also, a repeated measures design was used assessing symptomatology, dysfunctional attitudes, sense of mastery, psychosocial functioning, and quality of life at start and end of intervention, and at follow-up 2 and 12 months later.
The results indicate that variation in mood states diminished over the course of the intervention. Also, there was a change from depressive states to more euthymic states. Greater number of reported lifetime depressive episodes was associated with greater diversity of mood states. There was an increase in overall psychosocial functioning and self-reported psychological health following the intervention. Improvement continued after treatment ended until follow-up at 2 months, and measured 1 year later, for outcomes representing depression, general psychosocial functioning and self-reported psychological health. Due to small sample size and the lack of a control group the results are preliminary.
The results of this pilot study suggest that both offering CBT in group interventions and sequence analysis of time series data are helpful routes to further explore when improving standard CBT interventions for patients suffering from bipolar disorder.
Serum-antibodies against an organ specific CNS antigen as well as against serotonin and gangliosides (Gm 1) were analysed by ELISA in 34 patients with schizophrenia, ten patients with schizoaffective psychosis and 13 patients with major depressive disorder. Sixty-two patients with various rheumatic disorders and 32 blood donors were included in the study as controls. Sixty-two percent of the 13 patients with major depressive disorder had antibodies to serotonin and 69% to gangliosides, whereas antibody positive sera was only found in 38% of the 34 patients with schizophrenia. The same antibodies were found in only 6% (antibodies to serotonin) and 13% (antibodies to gangliosides) of the 32 blood donors and in a similar frequency in patients with schizoaffective psychosis. Organ specific antibodies to CNS-antigen could not be detected in the psychiatric patient group at any significant level. It is speculated that auto-immune reactions towards a serotonin receptor may be involved in the etiopathogenesis of major depressive disorder.
An experiment was conducted to test the hypothesis that meat products have digestible indispensable amino acid scores (DIAAS) >100 and that various processing methods will increase standardised ileal digestibility (SID) of amino acids (AA) and DIAAS. Nine ileal-cannulated gilts were randomly allotted to a 9 × 8 Youden square design with nine diets and eight 7-d periods. Values for SID of AA and DIAAS for two reference patterns were calculated for salami, bologna, beef jerky, raw ground beef, cooked ground beef and ribeye roast heated to 56, 64 or 72°C. The SID of most AA was not different among salami, bologna, beef jerky and cooked ground beef, but was less (P < 0·05) than the values for raw ground beef. The SID of AA for 56°C ribeye roast was not different from the values for raw ground beef and 72°C ribeye roast, but greater (P < 0·05) than those for 64°C ribeye roast. For older children, adolescents and adults, the DIAAS for all proteins, except cooked ground beef, were >100 and bologna and 64°C ribeye roast had the greatest (P < 0·05) DIAAS. The limiting AA for this age group were sulphur AA (beef jerky), leucine (bologna, raw ground beef and cooked ground beef) and valine (salami and the three ribeye roasts). In conclusion, meat products generally provide high-quality protein with DIAAS >100 regardless of processing. However, overcooking meat may reduce AA digestibility and DIAAS.
Neuropsychological tests are important instruments to determine a cognitive profile, giving insight into the etiology of dementia; however, these tests cannot readily be used in culturally diverse, low-educated populations, due to their dependence upon (Western) culture, education, and literacy. In this review we aim to give an overview of studies investigating domain-specific cognitive tests used to assess dementia in non-Western, low-educated populations. The second aim was to examine the quality of these studies and of the adaptations for culturally, linguistically, and educationally diverse populations.
A systematic review was performed using six databases, without restrictions on the year or language of publication.
Forty-four studies were included, stemming mainly from Brazil, Hong Kong, Korea, and considering Hispanics/Latinos residing in the USA. Most studies focused on Alzheimer’s disease (n = 17) or unspecified dementia (n = 16). Memory (n = 18) was studied most often, using 14 different tests. The traditional Western tests in the domains of attention (n = 8) and construction (n = 15), were unsuitable for low-educated patients. There was little variety in instruments measuring executive functioning (two tests, n = 13), and language (n = 12, of which 10 were naming tests). Many studies did not report a thorough adaptation procedure (n = 39) or blinding procedures (n = 29).
Various formats of memory tests seem suitable for low-educated, non-Western populations. Promising tasks in other cognitive domains are the Stick Design Test, Five Digit Test, and verbal fluency test. Further research is needed regarding cross-cultural instruments measuring executive functioning and language in low-educated people.
Forage maize (Zea mays L.) is often grown year after year on the same land on many intensive dairy farms in north-west Europe. This results in agronomical problems such as weed resistance and decline of soil quality, which may be solved by ley-arable farming. In the current study, forage maize was grown at different nitrogen (N) fertilization levels for 3 years on permanent arable land and on temporary arable land after ploughing out different types of grass–clover swards. Swards differed in management (grazing or cutting) and age (temporary or permanent). Maize yield and soil residual mineral N content were measured after the maize harvest. There was no effect on maize yield of the management of ploughed-out grass–clover swards but a clear effect of the age of grass–clover swards. The N fertilizer replacement value (NFRV) of all ploughed grass–clover swards was >170 kg N/ha in the first year after ploughing. In the third year after ploughing, NFRV of the permanent sward still exceeded 200 kg N/ha, whereas that of the temporary swards decreased to 30 kg N/ha on average. Soil residual nitrate (NO3−) remained below the local, legal threshold of 90 kg NO3− N/ha except for the ploughed-out permanent sward in the third year after ploughing (166 kg NO3− N/ha). The current study highlights the potential of forage maize – ley rotations in saving fertilizer N. This is beneficial both for the environment and for the profitability of dairy production in north-western Europe.
Terrorism and natural catastrophes have made disaster preparedness a critical issue. Despite the documented vulnerabilities of children during and following disasters, gaps remain in health care systems regarding pediatric disaster preparedness. This research study examined changes in knowledge acquisition of pediatric disaster preparedness among medical and non-medical personnel at a children’s hospital who completed an online training course of five modules: planning, triage, age-specific care, disaster management, and hospital emergency code response.
A multi-disciplinary team within the Pediatric Disaster Resource and Training Center at Children’s Hospital Los Angeles (Los Angeles, California USA) developed an online training course. Available archival course data from July 2009 to August 2012 were analyzed through linear growth curve multi-level modeling, with module total score as the outcome (0 to 100 points), attempt as the Level 1 variable (any module could be repeated), role in the hospital (medical or non-medical) as the Level 2 variable, and attempt by role as the cross-level effect.
A total of 44,115 module attempts by 5,773 course participants (3,686 medical personnel and 2,087 non-medical personnel) were analyzed. The average module total score upon first attempt across all participants ranged from 60.28 to 80.11 points, and participants significantly varied in how they initially scored. On average in the planning, triage, and age-specific care modules: total scores significantly increased per attempt across all participants (average rate of change ranged from 0.59 to 1.84 points) and medical personnel had higher total scores initially and through additional attempts (average difference ranged from 13.25 to 16.24 points). Cross-level effects were significant in the disaster management and hospital emergency code response modules: on average, total scores were initially lower among non-medical personnel compared to medical personnel, but non-medical personnel increased their total scores per attempt by 3.77 points in the disaster management module and 6.40 points in the hospital emergency code response module, while medical personnel did not improve their total scores through additional attempts.
Medical and non-medical hospital personnel alike can acquire knowledge of pediatric disaster preparedness. Key content can be reinforced or improved through successive training in an online course.
PhamPK, BeharSM, BergBM, UppermanJS, NagerAL. Pediatric Online Disaster Preparedness Training for Medical and Non-Medical Personnel: A Multi-Level Modeling AnalysisPrehosp Disaster Med.2018;33(4):349–354.
Maternal exposures to fever and infections in pregnancy have been linked to subsequent psychiatric morbidity in the child. This study examined whether fever and common infections in pregnancy were associated with psychosis-like experiences (PLEs) in the child.
A longitudinal study of 46 184 children who participated in the 11-year follow-up of the Danish National Birth Cohort was conducted. Pregnant women were enrolled between 1996 and 2002 and information on fever, genitourinary infections, respiratory tract infection, and influenza-like illness during pregnancy was prospectively collected in two interviews during pregnancy. PLEs were assessed using the seven-item Adolescent Psychotic-Like Symptom Screener in a web-based questionnaire completed by the children themselves at age 11.
PLEs were reported among 11% of the children. Multinomial logistic regression models with probability weights to adjust for potential selection bias due to attrition suggested that maternal fever, genitourinary infections and influenza-like illness were associated with a weak to moderate increased risk of subclinical psychosis-like symptoms in the offspring, whereas respiratory tract infections were not. No clear pattern was observed between the strengths of the associations and the timing of exposure, or the type of psychosis-like symptom.
In this study, maternal exposures to fevers and common infections in pregnancy were generally associated with a subtle excess risk of PLEs in the child. A more pronounced association was found for influenza-like illness under an a priori definition, leaving open the possibility that certain kinds of infections may constitute important risk factors.
Guidelines for the severity classification and treatment of Clostridium difficile infection (CDI) were published by Infectious Diseases Society of America (IDSA)/Society for Healthcare Epidemiology of America (SHEA) in 2010; however, compliance and efficacy of these guidelines has not been widely investigated. This present study assessed compliance with guidelines and its effect on CDI patient outcomes as compared with before these recommendations. A retrospective study included all adult inpatients with an initial episode of CDI treated in a single academic center from January 2009 to August 2014. Patients after guideline publication were compared with patients treated in 2009–2010. Demographic, clinical, and laboratory data were collected to stratify for disease severity. Outcome measures included compliance with guidelines, mortality, length of stay (LOS), and surgical intervention for CDI. A total of 1021 patients with CDI were included. Based upon the 2010 guidelines, 42 (28·8%) of 146 patients treated in 2009 would have been considered undertreated, and treatment progressively improved over time, as inadequate treatment decreased to 10·0% (15/148 patients) in 2014 (P = 0·0005). Overall, patient outcomes with guideline-adherent treatment decreased CDI attributable mortality twofold (P = 0·006) and CDI-related LOS by 1·9 days (P = 0·0009) when compared with undertreated patients. Compliance with IDSA/SHEA guidelines was associated with a decreased risk of mortality and LOS in hospitalized patients with CDI.
Our understanding of the complex relationship between schizophrenia symptomatology and etiological factors can be improved by studying brain-based correlates of schizophrenia. Research showed that impairments in value processing and executive functioning, which have been associated with prefrontal brain areas [particularly the medial orbitofrontal cortex (MOFC)], are linked to negative symptoms. Here we tested the hypothesis that MOFC thickness is associated with negative symptom severity.
This study included 1985 individuals with schizophrenia from 17 research groups around the world contributing to the ENIGMA Schizophrenia Working Group. Cortical thickness values were obtained from T1-weighted structural brain scans using FreeSurfer. A meta-analysis across sites was conducted over effect sizes from a model predicting cortical thickness by negative symptom score (harmonized Scale for the Assessment of Negative Symptoms or Positive and Negative Syndrome Scale scores).
Meta-analytical results showed that left, but not right, MOFC thickness was significantly associated with negative symptom severity (βstd = −0.075; p = 0.019) after accounting for age, gender, and site. This effect remained significant (p = 0.036) in a model including overall illness severity. Covarying for duration of illness, age of onset, antipsychotic medication or handedness weakened the association of negative symptoms with left MOFC thickness. As part of a secondary analysis including 10 other prefrontal regions further associations in the left lateral orbitofrontal gyrus and pars opercularis emerged.
Using an unusually large cohort and a meta-analytical approach, our findings point towards a link between prefrontal thinning and negative symptom severity in schizophrenia. This finding provides further insight into the relationship between structural brain abnormalities and negative symptoms in schizophrenia.
In cattle early gastrulation-stage embryos (Stage 5), four tissues can be discerned: (i) the top layer of the embryonic disc consisting of embryonic ectoderm (EmE); (ii) the bottom layer of the disc consisting of mesoderm, endoderm and visceral hypoblast (MEH); (iii) the trophoblast (TB); and (iv) the parietal hypoblast. We performed microsurgery followed by RNA-seq to analyse the transcriptome of these four tissues as well as a developmentally earlier pre-gastrulation embryonic disc. The cattle EmE transcriptome was similar at Stages 4 and 5, characterised by the OCT4/SOX2/NANOG pluripotency network. Expression of genes associated with primordial germ cells suggest their presence in the EmE tissue at these stages. Anterior visceral hypoblast genes were transcribed in the Stage 4 disc, but no longer by Stage 5. The Stage 5 MEH layer was equally similar to mouse embryonic and extraembryonic visceral endoderm. Our data suggest that the first mesoderm to invaginate in cattle embryos is fated to become extraembryonic. TGFβ, FGF, VEGF, PDGFA, IGF2, IHH and WNT signals and receptors were expressed, however the representative members of the FGF families differed from that seen in equivalent tissues of mouse embryos. The TB transcriptome was unique and differed significantly from that of mice. FGF signalling in the TB may be autocrine with both FGFR2 and FGF2 expressed. Our data revealed a range of potential inter-tissue interactions, highlighted significant differences in early development between mice and cattle and yielded insight into the developmental events occurring at the start of gastrulation.
We consider the minimization of Dirichlet eigenvalues
, of the Laplacian on cuboids of unit measure in
. We prove that any sequence of optimal cuboids in
converges to a cube of unit measure in the sense of Hausdorff as
. We also obtain an upper bound for that rate of convergence.
Fe fortification of centrally manufactured and frequently consumed condiments such as bouillon cubes could help prevent Fe deficiency in developing countries. However, Fe compounds that do not cause sensory changes in the fortified product, such as ferric pyrophosphate (FePP), exhibit low absorption in humans. Tetra sodium pyrophosphate (NaPP) can form soluble complexes with Fe, which could increase Fe bioavailability. Therefore, the aim of this study was to investigate Fe bioavailability from bouillon cubes fortified with either FePP only, FePP+NaPP, ferrous sulphate (FeSO4) only, or FeSO4+NaPP. We first conducted in vitro studies using a protocol of simulated digestion to assess the dialysable and ionic Fe, and the cellular ferritin response in a Caco-2 cell model. Second, Fe absorption from bouillon prepared from intrinsically labelled cubes (2·5 mg stable Fe isotopes/cube) was assessed in twenty-four Fe-deficient women, by measuring Fe incorporation into erythrocytes 2 weeks after consumption. Fe bioavailability in humans increased by 46 % (P<0·005) when comparing bouillons fortified with FePP only (4·4 %) and bouillons fortified with FePP+NaPP (6·4 %). Fe absorption from bouillons fortified with FeSO4 only and with FeSO4+NaPP was 33·8 and 27·8 %, respectively (NS). The outcome from the human study is in agreement with the dialysable Fe from the in vitro experiments. Our findings suggest that the addition of NaPP could be a promising strategy to increase Fe absorption from FePP-fortified bouillon cubes, and if confirmed by further research, for other fortified foods with complex food matrices as well.
On the basis of a multidisciplinary approach we have unraveled the palaeo-earthquake history of a trenched section across the Peel Boundary Fault. The area shows at present one of the largest contrasts in relative motion on both sides of the fault on the basis of repeated levelling. The geological record for the last 25 thousand years, recovered in the trench, shows evidence of two heavy earthquakes (moment magnitude between 6.0 and 6.6), that occurred in a relatively short timespan around 15 thousands years ago. A third less severe event occurred somewhere in the mid Holocene. The time interval between the two large events is in the order of 1500 years, an interval comparable to that between the last volcanic explosions in the nearby Eifel area. Both records together seem to suggest a relation between large-scale faulting and volcanic activity in the nearby Eifel area, but this interpretation is based on one trench only and should be tested by opening more trenches in the zone that is assumed to be affected by these large events.
Current ultra-high-risk (UHR) criteria appear insufficient to predict imminent onset of first-episode psychosis, as a meta-analysis showed that about 20% of patients have a psychotic outcome after 2 years. Therefore, we aimed to develop a stage-dependent predictive model in UHR individuals who were seeking help for co-morbid disorders.
Baseline data on symptomatology, and environmental and psychological factors of 185 UHR patients (aged 14–35 years) participating in the Dutch Early Detection and Intervention Evaluation study were analysed with Cox proportional hazard analyses.
At 18 months, the overall transition rate was 17.3%. The final predictor model included five variables: observed blunted affect [hazard ratio (HR) 3.39, 95% confidence interval (CI) 1.56–7.35, p < 0.001], subjective complaints of impaired motor function (HR 5.88, 95% CI 1.21–6.10, p = 0.02), beliefs about social marginalization (HR 2.76, 95% CI 1.14–6.72, p = 0.03), decline in social functioning (HR 1.10, 95% CI 1.01–1.17, p = 0.03), and distress associated with suspiciousness (HR 1.02, 95% CI 1.00–1.03, p = 0.01). The positive predictive value of the model was 80.0%. The resulting prognostic index stratified the general risk into three risk classes with significantly different survival curves. In the highest risk class, transition to psychosis emerged on average ⩾8 months earlier than in the lowest risk class.
Predicting a first-episode psychosis in help-seeking UHR patients was improved using a stage-dependent prognostic model including negative psychotic symptoms (observed flattened affect, subjective impaired motor functioning), impaired social functioning and distress associated with suspiciousness. Treatment intensity may be stratified and personalized using the risk stratification.
Previous research has established the relationship between cannabis use and psychotic disorders. Whether cannabis use is related to transition to psychosis in patients at ultra-high risk (UHR) for psychosis remains unclear. The present study aimed to review the existing evidence on the association between cannabis use and transition to psychosis in UHR samples.
A search of PsychInfo, Embase and Medline was conducted from 1996 to August 2015. The search yielded 5559 potentially relevant articles that were selected on title and abstract. Subsequently 36 articles were screened on full text for eligibility. Two random-effects meta-analyses were performed. First, we compared transition rates to psychosis of UHR individuals with lifetime cannabis use with non-cannabis-using UHR individuals. Second, we compared transition rates of UHR individuals with a current DSM-IV cannabis abuse or dependence diagnosis with lifetime users and non-using UHR individuals.
We found seven prospective studies reporting on lifetime cannabis use in UHR subjects (n = 1171). Of these studies, five also examined current cannabis abuse or dependence. Lifetime cannabis use was not significantly associated with transition to psychosis [odds ratio (OR) 1.14, 95% confidence interval (CI) 0.856–1.524, p = 0.37]. A second meta-analysis yielded an OR of 1.75 (95% CI 1.135–2.710, p = 0.01), indicating a significant association between current cannabis abuse or dependence and transition to psychosis.
Our results show that cannabis use was only predictive of transition to psychosis in those who met criteria for cannabis abuse or dependence, tentatively suggesting a dose–response relationship between current cannabis use and transition to psychosis.
In 2005, the Norwegian Institute of Public Health established a web-based outbreak rapid alert system called Vesuv. The system is used for mandatory outbreak alerts from municipal medical officers, healthcare institutions, and food safety authorities. As of 2013, 1426 outbreaks have been reported, involving 32913 cases. More than half of the outbreaks occurred in healthcare institutions (759 outbreaks, 53·2%). A total of 474 (33·2%) outbreaks were associated with food or drinking water. The web-based rapid alert system has proved to be a helpful tool by enhancing reporting and enabling rapid and efficient information sharing between different authorities at both the local and national levels. It is also an important tool for event-based reporting, as required by the International Health Regulations (IHR) 2005. Collecting information from all the outbreak alerts and reports in a national database is also useful for analysing trends, such as occurrence of certain microorganisms, places or sources of infection, or route of transmission. This can facilitate the identification of specific areas where more general preventive measures are needed.
Studies that address fish welfare before slaughter have concluded that many of the traditional systems used to stun fish including CO2 narcosis are unacceptable as they cause avoidable stress before death. One system recommended as a better alternative is electrical stunning, however, the welfare aspects of this method are not yet fully understood. To assess welfare in aquaculture both behavioural and physiological measurements have been used, but few studies have examined the relationship between these variables. In an on-site study aversive behaviours and several physiological stress indicators, including plasma levels of cortisol and ions as well as blood physiological variables, were compared in Arctic char (Salvelinus alpinus) stunned with CO2 or electricity. Exposure to water saturated with CO2 triggered aversive struggling and escape responses for several minutes before immobilization, whereas in fish exposed to an electric current immobilization was close to instant. On average, it took 5 min for the fish to recover from electrical stunning, whereas fish stunned with CO2 did not recover. Despite this, the electrically stunned fish had more than double the plasma levels of cortisol compared with fish stunned with CO2. This result is surprising considering that the behavioural reactions were much more pronounced following CO2 exposure. These contradictory results are discussed with regard to animal welfare and stress physiological responses. The present results emphasise the importance of using an integrative and interdisciplinary approach and to include both behavioural and physiological stress indicators in order to make accurate welfare assessments of fish in aquaculture.