To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In Europe, the incidence of psychotic disorder is high in certain migrant and minority ethnic groups (hence: ‘minorities’). However, it is unknown how the incidence pattern for these groups varies within this continent. Our objective was to compare, across sites in France, Italy, Spain, the UK and the Netherlands, the incidence rates for minorities and the incidence rate ratios (IRRs, minorities v. the local reference population).
The European Network of National Schizophrenia Networks Studying Gene–Environment Interactions (EU-GEI) study was conducted between 2010 and 2015. We analyzed data on incident cases of non-organic psychosis (International Classification of Diseases, 10th edition, codes F20–F33) from 13 sites.
The standardized incidence rates for minorities, combined into one category, varied from 12.2 in Valencia to 82.5 per 100 000 in Paris. These rates were generally high at sites with high rates for the reference population, and low at sites with low rates for the reference population. IRRs for minorities (combined into one category) varied from 0.70 (95% CI 0.32–1.53) in Valencia to 2.47 (95% CI 1.66–3.69) in Paris (test for interaction: p = 0.031). At most sites, IRRs were higher for persons from non-Western countries than for those from Western countries, with the highest IRRs for individuals from sub-Saharan Africa (adjusted IRR = 3.23, 95% CI 2.66–3.93).
Incidence rates vary by region of origin, region of destination and their combination. This suggests that they are strongly influenced by the social context.
The ‘jumping to conclusions’ (JTC) bias is associated with both psychosis and general cognition but their relationship is unclear. In this study, we set out to clarify the relationship between the JTC bias, IQ, psychosis and polygenic liability to schizophrenia and IQ.
A total of 817 first episode psychosis patients and 1294 population-based controls completed assessments of general intelligence (IQ), and JTC, and provided blood or saliva samples from which we extracted DNA and computed polygenic risk scores for IQ and schizophrenia.
The estimated proportion of the total effect of case/control differences on JTC mediated by IQ was 79%. Schizophrenia polygenic risk score was non-significantly associated with a higher number of beads drawn (B = 0.47, 95% CI −0.21 to 1.16, p = 0.17); whereas IQ PRS (B = 0.51, 95% CI 0.25–0.76, p < 0.001) significantly predicted the number of beads drawn, and was thus associated with reduced JTC bias. The JTC was more strongly associated with the higher level of psychotic-like experiences (PLEs) in controls, including after controlling for IQ (B = −1.7, 95% CI −2.8 to −0.5, p = 0.006), but did not relate to delusions in patients.
Our findings suggest that the JTC reasoning bias in psychosis might not be a specific cognitive deficit but rather a manifestation or consequence, of general cognitive impairment. Whereas, in the general population, the JTC bias is related to PLEs, independent of IQ. The work has the potential to inform interventions targeting cognitive biases in early psychosis.
Daily use of high-potency cannabis has been reported to carry a high risk for developing a psychotic disorder. However, the evidence is mixed on whether any pattern of cannabis use is associated with a particular symptomatology in first-episode psychosis (FEP) patients.
We analysed data from 901 FEP patients and 1235 controls recruited across six countries, as part of the European Network of National Schizophrenia Networks Studying Gene-Environment Interactions (EU-GEI) study. We used item response modelling to estimate two bifactor models, which included general and specific dimensions of psychotic symptoms in patients and psychotic experiences in controls. The associations between these dimensions and cannabis use were evaluated using linear mixed-effects models analyses.
In patients, there was a linear relationship between the positive symptom dimension and the extent of lifetime exposure to cannabis, with daily users of high-potency cannabis having the highest score (B = 0.35; 95% CI 0.14–0.56). Moreover, negative symptoms were more common among patients who never used cannabis compared with those with any pattern of use (B = −0.22; 95% CI −0.37 to −0.07). In controls, psychotic experiences were associated with current use of cannabis but not with the extent of lifetime use. Neither patients nor controls presented differences in depressive dimension related to cannabis use.
Our findings provide the first large-scale evidence that FEP patients with a history of daily use of high-potency cannabis present with more positive and less negative symptoms, compared with those who never used cannabis or used low-potency types.
Flagellar dyneins are the molecular motors responsible for producing the propagating bending motions of cilia and flagella. They are located within a densely packed and highly organised super-macromolecular cytoskeletal structure known as the axoneme. Using the mesoscale simulation technique Fluctuating Finite Element Analysis (FFEA), which represents proteins as viscoelastic continuum objects subject to explicit thermal noise, we have quantified the constraints on the range of molecular conformations that can be explored by dynein-c within the crowded architecture of the axoneme. We subsequently assess the influence of crowding on the 3D exploration of microtubule-binding sites, and specifically on the axial step length. Our calculations combine experimental information on the shape, flexibility and environment of dynein-c from three distinct sources; negative stain electron microscopy, cryo-electron microscopy (cryo-EM) and cryo-electron tomography (cryo-ET). Our FFEA simulations show that the super-macromolecular organisation of multiple protein complexes into higher-order structures can have a significant influence on the effective flexibility of the individual molecular components, and may, therefore, play an important role in the physical mechanisms underlying their biological function.
Positive symptoms are a useful predictor of aggression in schizophrenia. Although a similar pattern of abnormal brain structures related to both positive symptoms and aggression has been reported, this observation has not yet been confirmed in a single sample.
To study the association between positive symptoms and aggression in schizophrenia on a neurobiological level, a prospective meta-analytic approach was employed to analyze harmonized structural neuroimaging data from 10 research centers worldwide. We analyzed brain MRI scans from 902 individuals with a primary diagnosis of schizophrenia and 952 healthy controls.
The result identified a widespread cortical thickness reduction in schizophrenia compared to their controls. Two separate meta-regression analyses revealed that a common pattern of reduced cortical gray matter thickness within the left lateral temporal lobe and right midcingulate cortex was significantly associated with both positive symptoms and aggression.
These findings suggested that positive symptoms such as formal thought disorder and auditory misperception, combined with cognitive impairments reflecting difficulties in deploying an adaptive control toward perceived threats, could escalate the likelihood of aggression in schizophrenia.
To identify potential participants for clinical trials, electronic health records (EHRs) are searched at potential sites. As an alternative, we investigated using medical devices used for real-time diagnostic decisions for trial enrollment.
To project cohorts for a trial in acute coronary syndromes (ACS), we used electrocardiograph-based algorithms that identify ACS or ST elevation myocardial infarction (STEMI) that prompt clinicians to offer patients trial enrollment. We searched six hospitals’ electrocardiograph systems for electrocardiograms (ECGs) meeting the planned trial’s enrollment criterion: ECGs with STEMI or > 75% probability of ACS by the acute cardiac ischemia time-insensitive predictive instrument (ACI-TIPI). We revised the ACI-TIPI regression to require only data directly from the electrocardiograph, the e-ACI-TIPI using the same data used for the original ACI-TIPI (development set n = 3,453; test set n = 2,315). We also tested both on data from emergency department electrocardiographs from across the US (n = 8,556). We then used ACI-TIPI and e-ACI-TIPI to identify potential cohorts for the ACS trial and compared performance to cohorts from EHR data at the hospitals.
Receiver-operating characteristic (ROC) curve areas on the test set were excellent, 0.89 for ACI-TIPI and 0.84 for the e-ACI-TIPI, as was calibration. On the national electrocardiographic database, ROC areas were 0.78 and 0.69, respectively, and with very good calibration. When tested for detection of patients with > 75% ACS probability, both electrocardiograph-based methods identified eligible patients well, and better than did EHRs.
Using data from medical devices such as electrocardiographs may provide accurate projections of available cohorts for clinical trials.
Common mental health problems affect a quarter of the population. Online cognitive–behavioural therapy (CBT) is increasingly used, but the factors modulating response to this treatment modality remain unclear.
This study aims to explore the demographic and clinical predictors of response to one-to-one CBT delivered via the internet.
Real-world clinical outcomes data were collected from 2211 NHS England patients completing a course of CBT delivered by a trained clinician via the internet. Logistic regression analyses were performed using patient and service variables to identify significant predictors of response to treatment.
Multiple patient variables were significantly associated with positive response to treatment including older age, absence of long-term physical comorbidities and lower symptom severity at start of treatment. Service variables associated with positive response to treatment included shorter waiting times for initial assessment and longer treatment durations in terms of the number of sessions.
Knowledge of which patient and service variables are associated with good clinical outcomes can be used to develop personalised treatment programmes, as part of a quality improvement cycle aiming to drive up standards in mental healthcare. This study exemplifies translational research put into practice and deployed at scale in the National Health Service, demonstrating the value of technology-enabled treatment delivery not only in facilitating access to care, but in enabling accelerated data capture for clinical research purposes.
Declaration of interest
A.C., S.B., V.T., K.I., S.F., A.R., A.H. and A.D.B. are employees or board members of the sponsor. S.R.C. consults for Cambridge Cognition and Shire. Keywords: Anxiety disorders; cognitive behavioural therapies; depressive disorders; individual psychotherapy
To determine the effectiveness of a workplace wellness programme intervention in improving participants’ behaviour towards choosing a healthy diet and the correlation with health indicators.
A retrospective cohort study.
Wellness programme in the Midwest, USA.
Employees (n 12 636) who participated in a wellness programme for three consecutive years during years 2004 to 2013 and who completed web-based health risk questionnaires. The wellness programme included annual health screening, laboratory measures, health risk questionnaire and personalized health-care programme. Participants’ food group intakes, BMI and health indicators were compared between the first and last year of participation. McNemar’s non-parametric test was used for paired nominal data. Pearson correlations were computed for paired food and health indicator measurements. Correlations between dietary intake and BMI, cholesterol and TAG were computed using Pearson correlations and McNemar’s test.
There were negative correlations between intakes of fruits, vegetables, grains, dairy, healthy eating pattern and health outcome indicators such as BMI and TAG levels. Additionally, the percentage of employees who increased their consumption of fruits (16·88 v. 12·08 %, P<0·001), vegetables (15·20 v. 11·44 %, P<0·001) and dark green leafy vegetables (12·03 v. 7·27 %, P 0·001) was significantly higher than the percentage of participants who decreased their intake of these food groups during the third-year follow-up.
The wellness programme improved some health indicator parameters and had a positive impact on increasing participants’ intakes of fruits, vegetables and whole grains at the third year of follow-up.
From 1565 to 1570, Spain established no fewer than three networks of presidios (fortified military settlements) across portions of its frontier territories in La Florida and New Spain. Juan Pardo's network of six forts, extending from the Atlantic coast over the Appalachian Mountains, was the least successful of these presidio systems, lasting only from late 1566 to early 1568. The failure of Pardo's defensive network has long been attributed to poor planning and an insufficient investment of resources. Yet recent archaeological discoveries at the Berry site in western North Carolina—the location of both the Native American town of Joara and Pardo's first garrison, Fort San Juan—warrants a reappraisal of this interpretation. While previous archaeological research at Berry concentrated on the domestic compound where Pardo's soldiers resided, the location of the fort itself remained unknown. In 2013, the remains of Fort San Juan were finally identified south of the compound, the first of Pardo's interior forts to be discovered by archaeologists. Data from excavations and geophysical surveys suggest that it was a substantial defensive construction. We attribute the failure of Pardo's network to the social geography of the Native South rather than to an insufficient investment of resources.
To describe community representation in Nepal’s Health Facility Operation and Management Committees (HFMCs) and the degree of influence of community representatives in the HFMC decision-making processes.
Community participation has been recognised as one of the key components for the successful implementation of primary health care (PHC) strategies, following the 1978 Declaration of Alma-Ata. In low- and middle-income countries (LMICs), HFMCs are now widely considered as a mechanism to increase community participation in health through community representation. There is some research examining the implementation process, impact and factors affecting the effectiveness of HFMCs. Despite the documented evidence of the importance of factors such as adequate representation, links with wider community, and decision-making power, there is limited evidence about the nature of community representation and degree of decision making within HFMCs in the PHC setting, particularly in LMICs.
Qualitative interviews with 39 key informants were held to explore different aspects of community representation in HFMCs, and the influence of the HFMC on health facility decision-making processes. In addition, a facility audit at 22 facilities and review of HFMC meeting minutes at six health facilities were conducted.
There were Dalit (a marginalised caste) and Janajati (an ethnic group) representations in 77% and 100% of the committees, respectively. Likewise, there were at least two female members in each committee. However, the HFMC member selection process and decision making within the committees were influenced by powerful elites. The degree of participation through HFMCs appeared to be at the ‘Manipulation’ and ‘Informing’ stage of Arnstein’s ladder of participation. In conclusion, despite representation of the community on HFMCs, the depth of participation seems low. There is a need to ensure a democratic selection process of committee members; and to expand the depth of participation.
Mahr & Csibra (M&C) claim that episodic remembering's autonoetic character serves as an indicator of epistemic authority. This proposal is difficult to reconcile with the existence of confabulation errors – where participants fabricate memories of experiences that never happened to them. Making confabulation errors damages one's epistemic authority, but these false memories have an autonoetic character.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
Conversations about literacy-related matters with parents can help prepare children for formal literacy instruction. We studied these conversations using data gathered from fifty-six US families as they engaged in daily activities at home. Analyzing conversations when children were aged 1;10, 2;6, 3;6, and 4;2, we found that explicit talk about the elements and processes of reading and writing occurred even when children were less than two years old and became more common as children grew older. The majority of literacy-related conversations included talk about alphabet letters. Literacy-related conversations occurred in a variety of contexts, not only book-reading. There were few differences as a function of family socioeconomic status in the proportion of utterances during the sessions that occurred in literacy-related conversations. At older ages, however, children in families of lower socioeconomic status bore more of the conversational burden than children in families of higher status.
In Memory: A Philosophical Study, Bernecker argues for an account of contiguity. This Contiguity View is meant to solve relearning and prompting, wayward causation problems plaguing the causal theory of memory. I argue that Bernecker’s Contiguity View fails in this task. Contiguity is too weak to prevent relearning and too strong to allow prompting. These failures illustrate a problem inherent in accounts of memory causation. Relearning and prompting are both causal relations, wayward only with respect to our interest in specifying remembering’s requirements. Solving them requires saying more about remembering, not causation. I conclude by sketching such an account.
Increasingly, ambulance services offer alternatives to transfer to the emergency department (ED), when this is better for patients. The introduction of electronic health records (EHR) in ambulance services is encouraged by national policy across the United Kingdom (UK) but roll-out has been variable and complex.
Electronic Records in Ambulances (ERA) is a two-year study which aims to investigate and describe the opportunities and challenges of implementing EHR and associated technology in ambulances to support a safe and effective shift to out of hospital care, including the implications for workforce in terms of training, role and clinical decision-making skills.
Our study includes a scoping review of relevant issues and a baseline assessment of progress in all UK ambulance services in implementing EHR. These will inform four in-depth case studies of services at different stages of implementation, assessing current usage, and examining context.
The scoping review identified themes including: there are many perceived potential benefits of EHR, such as improved safety and remote diagnostics, but as yet little evidence of them; technical challenges to implementation may inhibit uptake and lead to increased workload in the short term; staff implementing EHR may do so selectively or devise workarounds; and EHR may be perceived as a tool of staff surveillance.
Our scoping review identified some complex issues around the implementation of EHR and the relevant challenges, opportunities and workforce implications. These will help to inform our fieldwork and subsequent data analysis in the case study sites, to begin early in 2017. Lessons learned from the experience of implementing EHR so far should inform future development of information technology in ambulance services, and help service providers to understand how best to maximize the opportunities offered by EHR to redesign care.
The study of evolution most typically involves inferring past events on the basis of evidence from extant organisms. There are a number of challenges associated with this, such as uncertainties about the precise time of origin of character states, the rate of molecular evolution and confounding effects of population processes. Accessing evolutionary information directly from the fossil and sub-fossil record – in fact, any past period from which a measurable change has occurred – is therefore extremely useful in addressing these uncertainties. Museum, archaeology department and herbarium collections are the ‘banks’ of biomolecular information from which our scientific understanding of such processes can be extrapolated. Precautions taken to preserve biological material such as controlled environments, tissue-specific storage materials and the conservation of depositional environments are often conducive to long-term survival of genetic material. Consequently, these biomolecular banks hold material with a wide geographical and temporal range, often outside the typical age range of material used in phylogenetic analyses, as well as genetic diversity that is rare or lost in the living world. The advent of ancient biomolecular analyses in the 1990s was a technological milestone in this respect, in which oligogenic analyses based on one or a few genes enabled the reconstruction of extinct stages of phylogenies, such as the renowned placement of the thylacine among dasyuroid marsupials using evidence from cytochrome b DNA sequences (Krajewski et al. 1992; 1997).
NGS allows deep sequencing of single PCR targets, so generating systematic data for thousands or millions of organisms (Sogin et al. 2006). It also facilitates the study of multiple PCR targets of exons, introns, non-coding regions, mRNA transcripts or even complete genomic organization between organisms allowing a much greater depth of understanding in genetic phylogenies than could be gained from a handful of genes or simple morphological analysis (Horner et al. 2010). For the most part, NGS technology has been applied to extant species in systematics research. The applicability of NGS to sub-fossil material was first demonstrated by Poinar et al. (2006) in permafrost preserved mammoth bones. Subsequently, the application of NGS to generate data directly from historical, archaeological or paleontological sources holds the potential to view genomic evolution in real time.
A trend toward greater body size in dizygotic (DZ) than in monozygotic (MZ) twins has been suggested by some but not all studies, and this difference may also vary by age. We analyzed zygosity differences in mean values and variances of height and body mass index (BMI) among male and female twins from infancy to old age. Data were derived from an international database of 54 twin cohorts participating in the COllaborative project of Development of Anthropometrical measures in Twins (CODATwins), and included 842,951 height and BMI measurements from twins aged 1 to 102 years. The results showed that DZ twins were consistently taller than MZ twins, with differences of up to 2.0 cm in childhood and adolescence and up to 0.9 cm in adulthood. Similarly, a greater mean BMI of up to 0.3 kg/m2 in childhood and adolescence and up to 0.2 kg/m2 in adulthood was observed in DZ twins, although the pattern was less consistent. DZ twins presented up to 1.7% greater height and 1.9% greater BMI than MZ twins; these percentage differences were largest in middle and late childhood and decreased with age in both sexes. The variance of height was similar in MZ and DZ twins at most ages. In contrast, the variance of BMI was significantly higher in DZ than in MZ twins, particularly in childhood. In conclusion, DZ twins were generally taller and had greater BMI than MZ twins, but the differences decreased with age in both sexes.
For over 100 years, the genetics of human anthropometric traits has attracted scientific interest. In particular, height and body mass index (BMI, calculated as kg/m2) have been under intensive genetic research. However, it is still largely unknown whether and how heritability estimates vary between human populations. Opportunities to address this question have increased recently because of the establishment of many new twin cohorts and the increasing accumulation of data in established twin cohorts. We started a new research project to analyze systematically (1) the variation of heritability estimates of height, BMI and their trajectories over the life course between birth cohorts, ethnicities and countries, and (2) to study the effects of birth-related factors, education and smoking on these anthropometric traits and whether these effects vary between twin cohorts. We identified 67 twin projects, including both monozygotic (MZ) and dizygotic (DZ) twins, using various sources. We asked for individual level data on height and weight including repeated measurements, birth related traits, background variables, education and smoking. By the end of 2014, 48 projects participated. Together, we have 893,458 height and weight measures (52% females) from 434,723 twin individuals, including 201,192 complete twin pairs (40% monozygotic, 40% same-sex dizygotic and 20% opposite-sex dizygotic) representing 22 countries. This project demonstrates that large-scale international twin studies are feasible and can promote the use of existing data for novel research purposes.