To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Childhood maltreatment (CM) plays an important role in the development of major depressive disorder (MDD). The aim of this study was to examine whether CM severity and type are associated with MDD-related brain alterations, and how they interact with sex and age.
Within the ENIGMA-MDD network, severity and subtypes of CM using the Childhood Trauma Questionnaire were assessed and structural magnetic resonance imaging data from patients with MDD and healthy controls were analyzed in a mega-analysis comprising a total of 3872 participants aged between 13 and 89 years. Cortical thickness and surface area were extracted at each site using FreeSurfer.
CM severity was associated with reduced cortical thickness in the banks of the superior temporal sulcus and supramarginal gyrus as well as with reduced surface area of the middle temporal lobe. Participants reporting both childhood neglect and abuse had a lower cortical thickness in the inferior parietal lobe, middle temporal lobe, and precuneus compared to participants not exposed to CM. In males only, regardless of diagnosis, CM severity was associated with higher cortical thickness of the rostral anterior cingulate cortex. Finally, a significant interaction between CM and age in predicting thickness was seen across several prefrontal, temporal, and temporo-parietal regions.
Severity and type of CM may impact cortical thickness and surface area. Importantly, CM may influence age-dependent brain maturation, particularly in regions related to the default mode network, perception, and theory of mind.
The Mental Health Act 2001 has introduced significant changes to the process of admission to hospital for individuals affected by mental health disorders. This study aimed to determine whether a newly designed smartphone application could result in an improvement in service users’ knowledge of their rights compared with the paper booklet.
This was a randomized study conducted in an outpatient and day-hospital in North Dublin. Participants were randomized to receive the information booklet as either a smartphone application or in the paper form. A questionnaire which was scored from 0 to 10 was devised and was completed at baseline and at 1-week follow-up.
A total of 42 individuals completed the baseline and follow-up questionnaire and of these, 53.7% were female and the mean age was 38.2 years (s.d.±13.5). A total of 34.1% had a diagnosis of a psychotic disorder, 29.3% had a depressive disorder and 22% had bipolar-affective disorder. The mean score before the intervention in the total group was 3.5 (s.d.±2.2) and this increased to 5.8 (s.d.±2.2) at follow-up. Participants randomized to the smartphone application improving by a mean of 2.5 (s.d.±2.5), while those randomized to the booklet improving by a mean of 2.3 (s.d.±2.6), which was not statistically significant.
Both forms of the information booklet showed improvement in service users’ knowledge of their legal rights. It is possible that each individual will have preference for either a paper form or a smartphone form and this study suggests that both forms should be offered to each individual service user.
As poultry consumption continues to increase worldwide, and as the United States accounts for about one-third of all poultry exports globally, understanding factors leading to poultry-associated foodborne outbreaks in the United States has important implications for food safety. We analysed outbreaks reported to the United States’ Foodborne Disease Outbreak Surveillance System from 1998 to 2012 in which the implicated food or ingredient could be assigned to one food category. Of 1114 outbreaks, poultry was associated with 279 (25%), accounting for the highest number of outbreaks, illnesses, and hospitalizations, and the second highest number of deaths. Of the 149 poultry-associated outbreaks caused by a confirmed pathogen, Salmonella enterica (43%) and Clostridium perfringens (26%) were the most common pathogens. Restaurants were the most commonly reported location of food preparation (37% of poultry-associated outbreaks), followed by private homes (25%), and catering facilities (13%). The most commonly reported factors contributing to poultry-associated outbreaks were food-handling errors (64%) and inadequate cooking (53%). Effective measures to reduce poultry contamination, promote safe food-handling practices, and ensure food handlers do not work while ill could reduce poultry-associated outbreaks and illnesses.
Salmonella enterica causes an estimated 1 million domestically acquired foodborne illnesses annually. Salmonella enterica serovar Enteritidis (SE) is among the top three serovars of reported cases of Salmonella. We examined trends in SE foodborne outbreaks from 1973 to 2009 using Joinpoint and Poisson regression. The annual number of SE outbreaks increased sharply in the 1970s and 1980s but declined significantly after 1990. Over the study period, SE outbreaks were most frequently attributed to foods containing eggs. The average rate of SE outbreaks attributed to egg-containing foods reported by states began to decline significantly after 1990, and the proportion of SE outbreaks attributed to egg-containing foods began declining after 1997. Our results suggest that interventions initiated in the 1990s to decrease SE contamination of shell eggs may have been integral to preventing SE outbreaks.
For several decades, breeding goals in dairy cattle focussed on increased milk production. However, many functional traits have negative genetic correlations with milk yield, and reductions in genetic merit for health and fitness have been observed. Herd management has been challenged to compensate for these effects and to balance fertility, udder health and metabolic diseases against increased production to maximize profit without compromising welfare. Functional traits, such as direct information on cow health, have also become more important because of growing concern about animal well-being and consumer demands for healthy and natural products. There are major concerns about the impact of drugs used in veterinary medicine on the spread of antibiotic-resistant strains of bacteria that can negatively impact human health. Sustainability and efficiency are also increasingly important because of the growing competition for high-quality, plant-based sources of energy and protein. Disruptions to global environments because of climate change may encourage yet more emphasis on these traits. To be successful, it is vital that there be a balance between the effort required for data recording and subsequent benefits. The motivation of farmers and other stakeholders involved in documentation and recording is essential to ensure good data quality. To keep labour costs reasonable, existing data sources should be used as much as possible. Examples include the use of milk composition data to provide additional information about the metabolic status or energy balance of the animals. Recent advances in the use of mid-infrared spectroscopy to measure milk have shown considerable promise, and may provide cost-effective alternative phenotypes for difficult or expensive-to-measure traits, such as feed efficiency. There are other valuable data sources in countries that have compulsory documentation of veterinary treatments and drug use. Additional sources of data outside of the farm include, for example, slaughter houses (meat composition and quality) and veterinary labs (specific pathogens, viral loads). At the farm level, many data are available from automated and semi-automated milking and management systems. Electronic devices measuring physiological status or activity parameters can be used to predict events such as oestrus, and also behavioural traits. Challenges concerning the predictive biology of indicator traits or standardization need to be solved. To develop effective selection programmes for new traits, the development of large databases is necessary so that high-reliability breeding values can be estimated. For expensive-to-record traits, extensive phenotyping in combination with genotyping of females is a possibility.
An a posteriori granddaughter design was applied to estimate quantitative trait loci genotypes of sires with many sons in the US Holstein population. The results of this analysis can be used to determine concordance between specific polymorphisms and segregating quantitative trait loci. Determination of the actual polymorphisms responsible for observed genetic variation should increase the accuracy of genomic evaluations and rates of genetic gain. A total of 52 grandsire families, each with ⩾100 genotyped sons with genetic evaluations based on progeny tests, were analyzed for 33 traits (milk, fat and protein yields; fat and protein percentages; somatic cell score (SCS); productive life; daughter pregnancy rate; heifer and cow conception rates; service-sire and daughter calving ease; service-sire and daughter stillbirth rates; 18 conformation traits; and net merit). Of 617 haplotype segments spanning the entire bovine genome and each including ~5×106 bp, 5 cM and 50 genes, 608 autosomal segments were analyzed. A total of 19 335 unique haplotypes were found among the 52 grandsires. There were a total of 133 chromosomal segment-by-trait combinations, for which the nominal probability of significance for the haplotype effect was <10−8, which corresponds to genome-wide significance of <10−4. The number of chromosomal regions that met this criterion by trait ranged from one for rear legs (rear view) to seven for net merit. For each of the putative quantitative trait loci, at least one grandsire family had a within-family contrast with a t-value of >3. Confidence intervals (CIs) were estimated by the nonparametric bootstrap for the largest effect for each of nine traits. The bootstrap distribution generated by 100 samples was bimodal only for net merit, which had the widest 90% CI (eight haplotype segments). This may be due to the fact that net merit is a composite trait. For all other chromosomes, the CI spanned less than a third of the chromosome. The narrowest CI (a single haplotype segment) was found for SCS. It is likely that analysis by more advanced methods could further reduce CIs at least by half. These results can be used as a first step to determine the actual polymorphisms responsible for observed quantitative variation in dairy cattle.
The WAIS (West Antarctic Ice Sheet) Divide deep ice core was recently completed to a total depth of 3405 m, ending 50 m above the bed. Investigation of the visual stratigraphy and grain characteristics indicates that the ice column at the drilling location is undisturbed by any large-scale overturning or discontinuity. The climate record developed from this core is therefore likely to be continuous and robust. Measured grain-growth rates, recrystallization characteristics, and grain-size response at climate transitions fit within current understanding. Significant impurity control on grain size is indicated from correlation analysis between impurity loading and grain size. Bubble-number densities and bubble sizes and shapes are presented through the full extent of the bubbly ice. Where bubble elongation is observed, the direction of elongation is preferentially parallel to the trace of the basal (0001) plane. Preferred crystallographic orientation of grains is present in the shallowest samples measured, and increases with depth, progressing to a vertical-girdle pattern that tightens to a vertical single-maximum fabric. This single-maximum fabric switches into multiple maxima as the grain size increases rapidly in the deepest, warmest ice. A strong dependence of the fabric on the impurity-mediated grain size is apparent in the deepest samples.
Depressive symptoms are prominent psychopathological features of Huntington's disease (HD), making a negative impact on social functioning and well-being.
We compared the frequencies of a history of depression, previous suicide attempts and current subthreshold depression between 61 early-stage HD participants and 40 matched controls. The HD group was then split based on the overall HD group's median Hospital Anxiety and Depression Scale-depression score into a group of 30 non-depressed participants (mean 0.8, s.d. = 0.7) and a group of 31 participants with subthreshold depressive symptoms (mean 7.3, s.d. = 3.5) to explore the neuroanatomy underlying subthreshold depressive symptoms in HD using voxel-based morphometry (VBM) and diffusion tensor imaging (DTI).
Frequencies of history of depression, previous suicide attempts or current subthreshold depressive symptoms were higher in HD than in controls. The severity of current depressive symptoms was also higher in HD, but not associated with the severity of HD motor signs or disease burden. Compared with the non-depressed HD group DTI revealed lower fractional anisotropy (FA) values in the frontal cortex, anterior cingulate cortex, insula and cerebellum of the HD group with subthreshold depressive symptoms. In contrast, VBM measures were similar in both HD groups. A history of depression, the severity of HD motor signs or disease burden did not correlate with FA values of these regions.
Current subthreshold depressive symptoms in early HD are associated with microstructural changes – without concomitant brain volume loss – in brain regions known to be involved in major depressive disorder, but not those typically associated with HD pathology.
The first direct detection of gravitational waves may be made through observations of pulsars. The principal aim of pulsar timing-array projects being carried out worldwide is to detect ultra-low frequency gravitational waves (f ∼ 10−9–10−8 Hz). Such waves are expected to be caused by coalescing supermassive binary black holes in the cores of merged galaxies. It is also possible that a detectable signal could have been produced in the inflationary era or by cosmic strings. In this paper, we review the current status of the Parkes Pulsar Timing Array project (the only such project in the Southern hemisphere) and compare the pulsar timing technique with other forms of gravitational-wave detection such as ground- and space-based interferometer systems.
We report three cases of lateral outfracture of the inferior turbinate, which demonstrate a range of changes in the size, position and shape of the inferior turbinate.
During a study of the validity of computer modelling of nasal airflow, computed tomography scans of the noses of patients who had undergone lateral outfracture of the inferior turbinate were collected. The pre-operative scan was compared with the post-operative scan six weeks later.
In one patient, there was only a small lateral displacement of the inferior turbinate. In the other two cases, appreciable reduction in the volume of one inferior turbinate was noted, in addition to minor changes in the shape.
Lateral outfracture of the inferior turbinate produces varied and inconsistent changes in morphology which may affect the shape, size and position of the turbinate.
We examined reported outbreaks of foodborne shigellosis in the USA from 1998 to 2008 and summarized demographic and epidemiological characteristics of 120 confirmed outbreaks resulting in 6208 illnesses. Most reported foodborne shigellosis outbreaks (n = 70, 58%) and outbreak-associated illnesses (n = 3383, 54%) were restaurant-associated. The largest outbreaks were associated with commercially prepared foods distributed in multiple states and foods prepared in institutional settings. Foods commonly consumed raw were implicated in 29 (24%) outbreaks and infected food handlers in 28 (23%) outbreaks. Most outbreaks (n = 86, 72%) were caused by Shigella sonnei. Targeted efforts to reduce contamination during food handling at multiple points in the food processing and distribution system, including food preparation in restaurants and institutional settings, could prevent many foodborne disease outbreaks and outbreak-related illnesses including those due to Shigella.
Interferon-alpha (IFN-α) treatment for infectious disease and cancer causes high rates of depression and fatigue, and has been used to investigate the impact of inflammatory cytokines on brain and behavior. However, little is known about the transcriptional impact of chronic IFN-α on immune cells in vivo and its relationship to IFN-α-induced behavioral changes.
Genome-wide transcriptional profiling was performed on peripheral blood mononuclear cells (PBMCs) from 21 patients with chronic hepatitis C virus (HCV) either awaiting IFN-α therapy (n=10) or at 12 weeks of IFN-α treatment (n=11).
Significance analysis of microarray data identified 252 up-regulated and 116 down-regulated gene transcripts. Of the up-regulated genes, 2′-5′-oligoadenylate synthetase 2 (OAS2), a gene linked to chronic fatigue syndrome (CFS), was the only gene that was differentially expressed in patients with IFN-α-induced depression/fatigue, and correlated with depression and fatigue scores at 12 weeks (r=0.80, p=0.003 and r=0.70, p=0.017 respectively). Promoter-based bioinformatic analyses linked IFN-α-related transcriptional alterations to transcription factors involved in myeloid differentiation, IFN-α signaling, activator protein-1 (AP1) and cAMP responsive element binding protein/activation transcription factor (CREB/ATF) pathways, which were derived primarily from monocytes and plasmacytoid dendritic cells. IFN-α-treated patients with high depression/fatigue scores demonstrated up-regulation of genes bearing promoter motifs for transcription factors involved in myeloid differentiation, IFN-α and AP1 signaling, and reduced prevalence of motifs for CREB/ATF, which has been implicated in major depression.
Depression and fatigue during chronic IFN-α administration were associated with alterations in the expression (OAS2) and transcriptional control (CREB/ATF) of genes linked to behavioral disorders including CFS and major depression, further supporting an immune contribution to these diseases.
A 64-year-old man had a first generalized seizure. He was seen in an emergency room where magnetic resonance imaging revealed a right parietal meningioma with a diameter of 2.5 cm. He was seen by the neurosurgery team and a craniotomy was performed. He was discharged from the hospital on phenytoin. Three weeks later he reported drowsiness and unsteadiness to his neurologist. In addition, he described episodes of transient sensory disturbance in his left arm, sometimes with twitching movements of the left hand and wrist. These episodes could last ≤3 minutes, and his hand and arm would be weak afterward for several hours. A phenytoin level was measured at 17. Lamotrigine was added to his regimen, and on 400 mg/day the lamotrigine level was 3.8. The dose was increased gradually to 400 mg BID and seizures stopped. Two hours after each dose, however, the patient would become dizzy, nauseated, and encephalopathic for a period of 60–90 minutes.
The patient presented with a convulsive seizure due to a meningioma and developed simple partial sensorymotor seizures arising in the region of the meningioma after its resection. The seizures persisted with phenytoin at a moderately high serum level but responded to lamotrigine as a second agent at just above the maximum tolerated dose. This case illustrates a common course of events for otherwise healthy adults who have seizures due to structural brain disease and are placed on phenytoin and lamotrigine.