We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Depression and insomnia commonly co-occur. Yet, little is known about the mechanisms through which insomnia influences depression. Recent research and theory highlight reward system dysfunction as a potential mediator of the relationship between insomnia and depression. This study is the first to examine the impact of insomnia on reward learning, a key component of reward system functioning, in clinical depression.
Methods
The sample consisted of 72 veterans with unipolar depression who endorsed sleep disturbance symptoms. Participants completed the Structured Clinical Interview for DSM-IV, self-report measures of insomnia, depression, and reward processing, and a previously validated signal detection task (Pizzagalli et al., 2005, Biological Psychiatry, 57(4), 319–327). Trial-by-trial response bias (RB) estimates calculated for each of the 200 task trials were examined using linear mixed-model analyses to investigate change in reward learning.
Results
Findings demonstrated diminished rate and magnitude of reward learning in the Insomnia group relative to the Hypersomnia/Mixed Symptom group across the task. Within the Insomnia group, participants with more severe insomnia evidenced the lowest rates of reward learning, with increased RB across the task with decreasing insomnia severity.
Conclusions
Among individuals with depression, insomnia is associated with decreased ability to learn associations between neutral stimuli and rewarding outcomes and/or modify behavior in response to differential receipt of reward. This attenuated reward learning may contribute to clinically meaningful decreases in motivation and increased withdrawal in this comorbid group. Results extend existing theory by highlighting impairments in reward learning specifically as a potential mediator of the association between insomnia and depression.
Altered neurocognitive function in schizophrenia could reflect both genetic and illness-specific effects.
Objectives
To use functional magnetic resonance imaging to discriminate between the influences of the genetic risk for schizophrenia and environmental factors on the neural substrate of verbal fluency, a candidate schizophrenia endophenotype using a case control twin design.
Methods
We studied 23 monozygotic twin pairs: 13 pairs discordant for schizophrenia and 10 pairs of healthy volunteer twins. Groups were matched for age, gender, handedness, level of education, parental socio-economic status, and ethnicity. Behavioural performance and regional brain activation during a phonological verbal fluency task were assessed.
Results
Relative to healthy control twins, both patients and their non-psychotic co-twins produced fewer correct responses and showed less activation in the medial temporal region and inferior frontal gyrus. Twins with schizophrenia showed greater activation than both their non-psychotic co-twins and controls in right lateral temporal cortex, reflecting reduced deactivation during word generation while their non-psychotic co-twins showed greater activation in the left temporal cortex.
Conclusions
Both genetic vulnerability to schizophrenia and schizophrenia were associated with impaired verbal fluency performance, reduced engagement of the medial temporal region and dorsal inferior frontal gyrus. Schizophrenia was specifically associated with an additional reduction in deactivation in the right temporal cortex.
To investigate associations between schizophrenia candidate gene polymorphisms and regional cortical thickness and volume in patients with schizophrenia and healthy control subjects.
Methods:
Genotyping was performed using PCR and pyrosequencing techniques. Cortical morphology was analyzed by processing magnetic resonance brain images with the FreeSurfer software package. General linear model analysis was used to study associations between gene variants and cortical thickness in patients and controls, respectively. Regional cortical volumes were defined from automatic cortical parcellations. Our first studies from 96 patients with schizophrenia and 104 healthy control subjects demonstrate that polymorphisms in the brain derived neurotrophic factor (BDNF) gene may be associated with variation in frontal lobe morphology. Associations seem to be stronger in patients with schizophrenia than in healthy controls.
Recent findings support sex-specific effects of PDYN polymorphisms on association with opioid addiction (Clarke et al. 2012). We have demonstrated that PDYN haplotypes, which include rs2281285, are associated with alcohol dependence and propensity to drink in negative emotional situations (negative craving) (Karpyak et al 2012). The rs2281285 variant may contribute to regulation of alternative PDYN mRNA transcription specific to brain area or physiological condition.
Objectives
To investigate sex-specific effects of the PDYN rs2281285 variant on risk for alcohol dependence.
Aims
To examine the association of the PDYN rs2281285 variant with alcohol dependence in male and female subjects.
Methods
rs2281285 was genotyped in the investigation cohort of 816 (554 males) alcohol dependent subjects (DSM-IV-TR) and 1248 (603 males) non-alcoholic controls and in the replication cohort of 467 (347 males) alcohol dependent subjects and 431 (224 males) non-alcoholic controls. Logistic regression models were used to test for sex-specific associations after controlling for age.
Results
As previously reported, significant association of the PDYN rs2281285 variant with alcohol dependence was found in the investigation (p = 0.008, odds ratio = 1.299), but not the replication cohort (0.223, OR = 0.118). However, sex-specific analyses revealed stronger association in males (p = 0.002, OR = 1.493) but not females (p = 0.684, OR = 1.066) in the investigation cohort, and a trend for association in males (p = 0.086, OR = 1.352) but not females (p = 0.808, OR = 0.947) in the replication cohort.
Conclusions
Our findings support association of PDYN rs2281285 variant with alcohol dependence in male but not female subjects. Future studies should investigate functional mechanisms of this effect.
It is widely recognised that people with intellectual disabilities receive a poorer quality of healthcare than their non-disabled counterparts. Training for healthcare professionals in intellectual disability is often scant or non-existent. The purpose of this work is to explore the usefulness of employing actors with intellectual disabilities as simulated patients in the assessment of trainee psychiatrists.
Design/methodology/approach
The development of a structured clinical exam “station” designed to assess the ability of trainee psychiatrists to communicate with a simulated patient played by an actor with an intellectual disability is described. The paper also assesses the potential benefits of this kind of assessment and the experience of actors and examiners taking part in this process.
Findings
The station was found to perform well in discriminating between candidates of various abilities and was well received by actors, examiners and observers. The station is now routinely used in the formal assessment of trainee psychiatrists in the UK.
Practical implications
The use of people with intellectual disabilities in training and assessment appears to be advantageous in terms of improving knowledge, attitudes and skills amongst healthcare professionals and gives increased opportunities for people with intellectual disabilities to undertake valued social roles.
Originality/value
Few institutions currently employ actors with intellectual disabilities as simulated patients as part of their training programmes and as a result there is little in the way of literature on this subject. This paper describes an alternative approach to teaching and assessment which falls in line with recommendations from the UK Department of Health to involve service users in the training of healthcare professionals.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
All different types of abuse can happen to people with intellectual disabilities living in community setting. This can include physical abuse, including the use of restrictive practices, financial abuse by strangers but also by family and carers. They can also be victim to sexual abuse. Neglect is a relatively common concern, which is perhaps more likely in the community compared to institutional settings.
In this presentation we will discuss the fundamental balancing act between paternalism an autonomy that is so often an issue when supporting people with intellectual disabilities in the community, and how to decide where to draw the line in individual cases. We will consider a range of examples to illustrate this, including unlawful deprivation of liberty, people choosing life partners that others regard as unsuitable, why families might restrict access to services, and whether giving people more control over their care through direct payments and individual budgets can lead to financial exploitation.
Finally we will discuss potential solutions to preventing abuse including robust Safeguarding procedures, integrated working between health and social services, a program of Positive Behavioral Support, maximizing communication, promoting access to health and the recognition of mental health problems, how to disseminate training, and the importance of advocacy and regular review.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
In May 2015, NICE published guidelines for people with intellectual disabilities whose behavior challenges (NG11). Eight quality standards were subsequently developed by NICE to help service providers, health and social care practitioners and commissioners implement the necessary recommendations within the new NG11 guidelines.
Methods
We used a Quality Improvement (QI) methodology including process mapping, driver diagrams, and fortnightly QI team meetings. We conducted a baseline audit of the quality standards and used Plan-Do-Study-Act (PDSA) cycles to pilot interventions generated by the team to improve compliance with the standards.
Results
Baseline compliance with the quality standards was low. We identified four priority areas for intervention: annual physical health checks, recording the indication of medication, multidisciplinary case discussion and concurrent psychosocial interventions for those prescribed medications for challenging behavior. Using a PDSA cycle for each intervention, we have demonstrated improved compliance with the NG11 guidelines. Compliance for the recording of indication of medication for all case reviews was previously 0% and now 100%. At least one target case is discussed at each MDT team meeting. Full results for annual health checks are awaited, but intervention has already shown an improvement in the uptake from 40% to 70%. Staff and carers knowledge of psychosocial interventions for people with challenging behavior showed an improvement after training.
Conclusions
Quality Improvement methodology was successful in improving adherence to NG11 guidelines. We are currently assessing whether this is leading to reductions in challenging behavior and improvements to people's well-being.
The RemoveDEBRIS mission has been the first mission to successfully demonstrate, in-orbit, a series of technologies that can be used for the active removal of space debris. The mission started late in 2014 and was sponsored by a grant from the EC that saw a consortium led by the Surrey Space Centre to develop the mission, from concept to in-orbit demonstrations, that terminated in March 2019. Technologies for the capture of large space debris, like a net and a harpoon, have been successfully tested together with hardware and software to retrieve data on non-cooperative target debris kinematics from observations carried out with on board cameras. The final demonstration consisted of the deployment of a drag-sail to increase the drag of the satellite to accelerate its demise.
Shiga-toxin producing Escherichia coli (STEC) is a pathogen that can cause bloody diarrhoea and severe complications. Cases occur sporadically but outbreaks are also common. Understanding the incubation period distribution and factors influencing it will help in the investigation of exposures and consequent disease control. We extracted individual patient data for STEC cases associated with outbreaks with a known source of exposure in England and Wales. The incubation period was derived and cases were described according to patient and outbreak characteristics. We tested for heterogeneity in reported incubation period between outbreaks and described the pattern of heterogeneity. We employed a multi-level regression model to examine the relationship between patient characteristics such as age, gender and reported symptoms; and outbreak characteristics such as mode of transmission with the incubation period. A total of 205 cases from 41 outbreaks were included in the study, of which 64 cases (31%) were from a single outbreak. The median incubation period was 4 days. Cases reporting bloody diarrhoea reported shorter incubation periods compared with cases without bloody diarrhoea, and likewise, cases aged between 40 and 59 years reported shorter incubation period compared with other age groups. It is recommended that public health officials consider the characteristics of cases involved in an outbreak in order to inform the outbreak investigation and the period of exposure to be investigated.
Two fatal drumming-related inhalational anthrax incidents occurred in 2006 and 2008 in the UK. One individual was a drum maker and drummer from the Scottish Borders, most likely infected whilst playing a goat-skin drum contaminated with Bacillus anthracis spores; the second, a drummer and drum maker from East London, likely became infected whilst working with contaminated animal hides.
We have collated epidemiological and environmental data from these incidents and reviewed them alongside three similar contemporaneous incidents in the USA. Sampling operations recovered the causative agent from drums and drum skins and from residences and communal buildings at low levels. From these data, we have considered the nature of the exposures and the number of other individuals likely to have been exposed, either to the primary infection events or to subsequent prolonged environmental contamination (or both).
Despite many individual exposures to widespread low-level spore contamination in private residences and in work spaces for extended periods of time (at least 1 year in one instance), only one other individual acquired an infection (cutaneous). Whilst recognising the difficulty in making definitive inferences from these incidents to specific residual contamination levels, and by extending the risk to public health, we believe it may be useful to reflect on these findings when considering future incident management risk assessments and decisions in similar incidents that result in low-level indoor contamination.
Coinfection with human immunodeficiency virus (HIV) and viral hepatitis is associated with high morbidity and mortality in the absence of clinical management, making identification of these cases crucial. We examined characteristics of HIV and viral hepatitis coinfections by using surveillance data from 15 US states and two cities. Each jurisdiction used an automated deterministic matching method to link surveillance data for persons with reported acute and chronic hepatitis B virus (HBV) or hepatitis C virus (HCV) infections, to persons reported with HIV infection. Of the 504 398 persons living with diagnosed HIV infection at the end of 2014, 2.0% were coinfected with HBV and 6.7% were coinfected with HCV. Of the 269 884 persons ever reported with HBV, 5.2% were reported with HIV. Of the 1 093 050 persons ever reported with HCV, 4.3% were reported with HIV. A greater proportion of persons coinfected with HIV and HBV were males and blacks/African Americans, compared with those with HIV monoinfection. Persons who inject drugs represented a greater proportion of those coinfected with HIV and HCV, compared with those with HIV monoinfection. Matching HIV and viral hepatitis surveillance data highlights epidemiological characteristics of persons coinfected and can be used to routinely monitor health status and guide state and national public health interventions.
Accurate knowledge of pathogen incubation period is essential to inform public health policies and implement interventions that contribute to the reduction of burden of disease. The incubation period distribution of campylobacteriosis is currently unknown with several sources reporting different times. Variation in the distribution could be expected due to host, transmission vehicle, and organism characteristics, however, the extent of this variation and influencing factors are unclear. The authors have undertaken a systematic review of published literature of outbreak studies with well-defined point source exposures and human experimental studies to estimate the distribution of incubation period and also identify and explain the variation in the distribution between studies. We tested for heterogeneity using I2 and Kolmogorov–Smirnov tests, regressed incubation period against possible explanatory factors, and used hierarchical clustering analysis to define subgroups of studies without evidence of heterogeneity. The mean incubation period of subgroups ranged from 2·5 to 4·3 days. We observed variation in the distribution of incubation period between studies that was not due to chance. A significant association between the mean incubation period and age distribution was observed with outbreaks involving only children reporting an incubation of 1·29 days longer when compared with outbreaks involving other age groups.
Introduction: Patients with advanced malignant and non-malignant disease (advanced disease—AD) who do not want or benefit from aggressive resuscitation may unfortunately receive such treatments if unable to communicate in an emergency. Timely access to patients’ resuscitation wishes is imperative for treating physicians and for medical information systems. Our aim was to determine what proportion of emergency department (ED) patients with AD have accurate, readily accessible resuscitation status documentation. Methods: This cross-sectional, prospective study was conducted at a tertiary care ED during purposefully sampled random accrual times in summer 2016. We enrolled all patients with: 1) palliative care consultation, 2) metastatic malignancy, 3) COPD or CHF on home oxygen, 4) hemodialysis, or 5) advanced neurodegenerative disease/dementia. The primary outcome was the retrieval of any existing resuscitation status documents. Documentation was obtained from a standardized review of forms accompanying the patient (“arrival documents”) and electronic medical record (“EMR”). We measured the time to retrieve this documentation, and interviewed consenting patients to corroborate documentation with their current wishes. Results: Of 85 enrolled patients, only 33 (39%) had any documentation of resuscitation status: 28 (33%) had goals of care retrieved from the hospital EMR, and 11 (15%) from arrival documents (some had both). Patients from long-term care facilities were more likely to have documentation available (odds ratio 13 [95% CI 2.5-65] vs community-living). Of 32 patients who were able to be interviewed, 20 (63%) expressed “do not resuscitate” wishes. Ten of these 20 lacked any documents to support their expressed resuscitation wishes. Previously expressed resuscitation wishes took more than 5 minutes to be retrieved in 3 cases when not filed “one click deep” in our EMR. Conclusion: The majority of patients with AD, including half of those who would not wish resuscitation from cardiorespiratory arrest, did not have goals of care documents readily available upon arrival to the ED. Patients living in the community with AD appear to be at high risk for unwanted resuscitative treatments should they present to hospital in extremis. Having documentation of their goals of care that is easily retrievable from the EMR shows promise, though issues of retrieval, accuracy, and validity remain important considerations.
Signatories to the Convention on Biological Diversity (CBD) agreed to the effective protection of at least 17% of the terrestrial environment by 2020 (Aichi Target 11). Here, we assess the coverage of terrestrial protected areas (land protected by legislation) on the UK's Overseas Territories. These 14 Territories are under the sovereignty of the UK, a signatory of the CBD, and are particularly biodiverse. Eight Territories have protected areas covering 17% or more of their land, but the extent of protection across these Territories as a whole is low, with only 4.8% of this land designated as protected. This protection covered 51% of sites already identified as of conservation importance (Important Bird and Biodiversity Areas), although only 8% of the area of these sites was protected. The expansion of effective protection to meet the 17% target provides an opportunity to capture the most important sites for conservation. Locally led designation will require an improvement in knowledge of the distribution and density of species. This, together with measures to ensure that the protection is enforced and effective, will require provision of resources. This should be seen as an investment in the UK meeting its obligations to Aichi Target 11.
The target article's call to end reliance on acceptability judgments is premature. First, it restricts syntactic inquiry to cases were a semantically equivalent alternative is available. Second, priming studies require groups of participants who are linguistically homogenous and whose grammar is known to the researcher. These requirements would eliminate two major research areas: syntactic competence in d/Deaf individuals, and language documentation. (We follow the convention of using deaf to describe hearing levels, Deaf to describe cultural identity, and d/Deaf to include both. Our own work has focused on Deaf signers, but the same concerns could apply to other deaf populations.)
Outbreaks of insect pests periodically cause large losses of volume in Canada’s forests. Compounded with climate change, outbreaks create significant challenges for managing the sustainable delivery of ecosystem services. Current methods to monitor damage by these pests involve both field and aerial surveys. While relatively cost effective and timely, aerial survey consistency and spatial coverage may be insufficient for detailed monitoring across Canada’s vast forest-land base. Remote sensing can augment these methods and extend monitoring capabilities in time and space by incorporating knowledge of pest-host interactions and of how damage translates into a remote sensing signal for detection and mapping. This review provides a brief introduction to major forest insect pests in Canada (two bark beetles (Coleoptera: Curculionidae) and six defoliators) and the damage they cause, a synthesis of the literature involving aerial survey and remote sensing, and a discussion of how these two approaches could be integrated into future pest monitoring from a Canadian perspective. We offer some lessons learned, outline roles that remote sensing could serve in a management context, and discuss what ongoing and upcoming technological advances may offer to future forest health monitoring.