To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The ‘jumping to conclusions’ (JTC) bias is associated with both psychosis and general cognition but their relationship is unclear. In this study, we set out to clarify the relationship between the JTC bias, IQ, psychosis and polygenic liability to schizophrenia and IQ.
A total of 817 first episode psychosis patients and 1294 population-based controls completed assessments of general intelligence (IQ), and JTC, and provided blood or saliva samples from which we extracted DNA and computed polygenic risk scores for IQ and schizophrenia.
The estimated proportion of the total effect of case/control differences on JTC mediated by IQ was 79%. Schizophrenia polygenic risk score was non-significantly associated with a higher number of beads drawn (B = 0.47, 95% CI −0.21 to 1.16, p = 0.17); whereas IQ PRS (B = 0.51, 95% CI 0.25–0.76, p < 0.001) significantly predicted the number of beads drawn, and was thus associated with reduced JTC bias. The JTC was more strongly associated with the higher level of psychotic-like experiences (PLEs) in controls, including after controlling for IQ (B = −1.7, 95% CI −2.8 to −0.5, p = 0.006), but did not relate to delusions in patients.
Our findings suggest that the JTC reasoning bias in psychosis might not be a specific cognitive deficit but rather a manifestation or consequence, of general cognitive impairment. Whereas, in the general population, the JTC bias is related to PLEs, independent of IQ. The work has the potential to inform interventions targeting cognitive biases in early psychosis.
We describe an ultra-wide-bandwidth, low-frequency receiver recently installed on the Parkes radio telescope. The receiver system provides continuous frequency coverage from 704 to 4032 MHz. For much of the band (
), the system temperature is approximately 22 K and the receiver system remains in a linear regime even in the presence of strong mobile phone transmissions. We discuss the scientific and technical aspects of the new receiver, including its astronomical objectives, as well as the feed, receiver, digitiser, and signal processor design. We describe the pipeline routines that form the archive-ready data products and how those data files can be accessed from the archives. The system performance is quantified, including the system noise and linearity, beam shape, antenna efficiency, polarisation calibration, and timing stability.
Daily use of high-potency cannabis has been reported to carry a high risk for developing a psychotic disorder. However, the evidence is mixed on whether any pattern of cannabis use is associated with a particular symptomatology in first-episode psychosis (FEP) patients.
We analysed data from 901 FEP patients and 1235 controls recruited across six countries, as part of the European Network of National Schizophrenia Networks Studying Gene-Environment Interactions (EU-GEI) study. We used item response modelling to estimate two bifactor models, which included general and specific dimensions of psychotic symptoms in patients and psychotic experiences in controls. The associations between these dimensions and cannabis use were evaluated using linear mixed-effects models analyses.
In patients, there was a linear relationship between the positive symptom dimension and the extent of lifetime exposure to cannabis, with daily users of high-potency cannabis having the highest score (B = 0.35; 95% CI 0.14–0.56). Moreover, negative symptoms were more common among patients who never used cannabis compared with those with any pattern of use (B = −0.22; 95% CI −0.37 to −0.07). In controls, psychotic experiences were associated with current use of cannabis but not with the extent of lifetime use. Neither patients nor controls presented differences in depressive dimension related to cannabis use.
Our findings provide the first large-scale evidence that FEP patients with a history of daily use of high-potency cannabis present with more positive and less negative symptoms, compared with those who never used cannabis or used low-potency types.
We implemented a guideline for appropriate acid suppressant use in hematology-oncology patients. This intervention resulted in a sustained reduction in proton pump inhibitor (PPI) use without an increase in rates of gastrointestinal bleeding. Practice guidelines are effective in reducing PPI use, which is associated with risk of Clostridioides difficile infection.
Impairment in financial capacity is an early sign of cognitive decline and functional impairment in late life. Cognitive impairments such as executive dysfunction are well documented in late-life major depression; however, little progress has been made in assessing associations of these impairments with financial incapacity.
Participants included 95 clinically depressed and 41 nondepressed older adults without dementia. Financial capacity (assessed with the Managing Money scale of the Independent Living Scale), cognitive functioning (comprehensive neuropsychological evaluation), and depression severity (Hamilton Depression Rating Scale – 24) were assessed. T tests were used to assess group differences. Linear regression was used to analyze data.
Depressed participants performed significantly lower on financial capacity (t = 2.98, p < .01). Among depressed participants, executive functioning (B = .24, p < .05) was associated with reduced financial capacity, controlling for age, gender, education, depression severity, and other cognitive domains.
Our results underscore the importance of assessing financial capacity in older depressed adults as they are likely vulnerable to financial abuse even in the absence of dementia. It will be valuable to assess whether treatment for depression is an effective intervention to improve outcomes.
Concerns have repeatedly been expressed about the quality of physical healthcare that people with psychosis receive.
To examine whether the introduction of a financial incentive for secondary care services led to improvements in the quality of physical healthcare for people with psychosis.
Longitudinal data were collected over an 8-year period on the quality of physical healthcare that people with psychosis received from 56 trusts in England before and after the introduction of the financial incentive. Control data were also collected from six health boards in Wales where a financial incentive was not introduced. We calculated the proportion of patients whose clinical records indicated that they had been screened for seven key aspects of physical health and whether they were offered interventions for problems identified during screening.
Data from 17 947 people collected prior to (2011 and 2013) and following (2017) the introduction of the financial incentive in 2014 showed that the proportion of patients who received high-quality physical healthcare in England rose from 12.85% to 31.65% (difference 18.80, 95% CI 17.37–20.21). The proportion of patients who received high-quality physical healthcare in Wales during this period rose from 8.40% to 13.96% (difference 5.56, 95% CI 1.33–10.10).
The results of this study suggest that financial incentives for secondary care mental health services are associated with marked improvements in the quality of care that patients receive. Further research is needed to examine their impact on aspects of care that are not incentivised.
Due to concerns over increasing fluoroquinolone (FQ) resistance among gram-negative organisms, our stewardship program implemented a preauthorization use policy. The goal of this study was to assess the relationship between hospital FQ use and antibiotic resistance.
Large academic medical center.
We performed a retrospective analysis of FQ susceptibility of hospital isolates for 5 common gram-negative bacteria: Acinetobacter spp., Enterobacter cloacae, Escherichia coli, Klebsiella pneumoniae, and Pseudomonas aeruginosa. Primary endpoint was the change of FQ susceptibility. A Poisson regression model was used to calculate the rate of change between the preintervention period (1998–2005) and the postimplementation period (2006–2016).
Large rates of decline of FQ susceptibility began in 1998, particularly among P. aeruginosa, Acinetobacter spp., and E. cloacae. Our FQ restriction policy improved FQ use from 173 days of therapy (DOT) per 1,000 patient days to <60 DOT per 1,000 patient days. Fluoroquinolone susceptibility increased for Acinetobacter spp. (rate ratio [RR], 1.038; 95% confidence interval [CI], 1.005–1.072), E. cloacae (RR, 1.028; 95% CI, 1.013–1.044), and P. aeruginosa (RR, 1.013; 95% CI, 1.006–1.020). No significant change in susceptibility was detected for K. pneumoniae (RR, 1.002; 95% CI, 0.996–1.008), and the susceptibility for E. coli continued to decline, although the decline was not as steep (RR, 0.981; 95% CI, 0.975–0.987).
A stewardship-driven FQ restriction program stopped overall declining FQ susceptibility rates for all species except E. coli. For 3 species (ie, Acinetobacter spp, E. cloacae, and P. aeruginosa), susceptibility rates improved after implementation, and this improvement has been sustained over a 10-year period.
The value of the nosological distinction between non-affective and affective psychosis has frequently been challenged. We aimed to investigate the transdiagnostic dimensional structure and associated characteristics of psychopathology at First Episode Psychosis (FEP). Regardless of diagnostic categories, we expected that positive symptoms occurred more frequently in ethnic minority groups and in more densely populated environments, and that negative symptoms were associated with indices of neurodevelopmental impairment.
This study included 2182 FEP individuals recruited across six countries, as part of the EUropean network of national schizophrenia networks studying Gene–Environment Interactions (EU-GEI) study. Symptom ratings were analysed using multidimensional item response modelling in Mplus to estimate five theory-based models of psychosis. We used multiple regression models to examine demographic and context factors associated with symptom dimensions.
A bifactor model, composed of one general factor and five specific dimensions of positive, negative, disorganization, manic and depressive symptoms, best-represented associations among ratings of psychotic symptoms. Positive symptoms were more common in ethnic minority groups. Urbanicity was associated with a higher score on the general factor. Men presented with more negative and less depressive symptoms than women. Early age-at-first-contact with psychiatric services was associated with higher scores on negative, disorganized, and manic symptom dimensions.
Our results suggest that the bifactor model of psychopathology holds across diagnostic categories of non-affective and affective psychosis at FEP, and demographic and context determinants map onto general and specific symptom dimensions. These findings have implications for tailoring symptom-specific treatments and inform research into the mood-psychosis spectrum.
Novel approaches to improving disaster response have begun to include the use of big data and information and communication technology (ICT). However, there remains a dearth of literature on the use of these technologies in disasters. We have conducted an integrative literature review on the role of ICT and big data in disasters. Included in the review were 113 studies that met our predetermined inclusion criteria. Most studies used qualitative methods (39.8%, n=45) over mixed methods (31%, n=35) or quantitative methods (29.2%, n=33). Nearly 80% (n=88) covered only the response phase of disasters and only 15% (n=17) of the studies addressed disasters in low- and middle-income countries. The 4 most frequently mentioned tools were geographic information systems, social media, patient information, and disaster modeling. We suggest testing ICT and big data tools more widely, especially outside of high-income countries, as well as in nonresponse phases of disasters (eg, disaster recovery), to increase an understanding of the utility of ICT and big data in disasters. Future studies should also include descriptions of the intended users of the tools, as well as implementation challenges, to assist other disaster response professionals in adapting or creating similar tools. (Disaster Med Public Health Preparedness. 2019;13:353–367)
Public involvement in disinvestment decision making in health care is widely advocated, and in some cases legally mandated. However, attempts to involve the public in other areas of health policy have been accused of tokenism and manipulation. This paper presents research into the views of local health care leaders in the English National Health Service (NHS) with regards to the involvement of citizens and local communities in disinvestment decision making. The research includes a Q study and follow-up interviews with a sample of health care clinicians and managers in senior roles in the English NHS. It finds that whilst initial responses suggest high levels of support for public involvement, further probing of attitudes and experiences shows higher levels of ambivalence and risk aversion and a far more cautious overall stance. This study has implications for the future of disinvestment activities and public involvement in health care systems faced with increased resource constraint. Recommendations are made for future research and practice.
Echocardiographic screening for rheumatic heart disease in asymptomatic children may result in early diagnosis and prevent progression. Physician-led screening is not feasible in Malawi. Task shifting to mid-level providers such as clinical officers may enable more widespread screening.
With short-course training, clinical officers can accurately screen for rheumatic heart disease using focussed echocardiography.
A total of eight clinical officers completed three half-days of didactics and 2 days of hands-on echocardiography training. Clinical officers were evaluated by performing screening echocardiograms on 20 children with known rheumatic heart disease status. They indicated whether children should be referred for follow-up. Referral was indicated if mitral regurgitation measured more than 1.5 cm or there was any measurable aortic regurgitation. The κ statistic was calculated to measure referral agreement with a paediatric cardiologist. Sensitivity and specificity were estimated using a generalised linear mixed model, and were calculated on the basis of World Heart Federation diagnostic criteria.
The mean κ statistic comparing clinical officer referrals with the paediatric cardiologist was 0.72 (95% confidence interval: 0.62, 0.82). The κ value ranged from a minimum of 0.57 to a maximum of 0.90. For rheumatic heart disease diagnosis, sensitivity was 0.91 (95% confidence interval: 0.86, 0.95) and specificity was 0.65 (95% confidence interval: 0.57, 0.72).
There was substantial agreement between clinical officers and paediatric cardiologists on whether to refer. Clinical officers had a high sensitivity in detecting rheumatic heart disease. With short-course training, clinical officer-led echo screening for rheumatic heart disease is a viable alternative to physician-led screening in resource-limited settings.
We trained local public health workers on disaster recovery roles and responsibilities by using a novel curriculum based on a threat and efficacy framework and a training-of-trainers approach. This study used qualitative data to assess changes in perceptions of efficacy toward Hurricane Sandy recovery and willingness to participate in future disaster recoveries.
Purposive and snowball sampling were used to select trainers and trainees from participating local public health departments in jurisdictions impacted by Hurricane Sandy in October 2012. Two focus groups totaling 29 local public health workers were held in April and May of 2015. Focus group participants discussed the content and quality of the curriculum, training logistics, and their willingness to engage in future disaster recovery efforts.
The training curriculum improved participants’ understanding of and confidence in their disaster recovery work and related roles within their agencies (self-efficacy); increased their individual- and agency-level sense of role-importance in disaster recovery (response-efficacy); and enhanced their sense of their agencies’ effective functioning in disaster recovery. Participants suggested further training customization and inclusion of other recovery agencies.
Threat- and efficacy-based disaster recovery trainings show potential to increase public health workers’ sense of efficacy and willingness to participate in recovery efforts. (Disaster Med Public Health Preparedness. 2016;10:615–622)
Electroconvulsive therapy (ECT) prescription rates rise with age, making it important that treatments be made as effective and safe as possible (Plakiotis et al., 2012). Older people are vulnerable to post-treatment confusion and to subsequent deficits in attention, new learning, and autobiographical memory (Gardner and O'Connor, 2008). Strategies to minimize cognitive side-effects include unilateral electrode placement and stimulus dose titration whereby electrical charge is individually calibrated to seizure threshold (Sackeim et al., 2000). It remains the case, however, that threshold levels typically rise over the treatment course, leading to an increase both in delivered charge and the risk of adverse sequelae.
Tic disorders are moderately heritable common psychiatric disorders that can be highly troubling, both in childhood and in adulthood. In this study, we report results obtained in the first epigenome-wide association study (EWAS) of tic disorders. The subjects are participants in surveys at the Netherlands Twin Register (NTR) and the NTR biobank project. Tic disorders were measured with a self-report version of the Yale Global Tic Severity Scale Abbreviated version (YGTSS-ABBR), included in the 8th wave NTR data collection (2008). DNA methylation data consisted of 411,169 autosomal methylation sites assessed by the Illumina Infinium HumanMethylation450 BeadChip Kit (HM450k array). Phenotype and DNA methylation data were available in 1,678 subjects (mean age = 41.5). No probes reached genome-wide significance (p < 1.2 × 10−7). The strongest associated probe was cg15583738, located in an intergenic region on chromosome 8 (p = 1.98 × 10−6). Several of the top ranking probes (p < 1 × 10−4) were in or nearby genes previously associated with neurological disorders (e.g., GABBRI, BLM, and ADAM10), warranting their further investigation in relation to tic disorders. The top significantly enriched gene ontology (GO) terms among higher ranking methylation sites included anatomical structure morphogenesis (GO:0009653, p = 4.6 × 10−15) developmental process (GO:0032502, p = 2.96 × 10−12), and cellular developmental process (GO:0048869, p = 1.96 × 10−12). Overall, these results provide a first insight into the epigenetic mechanisms of tic disorders. This first study assesses the role of DNA methylation in tic disorders, and it lays the foundations for future work aiming to unravel the biological mechanisms underlying the architecture of this disorder.
The discovery of Neolithic houses at Durrington Walls that are contemporary with the main construction phase of Stonehenge raised questions as to their interrelationship. Was Durrington Walls the residence of the builders of Stonehenge? Were the activities there more significant than simply domestic subsistence? Using lipid residue analysis, this paper identifies the preferential use of certain pottery types for the preparation of particular food groups and differential consumption of dairy and meat products between monumental and domestic areas of the site. Supported by the analysis of faunal remains, the results suggest seasonal feasting and perhaps organised culinary unification of a diverse community.
Gaining vascular access is essential in the resuscitation of critically ill patients. Intraosseous (IO) placement is a fundamentally important alternative to intravenous (IV) access in conditions where IV access delays resuscitation or is not possible. This case report presents a previously unreported example of prehospital misplacement of an IO catheter into the intra-articular space of the knee joint. This report serves to inform civilian and military first responders, as well as emergency medicine physicians, of intra-articular IO line placement as a potential complication of IO vascular access. Infusion of large amounts of fluid into the joint space could damage the joint and be catastrophic to a patient who needs immediate IV fluids or medications. In addition, intra-articular IO placement could result in septic arthritis of the knee.
GrabelZ, DePasseJM, LareauCR, BornCT, DanielsAH. Intra-articular Placement of an Intraosseous Catheter. Prehosp Disaster Med. 2015;30(1):1-4.
To examine the use of vitamin D supplements during infancy among the participants in an international infant feeding trial.
Information about vitamin D supplementation was collected through a validated FFQ at the age of 2 weeks and monthly between the ages of 1 month and 6 months.
Infants (n 2159) with a biological family member affected by type 1 diabetes and with increased human leucocyte antigen-conferred susceptibility to type 1 diabetes from twelve European countries, the USA, Canada and Australia.
Daily use of vitamin D supplements was common during the first 6 months of life in Northern and Central Europe (>80 % of the infants), with somewhat lower rates observed in Southern Europe (>60 %). In Canada, vitamin D supplementation was more common among exclusively breast-fed than other infants (e.g. 71 % v. 44 % at 6 months of age). Less than 2 % of infants in the USA and Australia received any vitamin D supplementation. Higher gestational age, older maternal age and longer maternal education were study-wide associated with greater use of vitamin D supplements.
Most of the infants received vitamin D supplements during the first 6 months of life in the European countries, whereas in Canada only half and in the USA and Australia very few were given supplementation.