To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The aim of this study was to provide insights learned from disaster research response (DR2) efforts following Hurricane Harvey in 2017 to launch DR2 activities following the Intercontinental Terminals Company (ITC) fire in Deer Park, Texas, in 2019.
A multidisciplinary group of academic, community, and government partners launched a myriad of DR2 activities.
The DR2 response to Hurricane Harvey focused on enhancing environmental health literacy around clean-up efforts, measuring environmental contaminants in soil and water in impacted neighborhoods, and launching studies to evaluate the health impact of the disaster. The lessons learned after Harvey enabled rapid DR2 activities following the ITC fire, including air monitoring and administering surveys and in-depth interviews with affected residents.
Embedding DR2 activities at academic institutions can enable rapid deployment of lessons learned from one disaster to enhance the response to subsequent disasters, even when those disasters are different. Our experience demonstrates the importance of academic institutions working with governmental and community partners to support timely disaster response efforts. Efforts enabled by such experience include providing health and safety training and consistent and reliable messaging, collecting time-sensitive and critical data in the wake of the event, and launching research to understand health impacts and improve resiliency.
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
The Rapid ASKAP Continuum Survey (RACS) is the first large-area survey to be conducted with the full 36-antenna Australian Square Kilometre Array Pathfinder (ASKAP) telescope. RACS will provide a shallow model of the ASKAP sky that will aid the calibration of future deep ASKAP surveys. RACS will cover the whole sky visible from the ASKAP site in Western Australia and will cover the full ASKAP band of 700–1800 MHz. The RACS images are generally deeper than the existing NRAO VLA Sky Survey and Sydney University Molonglo Sky Survey radio surveys and have better spatial resolution. All RACS survey products will be public, including radio images (with
15 arcsec resolution) and catalogues of about three million source components with spectral index and polarisation information. In this paper, we present a description of the RACS survey and the first data release of 903 images covering the sky south of declination
made over a 288-MHz band centred at 887.5 MHz.
Introduction: Point-of-care ultrasound (POCUS) has become standard practice in emergency departments ranging from remote rural hospitals to well-resourced academic centres. To facilitate quality assurance, the Canadian Association of Emergency Physicians (CAEP) recommends image archiving. Due in part to poor infrastructure and lack of a national standard, however, archiving remains uncommon. Our objective was to establish a minimum standard archiving protocol for the core emergency department POCUS indications. Methods: Itemization of potential archiving standards was created through an extensive literature review. An online, three-round, modified Delphi survey was conducted with the thirteen POCUS experts on the national CAEP Emergency Ultrasound Committee tasked with representing diverse practice locations and experiences. Participants were surveyed to determine the images or clips, measurements, mode, and number of views that should comprise the minimum standard for archiving. Consensus was pre-defined as 80%. Results: All thirteen experts participated fully in the three rounds. In establishing minimum image archiving standards for emergency department POCUS, complete consensus was achieved for first trimester pregnancy, hydronephrosis, cardiac activity versus standstill, lower extremity deep venous thrombosis, and ultrasound-guided central line placement. Consensus was achieved for the majority of statements regarding abdominal aortic aneurysm, extended focused assessment with sonography in trauma, pericardial effusion, left and right ventricular function, thoracic B-line assessment, cholelithiasis and cholecystitis scans. In total, consensus was reached for 58 of 69 statements (84.1%). This included agreement on 41 of 43 statements (95.3%) describing mandatory images for archiving in the above indications. Conclusion: Our modified Delphi-derived consensus represents the first national standard archiving requirements for emergency department POCUS. Depending on the clinical context, additional images may be required beyond this minimum standard to support a diagnosis.
Night-migratory songbirds appear to sense the direction of the Earth's magnetic field via radical pair intermediates formed photochemically in cryptochrome flavoproteins contained in photoreceptor cells in their retinas. It is an open question whether this light-dependent mechanism could be sufficiently sensitive given the low-light levels experienced by nocturnal migrants. The scarcity of available photons results in significant uncertainty in the signal generated by the magnetoreceptors distributed around the retina. Here we use results from Information Theory to obtain a lower bound estimate of the precision with which a bird could orient itself using only geomagnetic cues. Our approach bypasses the current lack of knowledge about magnetic signal transduction and processing in vivo by computing the best-case compass precision under conditions where photons are in short supply. We use this method to assess the performance of three plausible cryptochrome-derived flavin-containing radical pairs as potential magnetoreceptors.
To summarize and discuss logistic and administrative challenges we encountered during the Benefits of Enhanced Terminal Room (BETR) Disinfection Study and lessons learned that are pertinent to future utilization of ultraviolet (UV) disinfection devices in other hospitals
Multicenter cluster randomized trial
SETTING AND PARTICIPANTS
Nine hospitals in the southeastern United States
All participating hospitals developed systems to implement 4 different strategies for terminal room disinfection. We measured compliance with disinfection strategy, barriers to implementation, and perceptions from nurse managers and environmental services (EVS) supervisors throughout the 28-month trial.
Implementation of enhanced terminal disinfection with UV disinfection devices provides unique challenges, including time pressures from bed control personnel, efficient room identification, negative perceptions from nurse managers, and discharge volume. In the course of the BETR Disinfection Study, we utilized several strategies to overcome these barriers: (1) establishing safety as the priority; (2) improving communication between EVS, bed control, and hospital administration; (3) ensuring availability of necessary resources; and (4) tracking and providing feedback on compliance. Using these strategies, we deployed ultraviolet (UV) disinfection devices in 16,220 (88%) of 18,411 eligible rooms during our trial (median per hospital, 89%; IQR, 86%–92%).
Implementation of enhanced terminal room disinfection strategies using UV devices requires recognition and mitigation of 2 key barriers: (1) timely and accurate identification of rooms that would benefit from enhanced terminal disinfection and (2) overcoming time constraints to allow EVS cleaning staff sufficient time to properly employ enhanced terminal disinfection methods.
We performed a spatial-temporal analysis to assess household risk factors for Ebola virus disease (Ebola) in a remote, severely-affected village. We defined a household as a family's shared living space and a case-household as a household with at least one resident who became a suspect, probable, or confirmed Ebola case from 1 August 2014 to 10 October 2014. We used Geographic Information System (GIS) software to calculate inter-household distances, performed space-time cluster analyses, and developed Generalized Estimating Equations (GEE). Village X consisted of 64 households; 42% of households became case-households over the observation period. Two significant space-time clusters occurred among households in the village; temporal effects outweighed spatial effects. GEE demonstrated that the odds of becoming a case-household increased by 4·0% for each additional person per household (P < 0·02) and 2·6% per day (P < 0·07). An increasing number of persons per household, and to a lesser extent, the passage of time after onset of the outbreak were risk factors for household Ebola acquisition, emphasizing the importance of prompt public health interventions that prioritize the most populated households. Using GIS with GEE can reveal complex spatial-temporal risk factors, which can inform prioritization of response activities in future outbreaks.
Cognitive deficits in schizophrenia have major functional impacts. Modafinil is a cognitive enhancer whose effect in healthy volunteers is well-described, but whose effects on the cognitive deficits of schizophrenia appear to be inconsistent. Two possible reasons for this are that cognitive test batteries vary in their sensitivity, or that the phase of illness may be important, with patients early in their illness responding better.
A double-blind, randomised, placebo-controlled single-dose crossover study of modafinil 200 mg examined this with two cognitive batteries [MATRICS Consensus Cognitive Battery (MCCB) and Cambridge Neuropsychological Test Automated Battery (CANTAB)] in 46 participants with under 3 years’ duration of DSM-IV schizophrenia, on stable antipsychotic medication. In parallel, the same design was used in 28 age-, sex-, and education-matched healthy volunteers. Uncorrected p values were calculated using mixed effects models.
In patients, modafinil significantly improved CANTAB Paired Associate Learning, non-significantly improved efficiency and significantly slowed performance of the CANTAB Stockings of Cambridge spatial planning task. There was no significant effect on any MCCB domain. In healthy volunteers, modafinil significantly increased CANTAB Rapid Visual Processing, Intra-Extra Dimensional Set Shifting and verbal recall accuracy, and MCCB social cognition performance. The only significant differences between groups were in MCCB visual learning.
As in earlier chronic schizophrenia studies, modafinil failed to produce changes in cognition in early psychosis as measured by MCCB. CANTAB proved more sensitive to the effects of modafinil in participants with early schizophrenia and in healthy volunteers. This confirms the importance of selecting the appropriate test battery in treatment studies of cognition in schizophrenia.
We determined the profitability and risk for spring- and fall-calving beef cows in Tennessee. Simulation models were developed using 19 years of data and considered the seasonality of cattle prices and feed prices for least-cost feed rations to find a distribution of net returns for spring- and fall-calving seasons for two weaning months. Fall calving was more profitable than the spring calving for all feed rations and weaning months. Fall calving was also risk preferred over spring calving for all levels of risk aversion. Higher calf prices at weaning were the primary factor influencing the risk efficiency of fall calving.
Major depressive disorder (MDD) is a common and disabling condition with well-established heritability and environmental risk factors. Gene–environment interaction studies in MDD have typically investigated candidate genes, though the disorder is known to be highly polygenic. This study aims to test for interaction between polygenic risk and stressful life events (SLEs) or childhood trauma (CT) in the aetiology of MDD.
The RADIANT UK sample consists of 1605 MDD cases and 1064 controls with SLE data, and a subset of 240 cases and 272 controls with CT data. Polygenic risk scores (PRS) were constructed using results from a mega-analysis on MDD by the Psychiatric Genomics Consortium. PRS and environmental factors were tested for association with case/control status and for interaction between them.
PRS significantly predicted depression, explaining 1.1% of variance in phenotype (p = 1.9 × 10−6). SLEs and CT were also associated with MDD status (p = 2.19 × 10−4 and p = 5.12 × 10−20, respectively). No interactions were found between PRS and SLEs. Significant PRSxCT interactions were found (p = 0.002), but showed an inverse association with MDD status, as cases who experienced more severe CT tended to have a lower PRS than other cases or controls. This relationship between PRS and CT was not observed in independent replication samples.
CT is a strong risk factor for MDD but may have greater effect in individuals with lower genetic liability for the disorder. Including environmental risk along with genetics is important in studying the aetiology of MDD and PRS provide a useful approach to investigating gene–environment interactions in complex traits.
Paranoia is one of the commonest symptoms of psychosis but has rarely been studied in a population at risk of developing psychosis. Based on existing theoretical models, including the proposed distinction between ‘poor me’ and ‘bad me’ paranoia, we aimed to test specific predictions about associations between negative cognition, metacognitive beliefs and negative emotions and paranoid ideation and the belief that persecution is deserved (deservedness).
We used data from 117 participants from the Early Detection and Intervention Evaluation for people at risk of psychosis (EDIE-2) trial of cognitive–behaviour therapy, comparing them with samples of psychiatric in-patients and healthy students from a previous study. Multi-level modelling was utilized to examine predictors of both paranoia and deservedness, with post-hoc planned comparisons conducted to test whether person-level predictor variables were associated differentially with paranoia or with deservedness.
Our sample of at-risk mental state participants was not as paranoid, but reported higher levels of ‘bad-me’ deservedness, compared with psychiatric in-patients. We found several predictors of paranoia and deservedness. Negative beliefs about self were related to deservedness but not paranoia, whereas negative beliefs about others were positively related to paranoia but negatively with deservedness. Both depression and negative metacognitive beliefs about paranoid thinking were specifically related to paranoia but not deservedness.
This study provides evidence for the role of negative cognition, metacognition and negative affect in the development of paranoid beliefs, which has implications for psychological interventions and our understanding of psychosis.
The quality of the therapeutic alliance (TA) has been invoked to explain the equal effectiveness of different psychotherapies, but prior research is correlational, and does not address the possibility that individuals who form good alliances may have good outcomes without therapy.
We evaluated the causal effect of TA using instrumental variable (structural equation) modelling on data from a three-arm, randomized controlled trial of 308 people in an acute first or second episode of a non-affective psychosis. The trial compared cognitive behavioural therapy (CBT) over 6 weeks plus routine care (RC) v. supportive counselling (SC) plus RC v. RC alone. We examined the effect of TA, as measured by the client-rated CALPAS, on the primary trial 18-month outcome of symptom severity (PANSS), which was assessed blind to treatment allocation.
Both adjunctive CBT and SC improved 18-month outcomes, compared to RC. We showed that, for both psychological treatments, improving TA improves symptomatic outcome. With a good TA, attending more sessions causes a significantly better outcome on PANSS total score [effect size −2.91, 95% confidence interval (CI) −0.90 to −4.91]. With a poor TA, attending more sessions is detrimental (effect size +7.74, 95% CI +1.03 to +14.45).
This is the first ever demonstration that TA has a causal effect on symptomatic outcome of a psychological treatment, and that poor TA is actively detrimental. These effects may extend to other therapeutic modalities and disorders.
Strategies to dissect phenotypic and genetic heterogeneity of major depressive disorder (MDD) have mainly relied on subphenotypes, such as age at onset (AAO) and recurrence/episodicity. Yet, evidence on whether these subphenotypes are familial or heritable is scarce. The aims of this study are to investigate the familiality of AAO and episode frequency in MDD and to assess the proportion of their variance explained by common single nucleotide polymorphisms (SNP heritability).
For investigating familiality, we used 691 families with 2–5 full siblings with recurrent MDD from the DeNt study. We fitted (square root) AAO and episode count in a linear and a negative binomial mixed model, respectively, with family as random effect and adjusting for sex, age and center. The strength of familiality was assessed with intraclass correlation coefficients (ICC). For estimating SNP heritabilities, we used 3468 unrelated MDD cases from the RADIANT and GSK Munich studies. After similarly adjusting for covariates, derived residuals were used with the GREML method in GCTA (genome-wide complex trait analysis) software.
Significant familial clustering was found for both AAO (ICC = 0.28) and episodicity (ICC = 0.07). We calculated from respective ICC estimates the maximal additive heritability of AAO (0.56) and episodicity (0.15). SNP heritability of AAO was 0.17 (p = 0.04); analysis was underpowered for calculating SNP heritability of episodicity.
AAO and episodicity aggregate in families to a moderate and small degree, respectively. AAO is under stronger additive genetic control than episodicity. Larger samples are needed to calculate the SNP heritability of episodicity. The described statistical framework could be useful in future analyses.
It is uncertain whether antipsychotic long-acting injection (LAI) medication in schizophrenia is associated with better clinical outcomes than oral preparations.
To examine the impact of prior treatment delivery route on treatment outcomes and whether any differences are moderated by adherence.
Analysis of data from two pragmatic 1-year clinical trials in which patients with schizophrenia were randomised to either an oral first-generation antipsychotic (FGA), or a non-clozapine second-generation antipsychotic (SGA, CUtLASS 1 study), or a non-clozapine SGA or clozapine (CUtLASS 2 study).
Across both trials, 43% (n = 155) of participants were prescribed an FGA-LAI before randomisation. At 1-year follow-up they showed less improvement in quality of life, symptoms and global functioning than those randomised from oral medication. This difference was confined to patients rated as less than consistently adherent pre-randomisation. The relatively poor improvement in the patients prescribed an LAI pre-randomisation was ameliorated if they had been randomised to clozapine rather than another SGA. There was no advantage to being randomly assigned from an LAI at baseline to a non-clozapine oral SGA rather than an oral FGA.
A switch at randomisation from an LAI to an oral antipsychotic was associated with poorer clinical and functional outcomes at 1-year follow-up compared with switching from one oral antipsychotic to another. This effect appears to be moderated by adherence, and may not extend to switching to clozapine. This has implications for clinical trial design: the drug from which a participant is randomised may have a greater effect than the drug to which they are randomised.
We describe and compare the epidemiology of catheter-associated urinary tract infection (CAUTI) occurring in non-intensive care unit (ICU) versus ICU wards in a network of community hospitals over a 2-year period. Overall, 72% of cases of CAUTI occurred in non-ICU patients, which indicates that this population is an important target for dedicated surveillance and prevention efforts.