To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To describe the genomic analysis and epidemiologic response related to a slow and prolonged methicillin-resistant Staphylococcus aureus (MRSA) outbreak.
Prospective observational study.
Neonatal intensive care unit (NICU).
We conducted an epidemiologic investigation of a NICU MRSA outbreak involving serial baby and staff screening to identify opportunities for decolonization. Whole-genome sequencing was performed on MRSA isolates.
A NICU with excellent hand hygiene compliance and longstanding minimal healthcare-associated infections experienced an MRSA outbreak involving 15 babies and 6 healthcare personnel (HCP). In total, 12 cases occurred slowly over a 1-year period (mean, 30.7 days apart) followed by 3 additional cases 7 months later. Multiple progressive infection prevention interventions were implemented, including contact precautions and cohorting of MRSA-positive babies, hand hygiene observers, enhanced environmental cleaning, screening of babies and staff, and decolonization of carriers. Only decolonization of HCP found to be persistent carriers of MRSA was successful in stopping transmission and ending the outbreak. Genomic analyses identified bidirectional transmission between babies and HCP during the outbreak.
In comparison to fast outbreaks, outbreaks that are “slow and sustained” may be more common to units with strong existing infection prevention practices such that a series of breaches have to align to result in a case. We identified a slow outbreak that persisted among staff and babies and was only stopped by identifying and decolonizing persistent MRSA carriage among staff. A repeated decolonization regimen was successful in allowing previously persistent carriers to safely continue work duties.
Early in the COVID-19 pandemic, the World Health Organization stressed the importance of daily clinical assessments of infected patients, yet current approaches frequently consider cross-sectional timepoints, cumulative summary measures, or time-to-event analyses. Statistical methods are available that make use of the rich information content of longitudinal assessments. We demonstrate the use of a multistate transition model to assess the dynamic nature of COVID-19-associated critical illness using daily evaluations of COVID-19 patients from 9 academic hospitals. We describe the accessibility and utility of methods that consider the clinical trajectory of critically ill COVID-19 patients.
Otitis externa accounts for 1.1–1.3 per cent of patient presentations in primary care and 25 per cent of urgent referrals to ENT. This study aimed to explore otitis externa clinical decision-making at the primary-secondary care interface, otitis externa prevalence and recent trends in antimicrobial resistance in otitis externa related bacterial isolates and ototopical prescribing.
This is a mixed-methods study drawing on data from primary and secondary care and open National Health Service sources.
A total of 101 general practitioner survey respondents reported frequently prescribing oral antibiotics for otitis externa. General practitioner consultations for otitis externa increased 25 per cent over 15 years. General practitioner ototopical preparations cost the National Health Service £7 410 440 in 2006 and £11 325 241 in 2016. A total of 162 consecutive hospital otitis externa-related bacterial isolates yielded 128 pseudomonas species, with 18 that were resistant to gentamicin and 7 that were resistant to ciprofloxacin. Ten guidelines reviewed showed systematic inconsistencies.
General practitioners reported regularly prescribing oral antibiotics for otitis externa. Antimicrobial drug resistance is common in otitis externa. The available guidance is suboptimal.
Identifying predictors of patient outcomes evaluated over time may require modeling interactions among variables while addressing within-subject correlation. Generalized linear mixed models (GLMMs) and generalized estimating equations (GEEs) address within-subject correlation, but identifying interactions can be difficult if not hypothesized a priori. We evaluate the performance of several variable selection approaches for clustered binary outcomes to provide guidance for choosing between the methods.
We conducted simulations comparing stepwise selection, penalized GLMM, boosted GLMM, and boosted GEE for variable selection considering main effects and two-way interactions in data with repeatedly measured binary outcomes and evaluate a two-stage approach to reduce bias and error in parameter estimates. We compared these approaches in real data applications: hypothermia during surgery and treatment response in lupus nephritis.
Penalized and boosted approaches recovered correct predictors and interactions more frequently than stepwise selection. Penalized GLMM recovered correct predictors more often than boosting, but included many spurious predictors. Boosted GLMM yielded parsimonious models and identified correct predictors well at large sample and effect sizes, but required excessive computation time. Boosted GEE was computationally efficient and selected relatively parsimonious models, offering a compromise between computation and parsimony. The two-stage approach reduced the bias and error in regression parameters in all approaches.
Penalized and boosted approaches are effective for variable selection in data with clustered binary outcomes. The two-stage approach reduces bias and error and should be applied regardless of method. We provide guidance for choosing the most appropriate method in real applications.
Background: Automated testing instruments (ATIs) are commonly used by clinical microbiology laboratories to perform antimicrobial susceptibility testing (AST), whereas public health laboratories may use established reference methods such as broth microdilution (BMD). We investigated discrepancies in carbapenem minimum inhibitory concentrations (MICs) among Enterobacteriaceae tested by clinical laboratory ATIs and by reference BMD at the CDC. Methods: During 2016–2018, we conducted laboratory- and population-based surveillance for carbapenem-resistant Enterobacteriaceae (CRE) through the CDC Emerging Infections Program (EIP) sites (10 sites by 2018). We defined an incident case as the first isolation of Enterobacter spp (E. cloacae complex or E. aerogenes), Escherichia coli, Klebsiella pneumoniae, K. oxytoca, or K. variicola resistant to doripenem, ertapenem, imipenem, or meropenem from normally sterile sites or urine identified from a resident of the EIP catchment area in a 30-day period. Cases had isolates that were determined to be carbapenem-resistant by clinical laboratory ATI MICs (MicroScan, BD Phoenix, or VITEK 2) or by other methods, using current Clinical and Laboratory Standards Institute (CLSI) criteria. A convenience sample of these isolates was tested by reference BMD at the CDC according to CLSI guidelines. Results: Overall, 1,787 isolates from 112 clinical laboratories were tested by BMD at the CDC. Of these, clinical laboratory ATI MIC results were available for 1,638 (91.7%); 855 (52.2%) from 71 clinical laboratories did not confirm as CRE at the CDC. Nonconfirming isolates were tested on either a MicroScan (235 of 462; 50.9%), BD Phoenix (249 of 411; 60.6%), or VITEK 2 (371 of 765; 48.5%). Lack of confirmation was most common among E. coli (62.2% of E. coli isolates tested) and Enterobacter spp (61.4% of Enterobacter isolates tested) (Fig. 1A), and among isolates testing resistant to ertapenem by the clinical laboratory ATI (52.1%, Fig. 1B). Of the 1,388 isolates resistant to ertapenem in the clinical laboratory, 1,006 (72.5%) were resistant only to ertapenem. Of the 855 nonconfirming isolates, 638 (74.6%) were resistant only to ertapenem based on clinical laboratory ATI MICs. Conclusions: Nonconfirming isolates were widespread across laboratories and ATIs. Lack of confirmation was most common among E. coli and Enterobacter spp. Among nonconfirming isolates, most were resistant only to ertapenem. These findings may suggest that ATIs overcall resistance to ertapenem or that isolate transport and storage conditions affect ertapenem resistance. Further investigation into this lack of confirmation is needed, and CRE case identification in public health surveillance may need to account for this phenomenon.
Psychiatry in the UK has longstanding recruitment problems (1). Evidence suggests the positive effects of clinical attachments on attitudes towards psychiatry are often transient (2). We therefore created the Psychiatry Early Experience Programme (PEEP) where year 1 medical students are paired with psychiatry trainees and shadow them at work. Students will ideally remain in PEEP throughout medical school, providing consistent exposure to psychiatry and a broad experience of its subspecialties.
1. To present PEEP
2. To assess:
a. Students’ baseline attitudes to psychiatry
b. PEEPs’ impact on students’ attitudes to psychiatry
A prospective survey based cohort study of King’s College London medical students.
PEEP started in 2013. In this cohort all students that signed up were accepted.
Students’ attitudes towards psychiatry were assessed on recruitment using the ATP-30 questionnaire (3), and will be re-assessed annually.
127 students were recruited. Attitudes were positive overall. 73% listed psychiatry in their top three specialities. 95.3% agreed or strongly agreed that ‘psychiatric illness deserves at least as much attention as physical illness.’ 84.3% disagreed or strongly disagreed that ‘at times it is hard to think of psychiatrists as equal to other doctors.’
Baseline attitudes to psychiatry were positive. By March 2015 we aim to collect and analyse data on students’ attitudes after one year in PEEP. Through on-ongoing analysis of this and future cohorts, we aim to assess the impact of PEEP on improving attitudes to psychiatry and whether this will ultimately improve recruitment.
At Guy's King's and St Thomas’ School of Medicine, a unique initiative is the Psychiatry Early Experience Programme (PEEP), which allows students to shadow psychiatry trainees at work several times a year. The students’ attitudes towards psychiatry and the scheme are regularly assessed and initial results are already available.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Between 2010 and 2019 the international health care organization Partners In Health (PIH) and its sister organization Zanmi Lasante (ZL) mounted a long-term response to the 2010 Haiti earthquake, focused on mental health. Over that time, implementing a Theory of Change developed in 2012, the organization successfully developed a comprehensive, sustained community mental health system in Haiti's Central Plateau and Artibonite departments, directly serving a catchment area of 1.5 million people through multiple diagnosis-specific care pathways. The resulting ZL mental health system delivered 28 184 patient visits and served 6305 discrete patients at ZL facilities between January 2016 and September 2019. The experience of developing a system of mental health services in Haiti that currently provides ongoing care to thousands of people serves as a case study in major challenges involved in global mental health delivery. The essential components of the effort to develop and sustain this community mental health system are summarized.
Psychotropic prescription rates continue to increase in the United States (USA). Few studies have investigated whether social-structural factors may play a role in psychotropic medication use independent of mental illness. Food insecurity is prevalent among people living with HIV in the USA and has been associated with poor mental health. We investigated whether food insecurity was associated with psychotropic medication use independent of the symptoms of depression and anxiety among women living with HIV in the USA.
We used cross-sectional data from the Women's Interagency HIV Study (WIHS), a nationwide cohort study. Food security (FS) was the primary explanatory variable, measured using the Household Food Security Survey Module. First, we used multivariable linear regressions to test whether FS was associated with symptoms of depression (Center for Epidemiologic Studies Depression [CESD] score), generalised anxiety disorder (GAD-7 score) and mental health-related quality of life (MOS-HIV Mental Health Summary score; MHS). Next, we examined associations of FS with the use of any psychotropic medications, including antidepressants, sedatives and antipsychotics, using multivariable logistic regressions adjusting for age, race/ethnicity, income, education and alcohol and substance use. In separate models, we additionally adjusted for symptoms of depression (CESD score) and anxiety (GAD-7 score).
Of the 905 women in the sample, two-thirds were African-American. Lower FS (i.e. worse food insecurity) was associated with greater symptoms of depression and anxiety in a dose–response relationship. For the psychotropic medication outcomes, marginal and low FS were associated with 2.06 (p < 0.001; 95% confidence interval [CI] = 1.36–3.13) and 1.99 (p < 0.01; 95% CI = 1.26–3.15) times higher odds of any psychotropic medication use, respectively, before adjusting for depression and anxiety. The association of very low FS with any psychotropic medication use was not statistically significant. A similar pattern was found for antidepressant and sedative use. After additionally adjusting for CESD and GAD-7 scores, marginal FS remained associated with 1.93 (p < 0.05; 95% CI = 1.16–3.19) times higher odds of any psychotropic medication use. Very low FS, conversely, was significantly associated with lower odds of antidepressant use (adjusted odds ratio = 0.42; p < 0.05; 95% CI = 0.19–0.96).
Marginal FS was associated with higher odds of using psychotropic medications independent of depression and anxiety, while very low FS was associated with lower odds. These complex findings may indicate that people experiencing very low FS face barriers to accessing mental health services, while those experiencing marginal FS who do access services are more likely to be prescribed psychotropic medications for distress arising from social and structural factors.
Meal timing may influence food choices, neurobiology and psychological states. Our exploratory study examined if time-of-day eating patterns were associated with mood disorders among adults.
During 2004–2006 (age 26–36 years) and 2009–2011 (follow-up, age 31–41 years), N = 1304 participants reported 24-h food and beverage intake. Time-of-day eating patterns were derived by principal components analysis. At follow-up, the Composite International Diagnostic Interview measured lifetime mood disorder. Log binomial and adjacent categories log-link regression were used to examine bidirectional associations between eating patterns and mood disorder. Covariates included sex, age, marital status, social support, education, work schedule, body mass index and smoking.
Three patterns were derived at each time-point: Grazing (intake spread across the day), Traditional (highest intakes reflected breakfast, lunch and dinner), and Late (skipped/delayed breakfast with higher evening intakes). Compared to those in the lowest third of the respective pattern at baseline and follow-up, during the 5-year follow-up, those in the highest third of the Late pattern at both time-points had a higher prevalence of mood disorder [prevalence ratio (PR) = 2.04; 95% confidence interval (CI) 1.20–3.48], and those in the highest third of the Traditional pattern at both time-points had a lower prevalence of first onset mood disorder (PR = 0.31; 95% CI 0.11–0.87). Participants who experienced a mood disorder during follow-up had a 1.07 higher relative risk of being in a higher Late pattern score category at follow-up than those without mood disorder (95% CI 1.00–1.14).
Non-traditional eating patterns, particularly skipped or delayed breakfast, may be associated with mood disorders.
Background: Cervical sponylotic myelopathy (CSM) may present with neck and arm pain. This study investiagtes the change in neck/arm pain post-operatively in CSM. Methods: This ambispective study llocated 402 patients through the Canadian Spine Outcomes and Research Network. Outcome measures were the visual analogue scales for neck and arm pain (VAS-NP and VAS-AP) and the neck disability index (NDI). The thresholds for minimum clinically important differences (MCIDs) for VAS-NP and VAS-AP were determined to be 2.6 and 4.1. Results: VAS-NP improved from mean of 5.6±2.9 to 3.8±2.7 at 12 months (P<0.001). VAS-AP improved from 5.8±2.9 to 3.5±3.0 at 12 months (P<0.001). The MCIDs for VAS-NP and VAS-AP were also reached at 12 months. Based on the NDI, patients were grouped into those with mild pain/no pain (33%) versus moderate/severe pain (67%). At 3 months, a significantly high proportion of patients with moderate/severe pain (45.8%) demonstrated an improvement into mild/no pain, whereas 27.2% with mild/no pain demonstrated worsening into moderate/severe pain (P <0.001). At 12 months, 17.4% with mild/no pain experienced worsening of their NDI (P<0.001). Conclusions: This study suggests that neck and arm pain responds to surgical decompression in patients with CSM and reaches the MCIDs for VAS-AP and VAS-NP at 12 months.
The majority of paediatric Clostridioides difficile infections (CDI) are community-associated (CA), but few data exist regarding associated risk factors. We conducted a case–control study to evaluate CA-CDI risk factors in young children. Participants were enrolled from eight US sites during October 2014–February 2016. Case-patients were defined as children aged 1–5 years with a positive C. difficile specimen collected as an outpatient or ⩽3 days of hospital admission, who had no healthcare facility admission in the prior 12 weeks and no history of CDI. Each case-patient was matched to one control. Caregivers were interviewed regarding relevant exposures. Multivariable conditional logistic regression was performed. Of 68 pairs, 44.1% were female. More case-patients than controls had a comorbidity (33.3% vs. 12.1%; P = 0.01); recent higher-risk outpatient exposures (34.9% vs. 17.7%; P = 0.03); recent antibiotic use (54.4% vs. 19.4%; P < 0.0001); or recent exposure to a household member with diarrhoea (41.3% vs. 21.5%; P = 0.04). In multivariable analysis, antibiotic exposure in the preceding 12 weeks was significantly associated with CA-CDI (adjusted matched odds ratio, 6.25; 95% CI 2.18–17.96). Improved antibiotic prescribing might reduce CA-CDI in this population. Further evaluation of the potential role of outpatient healthcare and household exposures in C. difficile transmission is needed.
Filamentary structures can form within the beam of protons accelerated during the interaction of an intense laser pulse with an ultrathin foil target. Such behaviour is shown to be dependent upon the formation time of quasi-static magnetic field structures throughout the target volume and the extent of the rear surface proton expansion over the same period. This is observed via both numerical and experimental investigations. By controlling the intensity profile of the laser drive, via the use of two temporally separated pulses, both the initial rear surface proton expansion and magnetic field formation time can be varied, resulting in modification to the degree of filamentary structure present within the laser-driven proton beam.
Children reared in impoverished environments are at risk for enduring psychological and physical health problems. Mechanisms by which poverty affects development, however, remain unclear. To explore one potential mechanism of poverty's impact on social–emotional and cognitive development, an experimental examination of a rodent model of scarcity-adversity was conducted and compared to results from a longitudinal study of human infants and families followed from birth (N = 1,292) who faced high levels of poverty-related scarcity-adversity. Cross-species results supported the hypothesis that altered caregiving is one pathway by which poverty adversely impacts development. Rodent mothers assigned to the scarcity-adversity condition exhibited decreased sensitive parenting and increased negative parenting relative to mothers assigned to the control condition. Furthermore, scarcity-adversity reared pups exhibited decreased developmental competence as indicated by disrupted nipple attachment, distress vocalization when in physical contact with an anesthetized mother, and reduced preference for maternal odor with corresponding changes in brain activation. Human results indicated that scarcity-adversity was inversely correlated with sensitive parenting and positively correlated with negative parenting, and that parenting fully mediated the association of poverty-related risk with infant indicators of developmental competence. Findings are discussed from the perspective of the usefulness of bidirectional–translational research to inform interventions for at-risk families.
Objectives: Studies suggest that impairments in some of the same domains of cognition occur in different neuropsychiatric conditions, including those known to share genetic liability. Yet, direct, multi-disorder cognitive comparisons are limited, and it remains unclear whether overlapping deficits are due to comorbidity. We aimed to extend the literature by examining cognition across different neuropsychiatric conditions and addressing comorbidity. Methods: Subjects were 486 youth consecutively referred for neuropsychiatric evaluation and enrolled in the Longitudinal Study of Genetic Influences on Cognition. First, we assessed general ability, reaction time variability (RTV), and aspects of executive functions (EFs) in youth with non-comorbid forms of attention-deficit/hyperactivity disorder (ADHD), mood disorders and autism spectrum disorder (ASD), as well as in youth with psychosis. Second, we determined the impact of comorbid ADHD on cognition in youth with ASD and mood disorders. Results: For EFs (working memory, inhibition, and shifting/ flexibility), we observed weaknesses in all diagnostic groups when participants’ own ability was the referent. Decrements were subtle in relation to published normative data. For RTV, weaknesses emerged in youth with ADHD and mood disorders, but trend-level results could not rule out decrements in other conditions. Comorbidity with ADHD did not impact the pattern of weaknesses for youth with ASD or mood disorders but increased the magnitude of the decrement in those with mood disorders. Conclusions: Youth with ADHD, mood disorders, ASD, and psychosis show EF weaknesses that are not due to comorbidity. Whether such cognitive difficulties reflect genetic liability shared among these conditions requires further study. (JINS, 2018, 24, 91–103)
To achieve their conservation goals individuals, communities and organizations need to acquire a diversity of skills, knowledge and information (i.e. capacity). Despite current efforts to build and maintain appropriate levels of conservation capacity, it has been recognized that there will need to be a significant scaling-up of these activities in sub-Saharan Africa. This is because of the rapid increase in the number and extent of environmental problems in the region. We present a range of socio-economic contexts relevant to four key areas of African conservation capacity building: protected area management, community engagement, effective leadership, and professional e-learning. Under these core themes, 39 specific recommendations are presented. These were derived from multi-stakeholder workshop discussions at an international conference held in Nairobi, Kenya, in 2015. At the meeting 185 delegates (practitioners, scientists, community groups and government agencies) represented 105 organizations from 24 African nations and eight non-African nations. The 39 recommendations constituted six broad types of suggested action: (1) the development of new methods, (2) the provision of capacity building resources (e.g. information or data), (3) the communication of ideas or examples of successful initiatives, (4) the implementation of new research or gap analyses, (5) the establishment of new structures within and between organizations, and (6) the development of new partnerships. A number of cross-cutting issues also emerged from the discussions: the need for a greater sense of urgency in developing capacity building activities; the need to develop novel capacity building methodologies; and the need to move away from one-size-fits-all approaches.
Children with cancer are potentially at a high risk of plasma 25-hydroxyvitamin D (25(OH)D) inadequacy, and despite UK vitamin D supplementation guidelines their implementation remains inconsistent. Thus, we aimed to investigate 25(OH)D concentration and factors contributing to 25(OH)D inadequacy in paediatric cancer patients. A prospective cohort study of Scottish children aged <18 years diagnosed with, and treated for, cancer (patients) between August 2010 and January 2014 was performed, with control data from Scottish healthy children (controls). Clinical and nutritional data were collected at defined periods up to 24 months. 25(OH)D status was defined by the Royal College of Paediatrics and Child Health as inadequacy (<50 nmol/l: deficiency (<25 nmol/l), insufficiency (25–50 nmol/l)), sufficiency (51–75 nmol/l) and optimal (>75 nmol/l). In all, eighty-two patients (median age 3·9, interquartile ranges (IQR) 1·9–8·8; 56 % males) and thirty-five controls (median age 6·2, IQR 4·8–9·1; 49 % males) were recruited. 25(OH)D inadequacy was highly prevalent in the controls (63 %; 22/35) and in the patients (64 %; 42/65) at both baseline and during treatment (33–50 %). Non-supplemented children had the highest prevalence of 25(OH)D inadequacy at every stage with 25(OH)D median ranging from 32·0 (IQR 21·0–46·5) to 45·0 (28·0–64·5) nmol/l. Older age at baseline (R −0·46; P<0·001), overnutrition (BMI≥85th centile) at 3 months (P=0·005; relative risk=3·1) and not being supplemented at 6 months (P=0·04; relative risk=4·3) may have contributed to lower plasma 25(OH)D. Paediatric cancer patients are not at a higher risk of 25(OH)D inadequacy than healthy children at diagnosis; however, prevalence of 25(OH)D inadequacy is still high and non-supplemented children have a higher risk. Appropriate monitoring and therapeutic supplementation should be implemented.
The Antarctic Roadmap Challenges (ARC) project identified critical requirements to deliver high priority Antarctic research in the 21st century. The ARC project addressed the challenges of enabling technologies, facilitating access, providing logistics and infrastructure, and capitalizing on international co-operation. Technological requirements include: i) innovative automated in situ observing systems, sensors and interoperable platforms (including power demands), ii) realistic and holistic numerical models, iii) enhanced remote sensing and sensors, iv) expanded sample collection and retrieval technologies, and v) greater cyber-infrastructure to process ‘big data’ collection, transmission and analyses while promoting data accessibility. These technologies must be widely available, performance and reliability must be improved and technologies used elsewhere must be applied to the Antarctic. Considerable Antarctic research is field-based, making access to vital geographical targets essential. Future research will require continent- and ocean-wide environmentally responsible access to coastal and interior Antarctica and the Southern Ocean. Year-round access is indispensable. The cost of future Antarctic science is great but there are opportunities for all to participate commensurate with national resources, expertise and interests. The scope of future Antarctic research will necessitate enhanced and inventive interdisciplinary and international collaborations. The full promise of Antarctic science will only be realized if nations act together.
We describe the performance of the Boolardy Engineering Test Array, the prototype for the Australian Square Kilometre Array Pathfinder telescope. Boolardy Engineering Test Array is the first aperture synthesis radio telescope to use phased array feed technology, giving it the ability to electronically form up to nine dual-polarisation beams. We report the methods developed for forming and measuring the beams, and the adaptations that have been made to the traditional calibration and imaging procedures in order to allow BETA to function as a multi-beam aperture synthesis telescope. We describe the commissioning of the instrument and present details of Boolardy Engineering Test Array’s performance: sensitivity, beam characteristics, polarimetric properties, and image quality. We summarise the astronomical science that it has produced and draw lessons from operating Boolardy Engineering Test Array that will be relevant to the commissioning and operation of the final Australian Square Kilometre Array Path telescope.