We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This SHEA white paper identifies knowledge gaps and challenges in healthcare epidemiology research related to COVID-19 with a focus on core principles of healthcare epidemiology. These gaps, revealed during the worst phases of the COVID-19 pandemic, are described in 10 sections: epidemiology, outbreak investigation, surveillance, isolation precaution practices, personal protective equipment (PPE), environmental contamination and disinfection, drug and supply shortages, antimicrobial stewardship, healthcare personnel (HCP) occupational safety, and return to work policies. Each section highlights three critical healthcare epidemiology research questions with detailed description provided in supplemental materials. This research agenda calls for translational studies from laboratory-based basic science research to well-designed, large-scale studies and health outcomes research. Research gaps and challenges related to nursing homes and social disparities are included. Collaborations across various disciplines, expertise and across diverse geographic locations will be critical.
Children are important transmitters of infection. Within schools they encounter large numbers of contacts and infections can spread easily causing outbreaks. However, not all schools are affected equally. We conducted a retrospective analysis of school outbreaks to identify factors associated with the risk of gastroenteritis, influenza, rash or other outbreaks. Data on reported school outbreaks in England were obtained from Public Health England and linked with data from the Department for Education and the Office for Standards in Education, Children's Services and Skills (Ofsted). Primary and all-through schools were found to be at increased risk of outbreaks, compared with secondary schools (odds ratio (OR) 5.82, 95% confidence interval (CI) 4.50–7.58 and OR 4.66, 95% CI 3.27–6.61, respectively). School size was also significantly associated with the risk of outbreaks, with higher odds associated with larger schools. Attack rates were higher in gastroenteritis and influenza outbreaks, with lower attack rates associated with rashes (relative risk 0.17, 95% CI 0.15–0.20). Deprivation and Ofsted rating were not associated with either outbreak occurrence or the subsequent attack rate. This study identifies primary and all-through schools as key settings for health protection interventions. Public health teams need to work closely with these schools to encourage early identification and reporting of outbreaks.
To characterize associations between exposures within and outside the medical workplace with healthcare personnel (HCP) SARS-CoV-2 infection, including the effect of various forms of respiratory protection.
Design:
Case–control study.
Setting:
We collected data from international participants via an online survey.
Participants:
In total, 1,130 HCP (244 cases with laboratory-confirmed COVID-19, and 886 controls healthy throughout the pandemic) from 67 countries not meeting prespecified exclusion (ie, healthy but not working, missing workplace exposure data, COVID symptoms without lab confirmation) were included in this study.
Methods:
Respondents were queried regarding workplace exposures, respiratory protection, and extra-occupational activities. Odds ratios for HCP infection were calculated using multivariable logistic regression and sensitivity analyses controlling for confounders and known biases.
Results:
HCP infection was associated with non–aerosol-generating contact with COVID-19 patients (adjusted OR, 1.4; 95% CI, 1.04–1.9; P = .03) and extra-occupational exposures including gatherings of ≥10 people, patronizing restaurants or bars, and public transportation (adjusted OR range, 3.1–16.2). Respirator use during aerosol-generating procedures (AGPs) was associated with lower odds of HCP infection (adjusted OR, 0.4; 95% CI, 0.2–0.8, P = .005), as was exposure to intensive care and dedicated COVID units, negative pressure rooms, and personal protective equipment (PPE) observers (adjusted OR range, 0.4–0.7).
Conclusions:
COVID-19 transmission to HCP was associated with medical exposures currently considered lower-risk and multiple extra-occupational exposures, and exposures associated with proper use of appropriate PPE were protective. Closer scrutiny of infection control measures surrounding healthcare activities and medical settings considered lower risk, and continued awareness of the risks of public congregation, may reduce the incidence of HCP infection.
Violence and aggression are a major concern in acute inpatient psychiatric wards. Hard outcome data on the impact of service change are scarce. This poster presents the outcomes of service changes designed to improve the acute ward environment and patient experience.
Aims and objectives
To implement changes to the delivery of acute inpatient psychiatric services and to measure the outcome of these changes in objective verifiable form.
Method
Significant changes were introduced to an acute psychiatric inpatient service. These included introducing a dedicated inpatient psychiatrist “hospitalist”, replacing weekly ward rounds with daily multidisciplinary care and discharge planning meetings and promoting increased roles for nursing staff in decision-making and patient contact. Outcomes measured included routinely recorded incidents of violence with and without injury, use of restraint for medication and use of constant nursing observation. The control group was a similar service in the same hospital subject to the same general policies and admitting patients demographically comparable, but that did not at the time undergo the interventions implemented in the trial service. All data was recorded by staff who were unaware of this study or even that any analysis of the data would occur.
Results and conclusions
Violent incidents in the intervention ward dropped by 34% per patient (p=< 0.02) whilst increasing by 3% in the control ward; restraints decreased by 28% (p=ns) whilst increasing by 12% in the control ward; with an overall reduction in constant observation. The intervention was highly effective in reducing violent incidents.
The rate-corrected Q-T interval (QTc) prolongation is a risk factor for sudden death and may be produced by antipsychotic drugs.
Objective
To determined the frequency and psychopharmacological correlates of baseline prolongation of QTc in a large pediatric cohort.
Methods
The QTc was measured on the electrocardiograms obtained on 811 children and adolescents (404 males and 407 females, mean age: 15.5 ± 2.4 years) consecutively evaluated in the admissions unit of a psychiatric hospital. Each patient with QTc > 440 msec was age- and gender-matched with 5 patients with QTc< 420 msec. The psychiatric diagnoses and psychotropic treatment of patients with prolonged QTc and control subjects were compared in univariate and logistic analyses.
Results
QTc duration was > 440 msec (mean 454 ± 10 msec, range 442–481 msec) in 16 patients (1.97%; 95% confidence interval (CI): 1.17%–3.25%). The 80 control subjects had a mean QTc of 391 ± 21 msec. The groups were similar with regard to the proportion of patients on antipsychotics (43.8% vs. 40.8%, p = 0.78) and chlorpromazine equivalents (165.5 ± 109.7 mg vs. 167.6 ± 217.8 mg, p = 0.98). Logistic regression identified schizophrenia as the only psychiatric predictor of baseline QTc prolongation (odds-ratio: 6.17, 95% CI: 1.24–30.69, p = 0.042).
Conclusions
In a large cohort of children and adolescents with psychiatric disorders, baseline QTc prolongation was infrequent and, at most, of moderate severity. The findings argue against performing electrocardiograms prior to the initiation of antipsychotics in all patients from this age group.
Aberrant activity of the subcallosal cingulate (SCC) is a common theme across pharmacologic treatment efficacy prediction studies. The functioning of the SCC in psychotherapeutic interventions is relatively understudied, as are functional differences among SCC subdivisions. We conducted functional connectivity analyses (rsFC) on resting-state functional magnetic resonance imaging (fMRI) data, collected before and after a course of cognitive behavioral therapy (CBT) in patients with major depressive disorder (MDD), using seeds from three SCC subdivisions.
Methods.
Resting-state data were collected from unmedicated patients with current MDD (Hamilton Depression Rating Scale-17 > 16) before and after 14-sessions of CBT monotherapy. Treatment outcome was assessed using the Beck Depression Inventory (BDI). Rostral anterior cingulate (rACC), anterior subcallosal cingulate (aSCC), and Brodmann’s area 25 (BA25) masks were used as seeds in connectivity analyses that assessed baseline rsFC and symptom severity, changes in connectivity related to symptom improvement after CBT, and prediction of treatment outcomes using whole-brain baseline connectivity.
Results.
Pretreatment BDI negatively correlated with pretreatment rACC ~ dorsolateral prefrontal cortex and aSCC ~ lateral prefrontal cortex rsFC. In a region-of-interest longitudinal analysis, rsFC between these regions increased post-treatment (p < 0.05FDR). In whole-brain analyses, BA25 ~ paracentral lobule and rACC ~ paracentral lobule connectivities decreased post-treatment. Whole-brain baseline rsFC with SCC did not predict clinical improvement.
Conclusions.
rsFC features of rACC and aSCC, but not BA25, correlated inversely with baseline depression severity, and increased following CBT. Subdivisions of SCC involved in top-down emotion regulation may be more involved in cognitive interventions, while BA25 may be more informative for interventions targeting bottom-up processing. Results emphasize the importance of subdividing the SCC in connectivity analyses.
Between 2001 and 2017, the Royal Botanic Garden Edinburgh conducted training and research in Belize built around an annual two-week field course, part of the Edinburgh M.Sc. programme in Biodiversity and Taxonomy of Plants, focused on tropical plant identification, botanical-collecting and tropical fieldwork skills. This long-term collaboration in one country has led to additional benefits, most notably capacity building, acquisition of new country records, completion of M.Sc. thesis projects and publication of the findings in journal articles, and continued cooperation. Detailed summaries are provided for the specimens collected by students during the field course or return visits to Belize for M.Sc. thesis projects. Additionally, 15 species not recorded in the national checklist for Belize are reported. The information in this paper highlights the benefits of collaborations between institutions and countries for periods greater than the typical funding cycles of three to five years.
Shared patient–clinician decision-making is central to choosing between medical treatments. Decision support tools can have an important role to play in these decisions. We developed a decision support tool for deciding between nonsurgical treatment and surgical total knee replacement for patients with severe knee osteoarthritis. The tool aims to provide likely outcomes of alternative treatments based on predictive models using patient-specific characteristics. To make those models relevant to patients with knee osteoarthritis and their clinicians, we involved patients, family members, patient advocates, clinicians, and researchers as stakeholders in creating the models.
Methods:
Stakeholders were recruited through local arthritis research, advocacy, and clinical organizations. After being provided with brief methodological education sessions, stakeholder views were solicited through quarterly patient or clinician stakeholder panel meetings and incorporated into all aspects of the project.
Results:
Participating in each aspect of the research from determining the outcomes of interest to providing input on the design of the user interface displaying outcome predications, 86% (12/14) of stakeholders remained engaged throughout the project. Stakeholder engagement ensured that the prediction models that form the basis of the Knee Osteoarthritis Mathematical Equipoise Tool and its user interface were relevant for patient–clinician shared decision-making.
Conclusions:
Methodological research has the opportunity to benefit from stakeholder engagement by ensuring that the perspectives of those most impacted by the results are involved in study design and conduct. While additional planning and investments in maintaining stakeholder knowledge and trust may be needed, they are offset by the valuable insights gained.
The transmission rate of methicillin-resistant Staphylococcus aureus (MRSA) to gloves or gowns of healthcare personnel (HCP) caring for MRSA patients in a non–intensive care unit setting was 5.4%. Contamination rates were higher among HCP performing direct patient care and when patients had detectable MRSA on their body. These findings may inform risk-based contact precautions.
To determine sociodemographic factors associated with occupational, recreational and firearm-related noise exposure.
Methods
This nationally representative, multistage, stratified, cluster cross-sectional study sampled eligible National Health and Nutrition Examination Survey participants aged 20–69 years (n = 4675) about exposure to occupational and recreational noise and recurrent firearm usage, using a weighted multivariate logistic regression analysis.
Results
Thirty-four per cent of participants had exposure to occupational noise and 12 per cent to recreational noise, and 13 per cent repeatedly used firearms. Males were more likely than females to have exposure to all three noise types (adjusted odds ratio range = 2.63–14.09). Hispanics and Asians were less likely to have exposure to the three noise types than Whites. Blacks were less likely than Whites to have occupational and recurrent firearm noise exposure. Those with insurance were 26 per cent less likely to have exposure to occupational noise than those without insurance (adjusted odds ratio = 0.74, 95 per cent confidence interval = 0.60–0.93).
Conclusion
Whites, males and uninsured people are more likely to have exposure to potentially hazardous loud noise.
Epidemiological studies indicate that individuals with one type of mental disorder have an increased risk of subsequently developing other types of mental disorders. This study aimed to undertake a comprehensive analysis of pair-wise lifetime comorbidity across a range of common mental disorders based on a diverse range of population-based surveys.
Methods
The WHO World Mental Health (WMH) surveys assessed 145 990 adult respondents from 27 countries. Based on retrospectively-reported age-of-onset for 24 DSM-IV mental disorders, associations were examined between all 548 logically possible temporally-ordered disorder pairs. Overall and time-dependent hazard ratios (HRs) and 95% confidence intervals (CIs) were calculated using Cox proportional hazards models. Absolute risks were estimated using the product-limit method. Estimates were generated separately for men and women.
Results
Each prior lifetime mental disorder was associated with an increased risk of subsequent first onset of each other disorder. The median HR was 12.1 (mean = 14.4; range 5.2–110.8, interquartile range = 6.0–19.4). The HRs were most prominent between closely-related mental disorder types and in the first 1–2 years after the onset of the prior disorder. Although HRs declined with time since prior disorder, significantly elevated risk of subsequent comorbidity persisted for at least 15 years. Appreciable absolute risks of secondary disorders were found over time for many pairs.
Conclusions
Survey data from a range of sites confirms that comorbidity between mental disorders is common. Understanding the risks of temporally secondary disorders may help design practical programs for primary prevention of secondary disorders.
We studied the association between chlorhexidine gluconate (CHG) concentration on skin and resistant bacterial bioburden. CHG was almost always detected on the skin, and detection of methicillin-resistant Staphylococcus aureus, carbapenem-resistant Enterobacteriaceae, and vancomycin-resistant Enterococcus on skin sites was infrequent. However, we found no correlation between CHG concentration and bacterial bioburden.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Less than half of stool samples from people symptomatic with infectious intestinal disease (IID) will identify a causative organism. A secondary data analysis was undertaken to explore whether symptomology alone could be used to make inferences about causative organisms. Data were utilised from the Second Study of Infectious Intestinal Disease in the Community. A total of 844 cases were analysed. Few symptoms differentiated individual pathogens, but grouping pathogens together showed that viral IID was more likely when symptom onset was in winter (odds ratio (OR) 2.08, 95% confidence interval (CI) 1.16–3.75) or spring (OR 1.92, 95% CI 1.11–3.33), the patient was aged under 5 years (OR 3.63, 95% CI 2.24–6.03) and there was loss of appetite (OR 2.19, 95% CI 1.29–3.72). The odds of bacterial IID were higher with diarrhoea in the absence of vomiting (OR 3.54, 95% CI 2.37–5.32), diarrhoea which persisted for >3 days (OR 2.69, 95% CI 1.82–3.99), bloody diarrhoea (OR 4.17, 95% CI 1.63–11.83) and fever (OR 1.67, 95% CI 1.11–2.53). Symptom profiles could be of value to help guide clinicians and public health professionals in the management of IID, in the absence of microbiological confirmation.
To enhance enrollment into randomized clinical trials (RCTs), we proposed electronic health record-based clinical decision support for patient–clinician shared decision-making about care and RCT enrollment, based on “mathematical equipoise.”
Objectives:
As an example, we created the Knee Osteoarthritis Mathematical Equipoise Tool (KOMET) to determine the presence of patient-specific equipoise between treatments for the choice between total knee replacement (TKR) and nonsurgical treatment of advanced knee osteoarthritis.
Methods:
With input from patients and clinicians about important pain and physical function treatment outcomes, we created a database from non-RCT sources of knee osteoarthritis outcomes. We then developed multivariable linear regression models that predict 1-year individual-patient knee pain and physical function outcomes for TKR and for nonsurgical treatment. These predictions allowed detecting mathematical equipoise between these two options for patients eligible for TKR. Decision support software was developed to graphically illustrate, for a given patient, the degree of overlap of pain and functional outcomes between the treatments and was pilot tested for usability, responsiveness, and as support for shared decision-making.
Results:
The KOMET predictive regression model for knee pain had four patient-specific variables, and an r2 value of 0.32, and the model for physical functioning included six patient-specific variables, and an r2 of 0.34. These models were incorporated into prototype KOMET decision support software and pilot tested in clinics, and were generally well received.
Conclusions:
Use of predictive models and mathematical equipoise may help discern patient-specific equipoise to support shared decision-making for selecting between alternative treatments and considering enrollment into an RCT.
To identify potential participants for clinical trials, electronic health records (EHRs) are searched at potential sites. As an alternative, we investigated using medical devices used for real-time diagnostic decisions for trial enrollment.
Methods:
To project cohorts for a trial in acute coronary syndromes (ACS), we used electrocardiograph-based algorithms that identify ACS or ST elevation myocardial infarction (STEMI) that prompt clinicians to offer patients trial enrollment. We searched six hospitals’ electrocardiograph systems for electrocardiograms (ECGs) meeting the planned trial’s enrollment criterion: ECGs with STEMI or > 75% probability of ACS by the acute cardiac ischemia time-insensitive predictive instrument (ACI-TIPI). We revised the ACI-TIPI regression to require only data directly from the electrocardiograph, the e-ACI-TIPI using the same data used for the original ACI-TIPI (development set n = 3,453; test set n = 2,315). We also tested both on data from emergency department electrocardiographs from across the US (n = 8,556). We then used ACI-TIPI and e-ACI-TIPI to identify potential cohorts for the ACS trial and compared performance to cohorts from EHR data at the hospitals.
Results:
Receiver-operating characteristic (ROC) curve areas on the test set were excellent, 0.89 for ACI-TIPI and 0.84 for the e-ACI-TIPI, as was calibration. On the national electrocardiographic database, ROC areas were 0.78 and 0.69, respectively, and with very good calibration. When tested for detection of patients with > 75% ACS probability, both electrocardiograph-based methods identified eligible patients well, and better than did EHRs.
Conclusion:
Using data from medical devices such as electrocardiographs may provide accurate projections of available cohorts for clinical trials.
To determine which healthcare worker (HCW) roles and patient care activities are associated with acquisition of vancomycin-resistant Enterococcus (VRE) on HCW gloves or gowns after patient care, as a surrogate for transmission to other patients.
Design
Prospective cohort study.
Setting
Medical and surgical intensive care units at a tertiary-care academic institution.
Participants
VRE-colonized patients on Contact Precautions and their HCWs.
Methods
Overall, 94 VRE-colonized patients and 469 HCW–patient interactions were observed. Research staff recorded patient care activities and cultured HCW gloves and gowns for VRE before doffing and exiting patient room.
Results
VRE were isolated from 71 of 469 HCWs’ gloves or gowns (15%) following patient care. Occupational/physical therapists, patient care technicians, nurses, and physicians were more likely than environmental services workers and other HCWs to have contaminated gloves or gowns. Compared to touching the environment alone, the odds ratio (OR) for VRE contamination associated with touching both the patient (or objects in the immediate vicinity of the patient) and environment was 2.78 (95% confidence interval [CI], 0.99–0.77) and the OR associated with touching only the patient (or objects in the immediate vicinity) was 3.65 (95% CI, 1.17–11.41). Independent risk factors for transmission of VRE to HCWs were touching the patient’s skin (OR, 2.18; 95% CI, 1.15–4.13) and transferring the patient into or out of bed (OR, 2.66; 95% CI, 1.15–6.43).
Conclusion
Patient contact is a major risk factor for HCW contamination and subsequent transmission. Interventions should prioritize contact precautions and hand hygiene for HCWs whose activities involve touching the patient.
For livestock production systems to play a positive role in global food security, the balance between their benefits and disbenefits to society must be appropriately managed. Based on the evidence provided by field-scale randomised controlled trials around the world, this debate has traditionally centred on the concept of economic-environmental trade-offs, of which existence is theoretically assured when resource allocation is perfect on the farm. Recent research conducted on commercial farms indicates, however, that the economic-environmental nexus is not nearly as straightforward in the real world, with environmental performances of enterprises often positively correlated with their economic profitability. Using high-resolution primary data from the North Wyke Farm Platform, an intensively instrumented farm-scale ruminant research facility located in southwest United Kingdom, this paper proposes a novel, information-driven approach to carry out comprehensive assessments of economic-environmental trade-offs inherent within pasture-based cattle and sheep production systems. The results of a data-mining exercise suggest that a potentially systematic interaction exists between ‘soil health’, ecological surroundings and livestock grazing, whereby a higher level of soil organic carbon (SOC) stock is associated with a better animal performance and less nutrient losses into watercourses, and a higher stocking density with greater botanical diversity and elevated SOC. We contend that a combination of farming system-wide trials and environmental instrumentation provides an ideal setting for enrolling scientifically sound and biologically informative metrics for agricultural sustainability, through which agricultural producers could obtain guidance to manage soils, water, pasture and livestock in an economically and environmentally acceptable manner. Priority areas for future farm-scale research to ensure long-term sustainability are also discussed.