We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In September 2023, the UK Health Security Agency’s (UKHSA) South West Health Protection Team received notification of patients with Pseudomonas aeruginosa perichondritis. All five cases had attended the same cosmetic piercing studio and a multi-disciplinary outbreak control investigation was subsequently initiated. An additional five cases attending the same studio were found. Seven of the ten cases had isolates available for Variable Number Tandem Repeat (VNTR) typing at the UKHSA national reference laboratory. Clinical and environmental P. aeruginosa isolates from the patients, handwash sink, tap water and throughout the wall-mounted point-of-use water heater (including outlet water) were indistinguishable by VNTR typing (11,6,2,2,1,3,6,3,11). No additional cases were identified after control measures were implemented, which included replacing the sink and point-of-use heater.
The lack of specific recommendations to control for P. aeruginosa within Council-adopted ear-piercing byelaws or national guidance means that a cosmetic piercing artist could inadvertently overlook the risks from this bacterial pathogen despite every intention to comply with the law and follow industry best practice advice. Clinicians, Environmental Health Officers and public health professionals should remain alert for single cases of Pseudomonas perichondritis infections associated with piercings and have a low threshold for notification to local health protection teams.
Archaeological sites in Northwest Africa are rich in human fossils and artefacts providing proxies for behavioural and evolutionary studies. However, these records are difficult to underpin on a precise chronology, which can prevent robust assessments of the drivers of cultural/behavioural transitions. Past investigations have revealed that numerous volcanic ash (tephra) layers are interbedded within the Palaeolithic sequences and likely originate from large volcanic eruptions in the North Atlantic (e.g. the Azores, Canary Islands, Cape Verde). Critically, these ash layers offer a unique opportunity to provide new relative and absolute dating constraints (via tephrochronology) to synchronise key archaeological and palaeoenvironmental records in this region. Here, we provide an overview of the known eruptive histories of the potential source volcanoes capable of widespread ashfall in the region during the last ~300,000 years, and discuss the diagnostic glass compositions essential for robust tephra correlations. To investigate the eruption source parameters and weather patterns required for ash dispersal towards NW Africa, we simulate plausible ashfall distributions using the Ash3D model. This work constitutes the first step in developing a more robust tephrostratigraphic framework for distal ash layers in NW Africa and highlights how tephrochronology may be used to reliably synchronise and date key climatic and cultural transitions during the Palaeolithic.
Antimicrobial stewardship programs (ASPs) aim to mitigate antimicrobial resistance (AMR) by optimizing antibiotic use including reducing unnecessary broad-spectrum therapy. This study evaluates the impact of ASP funding and resources on the use of broad-spectrum antibiotics in Ontario hospitals.
Methods:
We conducted a cross-sectional study of antimicrobial use (AMU) across 63 Ontario hospitals from April 2020 to March 2023. The Ontario ASP Landscape Survey provided data on ASP resourcing and antibiotic utilization. The main outcome was the proportion of all antibiotics that were broad-spectrum, defined as: fluoroquinolones; third-generation cephalosporins; beta-lactam/beta-lactamase inhibitors; carbapenems; clindamycin; and parenteral vancomycin. Secondary outcomes included the proportions of individual antibiotic classes listed above and anti-pseudomonal agents. Statistical analysis involved logistic regression to determine the odds ratio (OR) of the association between ASP funding/resourcing and broad-spectrum antibiotic use.
Results:
Among 63 hospitals, 48 reported designated ASP funding/resources. Median broad-spectrum antibiotic use was 52.5%. ASP funding/resources was not associated with overall broad-spectrum antibiotic use (0.97, 95% CI: 0.75–1.25, P = 0.79). However, funding was associated with lower use of fluoroquinolones (OR 0.67, 95% CI: 0.46–0.96, P = 0.03), clindamycin (OR 0.69, 95% CI: 0.47–1.00, P = 0.05), and anti-pseudomonal agents (OR 0.76, 95% CI: 0.59–0.98, P = 0.03).
Conclusion:
The presence of designated funding and resources for hospital ASPs is linked to reduced use of specific broad-spectrum antibiotics but not overall broad-spectrum antibiotic use. Enhancing ASP resourcing may be an important factor in limiting targeted antibiotic use, thereby increasing the effectiveness of efforts to mitigate AMR.
To evaluate inter-physician variability and predictors of changes in antibiotic prescribing before (2019) and during (2020/2021) the coronavirus disease 2019 (COVID-19) pandemic.
Methods:
We conducted a retrospective cohort analysis of physicians in Ontario, Canada prescribing oral antibiotics in the outpatient setting between January 1, 2019 and December 31, 2021 using the IQVIA Xponent data set. The primary outcome was the change in the number of antibiotic prescriptions between the prepandemic and pandemic period. Secondary outcomes were changes in the selection of broad-spectrum agents and long-duration (>7 d) antibiotic use. We used multivariable linear regression models to evaluate predictors of change.
Results:
There were 17,288 physicians included in the study with substantial inter-physician variability in changes in antibiotic prescribing (median change of −43.5 antibiotics per physician, interquartile range −136.5 to −5.0). In the multivariable model, later career stage (adjusted mean difference [aMD] −45.3, 95% confidence interval [CI] −52.9 to −37.8, p < .001), family medicine (aMD −46.0, 95% CI −62.5 to −29.4, p < .001), male patient sex (aMD −52.4, 95% CI −71.1 to −33.7, p < .001), low patient comorbidity (aMD −42.5, 95% CI −50.3 to −34.8, p < .001), and high prescribing to new patients (aMD −216.5, 95% CI −223.5 to −209.5, p < .001) were associated with decreases in antibiotic initiation. Family medicine and high prescribing to new patients were associated with a decrease in selection of broad-spectrum agents and prolonged antibiotic use.
Conclusions:
Antibiotic prescribing changed throughout the COVID-19 pandemic with overall decreases in antibiotic initiation, broad-spectrum agents, and prolonged antibiotic courses with inter-physician variability. These findings present opportunities for community antibiotic stewardship interventions.
Laboratory-based case confirmation is an integral part of measles surveillance programmes; however, logistical constraints can delay response. Use of RDTs during initial patient contact could enhance surveillance by real-time case confirmation and accelerating public health response. Here, we evaluate performance of a novel measles IgM RDT and assess accuracy of visual interpretation using a representative collection of 125 sera from the Brazilian measles surveillance programme. RDT results were interpreted visually by a panel of six independent observers, the consensus of three observers and by relative reflectance measurements using an ESEQuant Reader. Compared to the Siemens anti-measles IgM EIA, sensitivity and specificity of the RDT were 94.9% (74/78, 87.4–98.6%) and 95.7% (45/47, 85.5-99.5%) for consensus visual results, and 93.6% (73/78, 85.7–97.9%) and 95.7% (45/47, 85.5-99.5%), for ESEQuant measurement, respectively. Observer agreement, determined by comparison between individuals and visual consensus results, and between individuals and ESEQuant measurements, achieved average kappa scores of 0.97 and 0.93 respectively. The RDT has the sensitivity and specificity required of a field-based test for measles diagnosis, and high kappa scores indicate this can be accomplished accurately by visual interpretation alone. Detailed studies are needed to establish its role within the global measles control programme.
We sought to evaluate the impact of antibiotic selection and duration of therapy on treatment failure in older adults with catheter-associated urinary tract infection (CA-UTI).
Methods:
We conducted a population-based cohort study comparing antibiotic treatment options and duration of therapy for non-hospitalized adults aged 66 and older with presumed CA-UTI (defined as an antibiotic prescription and an organism identified in urine culture in a patient with urinary catheterization documented within the prior 90 d). The primary outcome was treatment failure, a composite of repeat urinary antibiotic prescribing, positive blood culture with the same organism, all-cause hospitalization or mortality, within 60 days. We determined the risk of treatment failure accounting for age, sex, comorbidities, and healthcare exposure using log-binomial regression.
Results:
Of 4,436 CA-UTI patients, 2,709 (61.1%) experienced treatment failure. Compared to a reference of TMP-SMX (61.9% failure), of those treated with fluoroquinolones, 56.3% experienced failure (RR 0.91, 95% CI: 0.85–0.98) and 60.9% of patients treated with nitrofurantoin experienced failure (RR 1.02, 95% CI: 0.94–1.10). Compared to 5–7 days of therapy (treatment failure: 59.4%), 1–4 days was associated with 69.5% failure (RR 1.15, 95% CI: 1.05–1.27), and 8–14 days was associated with a 62.0% failure (RR 1.05, 95% CI: 0.99–1.11).
Conclusions:
Although most treatment options for CA-UTI have a similar risk of treatment failure, fluoroquinolones, and treatment durations ≥ 5 days in duration appear to be associated with modestly improved clinical outcomes. From a duration of therapy perspective, this study provides reassurance that relatively short courses of 5–7 days may be reasonable for CA-UTI.
The traditional medical education system has produced scientifically grounded and clinically skilled physicians who have served medicine and society. Sweeping changes launched around the turn of the millennium have revolutionized undergraduate and postgraduate medical education across the world (Gutierrez et al. 2016; Shelton et al. 2017; Samarasekera et al. 2018). Training has moved from being time-based to become more outcome-based, with a move away from the apprenticeship model to a more structured and systematic approach, emphasizing learning and development of skills.
Medical education has changed considerably from models based mainly on knowledge acquisition and duration of training towards the achievement of predefined learning outcomes (Krackov and Pohl 2011). In such a competency-based approach to education effective feedback has become an integral and important constituent of teaching and learning.
In the learning process, feedback is a process of sharing observations, concerns and suggestions with another person. Feedback helps to maximize learning and development by raising an individual’s awareness of their areas of strength and relative weakness or need as well as outlining the actions required to improve performance.
Detailed and prompt feedback coupled with clear opportunities to improve enables individuals to achieve previously agreed milestones such as curriculum outcomes (Krackov and Pohl 2011) or continuing professional development (CPD) objectives.
The duration of immunity after first severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection and the extent to which prior immunity prevents reinfection is uncertain and remains an important question within the context of new variants. This is a retrospective population-based matched observational study where we identified the first polymerase chain reaction (PCR) positive of primary SARS-CoV-2 infection case tests between 1 March 2020 and 30 September 2020. Each case was matched by age, sex, upper tier local authority of residence and testing route to one individual testing negative in the same week (controls) by PCR. After a 90-day pre-follow-up period for cases and controls, any subsequent positive tests up to 31 December 2020 and deaths within 28 days of testing positive were identified, this encompassed an essentially vaccine-free period. We used a conditional logistic regression to analyse the results. There were 517 870 individuals in the matched cohort with 2815 reinfection cases and 12 098 first infections. The protective effect of a prior SARS-CoV-2 PCR-positive episode was 78% (odds ratio (OR) 0.22, 0.21–0.23). Protection rose to 82% (OR 0.18, 0.17–0.19) after a sensitivity analysis excluded 933 individuals with a first test between March and May and a subsequent positive test between June and September 2020. Amongst individuals testing positive by PCR during follow-up, reinfection cases had 77% lower odds of symptoms at the second episode (adjusted OR 0.23, 0.20–0.26) and 45% lower odds of dying in the 28 days after reinfection (adjusted OR 0.55, 0.42–0.71). Prior SARS-CoV-2 infection offered protection against reinfection in this population. There was some evidence that reinfections increased with the alpha variant compared to the wild-type SARS-CoV-2 variant highlighting the importance of continued monitoring as new variants emerge.
Online peer support platforms have been shown to provide a supportive space that can enhance social connectedness and personal empowerment. Some studies have analysed forum messages, showing that users describe a range of advantages, and some disadvantages to their use. However, the direct examination of users’ experiences of such platforms is rare and may be particularly informative for enhancing their helpfulness. This study aimed to understand users’ experiences of the Support, Hope and Recovery Online Network (SHaRON), an online cognitive behavioural therapy-based peer support platform for adults with mild to moderate anxiety or depression. Platform users (n = 88) completed a survey on their use of different platform features, feelings about using the platform, and overall experience. Responses were analysed descriptively and using thematic analysis. Results indicated that most features were generally well used, with the exception of private messaging. Many participants described feeling well supported and finding the information and resources helpful; the majority of recent users (81%) rated it as helpful overall. However, some participants described feeling uncomfortable about posting messages, and others did not find the platform helpful and gave suggestions for improvements. Around half had not used the platform in the past 3 months, for different reasons including feeling better or forgetting about it. Some described that simply knowing it was there was helpful, even without regular use. The findings highlight what is arguably a broader range of user experiences than observed in previous studies, which may have important implications for the enhancement of SHaRON and other platforms.
Key learning aims
(1) To understand what an online peer support platform is and how this can be used to support users’ mental health.
(2) To learn how users described their experience of the SHaRON platform.
(3) To understand the benefits that online peer support may provide.
(4) To consider what users found helpful and unhelpful, and how this might inform the further development of these platforms.
Supporting Antarctic scientific investigation is the job of the national Antarctic programmes, the government entities charged with delivering their countries’ Antarctic research strategies. This requires sustained investment in people, innovative technologies, Antarctic infrastructures, and vessels with icebreaking capabilities. The recent endorsement of the International Maritime Organization (IMO) Polar Code (2015) means that countries must address challenges related to an ageing icebreaking vessel fleet. Many countries have recently invested in and begun, or completed, builds on new icebreaking Polar research vessels. These vessels incorporate innovative technologies to increase fuel efficiency, to reduce noise output, and to address ways to protect the Antarctic environment in their design. This paper is a result of a Council of Managers of National Antarctic Programs (COMNAP) project on new vessel builds which began in 2018. It considers the recent vessel builds of Australia’s RSV Nuyina, China’s MV Xue Long 2, France’s L’Astrolabe, Norway’s RV Kronprins Haakon, Peru’s BAP Carrasco, and the United Kingdom’s RRS Sir David Attenborough. The paper provides examples of purposeful consideration of science support requirements and environmental sustainability in vessel designs and operations.
Encephalitis causes high morbidity and mortality. An incidence of 4.3 cases of encephalitis/100 000 population has been reported in the UK. We performed a retrospective evaluation of the diagnosis and management of adults admitted to hospital with a clinical diagnosis of encephalitis/meningoencephalitis. Clinical, laboratory and radiological data were collated from electronic records. Thirty-six patients, median age 55 years and 24 (67%) male were included. The aetiology was confirmed over nine months in 25 (69%) of whom 16 were infections (six viral, seven bacterial, two parasitic and one viral and parasitic co-infection); 7 autoimmune; 1 metabolic and 1 neoplastic. Of 24 patients with fever, 15 (63%) had an infection. The median time to computed topography, magnetic resonance imaging and electroencephalography (EEG) was 1, 8 and 3 days respectively. Neuroimaging was abnormal in 25 (69%) and 17 (89%) had abnormal EEGs. Only 19 (53%) received aciclovir treatment. Six (17%) made good recoveries, 16 (44%) had moderate disability, 8 (22%) severe disability and 6 (17%) died. Outcomes were worse for those with an infectious cause. In summary, a diagnosis was made in 69.4% of patients admitted with encephalitis/meningoencephalitis. Autoimmune causes are important to consider at an early stage due to a successful response to treatment. Only 53% of patients received aciclovir on admission. Neuroimaging and EEG studies were delayed. The results of this work resulted in further developing the clinical algorithm for managing these patients.
Interfacility patient movement plays an important role in the dissemination of antimicrobial-resistant organisms throughout healthcare systems. We evaluated how 3 alternative measures of interfacility patient sharing were associated with C. difficile infection incidence in Ontario acute-care facilities.
Design:
The cohort included adult acute-care facility stays of ≥3 days between April 2003 and March 2016. We measured 3 facility-level metrics of patient sharing: general patient importation, incidence-weighted patient importation, and C. difficile case importation. Each of the 3 patient-sharing metrics were examined against the incidence of C. difficile infection in the facility per 1,000 stays, using Poisson regression models.
Results:
The analyzed cohort included 6.70 million stays at risk of C. difficile infection across 120 facilities. Over the 13-year period, we included 62,189 new cases of healthcare-associated CDI (incidence, 9.3 per 1,000 stays). After adjustment for facility characteristics, general importation was not strongly associated with C. difficile infection incidence (risk ratio [RR] per doubling, 1.10; 95% confidence interval [CI], 0.97–1.24; proportional change in variance [PCV], −2.0%). Incidence-weighted (RR per doubling, 1.18; 95% CI, 1.06–1.30; PCV, −8.4%) and C. difficile case importation (RR per doubling, 1.43; 95% CI, 1.29–1.58; PCV, −30.1%) were strongly associated with C. difficile infection incidence.
Conclusions:
In this 13-year study of acute-care facilities in Ontario, interfacility variation in C. difficile infection incidence was associated with importation of patients from other high-incidence acute-care facilities or specifically of patients with a recent history of C. difficile infection. Regional infection control strategies should consider the potential impact of importation of patients at high risk of C. difficile shedding from outside facilities.
Clostridium difficile spores play an important role in transmission and can survive in the environment for several months. Optimal methods for measuring environmental C. difficile are unknown. We sought to determine whether increased sample surface area improved detection of C. difficile from environmental samples.
Setting
Samples were collected from 12 patient rooms in a tertiary-care hospital in Toronto, Canada.
Methods
Samples represented small surface-area and large surface-area floor and bedrail pairs from single-bed rooms of patients with low (without prior antibiotics), medium (with prior antibiotics), and high (C. difficile infected) shedding risk. Presence of C. difficile in samples was measured using quantitative polymerase chain reaction (qPCR) with targets on the 16S rRNA and toxin B genes and using enrichment culture.
Results
Of the 48 samples, 64·6% were positive by 16S qPCR (geometric mean, 13·8 spores); 39·6% were positive by toxin B qPCR (geometric mean, 1·9 spores); and 43·8% were positive by enrichment culture. By 16S qPCR, each 10-fold increase in sample surface area yielded 6·6 times (95% CI, 3·2–13) more spores. Floor surfaces yielded 27 times (95% CI, 4·9–181) more spores than bedrails, and rooms of C. difficile–positive patients yielded 11 times (95% CI, 0·55–164) more spores than those of patients without prior antibiotics. Toxin B qPCR and enrichment culture returned analogous findings.
Conclusions
Clostridium difficile spores were identified in most floor and bedrail samples, and increased surface area improved detection. Future research aiming to understand the role of environmental C. difficile in transmission should prefer samples with large surface areas.
Antibiotic use varies widely between hospitals, but the influence of antimicrobial stewardship programs (ASPs) on this variability is not known. We aimed to determine the key structural and strategic aspects of ASPs associated with differences in risk-adjusted antibiotic utilization across facilities.
Design
Observational study of acute-care hospitals in Ontario, Canada
Methods
A survey was sent to hospitals asking about both structural (8 elements) and strategic (32 elements) components of their ASP. Antibiotic use from hospital purchasing data was acquired for January 1 to December 31, 2014. Crude and adjusted defined daily doses per 1,000 patient days, accounting for hospital and aggregate patient characteristics, were calculated across facilities. Rate ratios (RR) of defined daily doses per 1,000 patient days were compared for hospitals with and without each antimicrobial stewardship element of interest.
Results
Of 127 eligible hospitals, 73 (57%) participated in the study. There was a 7-fold range in antibiotic use across these facilities (min, 253 defined daily doses per 1,000 patient days; max, 1,872 defined daily doses per 1,000 patient days). The presence of designated funding or resources for the ASP (RRadjusted, 0·87; 95% CI, 0·75–0·99), prospective audit and feedback (RRadjusted, 0·80; 95% CI, 0·67–0·96), and intravenous-to-oral conversion policies (RRadjusted, 0·79; 95% CI, 0·64–0·99) were associated with lower risk-adjusted antibiotic use.
Conclusions
Wide variability in antibiotic use across hospitals may be partially explained by both structural and strategic ASP elements. The presence of funding and resources, prospective audit and feedback, and intravenous-to-oral conversion should be considered priority elements of a robust ASP.
To study the antibody response to tetanus toxoid and measles by age following vaccination in children aged 4 months to 6 years in Entebbe, Uganda. Serum samples were obtained from 113 children aged 4–15 months, at the Mother-Child Health Clinic (MCHC), Entebbe Hospital and from 203 of the 206 children aged between 12 and 75 months recruited through the Outpatients Department (OPD). Antibodies to measles were quantified by plaque reduction neutralisation test (PRNT) and with Siemens IgG EIA. VaccZyme IgG EIA was used to quantify anti-tetanus antibodies. Sera from 96 of 113 (85.0%) children attending the MCHC contained Measles PRNT titres below the protective level (120 mIU/ml). Sera from 24 of 203 (11.8%) children attending the OPD contained PRNT titres <120 mIU/ml. There was no detectable decline in anti-measles antibody concentrations between 1 and 6 years. The anti-tetanus antibody titres in all 113 children attending MCHC and in 189 of 203 (93.1%) children attending the OPD were >0.15 IU/ml by EIA, a level considered protective. The overall concentration of anti-tetanus antibody was sixfold higher in children under 12 months compared with the older children, with geometric mean concentrations of 3.15 IU/ml and 0.49 IU/ml, respectively. For each doubling in age between 4 and 64 months, the anti-tetanus antibody concentration declined by 50%. As time since the administration of the third DTP vaccination doubled, anti-tetanus antibody concentration declined by 39%. The low measles antibody prevalence in the children presenting at the MCHC is consistent with the current measles epidemiology in Uganda, where a significant number of measles cases occur in children under 1 year of age and earlier vaccination may be indicated. The consistent fall in anti-tetanus antibody titre over time following vaccination supports the need for further vaccine boosters at age 4–5 years as recommended by the WHO.
Social injustices, structural and personal crises as well as intensifying stress on some citizens seem increasing preoccupations in contemporary society and social policy. In this context, the concept of vulnerability has come to play a prominent role in academic, governmental and everyday accounts of the human condition. Policy makers and practitioners are now concerned with addressing vulnerability through an expansive range of interventions. As this special issue draws attention to, a vulnerability zeitgeist or ‘spirit of the time’ has been traced in contemporary welfare and disciplinary arrangements (Brown, 2014, 2015), which now informs a range of interventions and approaches to social problems, both in the UK and internationally. As prominent examples, ‘vulnerable’ people are legally entitled to ‘priority need’ in English social housing allocations (Carr and Hunter, 2008), vulnerable victims of crime are seen as requiring special responses in the UK criminal justice system (see Roulstone et al., 2011; Walkgate, 2011), ‘vulnerable adults’ have designated ‘protections’ under British law (Dunn et al., 2008; Clough, 2014) and vulnerable migrants and refugees are increasingly prioritised within international immigration processes (Peroni and Timmer, 2013). There is a long tradition in the field of social policy of critiquing the implications of particular concepts as mechanisms of governance, from poverty (Townsend, 1979; Lister, 2004) and social exclusion (Levitas, 1998; Young 1999) to risk (Beck, 1992; Kemshall, 2002) and resilience (Ecclestone and Lewis, 2014; Wright, 2016). Yet while vulnerability seems to be one of the latest buzzwords gathering political and cultural momentum, critiques and empirical studies of how it is operationalised in different policy and practice contexts are less well elaborated.
Hospital-acquired infections (HAIs) develop rapidly after brief and transient exposures, and ecological exposures are central to their etiology. However, many studies of HAIs risk do not correctly account for the timing of outcomes relative to exposures, and they ignore ecological factors. We aimed to describe statistical practice in the most cited HAI literature as it relates to these issues, and to demonstrate how to implement models that can be used to account for them.
METHODS
We conducted a literature search to identify 8 frequently cited articles having primary outcomes that were incident HAIs, were based on individual-level data, and used multivariate statistical methods. Next, using an inpatient cohort of incident Clostridium difficile infection (CDI), we compared 3 valid strategies for assessing risk factors for incident infection: a cohort study with time-fixed exposures, a cohort study with time-varying exposures, and a case-control study with time-varying exposures.
RESULTS
Of the 8 studies identified in the literature scan, 3 did not adjust for time-at-risk, 6 did not assess the timing of exposures in a time-window prior to outcome ascertainment, 6 did not include ecological covariates, and 6 did not account for the clustering of outcomes in time and space. Our 3 modeling strategies yielded similar risk-factor estimates for CDI risk.
CONCLUSIONS
Several common statistical methods can be used to augment standard regression methods to improve the identification of HAI risk factors.
Infect. Control Hosp. Epidemiol. 2016;37(4):411–419