To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To determine the proportion of hospitals that implemented 6 leading practices in their antimicrobial stewardship programs (ASPs). Design: Cross-sectional observational survey.
Advance letters and electronic questionnaires were initiated February 2020. Primary outcomes were percentage of hospitals that (1) implemented facility-specific treatment guidelines (FSTG); (2) performed interactive prospective audit and feedback (PAF) either face-to-face or by telephone; (3) optimized diagnostic testing; (4) measured antibiotic utilization; (5) measured C. difficile infection (CDI); and (6) measured adherence to FSTGs.
Of 948 hospitals invited, 288 (30.4%) completed the questionnaire. Among them, 82 (28.5%) had <99 beds, 162 (56.3%) had 100–399 beds, and 44 (15.2%) had ≥400+ beds. Also, 230 (79.9%) were healthcare system members. Moreover, 161 hospitals (54.8%) reported implementing FSTGs; 214 (72.4%) performed interactive PAF; 105 (34.9%) implemented procedures to optimize diagnostic testing; 235 (79.8%) measured antibiotic utilization; 258 (88.2%) measured CDI; and 110 (37.1%) measured FSTG adherence. Small hospitals performed less interactive PAF (61.0%; P = .0018). Small and nonsystem hospitals were less likely to optimize diagnostic testing: 25.2% (P = .030) and 21.0% (P = .0077), respectively. Small hospitals were less likely to measure antibiotic utilization (67.8%; P = .0010) and CDI (80.3%; P = .0038). Nonsystem hospitals were less likely to implement FSTGs (34.3%; P < .001).
Significant variation exists in the adoption of ASP leading practices. A minority of hospitals have taken action to optimize diagnostic testing and measure adherence to FSTGs. Additional efforts are needed to expand adoption of leading practices across all acute-care hospitals with the greatest need in smaller hospitals.
Healthcare organizations are required to provide workers with respiratory protection (RP) to mitigate hazardous airborne inhalation exposures. This study sought to better identify gaps that exist between RP guidance and clinical practice to understand issues that would benefit from additional research or clarification.
To assess resource allocation and costs associated with US hospitals preparing for the possible spread of the 2014–2015 Ebola virus disease (EVD) epidemic in the United States.
A survey was sent to a stratified national probability sample (n=750) of US general medical/surgical hospitals selected from the American Hospital Association (AHA) list of hospitals. The survey was also sent to all children’s general hospitals listed by the AHA (n=60). The survey assessed EVD preparation supply costs and overtime staff hours. The average national wage was multiplied by labor hours to calculate overtime labor costs. Additional information collected included challenges, benefits, and perceived value of EVD preparedness activities.
The average amount spent by hospitals on combined supply and overtime labor costs was $80,461 (n=133; 95% confidence interval [CI], $56,502–$104,419). Multivariate analysis indicated that small hospitals (mean, $76,167) spent more on staff overtime costs per 100 beds than large hospitals (mean, $15,737; P<.0001). The overall cost for acute-care hospitals in the United States to prepare for possible EVD cases was estimated to be $361,108,968. The leading challenge was difficulty obtaining supplies from vendors due to shortages (83%; 95% CI, 78%–88%) and the greatest benefit was improved knowledge about personal protective equipment (89%; 95% CI, 85%–93%).
The financial impact of EVD preparedness activities was substantial. Overtime cost in smaller hospitals was >3 times that in larger hospitals. Planning for emerging infectious disease identification, triage, and management should be conducted at regional and national levels in the United States to facilitate efficient and appropriate allocation of resources in acute-care facilities.
Quasi-experimental studies evaluate the association between an intervention and an outcome using experiments in which the intervention is not randomly assigned. Quasi-experimental studies are often used to evaluate rapid responses to outbreaks or other patient safety problems requiring prompt, nonrandomized interventions. Quasi-experimental studies can be categorized into 3 major types: interrupted time-series designs, designs with control groups, and designs without control groups. This methods paper highlights key considerations for quasi-experimental studies in healthcare epidemiology and antimicrobial stewardship, including study design and analytic approaches to avoid selection bias and other common pitfalls of quasi-experimental studies.
This white paper identifies knowledge gaps and new challenges in healthcare epidemiology research, assesses the progress made toward addressing research priorities, provides the Society for Healthcare Epidemiology of America (SHEA) Research Committee's recommendations for high-priority research topics, and proposes a road map for making progress toward these goals. It updates the 2010 SHEA Research Committee document, “Charting the Course for the Future of Science in Healthcare Epidemiology: Results of a Survey of the Membership of SHEA,” which called for a national approach to healthcare-associated infections (HAIs) and a prioritized research agenda. This paper highlights recent studies that have advanced our understanding of HAIs, the establishment of the SHEA Research Network as a collaborative infrastructure to address research questions, prevention initiatives at state and national levels, changes in reporting and payment requirements, and new patterns in antimicrobial resistance.
Background: To assess the state of health center integration into community preparedness, we undertook a national study of linkages between health centers and the emergency preparedness and response planning initiatives in their communities. The key objectives of this project were to gain a better understanding of existing linkages in a nationally representative sample of health centers, and identify health center demographic and experience factors that were associated with strong linkages.
Methods: The objectives of the study were to gain a baseline understanding of existing health center linkages to community emergency preparedness and response systems and to identify factors that were associated with strong linkages. A 60-item questionnaire was mailed to the population of health centers supported by the Health Resources and Services Administration’s Bureau of Primary Health Care in February 2005. Results were aggregated and a chi square analysis identified factors associated with stronger linkages.
Results: Overall performance on study-defined indicators of strong linkages was low: 34% had completed a hazard vulnerability analysis in collaboration with the community emergency management agency, 30% had their role documented in the community plan, and 24% participated in community-wide exercises. Stronger linkages were associated with experience responding to a disaster and a perception of high risk for experiencing a disaster.
Conclusions: The potential for health centers to participate in an integrated response is not fully realized, and their absence from community-based planning leaves an already vulnerable population at greater risk. Community planners should be encouraged to include health centers in planning and response and centers should receive more targeted resources for community integration. (Disaster Med Public Health Preparedness. 2007;1:96–105)
Since its inception, the Society for Healthcare Epidemiology of America (SHEA) has promoted research into prevention of adverse events in hospitals. In 1995, SHEA made this mission concrete by initiating a collaborative research project with the Joint Commission on the Accreditation of Health Care Organization (now known as the Joint Commission). In the early 1990s, the Joint Commission was implementing its “Agenda for Change” and associated Indicator Monitoring System. At the time, there were numerous competing measurement systems that used different definitions, all aimed at measuring the quality of patient care, and many had indicators measuring the incidence of hospital-acquired infections. Some of these indicators used administrative data, such as International Classification of Diseases, 9th Revision, Clinical Modification (ICD-9-CM) codes, to measure adverse events.
Bloodstream infection (BSI) rates are used as comparative clinical performance indicators; however, variations in definitions and data-collection approaches make it difficult to compare and interpret rates. To determine the extent to which variation in indicator specifications affected infection rates and hospital performance rankings, we compared absolute rates and relative rankings of hospitals across 5 BSI indicators.
Multicenter observational study. BSI rate specifications varied by data source (clinical data, administrative data, or both), scope (hospital wide or intensive care unit specific), and inclusion/exclusion criteria. As appropriate, hospital-specific infection rates and rankings were calculated by processing data from each site according to 2-5 different specifications.
A total of 28 hospitals participating in the EPIC study.
Hospitals submitted deidentified information about all patients with BSIs from January through September 1999.
Median BSI rates for 2 indicators based on intensive care unit surveillance data ranged from 2.23 to 2.91 BSIs per 1000 central-line days. In contrast, median rates for indicators based on administrative data varied from 0.046 to 7.03 BSIs per 100 patients. Hospital-specific rates and rankings varied substantially as different specifications were applied; the rates of 8 of 10 hospitals were both greater than and less than the mean. Correlations of hospital rankings among indicator pairs were generally low (rs = 0-0.45), except when both indicators were based on intensive care unit surveillance (rs = 0.83).
Although BSI rates seem to be a logical indicator of clinical performance, the use of various indicator specifications can produce remarkably different judgments of absolute and relative performance for a given hospital. Recent national initiatives continue to mix methods for specifying BSI rates; this practice is likely to limit the usefulness of such information for comparing and improving performance.
To describe the conceptual framework and methodology of the Evaluation of Processes and Indicators in Infection Control (EPIC) study and present results of CVC insertion characteristics and organizational practices for preventing BSIs. The goal of the EPIC study was to evaluate relationships among processes of care, organizational characteristics, and the outcome of BSI.
This was a multicenter prospective observational study of variation in hospital practices related to preventing CVC-associated BSIs. Process of care information (eg, barrier use during insertions and experience of the inserting practitioner) was collected for a random sample of approximately 5 CVC insertions per month per hospital during November 1998 to December 1999. Organization demographic and practice information (eg, surveillance activities and staff and ICU nurse staffing levels) was also collected.
Medical, surgical, or medical-surgical ICUs from 55 hospitals (41 U.S. and 14 international sites).
Process information was obtained for 3,320 CVC insertions with an average of 58.2 (± 16.1) insertions per hospital. Fifty-four hospitals provided policy and practice information.
Staff spent an average of 13 hours per week in study ICU surveillance. Most patients received nontunneled, multiple lumen CVCs, of which fewer than 25% were coated with antimicrobial material. Regarding barriers, most clinicians wore masks (81.5%) and gowns (76.8%); 58.1% used large drapes. Few hospitals (18.1%) used an intravenous team to manage ICU CVCs.
Substantial variation exists in CVC insertion practice and BSI prevention activities. Understanding which practices have the greatest impact on BSI rates can help hospitals better target improvement interventions.
The Project to Monitor Indicators (PMI) will be a collaborative effort between the Society for Healthcare Epidemiology of America (SHEA) and the Joint Commission on Accreditation of Healthcare Organizations (JCAHO). The goal of this collaboration is to create an intellectual infrastructure to support the effective use, development, understanding, and continuous improvement of clinical quality indicators through coordinated study by hospital epidemiologists.
The Joint Commission is in the midst of an extensive effort to develop a set of indicators that reflect the performance of various aspects of clinical practice. These indicators will form the foundation of a national comparative measurement system called the Indicator Measurement System (IMSystem). It is the Joint Commission's plan that hospitals will be required to participate in the IMSystem as part of the accreditation process in the near future.
Email your librarian or administrator to recommend adding this to your organisation's collection.