To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To describe epidemiologic and genomic characteristics of a severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) outbreak in a large skilled-nursing facility (SNF), and the strategies that controlled transmission.
Design, setting, and participants:
This cohort study was conducted during March 22–May 4, 2020, among all staff and residents at a 780-bed SNF in San Francisco, California.
Contact tracing and symptom screening guided targeted testing of staff and residents; respiratory specimens were also collected through serial point prevalence surveys (PPSs) in units with confirmed cases. Cases were confirmed by real-time reverse transcription–polymerase chain reaction testing for SARS-CoV-2, and whole-genome sequencing (WGS) was used to characterize viral isolate lineages and relatedness. Infection prevention and control (IPC) interventions included restricting from work any staff who had close contact with a confirmed case; restricting movement between units; implementing surgical face masking facility-wide; and the use of recommended PPE (ie, isolation gown, gloves, N95 respirator and eye protection) for clinical interactions in units with confirmed cases.
Of 725 staff and residents tested through targeted testing and serial PPSs, 21 (3%) were SARS-CoV-2 positive: 16 (76%) staff and 5 (24%) residents. Fifteen cases (71%) were linked to a single unit. Targeted testing identified 17 cases (81%), and PPSs identified 4 cases (19%). Most cases (71%) were identified before IPC interventions could be implemented. WGS was performed on SARS-CoV-2 isolates from 4 staff and 4 residents: 5 were of Santa Clara County lineage and the 3 others were distinct lineages.
Early implementation of targeted testing, serial PPSs, and multimodal IPC interventions limited SARS-CoV-2 transmission within the SNF.
Synchrotron x-rays are a powerful tool to probe real-time changes in the microstructure of materials as they respond to an external stimulus, such as phase transformations that take place in response to a change in temperature. X-ray imaging techniques include radiography and tomography, and have been steadily improved over the last decades so that they can now resolve micrometer-scale or even finer structural changes in bulk specimens over time scales of a second or less. Under certain conditions, these imaging approaches can also give spatially resolved chemical information. In this article, we focus on the liquid to solid transformation of metallic alloys and the temporal and spatial resolution of the accompanying segregation of alloying elements. The solidification of alloys provides an excellent case study for x-ray imaging because it is usually accompanied by the progressive, preferential segregation of one or more of the alloying elements to either the solid or the liquid, and gives rise to surprisingly complex chemical segregation patterns. We describe chemical mapping investigations of binary and quasi-binary alloys using radiography and tomography, and recent developments in x-ray fluorescence imaging that offer the prospect of a more general, multielement mapping technique. Future developments for synchrotron-based chemical mapping are also considered.
Background: Central-line–associated blood stream infections (CLABSIs) are linked with significant morbidity and mortality. A NHSN laboratory-confirmed bloodstream infection (LCBSI) has specific criteria to ascribe an infection to the central line or not. The criteria used to associate the pathogen to another site are restrictive. This objective to better classify CLABSIs using enhanced criteria to gain a comprehensive understanding of the error so that appropriate reduction efforts are utilized. Methods: We conducted a retrospective review of medical records with NHSN-identified CLABSI from July 2017 to December 2018 at 2 geographically proximate hospitals. Trained infectious diseases personnel from tertiary-care academic medical centers, the University of Virginia Health System, a 600-bed medical center in Charlottesville, Virginia, and Virginia Commonwealth University Health System with 865 beds in Richmond, Virginia, reviewed charts. We defined “overcaptured” or O-CLABSI into different categories: O-CLABSI-1 is bacteremia attributable to a primary infectious source; O-CLABSI-2 is bacteremia attributable to neutropenia with gastrointestinal translocation not meeting mucosal barrier injury criteria; O-CLABSI-3 is a positive blood culture attributable to a contaminant; and O-CLABSI-4 is a patient injecting line, though not officially documented. Descriptive analyses were performed using the χ2 and the Fisher exact tests. Results: We found a large number of O-CLABSIs on chart review (79 of 192, 41%). Overall, 56 of 192 (29%) LCBSIs were attributable to a primary infectious source not meeting NHSN definition. O-CLABSI proportions between the 2 hospitals were statistically different; hospital A identified 34 of 59 (58%) of their NHSN-identified CLABSIs as O-CLABSIs, and hospital B identified a 45 of 133 (34%) as O-CLABSIs (P = .0020) (Table 1). When comparing O-CLABSI types, hospital B had a higher percentage of O-CLABSI-1 compared to hospital B: 76% versus 64%. Hospital A had a higher proportion of O-CLABSI-2: 21 versus 7%. Hospitals A and B had similar proportion of O-CLABSI-3: 15% versus 18%. These values were all statistically significant (P < .0001). Discussions: The results of these 2 geographically proximate systems indicate that O-CLABSIs are common. Attribution can vary significantly between institutions, likely depending on differences in incidence of true CLABSI, patient populations, protocols, and protocol compliance. These findings have implications for interfacility comparisons of publicly reported data. Most importantly, erroneous attribution can result in missed opportunity to direct patient safety efforts to the root cause of the bacteremia and could lead to inappropriate treatment.
Disclosures: Michelle Doll, Research Grant from Molnlycke Healthcare
The learning hospital is distinguished by ceaseless evolution of erudition, enhancement, and implementation of clinical best practices. We describe a model for the learning hospital within the framework of a hospital infection prevention program and argue that a critical assessment of safety practices is possible without significant grant funding. We reviewed 121 peer-reviewed manuscripts published by the VCU Hospital Infection Prevention Program over 16 years. Publications included quasi-experimental studies, observational studies, surveys, interrupted time series analyses, and editorials. We summarized the articles based on their infection prevention focus, and we provide a brief summary of the findings. We also summarized the involvement of nonfaculty learners in these manuscripts as well as the contributions of grant funding. Despite the absence of significant grant funding, infection prevention programs can critically assess safety strategies under the learning hospital framework by leveraging a diverse collaboration of motivated nonfaculty learners. This model is a valuable adjunct to traditional grant-funded efforts in infection prevention science and is part of a successful horizontal infection control program.
Head and neck cancer patients receiving radiotherapy can experience a number of toxicities, including weight loss and malnutrition, which can impact upon the quality of treatment. The purpose of this retrospective cohort study is to evaluate weight loss and identify predictive factors for this patient group.
Materials and methods
A total of 40 patients treated with radiotherapy since 2012 at the study centre were selected for analysis. Data were collected from patient records. The association between potential risk factors and weight loss was investigated.
Mean weight loss was 5 kg (6%). In all, 24 patients lost >5% starting body weight. Age, T-stage, N-stage, chemotherapy and starting body weight were individually associated with significant differences in weight loss. On multiple linear regression analysis age and nodal status were predictive.
Younger patients and those with nodal disease were most at risk of weight loss. Other studies have identified the same risk factors along with several other variables. The relative significance of each along with a number of other potential factors is yet to be fully understood. Further research is required to help identify patients most at risk of weight loss; and assess interventions aimed at preventing weight loss and malnutrition.
Introduction: Functionally univentricular hearts palliated with superior or total cavopulmonary connection result in circulations in series. The absence of a pre-pulmonary pump means that cardiac output is more difficult to adjust and control. Continuous monitoring of cardiac output is crucial during cardiac catheter interventions and can provide new insights into the complex physiology of these lesions. Materials and methods: The Icon® cardiac output monitor was used to study the changes in cardiac output during catheter interventions in 15 patients (median age: 6.1 years, range: 4.8–15.3 years; median weight: 18.5 kg, range: 15–63 kg) with cavopulmonary circulations. A total of 19 interventions were undertaken in these patients and the observed changes in cardiac output were recorded and analysed. Results: Cardiac output was increased with creation of stent fenestrations after total cavopulmonary connection (median increase of 22.2, range: 6.7%–28.6%) and also with drainage of significant pleural effusions (16.7% increase). Cardiac output was decreased with complete or partial occlusion of fenestrations (median decrease of 10.6, range: 7.1%–13.4%). There was a consistent increase in cardiac output with stenting of obstructive left pulmonary artery lesions (median increase of 7.7, range: 5%–14.3%, p = 0.007). Conclusions: Icon® provides a novel technique for the continuous, non-invasive monitoring of cardiac output. It provides a further adjunct for monitoring of physiologically complex patients during catheter interventions. These results are consistent with previously reported series involving manipulation of fenestrations. This is the first report identifying an increase in cardiac output with stenting of obstructive pulmonary arterial lesions.
While health warnings are present on cigarette packs around the world, the nature of the warnings varies considerably between countries. In the United States, a small text warning citing the dangers of cigarette smoking is found on the side of all packs. This pilot study sought to determine whether graphic cigarette warning images, like those found in the United Kingdom and Canada, were better at decreasing cravings to smoke than existing text warnings found on cigarette packs in the United States. Twenty-five smokers seeking treatment to quit at a specialty tobacco treatment program were administered the Brief Questionnaire of Smoking Urges (QSU — BRIEF), a validated measure of craving, prior to and following exposure to cigarette pack warning images. The graphic cigarette warning images reduced cravings to smoke (6.20 point decrease) more than neutral images (3.36 point decrease) and current text warnings used in the United States (5.75 point decrease), although this difference was not statistically significant. Based on these pilot data, a larger study could further examine the effectiveness of graphic warning images and whether such warnings hold an advantage over the currently used text warnings.
We describe the clinical, microbiologic, and molecular features of the first series of qacA/B-containing strains of methicillin-resistant Staphylococcus aureus from infected US patients. All qac-carrying strains were clonally diverse, and qacA strains exhibited increased tolerance to chlorhexidine as measured by minimum inhibitory concentrations, minimum bactericidal concentrations, and postexposure colony counts.
To analyze a decade of hospital staff and student exposures to blood and body fluids (BBF) and to identify risk factors relevant to prevention strategies.
Retrospective review of a 1999–2008 data set of BBF exposures. The data, maintained by occupational health staff, detailed the type of exposure, the setting in which the exposure occurred, and the occupational group of the BBF-exposed personnel.
Washington DC Veterans Affairs Medical Center (VA-DC), an inner-city tertiary care hospital.
All healthcare workers and staff at the VA-DC.
Review of database.
A review of 10 years of data revealed 564 occupational exposures to BBF, of which 66% were caused by needlesticks and 20% were caused by sharp objects. Exposures occurred most often in the acute care setting (which accounted for 39% of exposures) and the operating room (which accounted for 22%). There was a mean of 4.9 exposures per 10,000 acute care patient-days, 0.5 exposures per 10,000 long-term care patient-days, and 0.35 exposures per 10,000 outpatient visits. Housestaff accounted for the highest number of all exposures (196 [35%]). There were, on average, 15.2 exposures per 100 housestaff full-time equivalents. An average of only 1 exposure per year occurred in the hemodialysis center.
Occupational exposures to BBF remain common, but rates vary widely by setting and occupational group. Overall rates are steady across a decade, despite the use of various antiexposure devices and provider education programs. Targeting occupational groups and hospital settings that have been shown to have the highest risk rates should become foundational to future preventative strategies.
To our knowledge, no comprehensive, interdisciplinary initiatives have been taken to examine the role of genetic variants on patient-reported quality-of-life outcomes. The overall objective of this paper is to describe the establishment of an international and interdisciplinary consortium, the GENEQOL Consortium, which intends to investigate the genetic disposition of patient-reported quality-of-life outcomes. We have identified five primary patient-reported quality-of-life outcomes as initial targets: negative psychological affect, positive psychological affect, self-rated physical health, pain, and fatigue. The first tangible objective of the GENEQOL Consortium is to develop a list of potential biological pathways, genes and genetic variants involved in these quality-of-life outcomes, by reviewing current genetic knowledge. The second objective is to design a research agenda to investigate and validate those genes and genetic variants of patient-reported quality-of-life outcomes, by creating large datasets. During its first meeting, the Consortium has discussed draft summary documents addressing these questions for each patient-reported quality-of-life outcome. A summary of the primary pathways and robust findings of the genetic variants involved is presented here. The research agenda outlines possible research objectives and approaches to examine these and new quality-of-life domains. Intriguing questions arising from this endeavor are discussed. Insight into the genetic versus environmental components of patient-reported quality-of-life outcomes will ultimately allow us to explore new pathways for improving patient care. If we can identify patients who are susceptible to poor quality of life, we will be able to better target specific clinical interventions to enhance their quality of life and treatment outcomes.
Emergency medicine continues to grow as an international specialty. With >30 countries developing emergency medicine training, supporting international physician education is imperative. The proposed Emergency Medicine International (EMI) observational fellowship is a systematic model for the academic and experiential training of future leaders.
This program is a result of interest in academic emergency medicine and the responsibility of the educational institution. A literature review on the international development of emergency medicine was performed and the weaknesses were assessed. Based on this review, the goals for EMI are providing: (1) leadership; (2) exposure to education training models; and (3) research instruction. The EMI structure consists of four blocks: (1) emergency medicine clinical rotations; (2) emergency medical services (EMS) experience; (3) medical toxicology exposure; and (4) emergency medicine operations/administration. All blocks are tailored to the training background and interests of participants such as focusing on education methodology (conference organization, simulation) or departmental operations (quality improvement, faculty development). Overlapping all blocks is crucial to education in research methodology and evidence-based practice of medicine.
Assessment of the program includes pre-/post-survey completion by participants and yearly post-fellowship contact tracking the development of emergency medicine in their country.
While different types of organizations can assist in other ways, only academic emergency medicine can help grow and mentor faculty to expand the specialty worldwide.