To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
OBJECTIVES/SPECIFIC AIMS: Objectives and goals of this study are to (i) determine whether IBS-D patients randomized to either rifaximin or low FODMAP diet show improvement in IBS-related symptoms; and (2) identify using longitudinal analyses how SIBO status and fecal microbiota features associate with response to either rifaximin or low FODMAP dietary intervention. METHODS/STUDY POPULATION: 42 patients ≥ 18 years of age who meet Rome IV criteria for IBS-D will be randomized to receive either rifaximin or low FODMAP diet intervention. The primary outcome will be the proportion of responders to intervention which is defined as ≥ 30% reduction in mean daily abdominal pain or bloating by visual analog scale compared with baseline. Exclusion criteria will include: (a) history of microscopic colitis, inflammatory bowel disease, celiac disease, or other organic disease that could explain symptoms, (b) prior gastrointestinal surgery, other than appendectomy or cholecystectomy > 6 months prior to study initiation, (c) prior use of rifaximin or formal dietary interventions for IBS-D, (d) use of antibiotics within the past 3 months, or (e) use of probiotics within 1 month of study entry. Glucose hydrogen breath tests will be performed at the beginning and end of the trial to evaluate for SIBO. Fecal samples will be collected at 0, 2, and 6 weeks to determine changes in fecal microbial composition and structure. RESULTS/ANTICIPATED RESULTS: This study seeks to examine whether longitudinal analyses of small intestinal and colonic microbiota can subtype IBS-D subjects into clinically relevant phenotypes. A total of 18 subjects have been enrolled into the study. Clinical variables, hydrogen breath test results, and fecal microbiota data are being collected for ongoing analysis. DISCUSSION/SIGNIFICANCE OF IMPACT: Results from this study may help move treatment of IBS from a purely symptom based approach to a more individualized approach by stratifying IBS-D patients into distinct clinical phenotypes which are amenable to targeted therapeutic approaches.
Hospital environmental surfaces are frequently contaminated by microorganisms. However, the causal mechanism of bacterial contamination of the environment as a source of transmission is still debated. This prospective study was performed to characterize the nature of multidrug-resistant organism (MDRO) transmission between the environment and patients using standard microbiological and molecular techniques.
Prospective cohort study at 2 academic medical centers.
A prospective multicenter study to characterize the nature of bacterial transfer events between patients and environmental surfaces in rooms that previously housed patients with 1 of 4 ‘marker’ MDROs: methicillin-resistant Staphylococcus aureus, vancomycin-resistant enterococci, Clostridium difficile, and MDR Acinetobacter baumannii. Environmental and patient microbiological samples were obtained on admission into a freshly disinfected inpatient room. Repeat samples from room surfaces and patients were taken on days 3 and 7 and each week the patient stayed in the same room. The bacterial identity, antibiotic susceptibility, and molecular sequences were compared between organisms found in the environment samples and patient sources.
We enrolled 80 patient–room admissions; 9 of these patients (11.3%) were asymptomatically colonized with MDROs at study entry. Hospital room surfaces were contaminated with MDROs despite terminal disinfection in 44 cases (55%). Microbiological Bacterial Transfer events either to the patient, the environment, or both occurred in 12 patient encounters (18.5%) from the microbiologically evaluable cohort.
Microbiological Bacterial Transfer events between patients and the environment were observed in 18.5% of patient encounters and occurred early in the admission. This study suggests that research on prevention methods beyond the standard practice of room disinfection at the end of a patient’s stay is needed to better prevent acquisition of MDROs through the environment.
In this prospective study, we monitored 4 epidemiologically important pathogens (EIPs): methicillin-resistane Staphylococcus aureus (MRSA), vancomycin-resistant enterococci (VRE), Clostridium difficile, and multidrug-resistant (MDR) Acinetobacter to assess the effectiveness of 3 enhanced disinfection strategies for terminal room disinfection against standard practice. Our data demonstrated that a decrease in room contamination with EIPs of 94% was associated with a 35% decrease in subsequent patient colonization and/or infection.
OBJECTIVES/SPECIFIC AIMS: Rodent models can be used to study neonatal abstinence syndrome (NAS), but the applicability of findings from the models to NAS in humans is not well understood. The objective of this study was to develop a rat model of norbuprenorphine-induced NAS and validate its translational value by comparing blood concentrations in the norbuprenorphine-treated pregnant rat to those previously reported in pregnant women undergoing buprenorphine treatment. METHODS/STUDY POPULATION: Pregnant Long-Evans rats were implanted with 14-day osmotic minipumps containing vehicle, morphine (positive control), or norbuprenorphine (0.3–3 mg/kg/d) on gestation day 9. Within 12 hours of delivery, pups were tested for spontaneous or precipitated opioid withdrawal by injecting them with saline (10 mL/kg, i.p.) or naltrexone (1 or 10 mg/kg, i.p), respectively, and observing them for well-validated neonatal withdrawal signs. Blood was sampled via indwelling jugular catheters from a subset of norbuprenorphine-treated dams on gestation day 8, 10, 13, 17, and 20. Norbuprenorphine concentrations in whole blood samples were quantified using LC/MS/MS. RESULTS/ANTICIPATED RESULTS: Blood concentrations of norbuprenorphine in rats exposed to 1–3 mg/kg/d of norbuprenorphine were similar to levels previously reported in pregnant women undergoing buprenorphine treatment. Pups born to dams treated with these doses exhibited robust withdrawal signs. Blood concentrations of norbuprenorphine decreased across gestation, which is similar to previous reports in humans. DISCUSSION/SIGNIFICANCE OF IMPACT: These results suggest that dosing dams with 1–3 mg/kg/day norbuprenorphine produces maternal blood concentrations and withdrawal severity similar to those previously reported in humans. This provides evidence that, at these doses, this model is useful for testing hypotheses about norbuprenorphine that are applicable to NAS in humans.
Objectives: This study investigated the relationship between close proximity to detonated blast munitions and cognitive functioning in OEF/OIF/OND Veterans. Methods: A total of 333 participants completed a comprehensive evaluation that included assessment of neuropsychological functions, psychiatric diagnoses and history of military and non-military brain injury. Participants were assigned to a Close-Range Blast Exposure (CBE) or Non-Close-Range Blast Exposure (nonCBE) group based on whether they had reported being exposed to at least one blast within 10 meters. Results: Groups were compared on principal component scores representing the domains of memory, verbal fluency, and complex attention (empirically derived from a battery of standardized cognitive tests), after adjusting for age, education, PTSD diagnosis, sleep quality, substance abuse disorder, and pain. The CBE group showed poorer performance on the memory component. Rates of clinical impairment were significantly higher in the CBE group on select CVLT-II indices. Exploratory analyses examined the effects of concussion and multiple blasts on test performance and revealed that number of lifetime concussions did not contribute to memory performance. However, accumulating blast exposures at distances greater than 10 meters did contribute to poorer performance. Conclusions: Close proximity to detonated blast munitions may impact memory, and Veterans exposed to close-range blast are more likely to demonstrate clinically meaningful deficits. These findings were observed after statistically adjusting for comorbid factors. Results suggest that proximity to blast should be considered when assessing for memory deficits in returning Veterans. Comorbid psychiatric factors may not entirely account for cognitive difficulties. (JINS, 2018, 24, 466–475)
To summarize and discuss logistic and administrative challenges we encountered during the Benefits of Enhanced Terminal Room (BETR) Disinfection Study and lessons learned that are pertinent to future utilization of ultraviolet (UV) disinfection devices in other hospitals
Multicenter cluster randomized trial
SETTING AND PARTICIPANTS
Nine hospitals in the southeastern United States
All participating hospitals developed systems to implement 4 different strategies for terminal room disinfection. We measured compliance with disinfection strategy, barriers to implementation, and perceptions from nurse managers and environmental services (EVS) supervisors throughout the 28-month trial.
Implementation of enhanced terminal disinfection with UV disinfection devices provides unique challenges, including time pressures from bed control personnel, efficient room identification, negative perceptions from nurse managers, and discharge volume. In the course of the BETR Disinfection Study, we utilized several strategies to overcome these barriers: (1) establishing safety as the priority; (2) improving communication between EVS, bed control, and hospital administration; (3) ensuring availability of necessary resources; and (4) tracking and providing feedback on compliance. Using these strategies, we deployed ultraviolet (UV) disinfection devices in 16,220 (88%) of 18,411 eligible rooms during our trial (median per hospital, 89%; IQR, 86%–92%).
Implementation of enhanced terminal room disinfection strategies using UV devices requires recognition and mitigation of 2 key barriers: (1) timely and accurate identification of rooms that would benefit from enhanced terminal disinfection and (2) overcoming time constraints to allow EVS cleaning staff sufficient time to properly employ enhanced terminal disinfection methods.
Early career investigators have few opportunities for targeted training in supportive oncology research. To address this need, we developed, implemented, and evaluated an intensive, six-day workshop on methods in supportive oncology research for trainees and junior faculty across multiple disciplines.
A multidisciplinary team of supportive oncology researchers developed a workshop patterned after the clinical trials workshop offered jointly by the American Society of Clinical Oncology and American Association of Cancer Research. The curriculum included lectures and a mentored experience of writing a research protocol. Each year since 2015, the workshop has accepted and trained 36 early career investigators. Over the course of the workshop, participants present sections of their research protocols daily in small groups led by senior researchers, and have dedicated time to write and revise these sections. Primary outcomes for the workshop included the frequency of completed protocols by the end of the workshop, a pre- and posttest assessing participant knowledge, and follow-up surveys of the participants and their primary mentors.
Over three years, the workshop received 195 applications; 109 early career researchers were competitively selected to participate. All participants (109/109, 100%) completed writing a protocol by the end of their workshop. Participants and their primary mentors reported significant improvements in their research knowledge and skills. Each year, participants rated the workshop highly in terms of satisfaction, value, and likelihood of recommending it to a colleague. One year after the first workshop, most respondents (29/30, 96.7%) had either submitted their protocol or written at least one other protocol.
Significance of results
We developed a workshop on research methods in supportive oncology. More early career investigators applied for the workshop than capacity, and the workshop was fully attended each year. Both the workshop participants and their primary mentors reported improvement in research skills and knowledge.
Ventilator-associated pneumonia (VAP) is a frequent complication of severe burn injury. Comparing the current ventilator-associated event-possible VAP definition to the pre-2013 VAP definition, we identified considerably fewer VAP cases in our burn ICU. The new definition does not capture many VAP cases that would have been reported using the pre-2013 definition.
With improvements in early survival following congenital heart surgery, it has become increasingly important to understand longer-term outcomes; however, routine collection of these data is challenging and remains very limited. We describe the development and initial results of a collaborative programme incorporating standardised longitudinal follow-up into usual care at the Children’s Hospital of Philadelphia (CHOP) and University of Michigan (UM).
We included children undergoing benchmark operations of the Society of Thoracic Surgeons. Considerations regarding personnel, patient/parent engagement, funding, regulatory issues, and annual data collection are described, and initial follow-up rates are reported.
The present analysis included 1737 eligible patients undergoing surgery at CHOP from January 2007 to December 2014 and 887 UM patients from January 2010 to December 2014. Overall, follow-up data, of any type, were obtained from 90.8% of patients at CHOP (median follow-up 4.3 years, 92.2% survival) and 98.3% at UM (median follow-up 2.8 years, 92.7% survival), with similar rates across operations and institutions. Most patients lost to follow-up at CHOP had undergone surgery before 2010. Standardised questionnaires assessing burden of disease/quality of life were completed by 80.2% (CHOP) and 78.4% (UM) via phone follow-up. In subsequent pilot testing of an automated e-mail system, 53.4% of eligible patients completed the follow-up questionnaire through this system.
Standardised follow-up data can be obtained on the majority of children undergoing benchmark operations. Ongoing efforts to support automated electronic systems and integration with registry data may reduce resource needs, facilitate expansion across centres, and support multi-centre efforts to understand and improve long-term outcomes in this population.
Though the US civilian trauma care system plays a critical role in disaster response, there is currently no systems-based strategy that enables hospital emergency management and local and regional emergency planners to quantify, and potentially prepare for, surges in trauma care demand that accompany mass-casualty disasters.
A proof-of-concept model that estimates the geographic distributions of patients, trauma center resource usage, and mortality rates for varying disaster sizes, in and around the 25 largest US cities, is presented. The model was designed to be scalable, and its inputs can be modified depending on the planning assumptions of different locales and for different types of mass-casualty events.
To demonstrate the model’s potential application to real-life planning scenarios, sample disaster responses for 25 major US cities were investigated using a hybrid of geographic information systems and dynamic simulation-optimization. In each city, a simulated, fast-onset disaster epicenter, such as might occur with a bombing, was located randomly within one mile of its population center. Patients then were assigned and transported, in simulation, via the new model to Level 1, 2, and 3 trauma centers, in and around each city, over a 48-hour period for disaster scenario sizes of 100, 500, 5000, and 10,000 casualties.
Across all 25 cities, total mean mortality rates ranged from 26.3% in the smallest disaster scenario to 41.9% in the largest. Out-of-hospital mortality rates increased (from 21.3% to 38.5%) while in-hospital mortality rates decreased (from 5.0% to 3.4%) as disaster scenario sizes increased. The mean number of trauma centers involved ranged from 3.0 in the smallest disaster scenario to 63.4 in the largest. Cities that were less geographically isolated with more concentrated trauma centers in their surrounding regions had lower total and out-of-hospital mortality rates. The nine US cities listed as being the most likely targets of terrorist attacks involved, on average, more trauma centers and had lower mortality rates compared with the remaining 16 cities.
The disaster response simulation model discussed here may offer insights to emergency planners and health systems in more realistically planning for mass-casualty events. Longer wait and transport times needed to distribute high numbers of patients to distant trauma centers in fast-onset disasters may create predictable increases in mortality and trauma center resource consumption. The results of the modeled scenarios indicate the need for a systems-based approach to trauma care management during disasters, since the local trauma center network was often too small to provide adequate care for the projected patient surge. Simulation of out-of-hospital resources that might be called upon during disasters, as well as guidance in the appropriate execution of mutual aid agreements and prevention of over-response, could be of value to preparedness planners and emergency response leaders. Study assumptions and limitations are discussed.
A Geographic Simulation Model for the Treatment of Trauma Patients in Disasters. Prehosp Disaster Med.2016;31(4):413–421.
Legislative actions and advanced technologies, particularly dissemination of safety-engineered devices, have aided in protecting healthcare personnel from occupational blood and body fluid exposures (BBFE).
To investigate the trends in BBFE among healthcare personnel over 15 years and the impact of safety-engineered devices on the incidence of percutaneous injuries as well as features of injuries associated with these devices.
Retrospective cohort study at University of North Carolina Hospitals, a tertiary care academic facility. Data on BBFE in healthcare personnel were extracted from Occupational Health Service records (2000–2014). Exposures associated with safety-engineered and conventional devices were compared. Generalized linear models were applied to measure the annual incidence rate difference by exposure type over time.
A total of 4,300 BBFE, including 3,318 percutaneous injuries (77%), were reported. The incidence rate for overall BBFE was significantly reduced during 2000–2014 (incidence rate difference, 1.72; P=.0003). The incidence rate for percutaneous injuries was also dramatically reduced during 2001–2006 (incidence rate difference, 1.37; P=.0079) but was less changed during 2006–2014. Percutaneous injuries associated with safety-engineered devices accounted for 27% of all BBFE. BBFE was most commonly due to injecting through skin, placing intravenous catheters, and blood drawing.
Our study revealed significant overall reduction in BBFE and percutaneous injuries likely due in part to the impact of safety-engineered devices but also identified that a considerable proportion of percutaneous injuries is now associated with these devices. Additional prevention strategies are needed to further reduce percutaneous injuries and improve design of safety-engineered devices.
Targeted surveillance has focused on device-associated infections and surgical site infections (SSIs) and is often limited to healthcare-associated infections (HAIs) in high-risk areas. Longitudinal trends in all HAIs, including other types of HAIs, and HAIs outside of intensive care units (ICUs) remain unclear. We examined the incidences of all HAIs using comprehensive hospital-wide surveillance over a 12-year period (2001–2012).
This retrospective observational study was conducted at the University of North Carolina (UNC) Hospitals, a tertiary care academic facility. All HAIs, including 5 major infections with 14 specific infection sites as defined using CDC criteria, were ascertained through comprehensive hospital-wide surveillance. Generalized linear models were used to examine the incidence rate difference by infection type over time.
A total of 16,579 HAIs included 6,397 cases in ICUs and 10,182 cases outside ICUs. The incidence of overall HAIs decreased significantly hospital-wide (−3.4 infections per 1,000 patient days), in ICUs (−8.4 infections per 1,000 patient days), and in non-ICU settings (−1.9 infections per 1,000 patient days). The incidences of bloodstream infection, urinary tract infection, and pneumonia in hospital-wide settings decreased significantly, but the incidences of SSI and lower respiratory tract infection remained unchanged. The incidence of Clostridium difficile infection (CDI) increased remarkably. The outcomes were estimated to include 700 overall HAIs prevented, 40 lives saved, and cost savings in excess of $10 million.
We demonstrated success in reducing overall HAIs over a 12-year period. Our data underscore the necessity for surveillance and infection prevention interventions outside of the ICUs, for non–device-associated HAIs, and for CDI.
Infect Control Hosp Epidemiol 2015;36(10):1139–1147
Burn injuries are a common source of morbidity and mortality in the United States, with an estimated 450,000 burn injuries requiring medical treatment, 40,000 requiring hospitalization, and 3,400 deaths from burns annually in the United States. Patients with severe burns are at high risk for local and systemic infections. Furthermore, burn patients are immunosuppressed, as thermal injury results in less phagocytic activity and lymphokine production by macrophages. In recent years, multidrug-resistant (MDR) pathogens have become major contributors to morbidity and mortality in burn patients.
Since only limited data are available on the incidence of both device- and nondevice-associated healthcare-associated infections (HAIs) in burn patients, we undertook this retrospective cohort analysis of patients admitted to our burn intensive care unit (ICU) from 2008 to 2012.