To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We studied the association between chlorhexidine gluconate (CHG) concentration on skin and resistant bacterial bioburden. CHG was almost always detected on the skin, and detection of methicillin-resistant Staphylococcus aureus, carbapenem-resistant Enterobacteriaceae, and vancomycin-resistant Enterococcus on skin sites was infrequent. However, we found no correlation between CHG concentration and bacterial bioburden.
Iron deficiency is common in pregnant and lactating women and is associated with reduced cognitive development of the offspring. Since iron affects lipid metabolism, the availability of fatty acids, particularly the polyunsaturated fatty acids required for early neural development, was investigated in the offspring of female rats fed iron-deficient diets during gestation and lactation. Subsequent to the dams giving birth, one group of iron-deficient dams was recuperated by feeding an iron-replete diet. Dams and neonates were killed on postnatal days 1, 3 and 10, and the fatty acid composition of brain and stomach contents was assessed by gas chromatography. Changes in the fatty acid profile on day 3 became more pronounced on day 10 with a decrease in the proportion of saturated fatty acids and a compensatory increase in monounsaturated fatty acids. Long-chain polyunsaturated fatty acids in the n-6 family were reduced, but there was no change in the n-3 family. The fatty acid profiles of neonatal brain and stomach contents were similar, suggesting that the change in milk composition may be related to the changes in the neonatal brain. When the dams were fed an iron-sufficient diet at birth, the effects of iron deficiency on the fatty acid composition of lipids in both dam’s milk and neonates’ brains were reduced. This study showed an interaction between maternal iron status and fatty acid composition of the offspring’s brain and suggests that these effects can be reduced by iron repletion of the dam’s diet at birth.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
We used a survey to characterize contemporary infection prevention and antibiotic stewardship program practices across 64 healthcare facilities, and we compared these findings to those of a similar 2013 survey. Notable findings include decreased frequency of active surveillance for methicillin-resistant Staphylococcus aureus, frequent active surveillance for carbapenem-resistant Enterobacteriaceae, and increased support for antibiotic stewardship programs.
This is a copy of the slides presented at the meeting but not formally written up for the volume.
As in vivo cellular imaging becomes the necessary norm for understanding cancer and other diseases, new non-toxic nanoprobes are going to be required to replace the high quality cadmium based nanoprobes in use today. We are developing less toxic probes based on two types of luminescent ceramic nanoparticles: naturally occurring fluorescent (NOF) mimics and Ln-based ceramic oxide materials. The NOF minerals of interest and that have demonstrated initial luminosity of sufficient brightness for use in cellular studies that include sphalerite, scheelite, manganoan and perovskite nanoparticles. For Ln-based materials we have shown that Ln-doped zincite will also luminesce enough to allow for quantification in cellular activity. Once formed, these probes are functionalized such that they can be delivered to desired cellular targets. Probe derivatization has focused on surface capping with functionalized poly(ethyleneglycol) molecules/lipids to yield water soluble NCs and polyarginine-based transporters for transmembrane delivery. The probes are being evaluated for their luminescent properties, as well as their non-toxicity and ability to report on cell-signaling events with various cell lines using multi-spectral, confocal microscopy, and other techniques. Preliminary interdisciplinary studies have validated the basic approaches for the synthesis of NOF nanoprobes and the bio-delivery and imaging of nanoparticles. Work to optimize the design, delivery, and imaging of these new nanoprobes is expected to achieve the NIH directed goal of increasing in the sensitivity and specificity of molecular probes for imaging. Details of the synthesis, functionalization and biological imaging using these probes will be presented. This work partially supported by the United States Department of Energy under contract number DE-AC04-94AL85000. Sandia is a multi-program laboratory operated by Sandia Corporation, a Lockheed-Martin Company, for the United States Department of Energy and by the National Institutes of health through the NIH Roadmap for Medical Research, Grant #1 R21 EB005365-01. Information on this RFA (Innovation in Molecular Imaging Probes) can be found at http://grants.nih.gov/grants/guide/rfa-files/RFA-RM-04-021.html.
To ascertain opinions regarding etiology and preventability of hospital-onset bacteremia and fungemia (HOB) and perspectives on HOB as a potential outcome measure reflecting quality of infection prevention and hospital care.
Hospital epidemiologists and infection preventionist members of the Society for Healthcare Epidemiology of America (SHEA) Research Network.
A web-based, multiple-choice survey was administered via the SHEA Research Network to 133 hospitals.
A total of 89 surveys were completed (67% response rate). Overall, 60% of respondents defined HOB as a positive blood culture on or after hospital day 3. Central line-associated bloodstream infections and intra-abdominal infections were perceived as the most frequent etiologies. Moreover, 61% thought that most HOB events are preventable, and 54% viewed HOB as a measure reflecting a hospital’s quality of care. Also, 29% of respondents’ hospitals already collect HOB data for internal purposes. Given a choice to publicly report central-line–associated bloodstream infections (CLABSIs) and/or HOB, 57% favored reporting either HOB alone (22%) or in addition to CLABSI (35%) and 34% favored CLABSI alone.
Among the majority of SHEA Research Network respondents, HOB is perceived as preventable, reflective of quality of care, and potentially acceptable as a publicly reported quality metric. Further studies on HOB are needed, including validation as a quality measure, assessment of risk adjustment, and formation of evidence-based bundles and toolkits to facilitate measurement and improvement of HOB rates.
Englerophytum and Synsepalum are two closely related genera of trees and shrubs from the African tropics. Previous molecular studies have shown that these genera collectively form a clade within the subfamily Chrysophylloideae (Sapotaceae). However, little is known about the inter-relationships of the taxa within the Englerophytum–Synsepalum clade. In this study, nuclear ribosomal DNA and plastid trnH–psbA sequences were used to estimate the phylogeny within the clade. Results indicate that the clade consists of six major lineages, two composed solely of taxa from the genus Englerophytum and four composed of taxa from the genus Synsepalum. Each lineage can be distinguished by suites of vegetative and floral characters. Leaf venation patterns, calyx fusion, style length and staminodal structure were among the most useful characters for distinguishing clades. Some of the subclades within the Englerophytum–Synsepalum clade were also found to closely fit descriptions of former genera, most of which were described by Aubréville, that have since been placed in synonymy with Englerophytum and Synsepalum. The clade with the type species of Englerophytum also contains the type species of the genera Wildemaniodoxa and Zeyherella, which are confirmed as synonyms.
To enhance enrollment into randomized clinical trials (RCTs), we proposed electronic health record-based clinical decision support for patient–clinician shared decision-making about care and RCT enrollment, based on “mathematical equipoise.”
As an example, we created the Knee Osteoarthritis Mathematical Equipoise Tool (KOMET) to determine the presence of patient-specific equipoise between treatments for the choice between total knee replacement (TKR) and nonsurgical treatment of advanced knee osteoarthritis.
With input from patients and clinicians about important pain and physical function treatment outcomes, we created a database from non-RCT sources of knee osteoarthritis outcomes. We then developed multivariable linear regression models that predict 1-year individual-patient knee pain and physical function outcomes for TKR and for nonsurgical treatment. These predictions allowed detecting mathematical equipoise between these two options for patients eligible for TKR. Decision support software was developed to graphically illustrate, for a given patient, the degree of overlap of pain and functional outcomes between the treatments and was pilot tested for usability, responsiveness, and as support for shared decision-making.
The KOMET predictive regression model for knee pain had four patient-specific variables, and an r2 value of 0.32, and the model for physical functioning included six patient-specific variables, and an r2 of 0.34. These models were incorporated into prototype KOMET decision support software and pilot tested in clinics, and were generally well received.
Use of predictive models and mathematical equipoise may help discern patient-specific equipoise to support shared decision-making for selecting between alternative treatments and considering enrollment into an RCT.
A robust biomedical informatics infrastructure is essential for academic health centers engaged in translational research. There are no templates for what such an infrastructure encompasses or how it is funded. An informatics workgroup within the Clinical and Translational Science Awards network conducted an analysis to identify the scope, governance, and funding of this infrastructure. After we identified the essential components of an informatics infrastructure, we surveyed informatics leaders at network institutions about the governance and sustainability of the different components. Results from 42 survey respondents showed significant variations in governance and sustainability; however, some trends also emerged. Core informatics components such as electronic data capture systems, electronic health records data repositories, and related tools had mixed models of funding including, fee-for-service, extramural grants, and institutional support. Several key components such as regulatory systems (e.g., electronic Institutional Review Board [IRB] systems, grants, and contracts), security systems, data warehouses, and clinical trials management systems were overwhelmingly supported as institutional infrastructure. The findings highlighted in this report are worth noting for academic health centers and funding agencies involved in planning current and future informatics infrastructure, which provides the foundation for a robust, data-driven clinical and translational research program.
Twenty-seven species and two subspecies of Ficus are reported from one study site in central Africa. Characters for identification are explained. An identification key, illustrations, descriptions and habitats are provided. The species-level diversity of Ficus in tropical forests is discussed.
To determine which healthcare worker (HCW) roles and patient care activities are associated with acquisition of vancomycin-resistant Enterococcus (VRE) on HCW gloves or gowns after patient care, as a surrogate for transmission to other patients.
Prospective cohort study.
Medical and surgical intensive care units at a tertiary-care academic institution.
VRE-colonized patients on Contact Precautions and their HCWs.
Overall, 94 VRE-colonized patients and 469 HCW–patient interactions were observed. Research staff recorded patient care activities and cultured HCW gloves and gowns for VRE before doffing and exiting patient room.
VRE were isolated from 71 of 469 HCWs’ gloves or gowns (15%) following patient care. Occupational/physical therapists, patient care technicians, nurses, and physicians were more likely than environmental services workers and other HCWs to have contaminated gloves or gowns. Compared to touching the environment alone, the odds ratio (OR) for VRE contamination associated with touching both the patient (or objects in the immediate vicinity of the patient) and environment was 2.78 (95% confidence interval [CI], 0.99–0.77) and the OR associated with touching only the patient (or objects in the immediate vicinity) was 3.65 (95% CI, 1.17–11.41). Independent risk factors for transmission of VRE to HCWs were touching the patient’s skin (OR, 2.18; 95% CI, 1.15–4.13) and transferring the patient into or out of bed (OR, 2.66; 95% CI, 1.15–6.43).
Patient contact is a major risk factor for HCW contamination and subsequent transmission. Interventions should prioritize contact precautions and hand hygiene for HCWs whose activities involve touching the patient.
Introduction: Emergency Department Overcrowding (EDOC) is a multifactorial issue that leads to Access Block for patients needing emergency care. Identified as a national problem, patients presenting to a Canadian Emergency Department (ED) at a time of overcrowding have higher rates of admission to hospital and increased seven-day mortality. Using the well accepted input-throughput-output model to study EDOC, current research has focused on throughput as a measure of patient flow, reported as ED length of stay (LOS). In fact, ED LOS and ED beds occupied by inpatients are two “extremely important indicators of EDOC identified by a 2005 survey of Canadian ED directors. One proposed solution to improve ED throughput is to utilize a physician at triage (PAT) to rapidly assess newly arriving patients. In 2017, a pilot PAT program was trialed at Kelowna General Hospital (KGH), a tertiary care hospital, as part of a PDSA cycle. The aim was to mitigate EDOC by improving ED throughput by the end of 2018, to meet the national targets for ED LOS suggested in the 2013 CAEP position statement. Methods: During the fiscal periods 1-6 (April 1 to September 7, 2017) a PAT shift occurred daily from 1000-2200, over four long weekends. ED LOS, time to inpatient bed, time to physician initial assessment (PIA), number of British Columbia Ambulance Service (BCAS) offload delays, and number of patients who left without being seen (LWBS) were extracted from an administrative database. Results were retrospectively analyzed and compared to data from 1000-2200 of non-PAT trial days during the trial periods. Results: Median ED LOS decreased from 3.8 to 3.4 hours for high-acuity patients (CTAS 1-3), from 2.1 to 1.8 hours for low-acuity patients (CTAS 4-5), and from 9.3 to 8.0 hours for all admitted patients. During PAT trial weekends, there was a decrease in the average time to PIA by 65% (from 73 to 26 minutes for CTAS 2-5), average number of daily BCAS offload delays by 39% (from 2.3 to 1.4 delays per day), and number of patients who LWBS from 2.4% to 1.7%. Conclusion: The implementation of PAT was associated with improvements in all five measures of ED throughput, providing a potential solution for EDOC at KGH. ED LOS was reduced compared to non-PAT control days, successfully meeting the suggested national targets. PAT could improve efficiency, resulting in the ability to see more patients in the ED, and increase the quality and safety of ED practice. Next, we hope to prospectively evaluate PAT, continuing to analyze these process measures, perform a cost-benefit analysis, and formally assess ED staff and patient perceptions of the program.
Introduction: The administration of “to-go” medications in the Kelowna General Hospital Emergency Department was identified as an issue. Frequently, multiple administrations of “to-go” medication pre-packs were administered to individual patients on a frequent basis. In addition, the variability in “to-go” medication was substantial between providers. Recognizing the patient issues (addiction, dependency and diversion) and system issues (costs, risk) a team-based quality improvement initiative was instituted, utilizing a variety of quality improvement techniques. The aim was to reduce the number of “to-go” medications by half, within a year. Methods: The project began January 2015, and is ongoing. Multiple stakeholders were engaged within the emergency department; these included leaders of the physician, nursing and pharmacy teams, including an executive sponsor. Using change theory, and traditional Plan-Do-Study-Act (PDSA) cycles, an iterative methodology was proposed. The outcome measure proposed was number of “to-go” medications administered; secondary measures included number of opioid “to-go” and benzodiazepine “to-go“prescriptions. Balancing measures were the number of narcotic prescriptions written. Physician prescribing practice and nursing practice were reviewed at meetings and huddles. Individualized reports were provided to physicians for self-review. Data was collated at baseline then reviewed quarterly at meetings and huddles. Run charts were utilized along with raw data and individualized reports. Results: At baseline (January 2015), the number of “to-go” medications was 708. Over the next year, this value reduced to 459, showing a 35% reduction in “to-go”. Two years later (June 2017), this had reduced to 142, resulting in an overall reduction of 80% “to-go” medications. Secondary measures are currently under analysis. Further, no increase in prescribing of narcotics was seen during this time period. Conclusion: The administration of “to-go” medications from the emergency department has significant individual and societal impact. Frequently, these medications are diverted; meaning, sold for profit on the black market. Further, opioid prescribing is under increased scrutiny as the linkage between opioid prescriptions and addiction / dependency becomes more evident. This quality improvement initiative was successful for a number of reasons. First, we had strong engagement from the full emergency department clinical teams. The issue was first identified collaboratively, and teamwork and participation was strong from the outset. Second, we used individual and aggregate data to provide feedback on a regular basis. Third, we had strong support from our executive sponsor(s) who were able to support the efforts and champion and present the results locally, and now, throughout the Health Region.
Crib–biting is a stereotypic behaviour performed by approximately 5% of captive domestic horses. Dietary factors have been strongly associated with the development of oral stereotypies and risk factors for crib–biting, identified in recent epidemiological studies, include feeding high concentrate and/or low forage diets (Waters et al., 2002). Experimental work has shown that such diets are likely to result in increased gastric acidity (Murray and Eichorn, 1996; Nadeau et al., 2000). We therefore propose that young horses initiate crib–biting in an attempt to produce alkaline saliva to buffer their stomachs when alternative opportunities for mastication are limited. The aim of this study was to determine whether there was an association between crib–biting behaviour and stomach condition in foals.
Foals that had recently started to perform crib–biting were recruited into the study and compared with non–stereotypic foals. The stomachs of 15 crib-biting foals and 9 normal foals were examined using a video endoscope.
Oxidative stress occurs when antioxidant defence mechanisms are overwhelmed by free radicals and may lead to damage to DNA, which has been implicated in processes such as ageing and cancer. The Comet assay allows detection of oxidative DNA damage in individual cells. As horses with recurrent airway obstruction (RAO) have been shown to demonstrate low antioxidant status and oxidative stress, we hypothesised that peripheral blood mononuclear cells (PBMC) of horses with RAO would demonstrate increases in DNA damage following natural allergen challenge.
Six horses (mean age 15 years, range 8-23 years) diagnosed with RAO (in remission) and 6 healthy breed matched controls (mean age 9 years, range 5-15 years) were studied. Blood samples were collected 7 days prior to challenge and immediately and 3 days after stabling on mouldy hay and straw for 24h. All animals were kept at grass prior to and after the challenge period. Bronchoalveolar lavage (BAL) was performed and neutrophil counts determined.
To analyze whether electronically available comorbid conditions are risk factors for Centers for Disease Control and Prevention (CDC)-defined, hospital-onset Clostridium difficile infection (CDI) after controlling for antibiotic and gastric acid suppression therapy use.
Patients aged ≥18 years admitted to the University of Maryland Medical Center between November 7, 2015, and May 31, 2017.
Comorbid conditions were assessed using the Elixhauser comorbidity index. The Elixhauser comorbidity index and the comorbid condition components were calculated using the International Classification of Disease, Tenth Revision, Clinical Modification (ICD-10-CM) codes extracted from electronic medical records. Bivariate associations between CDI and potential covariates for multivariable regression, including antibiotic use, gastric acid suppression therapy use, as well as comorbid conditions, were estimated using log binomial multivariable regression.
After controlling for antibiotic use, age, proton-pump inhibitor use, and histamine-blocker use, the Elixhauser comorbidity index was a significant risk factor for predicting CDI. There was an increased risk of 1.26 (95% CI, 1.19–1.32) of having CDI for each additional Elixhauser point added to the total Elixhauser score.
An increase in Elixhauser score is associated with CDI. Our study and other studies have shown that comorbid conditions are important risk factors for CDI. Electronically available comorbid conditions and scores like the Elixhauser index should be considered for risk-adjustment of CDC CDI rates.
We assessed various locations and frequency of environmental sampling to maximize information and maintain efficiency when sampling for Acinetobacter baumannii. Although sampling sites in closer proximity to the patient were more likely positive, to fully capture environmental contamination, we found value in sampling all sites and across multiple days.