To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Between 2001 and 2017, the Royal Botanic Garden Edinburgh conducted training and research in Belize built around an annual two-week field course, part of the Edinburgh M.Sc. programme in Biodiversity and Taxonomy of Plants, focused on tropical plant identification, botanical-collecting and tropical fieldwork skills. This long-term collaboration in one country has led to additional benefits, most notably capacity building, acquisition of new country records, completion of M.Sc. thesis projects and publication of the findings in journal articles, and continued cooperation. Detailed summaries are provided for the specimens collected by students during the field course or return visits to Belize for M.Sc. thesis projects. Additionally, 15 species not recorded in the national checklist for Belize are reported. The information in this paper highlights the benefits of collaborations between institutions and countries for periods greater than the typical funding cycles of three to five years.
The transmission rate of methicillin-resistant Staphylococcus aureus (MRSA) to gloves or gowns of healthcare personnel (HCP) caring for MRSA patients in a non–intensive care unit setting was 5.4%. Contamination rates were higher among HCP performing direct patient care and when patients had detectable MRSA on their body. These findings may inform risk-based contact precautions.
To determine sociodemographic factors associated with occupational, recreational and firearm-related noise exposure.
This nationally representative, multistage, stratified, cluster cross-sectional study sampled eligible National Health and Nutrition Examination Survey participants aged 20–69 years (n = 4675) about exposure to occupational and recreational noise and recurrent firearm usage, using a weighted multivariate logistic regression analysis.
Thirty-four per cent of participants had exposure to occupational noise and 12 per cent to recreational noise, and 13 per cent repeatedly used firearms. Males were more likely than females to have exposure to all three noise types (adjusted odds ratio range = 2.63–14.09). Hispanics and Asians were less likely to have exposure to the three noise types than Whites. Blacks were less likely than Whites to have occupational and recurrent firearm noise exposure. Those with insurance were 26 per cent less likely to have exposure to occupational noise than those without insurance (adjusted odds ratio = 0.74, 95 per cent confidence interval = 0.60–0.93).
Whites, males and uninsured people are more likely to have exposure to potentially hazardous loud noise.
We studied the association between chlorhexidine gluconate (CHG) concentration on skin and resistant bacterial bioburden. CHG was almost always detected on the skin, and detection of methicillin-resistant Staphylococcus aureus, carbapenem-resistant Enterobacteriaceae, and vancomycin-resistant Enterococcus on skin sites was infrequent. However, we found no correlation between CHG concentration and bacterial bioburden.
Iron deficiency is common in pregnant and lactating women and is associated with reduced cognitive development of the offspring. Since iron affects lipid metabolism, the availability of fatty acids, particularly the polyunsaturated fatty acids required for early neural development, was investigated in the offspring of female rats fed iron-deficient diets during gestation and lactation. Subsequent to the dams giving birth, one group of iron-deficient dams was recuperated by feeding an iron-replete diet. Dams and neonates were killed on postnatal days 1, 3 and 10, and the fatty acid composition of brain and stomach contents was assessed by gas chromatography. Changes in the fatty acid profile on day 3 became more pronounced on day 10 with a decrease in the proportion of saturated fatty acids and a compensatory increase in monounsaturated fatty acids. Long-chain polyunsaturated fatty acids in the n-6 family were reduced, but there was no change in the n-3 family. The fatty acid profiles of neonatal brain and stomach contents were similar, suggesting that the change in milk composition may be related to the changes in the neonatal brain. When the dams were fed an iron-sufficient diet at birth, the effects of iron deficiency on the fatty acid composition of lipids in both dam’s milk and neonates’ brains were reduced. This study showed an interaction between maternal iron status and fatty acid composition of the offspring’s brain and suggests that these effects can be reduced by iron repletion of the dam’s diet at birth.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
We used a survey to characterize contemporary infection prevention and antibiotic stewardship program practices across 64 healthcare facilities, and we compared these findings to those of a similar 2013 survey. Notable findings include decreased frequency of active surveillance for methicillin-resistant Staphylococcus aureus, frequent active surveillance for carbapenem-resistant Enterobacteriaceae, and increased support for antibiotic stewardship programs.
To ascertain opinions regarding etiology and preventability of hospital-onset bacteremia and fungemia (HOB) and perspectives on HOB as a potential outcome measure reflecting quality of infection prevention and hospital care.
Hospital epidemiologists and infection preventionist members of the Society for Healthcare Epidemiology of America (SHEA) Research Network.
A web-based, multiple-choice survey was administered via the SHEA Research Network to 133 hospitals.
A total of 89 surveys were completed (67% response rate). Overall, 60% of respondents defined HOB as a positive blood culture on or after hospital day 3. Central line-associated bloodstream infections and intra-abdominal infections were perceived as the most frequent etiologies. Moreover, 61% thought that most HOB events are preventable, and 54% viewed HOB as a measure reflecting a hospital’s quality of care. Also, 29% of respondents’ hospitals already collect HOB data for internal purposes. Given a choice to publicly report central-line–associated bloodstream infections (CLABSIs) and/or HOB, 57% favored reporting either HOB alone (22%) or in addition to CLABSI (35%) and 34% favored CLABSI alone.
Among the majority of SHEA Research Network respondents, HOB is perceived as preventable, reflective of quality of care, and potentially acceptable as a publicly reported quality metric. Further studies on HOB are needed, including validation as a quality measure, assessment of risk adjustment, and formation of evidence-based bundles and toolkits to facilitate measurement and improvement of HOB rates.
Englerophytum and Synsepalum are two closely related genera of trees and shrubs from the African tropics. Previous molecular studies have shown that these genera collectively form a clade within the subfamily Chrysophylloideae (Sapotaceae). However, little is known about the inter-relationships of the taxa within the Englerophytum–Synsepalum clade. In this study, nuclear ribosomal DNA and plastid trnH–psbA sequences were used to estimate the phylogeny within the clade. Results indicate that the clade consists of six major lineages, two composed solely of taxa from the genus Englerophytum and four composed of taxa from the genus Synsepalum. Each lineage can be distinguished by suites of vegetative and floral characters. Leaf venation patterns, calyx fusion, style length and staminodal structure were among the most useful characters for distinguishing clades. Some of the subclades within the Englerophytum–Synsepalum clade were also found to closely fit descriptions of former genera, most of which were described by Aubréville, that have since been placed in synonymy with Englerophytum and Synsepalum. The clade with the type species of Englerophytum also contains the type species of the genera Wildemaniodoxa and Zeyherella, which are confirmed as synonyms.
To enhance enrollment into randomized clinical trials (RCTs), we proposed electronic health record-based clinical decision support for patient–clinician shared decision-making about care and RCT enrollment, based on “mathematical equipoise.”
As an example, we created the Knee Osteoarthritis Mathematical Equipoise Tool (KOMET) to determine the presence of patient-specific equipoise between treatments for the choice between total knee replacement (TKR) and nonsurgical treatment of advanced knee osteoarthritis.
With input from patients and clinicians about important pain and physical function treatment outcomes, we created a database from non-RCT sources of knee osteoarthritis outcomes. We then developed multivariable linear regression models that predict 1-year individual-patient knee pain and physical function outcomes for TKR and for nonsurgical treatment. These predictions allowed detecting mathematical equipoise between these two options for patients eligible for TKR. Decision support software was developed to graphically illustrate, for a given patient, the degree of overlap of pain and functional outcomes between the treatments and was pilot tested for usability, responsiveness, and as support for shared decision-making.
The KOMET predictive regression model for knee pain had four patient-specific variables, and an r2 value of 0.32, and the model for physical functioning included six patient-specific variables, and an r2 of 0.34. These models were incorporated into prototype KOMET decision support software and pilot tested in clinics, and were generally well received.
Use of predictive models and mathematical equipoise may help discern patient-specific equipoise to support shared decision-making for selecting between alternative treatments and considering enrollment into an RCT.
A robust biomedical informatics infrastructure is essential for academic health centers engaged in translational research. There are no templates for what such an infrastructure encompasses or how it is funded. An informatics workgroup within the Clinical and Translational Science Awards network conducted an analysis to identify the scope, governance, and funding of this infrastructure. After we identified the essential components of an informatics infrastructure, we surveyed informatics leaders at network institutions about the governance and sustainability of the different components. Results from 42 survey respondents showed significant variations in governance and sustainability; however, some trends also emerged. Core informatics components such as electronic data capture systems, electronic health records data repositories, and related tools had mixed models of funding including, fee-for-service, extramural grants, and institutional support. Several key components such as regulatory systems (e.g., electronic Institutional Review Board [IRB] systems, grants, and contracts), security systems, data warehouses, and clinical trials management systems were overwhelmingly supported as institutional infrastructure. The findings highlighted in this report are worth noting for academic health centers and funding agencies involved in planning current and future informatics infrastructure, which provides the foundation for a robust, data-driven clinical and translational research program.
Twenty-seven species and two subspecies of Ficus are reported from one study site in central Africa. Characters for identification are explained. An identification key, illustrations, descriptions and habitats are provided. The species-level diversity of Ficus in tropical forests is discussed.
To determine which healthcare worker (HCW) roles and patient care activities are associated with acquisition of vancomycin-resistant Enterococcus (VRE) on HCW gloves or gowns after patient care, as a surrogate for transmission to other patients.
Prospective cohort study.
Medical and surgical intensive care units at a tertiary-care academic institution.
VRE-colonized patients on Contact Precautions and their HCWs.
Overall, 94 VRE-colonized patients and 469 HCW–patient interactions were observed. Research staff recorded patient care activities and cultured HCW gloves and gowns for VRE before doffing and exiting patient room.
VRE were isolated from 71 of 469 HCWs’ gloves or gowns (15%) following patient care. Occupational/physical therapists, patient care technicians, nurses, and physicians were more likely than environmental services workers and other HCWs to have contaminated gloves or gowns. Compared to touching the environment alone, the odds ratio (OR) for VRE contamination associated with touching both the patient (or objects in the immediate vicinity of the patient) and environment was 2.78 (95% confidence interval [CI], 0.99–0.77) and the OR associated with touching only the patient (or objects in the immediate vicinity) was 3.65 (95% CI, 1.17–11.41). Independent risk factors for transmission of VRE to HCWs were touching the patient’s skin (OR, 2.18; 95% CI, 1.15–4.13) and transferring the patient into or out of bed (OR, 2.66; 95% CI, 1.15–6.43).
Patient contact is a major risk factor for HCW contamination and subsequent transmission. Interventions should prioritize contact precautions and hand hygiene for HCWs whose activities involve touching the patient.
Introduction: Emergency Department Overcrowding (EDOC) is a multifactorial issue that leads to Access Block for patients needing emergency care. Identified as a national problem, patients presenting to a Canadian Emergency Department (ED) at a time of overcrowding have higher rates of admission to hospital and increased seven-day mortality. Using the well accepted input-throughput-output model to study EDOC, current research has focused on throughput as a measure of patient flow, reported as ED length of stay (LOS). In fact, ED LOS and ED beds occupied by inpatients are two “extremely important indicators of EDOC identified by a 2005 survey of Canadian ED directors. One proposed solution to improve ED throughput is to utilize a physician at triage (PAT) to rapidly assess newly arriving patients. In 2017, a pilot PAT program was trialed at Kelowna General Hospital (KGH), a tertiary care hospital, as part of a PDSA cycle. The aim was to mitigate EDOC by improving ED throughput by the end of 2018, to meet the national targets for ED LOS suggested in the 2013 CAEP position statement. Methods: During the fiscal periods 1-6 (April 1 to September 7, 2017) a PAT shift occurred daily from 1000-2200, over four long weekends. ED LOS, time to inpatient bed, time to physician initial assessment (PIA), number of British Columbia Ambulance Service (BCAS) offload delays, and number of patients who left without being seen (LWBS) were extracted from an administrative database. Results were retrospectively analyzed and compared to data from 1000-2200 of non-PAT trial days during the trial periods. Results: Median ED LOS decreased from 3.8 to 3.4 hours for high-acuity patients (CTAS 1-3), from 2.1 to 1.8 hours for low-acuity patients (CTAS 4-5), and from 9.3 to 8.0 hours for all admitted patients. During PAT trial weekends, there was a decrease in the average time to PIA by 65% (from 73 to 26 minutes for CTAS 2-5), average number of daily BCAS offload delays by 39% (from 2.3 to 1.4 delays per day), and number of patients who LWBS from 2.4% to 1.7%. Conclusion: The implementation of PAT was associated with improvements in all five measures of ED throughput, providing a potential solution for EDOC at KGH. ED LOS was reduced compared to non-PAT control days, successfully meeting the suggested national targets. PAT could improve efficiency, resulting in the ability to see more patients in the ED, and increase the quality and safety of ED practice. Next, we hope to prospectively evaluate PAT, continuing to analyze these process measures, perform a cost-benefit analysis, and formally assess ED staff and patient perceptions of the program.
Introduction: The administration of “to-go” medications in the Kelowna General Hospital Emergency Department was identified as an issue. Frequently, multiple administrations of “to-go” medication pre-packs were administered to individual patients on a frequent basis. In addition, the variability in “to-go” medication was substantial between providers. Recognizing the patient issues (addiction, dependency and diversion) and system issues (costs, risk) a team-based quality improvement initiative was instituted, utilizing a variety of quality improvement techniques. The aim was to reduce the number of “to-go” medications by half, within a year. Methods: The project began January 2015, and is ongoing. Multiple stakeholders were engaged within the emergency department; these included leaders of the physician, nursing and pharmacy teams, including an executive sponsor. Using change theory, and traditional Plan-Do-Study-Act (PDSA) cycles, an iterative methodology was proposed. The outcome measure proposed was number of “to-go” medications administered; secondary measures included number of opioid “to-go” and benzodiazepine “to-go“prescriptions. Balancing measures were the number of narcotic prescriptions written. Physician prescribing practice and nursing practice were reviewed at meetings and huddles. Individualized reports were provided to physicians for self-review. Data was collated at baseline then reviewed quarterly at meetings and huddles. Run charts were utilized along with raw data and individualized reports. Results: At baseline (January 2015), the number of “to-go” medications was 708. Over the next year, this value reduced to 459, showing a 35% reduction in “to-go”. Two years later (June 2017), this had reduced to 142, resulting in an overall reduction of 80% “to-go” medications. Secondary measures are currently under analysis. Further, no increase in prescribing of narcotics was seen during this time period. Conclusion: The administration of “to-go” medications from the emergency department has significant individual and societal impact. Frequently, these medications are diverted; meaning, sold for profit on the black market. Further, opioid prescribing is under increased scrutiny as the linkage between opioid prescriptions and addiction / dependency becomes more evident. This quality improvement initiative was successful for a number of reasons. First, we had strong engagement from the full emergency department clinical teams. The issue was first identified collaboratively, and teamwork and participation was strong from the outset. Second, we used individual and aggregate data to provide feedback on a regular basis. Third, we had strong support from our executive sponsor(s) who were able to support the efforts and champion and present the results locally, and now, throughout the Health Region.
To analyze whether electronically available comorbid conditions are risk factors for Centers for Disease Control and Prevention (CDC)-defined, hospital-onset Clostridium difficile infection (CDI) after controlling for antibiotic and gastric acid suppression therapy use.
Patients aged ≥18 years admitted to the University of Maryland Medical Center between November 7, 2015, and May 31, 2017.
Comorbid conditions were assessed using the Elixhauser comorbidity index. The Elixhauser comorbidity index and the comorbid condition components were calculated using the International Classification of Disease, Tenth Revision, Clinical Modification (ICD-10-CM) codes extracted from electronic medical records. Bivariate associations between CDI and potential covariates for multivariable regression, including antibiotic use, gastric acid suppression therapy use, as well as comorbid conditions, were estimated using log binomial multivariable regression.
After controlling for antibiotic use, age, proton-pump inhibitor use, and histamine-blocker use, the Elixhauser comorbidity index was a significant risk factor for predicting CDI. There was an increased risk of 1.26 (95% CI, 1.19–1.32) of having CDI for each additional Elixhauser point added to the total Elixhauser score.
An increase in Elixhauser score is associated with CDI. Our study and other studies have shown that comorbid conditions are important risk factors for CDI. Electronically available comorbid conditions and scores like the Elixhauser index should be considered for risk-adjustment of CDC CDI rates.