To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Nosocomial central-line–associated bloodstream infections (CLABSIs) are an important cause of morbidity and mortality in hospitalized patients. CLABSI surveillance establishes rates for internal and external comparison, identifies risk factors, and allows assessment of interventions. Objectives: To determine the frequency of CLABSIs among adult patients admitted to intensive care units (ICUs) in CNISP hospitals and evaluate trends over time. Methods: CNISP is a collaborative effort of the Canadian Hospital Epidemiology Committee, the Association of Medical Microbiologists and Infectious Disease Canada and the Public Health Agency of Canada. Since 1995, CNISP has conducted hospital-based sentinel surveillance of healthcare-associated infections. Overall, 55 CNISP hospitals participated in ≥1 year of CLABSI surveillance. Adult ICUs are categorized as mixed ICUs or cardiovascular (CV) surgery ICUs. Data were collected using standardized definitions and collection forms. Line-day denominators for each participating ICU were collected. Negative-binomial regression was used to test for linear trends, with robust standard errors to account for clustering by hospital. We used the Fisher exact test to compare binary variables. Results: Each year, 28–42 adult ICUs participated in surveillance (27–37 mixed, 6–8 CV surgery). In both mixed ICUs and CV-ICUs, rates remained relatively stable between 2011 and 2018 (Fig. 1). In mixed ICUs, CLABSI rates were 1.0 per 1,000 line days in 2011, and 1.0 per 1,000 line days in 2018 (test for linear trend, P = .66). In CV-ICUs, CLABSI rates were 1.1 per 1,000 line days in 2011 and 0.8 per 1,000 line days in 2018 (P = .19). Case age and gender distributions were consistent across the surveillance period. The 30-day all-cause mortality rate was 29% in 2011 and in 2018 (annual range, 29%–35%). Between 2011 and 2018, the percentage of isolated microorganisms that were coagulase-negative staphylococci (CONS) decreased from 31% to 18% (P = .004). The percentage of other gram-positive organisms increased from 32% to 37% (P = .34); Bacillus increased from 0% to 4% of isolates and methicillin-susceptible Staphylococcus aureus from 2% to 6%). The gram-negative organisms increased from 21% to 27% (P = .19). Yeast represented 16% in 2011 and 18% in 2018; however, the percentage of yeast that were Candida albicans decreased over time (58% of yeast in 2011 and 30% in 2018; P = .04). Between 2011 and 2018, the most commonly identified species of microorganism in each year were CONS (18% in 2018) and Enterococcus spp (18% in 2018). Conclusions: Ongoing CLABSI surveillance has shown stable rates of CLABSI in adult ICUs from 2011 to 2018. The causative microorganisms have changed, with CONS decreasing from 31% to 18%.
Funding: CNISP is funded by the Public Health Agency of Canada.
Disclosures: Allison McGeer reports funds to her for studies, for which she is the principal investigator, from Pfizer and Merck, as well as consulting fees from Sanofi-Pasteur, Sunovion, GSK, Pfizer, and Cidara.
Pollen-mediated gene flow (PMGF) refers to the transfer of genetic information (alleles) from one plant to another compatible plant. With the evolution of herbicide-resistant (HR) weeds, PMGF plays an important role in the transfer of resistance alleles from HR to susceptible weeds; however, little attention is given to this topic. The objective of this work was to review reproductive biology, PMGF studies, and interspecific hybridization, as well as potential for herbicide resistance alleles to transfer in the economically important broadleaf weeds including common lambsquarters, giant ragweed, horseweed, kochia, Palmer amaranth, and waterhemp. The PMGF studies involving these species reveal that transfer of herbicide resistance alleles routinely occurs under field conditions and is influenced by several factors, such as reproductive biology, environment, and production practices. Interspecific hybridization studies within Amaranthus and Ambrosia spp. show that herbicide resistance allele transfer is possible between species of the same genus but at relatively low levels. The widespread occurrence of HR weed populations and high genetic diversity is at least partly due to PMGF, particularly in dioecious species such as Palmer amaranth and waterhemp compared with monoecious species such as common lambsquarters and horseweed. Prolific pollen production in giant ragweed contributes to PMGF. Kochia, a wind-pollinated species can efficiently disseminate herbicide resistance alleles via both PMGF and tumbleweed seed dispersal, resulting in widespread occurrence of multiple HR kochia populations. The findings from this review verify that intra- and interspecific gene flow can occur and, even at a low rate, could contribute to the rapid spread of herbicide resistance alleles. More research is needed to determine the role of PMGF in transferring multiple herbicide resistance alleles at the landscape level.
A major challenge in linking conservation science and policy is deciding how, and when, to offer relevant science to decision-makers to have the greatest impact on decisions. This chapter argues it is a question of alignment – of selecting the right knowledge to address the needs of decision-makers, ensuring that knowledge is accessible to them, and articulating it within their decision-making processes. The chapter describes three mechanisms to enhance this alignment: decision support tools; active knowledge exchange mechanisms; and large-scale scientific assessments. For each, we provide examples and draw out guidelines regarding circumstances in which the mechanism is likely to be most effective. No single mechanism is consistently best at aligning evidence with policy and practice. Each has strengths and weaknesses, and can be applied in different circumstances and at different scales. The chapter ends with a call for these mechanisms that link synthesised evidence with policy and practice decisions to be funded sufficiently, alongside environmental research, to enable adherence to core values of salience, legitimacy, credibility and transparency.
Canadian hospitals were made aware of the risk of Mycobacterium chimaera infection associated with heater-cooler units (HCUs) through alerts issued by the US food and Drug Administration (FDA) and the US Centers for Disease Control and Prevention (CDC). In response, most hospitals conducted retrospective reviews for infections, informed exposed patients, and initiated a requirement for informed consent with HCU use.
Life has been described as information flowing in molecular streams (Dawkins, 1996).Our growing understanding of the impact of horizontal gene transfer on evolutionary dynamics reinforces this fluid-like flow of molecular information (Joyce, 2002). The diversity of nucleic acid sequences, those known and yet to be characterized across Earth's varied environments, along with the vast repertoire of catalytic and structural proteins, presents as more of a dynamic molecular river than a tree of life. These informational biopolymers function as a mutualistic union so universal as to have been termed the Central Dogma (Crick, 1958). It is the distinct folding dynamics-the digital-like base pairing dominating nucleic acids, and the environmentally responsive and diverse range of analog-like interactions dictating protein folding (Goodwin et al., 2012)-that provides the basis for the mutualism. The intertwined functioning of these analog and digital forms of information (Goodwin et al., 2012) unified within diverse chemical networks is heralded as the Darwinian threshold of cellular life (Woese, 2002).
The discovery of prion diseases (Chien et al., 2004; Jablonka and Raz, 2009; Paravastu et al., 2008) introduced the paradigm of protein templates that propagate conformational information, suggesting a new context for Darwinian evolution. When taking both protein and nucleic acid moelcular evolution into consideration (Cairns- Smith, 1966; Joyce, 2002), the conceptual framework for chemical evolution can be generalized into three orthogonal dimensions as shown in Figure 5.1 (Goodwin et al., 2014). The 1st dimension manifests structural order through covalent polymerization reactions and includes chain length, sequence, and linkage chemistry inherent to a dynamic chemical network. The 2nd dimension extends the order in dynamic conformational networks through noncovalent interactions of the polymers. This dimension includes intramolecular and intermolecular forces, from macromolecular folding to supramolecular assembly to multicomponent quaternary structure. Folding in this 2nd dimension certainly depends on the primary polymer sequence, and the folding/assembly diversity yields an additional set of environmentally constrained supramolecular folding codes. For example, double-stranded DNA assemblies are dominated by the rules of complementary base pairing, while the self-propagating conformations of prions are based on additional noncovalent, environmentally-dependent interactions.
Hip and knee arthroplasty infections are associated with considerable healthcare costs. The merits of reducing the postoperative surveillance period from 1 year to 90 days have been debated.
To report the first pan-Canadian hip and knee periprosthetic joint infection (PJI) rates and to describe the implications of a shorter (90-day) postoperative surveillance period.
Prospective surveillance for infection following hip and knee arthroplasty was conducted by hospitals participating in the Canadian Nosocomial Infection Surveillance Program (CNISP) using standard surveillance definitions.
Overall hip and knee PJI rates were 1.64 and 1.52 per 100 procedures, respectively. Deep incisional and organ-space hip and knee PJI rates were 0.96 and 0.71, respectively. In total, 93% of hip PJIs and 92% of knee PJIs were identified within 90 days, with a median time to detection of 21 days. However, 11%–16% of deep incisional and organ-space infections were not detected within 90 days. This rate was reduced to 3%–4% at 180 days post procedure. Anaerobic and polymicrobial infections had the shortest median time from procedure to detection (17 and 18 days, respectively) compared with infections due to other microorganisms, including Staphylococcus aureus.
PJI rates were similar to those reported elsewhere, although differences in national surveillance systems limit direct comparisons. Our results suggest that a postoperative surveillance period of 90 days will detect the majority of PJIs; however, up to 16% of deep incisional and organ-space infections may be missed. Extending the surveillance period to 180 days could allow for a better estimate of disease burden.
To explore whether surgical teams with greater stability among their members (ie, members have worked together more in the past) experience lower rates of sharps-related percutaneous blood and body fluid exposures (BBFE) during surgical procedures.
A 10-year retrospective cohort study.
A single large academic teaching hospital.
Surgical teams participating in surgical procedures (n=333,073) performed during 2001–2010 and 2,113 reported percutaneous BBFE were analyzed.
A social network measure (referred to as the team stability index) was used to quantify the extent to which surgical team members worked together in the previous 6 months. Poisson regression was used to examine the effect of team stability on the risk of BBFE while controlling for procedure characteristics and accounting for procedure duration. Separate regression models were generated for percutaneous BBFE involving suture needles and those involving other surgical devices.
The team stability index was associated with the risk of percutaneous BBFE (adjusted rate ratio, 0.93 [95% CI, 0.88–0.97]). However, the association was stronger for percutaneous BBFE involving devices other than suture needles (adjusted rate ratio, 0.92 [95% CI, 0.85–0.99]) than for exposures involving suture needles (0.96 [0.88–1.04]).
Greater team stability may reduce the risk of percutaneous BBFE during surgical procedures, particularly for exposures involving devices other than suture needles. Additional research should be conducted on the basis of primary data gathered specifically to measure qualities of relationships among surgical team personnel.
To use a unique multicomponent administrative data set assembled at a large academic teaching hospital to examine the risk of percutaneous blood and body fluid (BBF) exposures occurring in operating rooms.
A 10-year retrospective cohort design.
A single large academic teaching hospital.
All surgical procedures (n=333,073) performed in 2001–2010 as well as 2,113 reported BBF exposures were analyzed.
Crude exposure rates were calculated; Poisson regression was used to analyze risk factors and account for procedure duration. BBF exposures involving suture needles were examined separately from those involving other device types to examine possible differences in risk factors.
The overall rate of reported BBF exposures was 6.3 per 1,000 surgical procedures (2.9 per 1,000 surgical hours). BBF exposure rates increased with estimated patient blood loss (17.7 exposures per 1,000 procedures with 501–1,000 cc blood loss and 26.4 exposures per 1,000 procedures with >1,000 cc blood loss), number of personnel working in the surgical field during the procedure (34.4 exposures per 1,000 procedures having ≥15 personnel ever in the field), and procedure duration (14.3 exposures per 1,000 procedures lasting 4 to <6 hours, 27.1 exposures per 1,000 procedures lasting ≥6 hours). Regression results showed associations were generally stronger for suture needle–related exposures.
Results largely support other studies found in the literature. However, additional research should investigate differences in risk factors for BBF exposures associated with suture needles and those associated with all other device types.
This study explored how older Punjabi-speaking South-Asian immigrants (four focus groups; 33 participants) in Surrey, British Columbia, perceive oral health and related problems. Content analysis revealed two umbrella themes: (a) interpretations of mouth conditions and (b) challenges to oral health. The umbrella themes had four sub-themes: damage caused by heat (wai), disturbances caused by caries, coping with dentures, and quality of life. Three challenges were considered: home remedies, Western dentistry, and difficulties accessing dentists. Participants explained oral diseases in terms of a systemic infection (resha), and preferred to decrease imbalances of wai in the mouth with home remedies from India. We conclude that older Punjabi-speaking immigrants interpret oral health and disease in the context of both Western and Ayurvedic traditions, and that they manage dental problems with a mix of traditional remedies supplemented, if possible, by elective oral health care in India, and by emergency dental care in Canada.
Pre-eclampsia is a serious hypertensive condition of pregnancy associated with high maternal and fetal morbidity and mortality. Se intake or status has been linked to the occurrence of pre-eclampsia by our own work and that of others. We hypothesised that a small increase in the Se intake of UK pregnant women of inadequate Se status would protect against the risk of pre-eclampsia, as assessed by biomarkers of pre-eclampsia. In a double-blind, placebo-controlled, pilot trial, we randomised 230 primiparous pregnant women to Se (60 μg/d, as Se-enriched yeast) or placebo treatment from 12 to 14 weeks of gestation until delivery. Whole-blood Se concentration was measured at baseline and 35 weeks, and plasma selenoprotein P (SEPP1) concentration at 35 weeks. The primary outcome measure of the present study was serum soluble vascular endothelial growth factor receptor-1 (sFlt-1), an anti-angiogenic factor linked with the risk of pre-eclampsia. Other serum/plasma components related to the risk of pre-eclampsia were also measured. Between 12 and 35 weeks, whole-blood Se concentration increased significantly in the Se-treated group but decreased significantly in the placebo group. At 35 weeks, significantly higher concentrations of whole-blood Se and plasma SEPP1 were observed in the Se-treated group than in the placebo group. In line with our hypothesis, the concentration of sFlt-1 was significantly lower at 35 weeks in the Se-treated group than in the placebo group in participants in the lowest quartile of Se status at baseline (P= 0·039). None of the secondary outcome measures was significantly affected by treatment. The present finding that Se supplementation has the potential to reduce the risk of pre-eclampsia in pregnant women of low Se status needs to be validated in an adequately powered trial.