To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The national implementation of competency-based medical education (CBME) has prompted an increased interest in identifying and tracking clinical and educational outcomes for emergency medicine training programs. For the 2019 Canadian Association of Emergency Physicians (CAEP) Academic Symposium, we developed recommendations for measuring outcomes in emergency medicine training in the context of CBME to assist educational leaders and systems designers in program evaluation.
We conducted a three-phase study to generate educational and clinical outcomes for emergency medicine (EM) education in Canada. First, we elicited expert and community perspectives on the best educational and clinical outcomes through a structured consultation process using a targeted online survey. We then qualitatively analyzed these responses to generate a list of suggested outcomes. Last, we presented these outcomes to a diverse assembly of educators, trainees, and clinicians at the CAEP Academic Symposium for feedback and endorsement through a voting process.
Academic Symposium attendees endorsed the measurement and linkage of CBME educational and clinical outcomes. Twenty-five outcomes (15 educational, 10 clinical) were derived from the qualitative analysis of the survey results and the most important short- and long-term outcomes (both educational and clinical) were identified. These outcomes can be used to help measure the impact of CBME on the practice of Emergency Medicine in Canada to ensure that it meets both trainee and patient needs.
The risk of sudden cardiac death (SCD) is doubled when a patient with chronic kidney disease (CKD) stage 5 starts haemodialysis. Low heart rate variability (HRV) has been reported to be independently associated with increased risk of SCD and all cardiac death in haemodialysis patients. Long chain n-3 polyunsaturated fatty acids (LC n-3 PUFA; 20:5n-3, EPA and 22:6n-3, DHA) may exert anti-arrhythmic effects on cardiac myocytes. Haemodialysis patients have lower serum LC n-3 PUFA levels compared to populations without CKD. Few studies have investigated the relationship between LC n-3 PUFA and HRV in patients with CKD. This study aimed to characterise the variability of LC n-3 PUFA status in patients who recently commenced haemodialysis, and to investigate relationships between LC n-3 PUFA status and HRV. A cross-sectional study was conducted in adults aged 40–80 years with CKD commencing haemodialysis (within 6–10 weeks) (NRES research ethics committee ref: 14/LO/0186). At 2 separate study days, pre-dialysis blood samples were taken to measure fatty acid composition by GC, and HRV monitors (Actiheart, CamNtech Ltd, UK) were fitted after dialysis had started to monitor parameters of cardiac autonomic function during dialysis, during the night, and for a total of 24 h. Forty-five patients (mean age 58 y, SD 9, 20 females/25 males) completed data collection at least once; 91% presented hypertension and 39% had type 2 diabetes. Sample mean omega-3 index (O3I; EPA + DHA as a % of fatty acids in erythrocyte membranes) was very low (3.45%, SD 1.25; median 3.26 %, IQR 1.32); only 2 individuals had O3I > 5%. Variability in erythrocyte EPA (median 0.66 %, IQR 0.42) and DHA (median 2.40 %, IQR 1.32) was limited. Most HRV parameters did not significantly correlate with O3I following adjustment (e.g. age, BMI, β-blockers). Plasma EPA significantly positively correlated with overall and longer phase components of HRV and significantly negatively correlated with beat-to-beat variability over 24 h after full adjustment for confounders. This suggests that although higher circulating EPA concentrations were associated with better cardiac responsivity to environmental stimulations over 24 h, they were also associated with poorer parasympathetic tone (the predominant influence on beat-to-beat HRV). No correlations were observed between plasma DHA and HRV. The divergent pattern of relationships between plasma EPA versus DHA and HRV raises the theory that patients commencing haemodialysis may have compromised conversion of EPA to DHA which may impair vagally-mediated regulation of cardiac autonomic function, a potential mechanism for high risk of SCD.
Interesterified (IE) fats are widely used to replace partially-hydrogenated fats as hard fats with functional and sensory properties needed for spreads/margarines, baked goods, and confectionary, while avoiding the health hazards of trans fats. Detailed mechanistic work to determine the metabolic effects of interesterification of commonly-consumed hard fats has not yet been done. Earlier studies using fats less commonly consumed have shown either neutral or a lowering effect on postprandial lipaemia. We investigated postprandial lipaemia, lipoprotein remodelling, and triacylglycerol-rich lipoprotein (TRL) fraction apolipoprotein concentrations following a common IE blend of palm oil/kernel fractions versus its non-IE counterpart, alongside a reference monounsaturated (MUFA) oil. A 3-armed, double blind, randomized controlled trial (clinicaltrials.gov NCT03191513) in healthy adults (n = 20; 10 men, 10 women) aged 45–75 y, assessed effects of single meals (897 kcal, 50 g fat, 16 g protein, 88 g carbohydrate) on postprandial plasma triacylglycerol (TAG) concentrations, lipoprotein profiles, and TRL fraction apolipoprotein B48 and TAG concentrations. Test fats were IE 80:20 palm stearin/palm kernel fat, the equivalent non-IE fat, and a high-MUFA reference oil (rapeseed oil, RO). Blood was collected at baseline and hourly for 8 h. Linear mixed modelling was performed, adjusting for treatment order and baseline values (ver. 24.0; SPSS Inc., Chicago, IL, USA). Total 8 h incremental area under the curves (iAUC) for plasma TAG concentrations were lower following IE and non-IE compared with RO (mean difference in iAUC: non-IE vs. RO -1.8 mmol/L.h (95% CI -3.3, -0.2); IE vs. RO -2.6 mmol/L.h (95% CI -5.3, 0.0)), but iAUCs for IE and non-IE were not significantly different. There were no differences between IE and non-IE for chylomicron fraction apoB48 concentrations nor TAG:apoB48 ratio. No differences were observed between IE and non-IE for lipoprotein (VLDL, HDL, LDL) particle size or sub-class particle concentrations. However, LDL particle diameters were reduced at 5 and 6 h following IE vs RO (P < 0.05). XXL- (including chylomicron remnants and VLDL particles), XL- and L-VLDL particle concentrations (average diameters > 75, 64, and 53.6 nm respectively) were higher following IE and non-IE vs. RO at 6 h (P < 0.05) and 8 h postprandially (P < 0.005–0.05). In conclusion, both IE and non-IE palmitic acid-rich fats generated a greater preponderance of pro-atherogenic large TRL remnant particles in the late postprandial phase relative to an oleic acid-rich oil. However, the process of interesterification did not modify postprandial TAG response or lipoprotein metabolism.
Self-reported activity restriction is an established correlate of depression in dementia caregivers (dCGs). It is plausible that the daily distribution of objectively measured activity is also altered in dCGs with depression symptoms; if so, such activity characteristics could provide a passively measurable marker of depression or specific times to target preventive interventions. We therefore investigated how levels of activity throughout the day differed in dCGs with and without depression symptoms, then tested whether any such differences predicted changes in symptoms 6 months later.
Design, setting, participants, and measurements:
We examined 56 dCGs (mean age = 71, standard deviation (SD) = 6.7; 68% female) and used clustering to identify subgroups which had distinct depression symptom levels, leveraging baseline Center for Epidemiologic Studies of Depression Scale–Revised Edition and Patient Health Questionnaire-9 (PHQ-9) measures, as well as a PHQ-9 score from 6 months later. Using wrist activity (mean recording length = 12.9 days, minimum = 6 days), we calculated average hourly activity levels and then assessed when activity levels relate to depression symptoms and changes in symptoms 6 months later.
Clustering identified subgroups characterized by: (1) no/minimal symptoms (36%) and (2) depression symptoms (64%). After multiple comparison correction, the group of dCGs with depression symptoms was less active from 8 to 10 AM (Cohen’s d ≤ −0.9). These morning activity levels predicted the degree of symptom change on the PHQ-9 6 months later (per SD unit β = −0.8, 95% confidence interval: −1.6, −0.1, p = 0.03) independent of self-reported activity restriction and other key factors.
These novel findings suggest that morning activity may protect dCGs from depression symptoms. Future studies should test whether helping dCGs get active in the morning influences the other features of depression in this population (i.e. insomnia, intrusive thoughts, and perceived activity restriction).
Surgical site infections (SSIs) are common surgical complications that lead to increased costs. Depending on payer type, however, they do not necessarily translate into deficits for every hospital.
We investigated how surgical site infections (SSIs) influence the contribution margin in 2 reimbursement systems based on diagnosis-related groups (DRGs).
This preplanned observational health cost analysis was nested within a Swiss multicenter randomized controlled trial on the timing of preoperative antibiotic prophylaxis in general surgery between February 2013 and August 2015. A simulation of cost and income in the National Health Service (NHS) England reimbursement system was conducted.
Of 5,175 patients initially enrolled, 4,556 had complete cost and income data as well as SSI status available for analysis. SSI occurred in 228 of 4,556 of patients (5%). Patients with SSIs were older, more often male, had higher BMIs, compulsory insurance, longer operations, and more frequent ICU admissions. SSIs led to higher hospital cost and income. The median contribution margin was negative in cases of SSI. In SSI cases, median contribution margin was Swiss francs (CHF) −2045 (IQR, −12,800 to 4,848) versus CHF 895 (IQR, −2,190 to 4,158) in non-SSI cases. Higher ASA class and private insurance were associated with higher contribution margins in SSI cases, and ICU admission led to greater deficits. Private insurance had a strong increasing effect on contribution margin at the 10th, 50th (median), and 90th percentiles of its distribution, leading to overall positive contribution margins for SSIs in Switzerland. The NHS England simulation with 3,893 patients revealed similar but less pronounced effects of SSI on contribution margin.
Depending on payer type, reimbursement systems with DRGs offer only minor financial incentives to the prevention of SSI.
Surgical site infections (SSIs) following colorectal surgery (CRS) are among the most common healthcare-associated infections (HAIs). Reduction in colorectal SSI rates is an important goal for surgical quality improvement.
To examine rates of SSI in patients with and without cancer and to identify potential predictors of SSI risk following CRS
American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) data files for 2011–2013 from a sample of 12 National Comprehensive Cancer Network (NCCN) member institutions were combined. Pooled SSI rates for colorectal procedures were calculated and risk was evaluated. The independent importance of potential risk factors was assessed using logistic regression.
Of 22 invited NCCN centers, 11 participated (50%). Colorectal procedures were selected by principal procedure current procedural technology (CPT) code. Cancer was defined by International Classification of Disease, Ninth Revision, Clinical Modification (ICD-9-CM) codes.
The primary outcome of interest was 30-day SSI rate.
A total of 652 SSIs (11.06%) were reported among 5,893 CRSs. Risk of SSI was similar for patients with and without cancer. Among CRS patients with underlying cancer, disseminated cancer (SSI rate, 17.5%; odds ratio [OR], 1.66; 95% confidence interval [CI], 1.23–2.26; P=.001), ASA score ≥3 (OR, 1.41; 95% CI, 1.09–1.83; P=.001), chronic obstructive pulmonary disease (COPD; OR, 1.6; 95% CI, 1.06–2.53; P=.02), and longer duration of procedure were associated with development of SSI.
Patients with disseminated cancer are at a higher risk for developing SSI. ASA score >3, COPD, and longer duration of surgery predict SSI risk. Disseminated cancer should be further evaluated by the Centers for Disease Control and Prevention (CDC) in generating risk-adjusted outcomes.
In the context of an austere financial climate, local health care budget holders are increasingly expected to make and enact decisions to decommission (reduce or stop providing) services. However, little is currently known about the experiences of those seeking to decommission. This paper presents the first national study of decommissioning in the English National Health Service drawing on multiple methods, including: an interview-based review of the contemporary policy landscape of health care decommissioning; a national online survey of commissioners of health care services responsible for managing and enacting budget allocation decisions locally; and illustrative vignettes provided by those who have led decommissioning activities. Findings are presented and discussed in relation to four themes: national-local relationships; organisational capacity and resources for decommissioning; the extent and nature of decommissioning; and intended outcomes of decommissioning. Whilst it is unlikely that local commissioners will be able to ‘successfully’ implement decommissioning decisions unless aspects of engagement, local context and outcomes are addressed, it remains unclear what ‘success’ looks like in terms of a decommissioning process.
We describe the design and performance of the Engineering Development Array, which is a low-frequency radio telescope comprising 256 dual-polarisation dipole antennas working as a phased array. The Engineering Development Array was conceived of, developed, and deployed in just 18 months via re-use of Square Kilometre Array precursor technology and expertise, specifically from the Murchison Widefield Array radio telescope. Using drift scans and a model for the sky brightness temperature at low frequencies, we have derived the Engineering Development Array’s receiver temperature as a function of frequency. The Engineering Development Array is shown to be sky-noise limited over most of the frequency range measured between 60 and 240 MHz. By using the Engineering Development Array in interferometric mode with the Murchison Widefield Array, we used calibrated visibilities to measure the absolute sensitivity of the array. The measured array sensitivity matches very well with a model based on the array layout and measured receiver temperature. The results demonstrate the practicality and feasibility of using Murchison Widefield Array-style precursor technology for Square Kilometre Array-scale stations. The modular architecture of the Engineering Development Array allows upgrades to the array to be rolled out in a staged approach. Future improvements to the Engineering Development Array include replacing the second stage beamformer with a fully digital system, and to transition to using RF-over-fibre for the signal output from first stage beamformers.
Low heart rate variability (HRV) predicts sudden cardiac death. Long-chain (LC) n-3 PUFA (C20–C22) status is positively associated with HRV. This cross-sectional study investigated whether vegans aged 40–70 years (n 23), whose diets are naturally free from EPA (20 : 5n-3) and DHA (22 : 6n-3), have lower HRV compared with omnivores (n 24). Proportions of LC n-3 PUFA in erythrocyte membranes, plasma fatty acids and concentrations of plasma LC n-3 PUFA-derived lipid mediators were significantly lower in vegans. Day-time interbeat intervals (IBI), adjusted for physical activity, age, BMI and sex, were significantly shorter in vegans compared with omnivores (mean difference −67 ms; 95 % CI −130, −3·4, P<0·05), but there were no significant differences over 24 h or during sleep. Vegans had higher overall HRV, measured as 24 h standard deviation of normal-to-normal intervals (SDNN) (mean adjusted difference 27 ms; 95 % CI 1, 52, P=0·039). Conversely, vegans presented with decreased 8 h day-time HRV: mean adjusted difference in SDNN −20 ms; 95 % CI −37, −3, P=0·021, with no differences during nocturnal sleep. Day-time parameters of beat-to-beat HRV (root of the mean of the sum of the squares of differences between adjacent normal-to-normal intervals, percentage of adjacent normal-to-normal intervals that differ by >50 % and high-frequency power) were similarly lower in vegans, with no differences during sleep. In conclusion, vegans have higher 24 h SDNN, but lower day-time HRV and shorter day-time IBI relative to comparable omnivores. Vegans may have reduced availability of precursor markers for pro-resolving lipid mediators; it remains to be determined whether there is a direct link with impaired cardiac function in populations with low-n-3 status.
Efficient natural dispersal of herbicide-resistance alleles via seed and pollen can markedly accelerate the incidence of herbicide-resistant weed populations across an agroecoregion. Studies were conducted in western Canada in 2014 and 2015 to investigate pollen- and seed-mediated gene flow in kochia. Pollen-mediated gene flow (PMGF) from glyphosate-resistant (GR) to non-GR kochia was quantified in a field trial (hub and spoke design) at Saskatoon, Saskatchewan. Seed-mediated gene flow of acetolactate synthase (ALS) inhibitor-resistant kochia as a function of tumbleweed speed and distance was estimated in cereal stubble fields at Lethbridge, Alberta and Scott, Saskatchewan. Regression analysis indicated that outcrossing from GR to adjacent non-GR kochia ranged from 5.3 to 7.5%, declining exponentially to 0.1 to 0.4% at 96 m distance. However, PMGF was significantly influenced by prevailing wind direction during pollination (maximum of 11 to 17% outcrossing down-wind). Seed dropped by tumbleweeds varied with distance and plant speed, approaching 90% or more (ca. 100,000 seeds or more) at distances of up to 1,000 m and plant speeds of up to 300 cm s–1. This study highlights the efficient proximal (pollen) and distal (seed) gene movement of this important GR weed.
This article reviews the most likely mechanisms of transmission of the commonly encountered respiratory viruses (influenza, respiratory syncytial virus, parainfluenza, rhinovirus), herpesviruses, and hepatitis viruses, and presents the guidelines used currently for prevention and control that are in use at Strong Memorial Hospital.
In western Canada, more money is spent on wild oat herbicides than on any
other weed species, and wild oat resistance to herbicides is the most
widespread resistance issue. A direct-seeded field experiment was conducted
from 2010 to 2014 at eight Canadian sites to determine crop life cycle, crop
species, crop seeding rate, crop usage, and herbicide rate combination
effects on wild oat management and canola yield. Combining 2× seeding rates
of early-cut barley silage with 2× seeding rates of winter cereals and
excluding wild oat herbicides for 3 of 5 yr (2011 to 2013) often led to
similar wild oat density, aboveground wild oat biomass, wild oat seed
density in the soil, and canola yield as a repeated canola–wheat rotation
under a full wild oat herbicide rate regime. Wild oat was similarly well
managed after 3 yr of perennial alfalfa without wild oat herbicides.
Forgoing wild oat herbicides in only 2 of 5 yr from exclusively summer
annual crop rotations resulted in higher wild oat density, biomass, and seed
banks. Management systems that effectively combine diverse and optimal
cultural practices against weeds, and limit herbicide use, reduce selection
pressure for weed resistance to herbicides and prolong the utility of
threatened herbicide tools.