To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Phytase has long been used to decrease the inorganic phosphorus (Pi) input in poultry diet. The current study was conducted to investigate the effects of Pi supplementation on laying performance, egg quality and phosphate–calcium metabolism in Hy-Line Brown laying hens fed phytase. Layers (n = 504, 29 weeks old) were randomly assigned to seven treatments with six replicates of 12 birds. The corn–soybean meal-based diet contained 0.12% non-phytate phosphorus (nPP), 3.8% calcium, 2415 IU/kg vitamin D3 and 2000 FTU/kg phytase. Inorganic phosphorus (in the form of mono-dicalcium phosphate) was added into the basal diet to construct seven experimental diets; the final dietary nPP levels were 0.12%, 0.17%, 0.22%, 0.27%, 0.32%, 0.37% and 0.42%. The feeding trial lasted 12 weeks (hens from 29 to 40 weeks of age). Laying performance (housed laying rate, egg weight, egg mass, daily feed intake and feed conversion ratio) was weekly calculated. Egg quality (egg shape index, shell strength, shell thickness, albumen height, yolk colour and Haugh units), serum parameters (calcium, phosphorus, parathyroid hormone, calcitonin and 1,25-dihydroxyvitamin D), tibia quality (breaking strength, and calcium, phosphorus and ash contents), intestinal gene expression (type IIb sodium-dependent phosphate cotransporter, NaPi-IIb) and phosphorus excretion were determined at the end of the trial. No differences were observed on laying performance, egg quality, serum parameters and tibia quality. Hens fed 0.17% nPP had increased (P < 0.01) duodenum NaPi-IIb expression compared to all other treatments. Phosphorus excretion linearly increased with an increase in dietary nPP (phosphorus excretion = 1.7916 × nPP + 0.2157; R2 = 0.9609, P = 0.001). In conclusion, corn–soybean meal-based diets containing 0.12% nPP, 3.8% calcium, 2415 IU/kg vitamin D3 and 2000 FTU/kg phytase would meet the requirements for egg production in Hy-Line Brown laying hens (29 to 40 weeks of age).
Introduction: Patients with poorly-controlled diabetes often visit the emergency department (ED) for treatment of hyperglycemia. While previous qualitative studies have examined the patient experience of diabetes as a chronic illness, there are no studies describing patients’ perceptions of ED care for hyperglycemia. The objective of this study was to explore the patient experience regarding ED hyperglycemia visits, and to characterize perceived barriers to adequate glycemic control post-discharge. Methods: This study was conducted at a tertiary care academic centre in London, Ontario. A qualitative constructivist grounded theory methodology was used to understand the experience of adult patient partners who have had an ED hyperglycemia visit. Patient partners, purposively sampled to capture a breadth of age, sex, disease and presentation frequency were invited to participate in a semi-structured individual interview to probe their experiences. Sampling continued until a theoretical framework representing key experiences and expectations reached sufficiency. Data were collected and analyzed iteratively using a constant comparative approach. Results: 22 patients with type 1 or 2 diabetes were interviewed. Participants sought care in the ED over other options because of their concern of having a potentially life-threatening condition, advice from a healthcare provider or family member, or a perceived lack of convenient alternatives to the ED based on time and location. Participants’ care expectations centred around symptom relief, glycemic control, reassurance and education, and seeking referral to specialist diabetes care post-discharge. Finally, perceived system barriers that challenged participants’ glycemic control included affordability of medical supplies and medications, access to follow-up and, in some cases, the transition from pediatric to adult diabetes care. Conclusion: Patients with diabetes utilize the ED for a variety of urgent and emergent hyperglycemic concerns. In addition to providing excellent medical treatment, ED healthcare providers should consider patients’ expectations when caring for those presenting with hyperglycemia. Future studies will focus on developing strategies to help patients navigate some of the barriers that exist within our current limited healthcare system, enhance follow-up care, and improve short- and long-term health outcomes.
Introduction: Naloxone is recommended for reversing opioid-associated respiratory depression. There is wide variability in emergency department (ED) practice patterns regarding naloxone use, dosing, and observation time post-administration. This study describes the naloxone practice patterns of ED physicians managing suspected opioid overdose patients. Methods: A retrospective chart review was conducted of adult patients (≥ 18 years) presenting to an academic tertiary care centre (consisting of two EDs with an annual census 150,000 visits) in 2017 with suspected opioid overdose who were administered naloxone in the ED. Patients were identified electronically and the following information was abstracted from patient charts: demographics, naloxone dosage and infusion initiation, disposition data, indications for naloxone administration, response to therapy, and adverse effects. Variability in initial and total dose was examined. Initial dose was also compared in those with cardiorespiratory compromise (CPR given, respiratory rate < 8, or desaturation below 89%) using independent samples median tests. Data was analyzed using standard descriptive statistics. Results: 113 patients met inclusion criteria. Indications for naloxone administration were: level of consciousness (50.5%), respiratory depression (4.0%), miosis (1.0%), a combination of factors (19.8%), or undocumented (24.8%). Median initial dose was 0.40 mg (IQR: 0.20-0.40 mg). Median total naloxone administered in the ED was 0.48 mg (IQR: 0.35-1.2 mg). The initial dose resulted in a response in 43.1% of patients, with 36.0% of responding patients later experiencing subsequent respiratory depression. 31% of patients received a naloxone infusion. Initial dose in patients with cardiopulmonary compromise was significantly different only comparing patients who received CPR versus those who did not (median 0.40 mg; IQR: 0.20-0.80 mg; P = 0.019). Four patients experienced emesis following naloxone. Median length of ED stay was 7.0 hours (IQR: 4.0-9.5 hours), and median hospital length of stay was 3.0 days (IQR: 1.0-5.0 days). Median ED observation time prior to discharge was 4.0 hours (IQR: 2.0-8.0 hours). Ultimate disposition home, to the ward, or to the intensive care unit was 47.1%, 42.2%, and 9.8% respectively (1.0% deceased). Conclusion: The dose and usage of naloxone by ED physicians in this study is variable. Further prospective studies are needed to determine the effective naloxone dosing strategy.
Introduction: Acute heart failure (AHF) is a common emergency department (ED) presentation and may be associated with poor outcomes. Conversely, many patients rapidly improve with ED treatment and may not need hospital admission. Because there is little evidence to guide disposition decisions by ED and admitting physicians, we sought to create a risk score for predicting short-term serious outcomes (SSO) in patients with AHF. Methods: We conducted prospective cohort studies at 9 tertiary care hospital EDs from 2007 to 2019, and enrolled adult patients who required treatment for AHF. Each patient was assessed for standardized real-time clinical and laboratory variables, as well as for SSO (defined as death within 30 days or intubation, non-invasive ventilation (NIV), myocardial infarction, coronary bypass surgery, or new hemodialysis after admission). The fully pre-specified, logistic regression model with 13 predictors (age, pCO2, and SaO2 were modeled using spline functions with 3 knots and heart rate and creatinine with 5 knots) was fitted to the 10 multiple imputation datasets. Harrell's fast stepdown procedure reduced the number of variables. We calculated the potential impact on sensitivity (95% CI) for SSO and hospital admissions and estimated a sample size of 170 SSOs. Results: The 2,246 patients had mean age 77.4 years, male sex 54.5%, EMS arrival 41.1%, IV NTG 3.1%, ED NIV 5.2%, admission on initial visit 48.6%. Overall there were 174 (7.8%) SSOs including 70 deaths (3.1%). The final risk scale is comprised of five variables (points) and had c-statistic of 0.76 (95% CI: 0.73-0.80): 1.Valvular heart disease (1) 2.ED non-invasive ventilation (2) 3.Creatinine 150-300 (1) ≥300 (2) 4.Troponin 2x-4x URL (1) ≥5x URL (2) 5.Walk test failed (2) The probability of SSO ranged from 2.0% for a total score of 0 to 90.2% for a score of 10, showing good calibration. The model was stable over 1,000 bootstrap samples. Choosing a risk model total point admission threshold of >2 would yield a sensitivity of 80.5% (95% CI 73.9-86.1) for SSO with no change in admissions from current practice (48.6% vs 48.7%). Conclusion: Using a large prospectively collected dataset, we created a concise and sensitive risk scale to assist with admission decisions for patients with AHF in the ED. Implementation of this risk scoring scale should lead to safer and more efficient disposition decisions, with more high-risk patients being admitted and more low-risk patients being discharged.
We report the results from the first 12 months of a 2-year maintenance phase of a study evaluating long-term efficacy and safety of venlafaxine extended-release (XR) in preventing recurrence of depression.
Patients with recurrent unipolar depression (N=1096) were randomly assigned in a 3:1 ratio to 10-week treatment with venlafaxine XR (75 mg/d to 300 mg/d) or fluoxetine (20 mg/d to 60 mg/d). Responders (HAM-D17 total score ≤12 and ≥50% decrease from baseline) entered a 6-month, double-blind, continuation phase on the same medication. Continuation phase responders enrolled into the maintenance treatment period consisting of 2 consecutive 12-month phases. At the start of each maintenance phase, venlafaxine XR responders were randomly assigned to double-blind treatment with venlafaxine XR or placebo; fluoxetine responders continued for each period. Time to recurrence (HAM-D17 total score >12 and <50% reduction from acute phase baseline at 2 consecutive visits or the last visit prior to discontinuation) was evaluated using Kaplan-Meier methods and compared between groups using log-rank tests.
At the end of the continuation phase, venlafaxine XR responders were randomly assigned to venlafaxine XR (n=164) or placebo (n=172); 129 patients in each group were evaluated for efficacy. The cumulative probability of recurrence through 12 months was 23.1% (95% CI: 15.3, 30.9) for venlafaxine XR and 42.0% (95% CI: 31.8, 52.2) for placebo (P=0.005).
Twelve months of venlafaxine XR maintenance treatment was effective in preventing recurrence in depressed patients who had been successfully treated with venlafaxine XR during acute and continuation therapy.
This study evaluated the efficacy and safety of venlafaxine extended-release (XR) in preventing recurrence of depression.
Outpatients with recurrent unipolar depression (N=1096) were randomly assigned in a 3:1 ratio to 10-week treatment with venlafaxine XR (75 mg/d to 300 mg/d) or fluoxetine (20 mg/d to 60 mg/d). Responders (HAM-D17 ≤12 and ≥50% decrease from baseline) entered a 6-month, double-blind, continuation phase on the same medication. Continuation phase responders enrolled into maintenance treatment consisting of 2 consecutive 12-month phases. At the start of each maintenance phase, venlafaxine XR responders were randomized to double-blind treatment with venlafaxine XR or placebo; fluoxetine responders continued on fluoxetine. Time to recurrence (HAM-D17 >12 and <50% reduction from acute phase baseline at 2 consecutive visits or the last valid visit prior to discontinuation) was evaluated using Kaplan-Meier methods and compared between groups using log-rank tests.
In the second maintenance phase, the cumulative probabilities of recurrence through 12 months in the venlafaxine XR (n=43) and placebo (n=40) groups were 8.0% (95% CI: 0.0, 16.8) and 44.8% (95% CI: 27.6, 62.0), respectively (P<0.001). The probabilities of recurrence over 24 months for patients assigned to venlafaxine XR (n=129) or placebo (n=129) for the first maintenance phase were 28.5% (95% CI 18.3, 37.8) and 47.3% (95% CI 36.4, 58.2), respectively (P=0.005).
An additional 12 months of venlafaxine XR maintenance therapy was effective in preventing recurrence in depressed patients who had responded to venlafaxine XR after acute, continuation, and 12 months' initial maintenance therapy.
The efficacy of venlafaxine extended-release (XR) at doses between 75 mg/d and 300 mg/d has been demonstrated in patients with recurrent major depressive disorder (MDD) over 2.5 years. This analysis evaluated the long-term efficacy of venlafaxine XR ≤225 mg/d, the approved dosage in many countries.
In the primary multicenter, double-blind trial, outpatients with recurrent MDD (N=1096) were randomized to receive 10-week acute-phase treatment with venlafaxine XR (75 mg/d to 300 mg/d) or fluoxetine (20 mg/d to 60 mg/d), followed by a 6-month continuation phase. Subsequently, at the start of 2 consecutive, double-blind, 12-month maintenance phases, venlafaxine XR responders were randomized to receive venlafaxine XR or placebo. Data from the 24 months of maintenance treatment were analyzed for the combined end point of maintenance of response (ie, no recurrence of depression and no dose increase above 225 mg/d), and each component individually. Time to each outcome was evaluated with Kaplan-Meier methods using log-rank tests for venlafaxine XR-placebo comparisons.
The analysis population included 114 patients who had received venlafaxine XR doses less than or equal to 225 mg/d prior to maintenance phase baseline (venlafaxine XR: n=55; placebo: n=59). Probability estimates for maintaining response were 70% for venlafaxine XR and 38% for placebo (P=0.007), for no dose increase were 76% and 58%, respectively (P=0.019), and for no recurrence were 87% vs 65%, respectively (P=.099).
These data confirm venlafaxine XR is effective maintaining response at doses ≤225 mg/d for up to 2.5 years in patients with MDD.
Recently, a triple-network model suggested the abnormal interactions between the executive-control network (ECN), default-mode network (DMN) and salience network (SN) are important characteristics of addiction, in which the SN plays a critical role in allocating attentional resources toward the ECN and DMN. Although increasing studies have reported dysfunctions in these brain networks in Internet gaming disorder (IGD), interactions between these networks, particularly in the context of the triple-network model, have not been investigated in IGD. Thus, we aimed to assess alterations in the inter-network interactions of these large-scale networks in IGD, and to associate the alterations with IGD-related behaviors.
DMN, ECN and SN were identified using group-level independent component analysis (gICA) in 39 individuals with IGD and 34 age and gender matched healthy controls (HCs). Then alterations in the SN-ECN and SN-DMN connectivity, as well as in the modulation of ECN versus DMN by SN, using a resource allocation index (RAI) developed and validated previously in nicotine addiction, were assessed. Further, associations between these altered network coupling and clinical assessments were also examined.
Compared with HCs, IGD had significantly increased SN-DMN connectivity and decreased RAI in right hemisphere (rRAI), and the rRAI in IGD was negatively associated with their scores of craving.
These findings suggest that the deficient modulation of ECN versus DMN by SN might provide a mechanistic framework to better understand the neural basis of IGD and might provide novel evidence for the triple-network model in IGD.
Childhoods in urban or rural environments may differentially affect risk for neuropsychiatric disorders. Here, we leveraged on dramatic urbanization and rural-urban migration since the 1980s in China to explore the hypothesis that rural or urban childhoods may differentially influence memory processing and neural responses to neutral and aversive stimuli.
Explore the underlying mechanisms of childhood environment effect on brain function and neuropsychiatric risk.
We examined 420 adult subjects with similar current socioeconomic status and living in Beijing, China, but with differing rural (n = 227) or urban (n = 193) childhoods. In an episodic memory paradigm scanned in a 3 T GE MRI, subjects viewed blocks of neutral or aversive pictures in the encoding and retrieval sessions.
Episodic memory accuracy for neutral stimuli was less than for aversive stimuli (P < 0.001). However, subjects with rural childhoods apparently performed less accurately for memory of aversive but not neutral stimuli (P < 0.01). In subjects with rural childhoods, there was relatively increased engagement of bilateral striatum at encoding, increased engagement of bilateral hippocampus at retrieval of neutral and aversive stimuli, and increased engagement of amygdala at aversive retrieval (P < 0.05 FDR corrected, cluster size > 50).
Rural or urban childhoods appear associated with physiological and behavioural differences, particularly in the neural processing of aversive episodic memory at medial temporal and striatal brain regions. It remains to be explored the extent to which these effects relate to individual risk for neuropsychiatric or stress-related disorders.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Grape seed procyanidins (GSPs), widely known for their beneficial health properties, fail to bring about the expected improvement in piglets’ growth performance. The effects of dietary supplementation with GSPs on nutrient utilisation may be a critical influencing factor. Hence, the purpose of this study was to investigate the effects of dietary supplementation with GSPs on nutrient utilisation and gut function in weaned piglets. One hundred and twenty crossbred piglets were allocated randomly to four treatment groups, with three replicate pens per treatment and 10 piglets per pen. Each group was given one of the four dietary treatments: the basal diet (control group) or the basal diet with the addition of 50-, 100- or 150-mg/kg GSPs. The trial lasted 28 days. Faeces were collected from d 12 to 14 and from d 26 to 28 for measuring the coefficient of total tract apparent digestibility (CTTAD) of the nutrients. Blood samples were collected on d 14 and 28 for detecting the blood biochemical parameters. Two piglets per pen were slaughtered to collect the pancreas and intestinal digesta for evaluating the digestive enzyme activity and the coefficient of ileal apparent digestibility (CIAD) of the nutrients. On d 14 and 28, supplementation with 150-mg/kg GSPs significantly decreased the CTTAD of DM and CP in piglets. On d 14, GSPs supplementation at a concentration of 150 mg/kg led to a remarkable decrease in the CIAD of CP and gross energy (GE). On d 28, GSPs supplementation at a dose of 150 mg/kg generated a marked decline in the CIAD of DM, GE, CP and ether extract. Grape seed procyanidins supplementation at concentrations of 100 or 150 mg/kg inhibited the activities of lipase and amylase. In contrast, the jejunum mucosa maltase and sucrase activities increased due to the inclusion of GSPs at a concentration of 100 mg/kg in the piglet diet. Compared with the levels of the control group, the serum glucose and total protein levels were enhanced significantly by supplementation with GSPs at 100 mg/kg and reduced dramatically at 150 mg/kg. The serum diamine oxidase activity and endotoxin levels were decreased by GSPs supplementation in piglet diets. In conclusion, higher concentrations of GSPs in weaned piglet diets attenuated nutrient digestion and inhibited digestive enzyme activity; however, suitable concentrations of GSPs could promote brush-border enzyme activity, enhance serum glucose and total protein concentrations and decrease epithelial permeability.
Diagnosis, treatment, and prevention of vector-borne disease (VBD) in pets is one cornerstone of companion animal practices. Veterinarians are facing new challenges associated with the emergence, reemergence, and rising incidence of VBD, including heartworm disease, Lyme disease, anaplasmosis, and ehrlichiosis. Increases in the observed prevalence of these diseases have been attributed to a multitude of factors, including diagnostic tests with improved sensitivity, expanded annual testing practices, climatologic and ecological changes enhancing vector survival and expansion, emergence or recognition of novel pathogens, and increased movement of pets as travel companions. Veterinarians have the additional responsibility of providing information about zoonotic pathogen transmission from pets, especially to vulnerable human populations: the immunocompromised, children, and the elderly. Hindering efforts to protect pets and people is the dynamic and ever-changing nature of VBD prevalence and distribution. To address this deficit in understanding, the Companion Animal Parasite Council (CAPC) began efforts to annually forecast VBD prevalence in 2011. These forecasts provide veterinarians and pet owners with expected disease prevalence in advance of potential changes. This review summarizes the fidelity of VBD forecasts and illustrates the practical use of CAPC pathogen prevalence maps and forecast data in the practice of veterinary medicine and client education.
Introduction: Acute aortic syndrome (AAS) is a time sensitive aortic catastrophe that is often misdiagnosed. There are currently no Canadian guidelines to aid in diagnosis. Our goal was to adapt the existing American Heart Association (AHA) and European Society of Cardiology (ESC) diagnostic algorithms for AAS into a Canadian evidence based best practices algorithm targeted for emergency medicine physicians. Methods: We chose to adapt existing high-quality clinical practice guidelines (CPG) previously developed by the AHA/ESC using the GRADE ADOLOPMENT approach. We created a National Advisory Committee consisting of 21 members from across Canada including academic, community and remote/rural emergency physicians/nurses, cardiothoracic and cardiovascular surgeons, cardiac anesthesiologists, critical care physicians, cardiologist, radiologists and patient representatives. The Advisory Committee communicated through multiple teleconference meetings, emails and a one-day in person meeting. The panel prioritized questions and outcomes, using the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach to assess evidence and make recommendations. The algorithm was prepared and revised through feedback and discussions and through an iterative process until consensus was achieved. Results: The diagnostic algorithm is comprised of an updated pre test probability assessment tool with further testing recommendations based on risk level. The updated tool incorporates likelihood of an alternative diagnosis and point of care ultrasound. The final best practice diagnostic algorithm defined risk levels as Low (0.5% no further testing), Moderate (0.6-5% further testing required) and High ( >5% computed tomography, magnetic resonance imaging, trans esophageal echocardiography). During the consensus and feedback processes, we addressed a number of issues and concerns. D-dimer can be used to reduce probability of AAS in an intermediate risk group, but should not be used in a low or high-risk group. Ultrasound was incorporated as a bedside clinical examination option in pre test probability assessment for aortic insufficiency, abdominal/thoracic aortic aneurysms. Conclusion: We have created the first Canadian best practice diagnostic algorithm for AAS. We hope this diagnostic algorithm will standardize and improve diagnosis of AAS in all emergency departments across Canada.
Introduction: We previously derived (N = 559) and validated (N = 1,100) the 10-item Ottawa Heart Failure Risk Scale (OHFRS), to assist with disposition decisions for patients with acute heart failure (AHF) in the emergency department (ED). In the current study we sought to use a larger dataset to develop a more concise and more accurate risk scale. Methods: We analyzed data from the prior two studies and from a new cohort. For all 3 groups we conducted prospective cohort studies that enrolled patients who required treatment for AHF at 8 tertiary care hospital EDs. Patients were followed for 30 days. The primary outcome was short-term serious outcome (SSO), defined as death within 30 days, intubation or non-invasive ventilation (NIV) after admission, myocardial infarction, or relapse resulting in hospital admission within 14 days. The fully pre-specified logistic regression model with 13 predictors (where age, pCO2, and SaO2 were modeled using spline functions) was fitted to 10 multiple imputation datasets. Harrell's fast stepdown procedure reduced the number of variables. We calculated the potential impact on sensitivity (95% CI) for SSO and hospital admissions, and estimated a sample size of 2,000 patients. Results: The 1,986 patients had mean age 77.3 years, male 54.1%, EMS arrival 41.2%, IV NTG 3.3%, ED NIV 5.4%, admission on initial visit 49.5%. Overall there were 236 (11.9%) SSOs including 61 deaths (3.1%), meaning that current admission practice sensitivity for SSO was only 59.7%. The final HEARTRISK6 scale is comprised of 6 variables (points) (C-statistic 0.68): Valvular heart disease (2) Antiarrhythmic medication (2) ED non-invasive ventilation (3) Creatinine 80–150 (1); ≥150 (3) Troponin ≥3x URL (2) Walk test failed (1). The probability of SSO ranged from 4.8% for a total score of 0 to 62.4% for a score of 10, showing good calibration. Choosing a HEARTRISK6 total point admission threshold of ≥3 would yield sensitivity of 70.8% (95%CI 64.5-76.5) for SSO with a slight decrease in admissions to 47.9%. Choosing a threshold of ≥2 would yield a sensitivity of 84.3% (95%CI 79.0-88.7) but require 66.6% admissions. Conclusion: Using a large prospectively collected dataset, we created a more concise and more sensitive risk scale to assist with admission decisions for patients with AHF in the ED. Implementation of the HEARTRISK6 scale should lead to safer and more efficient disposition decisions, with more high-risk patients being admitted and more low-risk patients being discharged.
Introduction: Endotracheal intubation (ETI) is a lifesaving procedure commonly performed by emergency department (ED) physicians that may lead to patient discomfort or adverse events (e.g., unintended extubation) if sedation is inadequate. No ED-based sedation guidelines currently exist, so individual practice varies widely. This study's objective was to describe the self-reported post-ETI sedation practice of Canadian adult ED physicians. Methods: An anonymous, cross-sectional, web-based survey featuring 7 common ED scenarios requiring ETI was distributed to adult ED physician members of the Canadian Association of Emergency Physicians (CAEP). Scenarios included post-cardiac arrest, hypercapnic and hypoxic respiratory failure, status epilepticus, polytrauma, traumatic brain injury, and toxicology. Participants indicated first and second choice of sedative medication following ETI, as well as bolus vs. infusion administration in each scenario. Data was presented by descriptive statistics. Results: 207 (response rate 16.8%) ED physicians responded to the survey. Emergency medicine training of respondents included CCFP-EM (47.0%), FRCPC (35.8%), and CCFP (13.9%). 51.0% of respondents work primarily in academic/teaching hospitals and 40.4% work in community teaching hospitals. On average, responding physicians report providing care for 4.9 ± 6.8 (mean ± SD) intubated adult patients per month for varying durations (39.2% for 1–2 hours, 27.8% for 2–4 hours, and 22.7% for ≤1 hour). Combining all clinical scenarios, propofol was the most frequently used medication for post-ETI sedation (38.0% of all responses) and was the most frequently used agent except for the post-cardiac arrest, polytrauma, and hypercapnic respiratory failure scenarios. Ketamine was used second most frequently (28.2%), with midazolam being third most common (14.5%). Post-ETI sedation was provided by > 98% of physicians in all situations except the post-cardiac arrest (26.1% indicating no sedation) and toxicology (15.5% indicating no sedation) scenarios. Sedation was provided by infusion in 74.6% of cases and bolus in 25.4%. Conclusion: Significant practice variability with respect to post-ETI sedation exists amongst Canadian emergency physicians. Future quality improvement studies should examine sedation provided in real clinical scenarios with a goal of establishing best sedation practices to improve patient safety and quality of care.
After five positive randomized controlled trials showed benefit of mechanical thrombectomy in the management of acute ischemic stroke with emergent large-vessel occlusion, a multi-society meeting was organized during the 17th Congress of the World Federation of Interventional and Therapeutic Neuroradiology in October 2017 in Budapest, Hungary. This multi-society meeting was dedicated to establish standards of practice in acute ischemic stroke intervention aiming for a consensus on the minimum requirements for centers providing such treatment. In an ideal situation, all patients would be treated at a center offering a full spectrum of neuroendovascular care (a level 1 center). However, for geographical reasons, some patients are unable to reach such a center in a reasonable period of time. With this in mind, the group paid special attention to define recommendations on the prerequisites of organizing stroke centers providing medical thrombectomy for acute ischemic stroke, but not for other neurovascular diseases (level 2 centers). Finally, some centers will have a stroke unit and offer intravenous thrombolysis, but not any endovascular stroke therapy (level 3 centers). Together, these level 1, 2, and 3 centers form a complete stroke system of care. The multi-society group provides recommendations and a framework for the development of medical thrombectomy services worldwide.
Normal odd-chain SFA (OCSFA), particularly tridecanoic acid (n-13 : 0), pentadecanoic acid (n-15 : 0) and heptadecanoic acid (n-17 : 0), are normal components of dairy products, beef and seafood. The ratio of n-15 : 0:n-17 : 0 in ruminant foods (dairy products and beef) is 2:1, while in seafood and human tissues it is 1:2, and their appearance in plasma is often used as a marker for ruminant fat intake. Human elongases encoded by elongation of very long-chain fatty acid (ELOVL)1, ELOVL3, ELOVL6 and ELOVL7 catalyse biosynthesis of the dominant even-chain SFA; however, there are no reports of elongase function on OCSFA. ELOVL transfected MCF7 cells were treated with n-13 : 0, n-15 : 0 or n-17 : 0 (80 µm) and products analysed. ELOVL6 catalysed elongation of n-13 : 0→n-15 : 0 and n-15 : 0→n-17 : 0; and ELOVL7 had modest activity toward n-15 : 0 (n-15 : 0→n-17 : 0). No elongation activity was detected for n-17 : 0→n-19 : 0. Our data expand ELOVL specificity to OCSFA, providing the first molecular evidence demonstrating ELOVL6 as the major elongase acting on OCSFA n-13 : 0 and n-15 : 0 fatty acids. Studies of food intake relying on OCSFA as a biomarker should consider endogenous human metabolism when relying on OCSFA ratios to indicate specific food intake.
We are performing systematic observation studies on the Galactic interstellar isotopic ratios, including 18O/17O, 12C/13C, 14N/15N and 32S/34S. Our strategy focuses on combination of multi-transition observation data toward large samples with different Galactocentric distances. Our preliminary results show positive Galactic radial gradients of 18O/17O and 12C/13C. In both cases, the ratio increases with the Galactocentric distance, which agrees with the inside-out scenario of our Galaxy. Observations of other isotopes such as 14N/15N and 32S/34S are on-going.