We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
OBJECTIVES/GOALS: We assessed the relationship between C-peptide preservation and a serum exocrine pancreatic enzyme (trypsin) in a recently concluded clinical trial. We hypothesized that immunomodulatory treatment resulting in improved beta-cell function would be associated with improved trypsin levels in subjects with recent-onset type 1 diabetes (T1D). METHODS/STUDY POPULATION: In a three-arm, randomized, double-masked, placebo-controlled trial 'Antithymocyte Globulin (ATG) and pegylated granulocyte colony stimulating factor (GCSF) in New Onset Type 1 Diabetes’ 89 subjects with recent-onset T1D (duration <100 days) were enrolled and randomized to 3 groups: low-dose ATG (2.5 mg/kg IV) followed by pegylated GCSF (6 mg subcutaneously every 2 weeks for 6 doses), low-dose ATG alone, and placebo. We compared longitudinal serum levels of an exocrine enzyme (trypsin) in a subset of responders to therapy (defined as subjects with at least 60% of baseline area under the curve (AUC) C-peptide levels at 96 weeks, n=4) versus placebo 'responders’ (n=2) and non-responders (n=25), and treated (n=19) versus placebo (n=12) subjects at baseline, 2 weeks, and 6 months after treatment. RESULTS/ANTICIPATED RESULTS: There was no observed difference in treated (n=20) versus placebo (n=12) longitudinal trends in trypsin levels when compared to baseline levels. However, responders to immunotherapy (n=4) had 6 month trypsin levels that were 114% of baseline whereas placebo subject 'responders’ (n=2), placebo subjects (n=10), and non-responders to immunotherapy (n=15) had trypsin levels that were 81-93% of baseline (unpaired T test p=0.05). Overall, we found that serum trypsin, a marker of exocrine pancreatic function, had a normal upward trend in new-onset T1D subjects who responded clinically to immunotherapy but declined in subjects who did not respond or who were not treated. These results were bordering on statistical significance but did not reach significance, likely due to the small sample size. DISCUSSION/SIGNIFICANCE: An improvement in trypsin, a marker of exocrine function, after response to immunotherapy in new-onset T1D may be due to a direct impact on exocrine function versus an indirect effect from improved beta cell function. Future studies will be needed to confirm our findings in a larger sample and evaluate the mechanism for improved exocrine function.
The incidence of infections from extended-spectrum β-lactamase (ESBL)–producing Enterobacterales (ESBL-E) is increasing in the United States. We describe the epidemiology of ESBL-E at 5 Emerging Infections Program (EIP) sites.
Methods
During October–December 2017, we piloted active laboratory- and population-based (New York, New Mexico, Tennessee) or sentinel (Colorado, Georgia) ESBL-E surveillance. An incident case was the first isolation from normally sterile body sites or urine of Escherichia coli or Klebsiella pneumoniae/oxytoca resistant to ≥1 extended-spectrum cephalosporin and nonresistant to all carbapenems tested at a clinical laboratory from a surveillance area resident in a 30-day period. Demographic and clinical data were obtained from medical records. The Centers for Disease Control and Prevention (CDC) performed reference antimicrobial susceptibility testing and whole-genome sequencing on a convenience sample of case isolates.
Results
We identified 884 incident cases. The estimated annual incidence in sites conducting population-based surveillance was 199.7 per 100,000 population. Overall, 800 isolates (96%) were from urine, and 790 (89%) were E. coli. Also, 393 cases (47%) were community-associated. Among 136 isolates (15%) tested at the CDC, 122 (90%) met the surveillance definition phenotype; 114 (93%) of 122 were shown to be ESBL producers by clavulanate testing. In total, 111 (97%) of confirmed ESBL producers harbored a blaCTX-M gene. Among ESBL-producing E. coli isolates, 52 (54%) were ST131; 44% of these cases were community associated.
Conclusions
The burden of ESBL-E was high across surveillance sites, with nearly half of cases acquired in the community. EIP has implemented ongoing ESBL-E surveillance to inform prevention efforts, particularly in the community and to watch for the emergence of new ESBL-E strains.
Sweet acacia [Vachellia farnesiana (L.) Willd.] is a problematic thorny weed species in several parts of Australia. Knowledge of its seed biology could help to formulate weed management decisions for this and similar species. Experiments were conducted to determine the effect of hot water (scarification), alternating temperatures, light, salt stress, and water stress on seed germination of two populations of V. farnesiana and to evaluate the response of its young seedlings (the most sensitive developmental stage) to commonly available postemergence herbicides in Australia. Both populations responded similarly to all the environmental factors and herbicides; therefore, data were pooled over the populations. Seeds immersed in hot water at 90 C for 10 min provided the highest germination (88%), demonstrating physical dormancy in this species. Seeds germinated at a wide range of alternating day/night temperatures from 20/10 C (35%) to 35/25 C (90%), but no seeds germinated at 15/5 C. Germination was not affected by light, suggesting that seeds are nonphotoblastic and can germinate under a plant canopy or when buried in soil. Germination was not affected by sodium chloride (NaCl) concentrations up to 20 mM, and about 50% of seeds could germinate at 160 mM NaCl, suggesting high salt tolerance ability. Germination was only 13% at −0.2 MPa osmotic potential, and no seeds germinated at −0.4 MPa, suggesting that V. farnesiana seeds may remain ungerminated until moisture conditions have become conducive for germination. A number of postemergence herbicides, including 2,4-D + picloram, glufosinate, paraquat, and saflufenacil, provided >85% control of biomass of young seedlings compared with the non-treated control treatment. Knowledge gained from this study will help to predict the potential spread of V. farnesiana in other areas and help to integrate herbicide use with other management strategies.
Genetic susceptibility to late maturity alpha-amylase (LMA) in wheat (Triticum aestivum L.) results in increased alpha-amylase activity in mature grain when cool conditions occur during late grain maturation. Farmers are forced to sell wheat grain with elevated alpha-amylase at a discount because it has an increased risk of poor end-product quality. This problem can result from either LMA or preharvest sprouting, grain germination on the mother plant when rain occurs before harvest. Whereas preharvest sprouting is a well-understood problem, little is known about the risk LMA poses to North American wheat crops. To examine this, LMA susceptibility was characterized in a panel of 251 North American hard spring wheat lines, representing ten geographical areas. It appears that there is substantial LMA susceptibility in North American wheat since only 27% of the lines showed reproducible LMA resistance following cold-induction experiments. A preliminary genome-wide association study detected six significant marker-trait associations. LMA in North American wheat may result from genetic mechanisms similar to those previously observed in Australian and International Maize and Wheat Improvement Center (CIMMYT) germplasm since two of the detected QTLs, QLMA.wsu.7B and QLMA.wsu.6B, co-localized with previously reported loci. The Reduced height (Rht) loci also influenced LMA. Elevated alpha-amylase levels were significantly associated with the presence of both wild-type and tall height, rht-B1a and rht-D1a, loci in both cold-treated and untreated samples.
Ecosystem modeling, a pillar of the systems ecology paradigm (SEP), addresses questions such as, how much carbon and nitrogen are cycled within ecological sites, landscapes, or indeed the earth system? Or how are human activities modifying these flows? Modeling, when coupled with field and laboratory studies, represents the essence of the SEP in that they embody accumulated knowledge and generate hypotheses to test understanding of ecosystem processes and behavior. Initially, ecosystem models were primarily used to improve our understanding about how biophysical aspects of ecosystems operate. However, current ecosystem models are widely used to make accurate predictions about how large-scale phenomena such as climate change and management practices impact ecosystem dynamics and assess potential effects of these changes on economic activity and policy making. In sum, ecosystem models embedded in the SEP remain our best mechanism to integrate diverse types of knowledge regarding how the earth system functions and to make quantitative predictions that can be confronted with observations of reality. Modeling efforts discussed are the Century ecosystem model, DayCent ecosystem model, Grassland Ecosystem Model ELM, food web models, Savanna model, agent-based and coupled systems modeling, and Bayesian modeling.
Feeding difficulty is a known complication of congenital heart surgery. Despite this, there is a relative sparsity in the available data regarding risk factors, incidence, associated symptoms, and outcomes.
Methods:
In this retrospective chart review, patients aged 0–18 years who underwent congenital heart surgery at a single institution between January and December, 2017 were reviewed. Patients with feeding difficulties before surgery, multiple surgeries, and potentially abnormal recurrent laryngeal nerve anatomy were excluded. Data collected included patient demographics, feeding outcomes, post-operative symptoms, flexible nasolaryngoscopy findings, and rates of readmission within a 1-year follow-up period. Multivariable regression analyses were performed to evaluate the risk of an alternative feeding plan at discharge and length of stay.
Results:
Three-hundred and twenty-six patients met the inclusion criteria for this study. Seventy-two (22.09%) were discharged with a feeding tube and 70 (97.22%) of this subgroup were younger than 12 months at the time of surgery. Variables that increased the risk of being discharged with a feeding tube included patient age, The Society of Thoracic Surgeons–European Association for Cardio-Thoracic Surgery score, procedure group, aspiration, and reflux. Speech-language pathology was the most frequently utilised consulting service for patients discharged with feeding tubes (90.28%) while other services were not frequently consulted. The median length of stay was increased from 4 to 10 days for patients who required an enteral feeding tube at discharge.
Discussion:
Multidisciplinary management protocol and interventions should be developed and standardised to improve feeding outcomes following congenital heart surgery.
Biospecimen repositories play a vital role in enabling investigation of biologic mechanisms, identification of disease-related biomarkers, advances in diagnostic assays, recognition of microbial evolution, and characterization of new therapeutic targets for intervention. They rely on the complex integration of scientific need, regulatory oversight, quality control in collection, processing and tracking, and linkage to robust phenotype information. The COVID-19 pandemic amplified many of these considerations and illuminated new challenges, all while academic health centers were trying to adapt to unprecedented clinical demands and heightened research constraints not witnessed in over 100 years. The outbreak demanded rapid understanding of SARS-CoV-2 to develop diagnostics and therapeutics, prompting the immediate need for access to high quality, well-characterized COVID-19-associated biospecimens. We surveyed 60 Clinical and Translational Science Award (CTSA) hubs to better understand the strategies and barriers encountered in biobanking before and in response to the COVID-19 pandemic. Feedback revealed a major shift in biorepository model, specimen-acquisition and consent process from a combination of investigator-initiated and institutional protocols to an enterprise-serving strategy. CTSA hubs were well equipped to leverage established capacities and expertise to quickly respond to the scientific needs of this crisis through support of institutional approaches in biorepository management.
The COVID-19 pandemic and mitigation measures are likely to have a marked effect on mental health. It is important to use longitudinal data to improve inferences.
Aims
To quantify the prevalence of depression, anxiety and mental well-being before and during the COVID-19 pandemic. Also, to identify groups at risk of depression and/or anxiety during the pandemic.
Method
Data were from the Avon Longitudinal Study of Parents and Children (ALSPAC) index generation (n = 2850, mean age 28 years) and parent generation (n = 3720, mean age 59 years), and Generation Scotland (n = 4233, mean age 59 years). Depression was measured with the Short Mood and Feelings Questionnaire in ALSPAC and the Patient Health Questionnaire-9 in Generation Scotland. Anxiety and mental well-being were measured with the Generalised Anxiety Disorder Assessment-7 and the Short Warwick Edinburgh Mental Wellbeing Scale.
Results
Depression during the pandemic was similar to pre-pandemic levels in the ALSPAC index generation, but those experiencing anxiety had almost doubled, at 24% (95% CI 23–26%) compared with a pre-pandemic level of 13% (95% CI 12–14%). In both studies, anxiety and depression during the pandemic was greater in younger members, women, those with pre-existing mental/physical health conditions and individuals in socioeconomic adversity, even when controlling for pre-pandemic anxiety and depression.
Conclusions
These results provide evidence for increased anxiety in young people that is coincident with the pandemic. Specific groups are at elevated risk of depression and anxiety during the COVID-19 pandemic. This is important for planning current mental health provisions and for long-term impact beyond this pandemic.
Evidence suggests that eating nuts may reduce the risk of CVD. This study was intended to pool the data of all randomised controlled trials (RCT) available to determine if pistachios confer a beneficial effect on anthropometric indices, inflammatory markers, endothelial dysfunction and blood pressure. Without language restriction, PubMed, Scopus, Cochrane Library and Web of Science were searched for articles published from the earliest records to June 2019 investigating the effect of pistachio consumption on inflammation, endothelial dysfunction and hypertension. Mean difference (MD) was pooled using a random effects model. The Cochrane risk of bias tool was used to evaluate the quality of the studies. The meta-analysis of thirteen RCT with 563 participants indicated that pistachio consumption significantly decreased systolic blood pressure (SBP) (MD: −2·12 mmHg, 95 % CI −3·65, −0·59, P = 0·007), whereas changes in flow-mediated dilation (MD: 0·94 %, 95 % CI −0·99, 2·86, P = 0·813), diastolic blood pressure (MD: 0·32 mmHg, 95 % CI −1·37, 2·02, P = 0·707), C-reactive protein (MD: 0·00 mg/l, 95 % CI −0·21, 0·23, P = 0·942), TNF-α (MD: −0·09 pg/ml, 95 % CI −0·38, 0·20, P = 0·541), body weight (MD: 0·09 kg, 95 % CI −0·38, 0·69, P = 0·697), BMI (MD: 0·07 kg/m2, 95 % CI −0·16, 0·31, P = 0·553) and waist circumference (MD: 0·77 cm, 95 % CI −0·09, 1·64, P = 0·140) were not statistically significant. This systematic review and meta-analysis suggested the efficacy of pistachio consumption to reduce SBP levels. However, further large-scale studies are needed to confirm these results.
Annual grass weeds reduce profits of wheat farmers in the Pacific Northwest. The very-long-chain fatty acid elongase (VLCFA)-inhibiting herbicides S-metolachlor and dimethenamid-P could expand options for control of annual grasses but are not registered in wheat, because of crop injury. We evaluated a safener, fluxofenim, applied to wheat seed for protection of 19 soft white winter wheat varieties from S-metolachlor, dimethenamid-P, and pyroxasulfone herbicides; investigated the response of six varieties (UI Sparrow, LWW 15-72223, UI Magic CL+, Brundage 96, UI Castle CL+, and UI Palouse CL+) to incremental doses of fluxofenim; established the fluxofenim dose required to optimally protect the varieties from VLCFA-inhibiting herbicides; and assessed the impact of fluxofenim dose on glutathione S-transferase (GST) activity in three wheat varieties (UI Sparrow, Brundage 96, and UI Castle CL+). Fluxofenim increased the biomass of four varieties treated with S-metolachlor or dimethenamid-P herbicides and one variety treated with pyroxasulfone. Three varieties showed tolerance to the herbicides regardless of the fluxofenim treatment. Estimated fluxofenim doses resulting in 10% biomass reduction of wheat ranged from 0.55 to 1.23 g ai kg−1 seed. Fluxofenim doses resulting in 90% increased biomass after treatment with S-metolachlor, dimethenamid-P, and pyroxasulfone ranged from 0.07 to 0.55, 0.09 to 0.73, and 0.30 to 1.03 g ai kg−1 seed, respectively. Fluxofenim at 0.36 g ai kg−1 seed increased GST activity in UI Castle CL+, UI Sparrow, and Brundage 96 by 58%, 30%, and 38%, respectively. These results suggest fluxofenim would not damage wheat seedlings up to three times the rate labeled for sorghum, and fluxofenim protects soft white winter wheat varieties from S-metolachlor, dimethenamid-P, or pyroxasulfone injury at the herbicide rates evaluated.
Hydrogen lithography has been used to template phosphine-based surface chemistry to fabricate atomic-scale devices, a process we abbreviate as atomic precision advanced manufacturing (APAM). Here, we use mid-infrared variable angle spectroscopic ellipsometry (IR-VASE) to characterize single-nanometer thickness phosphorus dopant layers (δ-layers) in silicon made using APAM compatible processes. A large Drude response is directly attributable to the δ-layer and can be used for nondestructive monitoring of the condition of the APAM layer when integrating additional processing steps. The carrier density and mobility extracted from our room temperature IR-VASE measurements are consistent with cryogenic magneto-transport measurements, showing that APAM δ-layers function at room temperature. Finally, the permittivity extracted from these measurements shows that the doping in the APAM δ-layers is so large that their low-frequency in-plane response is reminiscent of a silicide. However, there is no indication of a plasma resonance, likely due to reduced dimensionality and/or low scattering lifetime.
The COVID-19 pandemic has created a high demand on personal protective equipment, including disposable N95 masks. Given the need for mask reuse, we tested the feasibility of vaporized hydrogen peroxide (VHP), ultraviolet light (UV), and ethanol decontamination strategies on N95 mask integrity and the ability to remove the infectious potential of SARS-CoV-2.
Methods:
Disposable N95 masks, including medical grade (1860, 1870+) and industrial grade (8511) masks, were treated by VHP, UV, and ethanol decontamination. Mask degradation was tested using a quantitative respirator fit testing. Pooled clinical samples of SARS-CoV-2 were applied to mask samples, treated, and then either sent immediately for real-time reverse transcriptase–polymerase chain reaction (RT-PCR) or incubated with Vero E6 cells to assess for virucidal effect.
Results:
Both ethanol and UV decontamination showed functional degradation to different degrees while VHP treatment showed no significant change after two treatments. We also report a single SARS-CoV-2 virucidal experiment using Vero E6 cell infection in which only ethanol treatment eliminated detectable SARS-CoV-2 RNA.
Conclusions:
We hope our data will guide further research for evidenced-based decisions for disposable N95 mask reuse and help protect caregivers from SARS-CoV-2 and other pathogens.
Introduction: Blood transfusions continue to be a critical intervention in patients presenting to emergency departments (ED). Improved understanding of the adverse events associated with transfusions has led to new research to inform and delineate transfusion guidelines. The Nova Scotia Guideline for Blood Component Utilization in Adults and Pediatrics was implemented in June 2017 to reflect current best practice in transfusion medicine. The guideline includes a lowering of the hemoglobin threshold from 80 g/L to 70 g/L for transfusion initiation, to be used in conjunction with the patient's hemodynamic assessment before and after transfusions. Our study aims to augment understanding of transfusion guideline adherence and ED physician transfusing practices at the Halifax Infirmary Emergency Department in Nova Scotia. Methods: A retrospective chart review was conducted on one third of all ED visits involving red-cell transfusions for one year prior to and one year following the guideline implementation. A total of 350 charts were reviewed. The primary data abstracted for the initial transfusion, and subsequent transfusion if applicable, from each reviewed chart included clinical and laboratory data reflective of the transfusion guideline. Based on these data, the transfusion event was classified one of three ways: indicated based on hemoglobin level, indicated based on patient's symptomatic presentation, or unable to determine if transfusion indicated based on charting. Results: The year before guideline implementation, the total number of transfusions initiated at a hemoglobin of between 71-80 was 31 of 146 total transfusions. This number dropped by 23.6% to 22 of 136 in the year following guideline implementation. The number of single-unit transfusions increased by 28.0% from 47 of 146 in the year prior to 56 of 136 in the year after guideline implementation. The initial indication for transfusion being unable to be determined based on charting provided increased by 120%. The indication for subsequent transfusions being unable to be determined based on charting increased by 1500% (P < 0.05). Conclusion: These data suggest that implementing transfusion guidelines effectively reduced the number of transfusions given in the ED setting and increased the number of single-unit transfusions administered. However, the data also suggest the need for better education around transfusion indications and proper documentation clearly outlining the rationale behind the decision to transfuse.
Introduction: CTAS is a validated five-level triage score utilized in EDs across Canada and internationally. Moderate interrater reliability between prehospital paramedic and triage nurse application of CTAS during clinical practice has been found. This study is the first assessment of the variation in distribution of CTAS scores with increasing departmental pressure as measured by the NEDOCs scale comparing triage allocations made by triage nurses with those made by triage paramedics. Methods: We conducted a retrospective, observational cohort study of EDIS data of all patients triaged in the Halifax Infirmary Emergency Department from January 1, 2017-May 30, 2017 and January 1, 2018 - May 30, 2018. CTAS score assignment by nursing and paramedic triage staff were compared with increasing levels of ED overcrowding, as determined by the department NEDOCS score. Results: Nurses were more likely to assign higher acuity scores in all situations of department crowding; there was a 3% increased probability that a nurse, as compared to a paramedic, would triage as emergent when the ED was not overcrowded (Pearson chi-square(1) = 4.21, p < 0.05, Cramer's v = 0.028, n = 5314), and a 10% increased probability that a nurse, as compared to a paramedic, would triage a patient as emergent when EDs were overcrowded (Pearson chi-square(1) = 623.83, p < 0.001, Cramer's v = 0.11, n = 56 018). Conclusion: Increasing levels of ED overcrowding influence triage nurse CTAS score assignment towards higher acuity to a greater degree than scores assigned by triage paramedics.
Background: In Canada, injuries represent 21% of Emergency Department (ED) visits. Faced with occupational injuries, physicians may feel pressured to provide urgent imaging to facilitate expedited return to work. There is not a body of literature to support this practice. Twenty percent of adult ED injuries involve workers compensation. Aim Statement: Tacit pressures were felt to impact imaging rates for patients with workplace injuries, and our aim was to determine if this hypothesis was accurate. We conducted a quality review to assess imaging rates among injuries suffered at work and outside work. A secondary aim was to reduce the harm resulting from non-value-added testing. Measures & Design: Information was collected from the Emergency Department Information System on patients with acute injuries over the age of 16-years including upper limb, lower limb, neck, back and head injuries. Data included both workplace and non-work-related presentations, Canadian Triage and Acuity Scale (CTAS) levels and age at presentation. Imaging included any of X-ray, CT, MRI, or Ultrasound ordered in EDs across the central zone of Nova Scotia from July 1, 2009 to June 30, 2019. A total of 282,860 patient-encounters were included for analysis. Comparison was made between patients presenting under the Workers’ Compensation Board of Nova Scotia (WCB) and those covered by the Department of Health and Wellness (DOHW). Imaging rates for all injuries were also trended over this ten-year period. Evaluation/Results: In patients between 16 and 65-years, the WCB group underwent more imaging (55.3% of visits) than did the DOHW group (43.1% of visits). In the same cohort, there was an overall decrease of over 10% in mean imaging rates for both WBC and DOHW between the first five-year period (2009-2013) and the second five-year study period (2013-2018). Imaging rates for WCB and DOHW converged with each decade beyond 35 years of age. No comparison was possible beyond 85-years, due to the absence of WCB presentations. Discussion/Impact: Patients presenting to the ED with workplace injuries are imaged at a higher rate than those covered by the DOHW. Campaigns promoting value-added care may have impacted imaging rates during the ten-year study period, explaining the decline in ED imaging for all injuries. While this 10% decrease in overall imaging is encouraging, these preliminary data indicate the need for further education on resource stewardship, especially for patients presenting to the ED with workplace injuries.
Introduction: Inhaled low dose methoxyflurane (MEOF) was recently approved in Canada for the short-term relief of moderate to severe acute pain associated with trauma or interventional medical procedures in conscious adult patients. ADVANCE-ED is an ongoing phase IV, prospective open label study undertaken to generate real-world evidence to complement the global clinical development program through evaluation of the effectiveness of low dose MEOF in Canadian emergency departments (EDs). Methods: This multi-centre study is enrolling adult (≥18 yrs) patients with moderate to severe acute pain (NRS0-10 ≥ 4) associated with minor trauma. To address limitations from the pivotal study, this study allows patients who were excluded in the pivotal trials: namely, those with severe (≥7) pain, and those using OTC or stably dosed analgesics for other conditions, including chronic pain. Eligible patients receive a single treatment of up to 2 x 3 mL MEOF (2nd 3 mL to be provided only upon request), self-administered by the patient under medical supervision. Rescue medication is permitted at any time, if required. Results: Here we describe the patient demographics and treatment satisfaction (Global Medication Performance, GMP) at 50% enrolment (n = 49). Mean (SD) patient age is 48.0 (17.1) yrs and 55.1% are female. Mean pain (SD) reported at enrolment is 8.3 (1.5), with 73.4% of patients with NRS0-10 ≥ 8. Injuries are overwhelmingly limb trauma (87.8%). The most common type is sprain/strain (40.8%), followed by fracture (32.7%). At 5 minutes post-start of administration (STA) of MEOF, 80.4% of patients reported pain relief; this increased to 91.3% at 15 minutes, and 100% of patients reported pain relief by 30 minutes post-STA. GMP was assessed as “good”, “very good” or “excellent” by ≥80% of patients both 20 minutes post-start of administration (STA) of MEOF (83.3%) and at discharge (85.8%). When asked to what extent their expectation of pain relief had been met, 32.7% responded good, 26.5% responded “very good” and 22.4% responded “excellent”. Three quarters of enrolled patients (75.5%) did not require rescue medication. The most common (≥5%) treatment-related adverse events were dizziness (n = 14, 28.6%) and euphoric mood (n = 4, 8.2%). No serious adverse events have been reported. Conclusion: Based on 50% of the patients enrolled in this prospective, open label study, responses to inhaled low-dose MEOF are within expectation for both effectiveness and tolerability.
Introduction: As the population of Canadian cities grows, public policy planners frequently base predictions of future demand on population trends. We aimed to discover the relationship between demographically defined ED visit rate (EDVR) trends in an academic ED with corresponding population trends in the catchment area. Methods: We used administrative data to conduct a retrospective cohort time series to analyze per capita EDVR trends based on CTAS, age, gender and housing status for the period 2006-2015. These were adjusted for population growth using age-gender standardized rates from 2011 census data. All EDVR and Standardized estimates were extrapolated for 100,000 population. Results: There were 646 731 visits during the study period, increasing by 25.6% from 56 757 in 2006 to 71 289 in 2015, with an annual incremental linear trend of 1893/year (CI:1593-2192). The highest CTAS2 EDVR increase, 521/year, (95%CI: 433-608) was by non-homeless patients older than 49. CTAS2 visits and the rate in all non-homeless patients increased by 335/year, (95% CI 280-391), while homeless patients less than 30 showed the highest CTAS2 EDVR annual rate incremease (1183/year, CI:1448-2218). From 2008-2015, the annual linear per capita CTAS5 EDVR declined by 121/year (CI:79-163). The population of adults in Halifax increased by1.2%/yr with a linear trend of 4149/year (CI:4012-4287). The highest linear increasing trend was for those older than 49 (2604/year CI:2494-2714), followed by 30-50-year old group (1223/year, CI:1138-1309) with the lowest trend for those aged less than 30 (322/year,CI:170-473). Standardized and non-standardized rate decline (CTAS5) and incline (CTAS2) were statistically similar and were not influenced by population changes. The population older than 49 increased by 38% over the 10 year period, whereas the CTAS2 visit change increased by 250%. If the CTAS2 EDVR trend continues, this rate in 2027 will double that of 2015, even if the population in the catchment area remains stable. Conclusion: EDVR trends show an increase in CTAS2 visits driven chiefly by older patients. This trend exceeds the trend suggested by Canadian Foundation for Healthcare Improvement and is significantly more than predicted by population demographic changes. Healthcare administrators will need to bear these disparities in mind as they prepare for future ED capabilities.
Introduction: Patients frequently present to the Emergency Department (ED) with predictable complications associated with radiation and chemotherapy for active cancer. Care alternatives have been proposed to reduce ED visits; however, no systematic review related to ED presentations has been completed. The objective of this scoping review was to examine the effectiveness of interventions designed to reduce ED visits among patients receiving active cancer treatment. Methods: A comprehensive literature search involving nine electronic databases and the grey literature was completed. Inclusion criteria considered studies assessing the impact of any intervention to reduce ED utilization among patients with active cancer. Two reviewers independently assessed relevance and inclusion; disagreements were resolved through third party adjudication. Dichotomous and continuous outcomes were summarized as risk ratio (RR) or mean difference (MD) with 95% confidence intervals (CIs) using a random-effects model, wherever appropriate. Results: From 3303 citations, a total of 25 studies were included. Interventions identified in these studies comprised: routine and symptom-based patient follow-up, oncology outpatient clinics, early symptom detection, comprehensive inpatient management, hospital at home, and patient navigators. Six out of eight studies assessing oncology outpatient clinics reported a decrease in the proportion of patients presenting to the ED. A meta-analysis of three of these studies did not demonstrate reduction in ED utilization (RR 0.78; 95% CI: 0.56 to 1.08; I2 = 77%) when comparing oncology outpatient clinics to standard care; however, sensitivity analysis removing one study reporting rare events supported a decrease in ED visits (RR 0.86; 95% CI: 0.74 to 0.99; I2 = 47%). Three studies assessing patient follow-up interventions showed no difference in ED utilization (RR 0.69; 95% CI: 0.38 to 1.25; I2 = 86%). Conclusion: A variety of interventions designed to mitigate ED presentations by patients receiving active cancer treatment have been developed and evaluated. Limited evidence suggests that an oncology outpatient clinic may be an effective strategy to reduce ED utilization; however, additional high-quality studies are needed.
Introduction: Choosing Wisely Nova Scotia (CWNS), an affiliate of Choosing Wisely Canada™ (CWC), aims to address unnecessary care and testing through literature-informed lists developed by various disciplines. CWC has identified unnecessary head CTs among the top five interventions to question in the Emergency Department (ED). Zyluk (2015) determined the Canadian CT Head Rule (CCHR) as the most effective clinical decision rule in adults with minor head injuries. To better understand the current status of CCHR use in Nova Scotia, we conducted a retrospective audit of patient charts at the Charles V. Keating Emergency and Trauma Center, in Halifax, Nova Scotia. Methods: Our mixed methods design included a literature review, retrospective chart audit, and a qualitative audit-feedback component with participating physicians. The chart audit applied the guidelines for adherence to the CCHR and reported on the level of compliance within the ED. Analysis of qualitative data is included here, in parallel with in-depth to contextualize findings from the audit. Results: 302 charts of patients having presented to the surveyed site were retrospectively reviewed. Of the 37 cases where a CT head was indicated as per the CCHR, a CT was ordered 32 (86.5%) times. Of the 176 cases where a CT head was not indicated, a CT was not ordered 155 (88.1%) times. Therefore, the CCHR was followed in 187 (87.8%) of the total 213 cases where the CCHR should be applied. Conclusion: Our study reveals adherence to the CCHR in 87.8% of cases at this ED. Identifying contextual factors that facilitate or hinder the application of CCHR in practice is critical for reducing unnecessary CTs. This work has been presented to the physician group to gain physician engagement and to elucidate enablers and barriers to guideline adherence. In light of the frequency of CT heads ordered EDs, even a small reduction would be impactful.