To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Introduction: In-hospital cardiac arrest (IHCA) most commonly occurs in non-monitored areas, where we observed a 10min delay before defibrillation (Phase I). Nurses (RNs) and respiratory therapists (RTs) cannot legally use Automated External Defibrillators (AEDs) during IHCA without a medical directive. We sought to evaluate IHCA outcomes following usual implementation (Phase II) vs. a Theory-Based educational program (Phase III) allowing RNs and RTs to use AEDs during IHCA. Methods: We completed a pragmatic before-after study of consecutive IHCA. We used ICD-10 codes to identify potentially eligible cases and included IHCA cases for which resuscitation was attempted. We obtained consensus on all data definitions before initiation of standardized-piloted data extraction by trained investigators. Phase I (Jan.2012-Aug.2013) consisted of baseline data. We implemented the AED medical directive in Phase II (Sept.2013-Aug.2016) using usual implementation strategies. In Phase III (Sept.2016-Dec.2017) we added an educational video informed by key constructs from a Theory of Planned Behavior survey. We report univariate comparisons of Utstein IHCA outcomes using 95% confidence intervals (CI). Results: There were 753 IHCA for which resuscitation was attempted with the following similar characteristics (Phase I n = 195; II n = 372; III n = 186): median age 68, 60.0% male, 79.3% witnessed, 29.7% non-monitored medical ward, 23.9% cardiac cause, 47.9% initial rhythm of pulseless electrical activity and 27.2% ventricular fibrillation/tachycardia (VF/VT). Comparing Phases I, II and III: an AED was used 0 times (0.0%), 21 times (5.6%), 15 times (8.1%); time to 1st rhythm analysis was 6min, 3min, 1min; and time to 1st shock was 10min, 10min and 7min. Comparing Phases I and III: time to 1st shock decreased by 3min (95%CI -7; 1), sustained ROSC increased from 29.7% to 33.3% (AD3.6%; 95%CI -10.8; 17.8), and survival to discharge increased from 24.6% to 25.8% (AD1.2%; 95%CI -7.5; 9.9). In the VF/VT subgroup, time to first shock decreased from 9 to 3 min (AD-6min; 95%CI -12; 0) and survival increased from 23.1% to 38.7% (AD15.6%; 95%CI -4.3; 35.4). Conclusion: The implementation of a medical directive allowing for AED use by RNs and RRTs successfully improved key outcomes for IHCA victims, particularly following the Theory-Based education video. The expansion of this project to other hospitals and health care professionals could significantly impact survival for VF/VT patients.
Introduction: Patients with major bleeding (e.g. gastrointestinal bleeding, and intracranial hemorrhage [ICH]) are commonly encountered in the Emergency Department (ED). A growing number of patients are on either oral or parenteral anticoagulation (AC), but the impact of AC on outcomes of patients with major bleeding is unknown. With regards to oral anticoagulation (OAC), we particularly sought to analyze differences between patients on Warfarin or Direct Oral Anticoagulants (DOACs). Methods: We analyzed a prospectively collected registry (2011-2016) of patients who presented to the ED with major bleeding at two academic hospitals. “Major bleeding” was defined by the International Society on Thrombosis and Haemostasis criteria. The primary outcome, in-hospital mortality, was analyzed using a multivariable logistic regression model. Secondary outcomes included discharge to long-term care among survivors, total hospital length of stay (LOS) among survivors, and total hospital costs. Results: 1,477 patients with major bleeding were included. AC use was found among 215 total patients (14.6%). Among OAC patients (n = 181), 141 (77.9%) had used Warfarin, and 40 (22.1%) had used a DOAC. 484 patients (32.8%) died in-hospital. AC use was associated with higher in-hospital mortality (adjusted odds ratio [OR]: 1.50 [1.17-1.93]). Among survivors to discharge, AC use was associated with higher discharge to long-term care (adjusted OR: 1.73 [1.18-2.57]), prolonged median LOS (19 days vs. 16 days, P = 0.03), and higher mean costs ($69,273 vs. $58,156, P = 0.02). With regards to OAC, a higher proportion of ICH was seen among patients on Warfarin (39.0% vs. 32.5%), as compared to DOACs. No difference in mortality was seen between DOACs and Warfarin (adjusted OR: 0.84 [0.40-1.72]). Patients with major bleeding on Warfarin had longer median LOS (11 days vs. 6 days, P = 0.03) and higher total costs ($51,524 vs. $35,176, P < 0.01) than patients on DOACs. Conclusion: AC use was associated with higher mortality among ED patients with major bleeding. Among survivors, AC use was associated with increased LOS, costs, and discharge to long-term care. Among OAC patients, no difference in mortality was found. Warfarin was associated with prolonged LOS and costs, likely secondary to higher incidence of ICH, as compared to DOACs.
Introduction: Guidelines recommend serial conventional cardiac troponin (cTn) measurements 6-9 hours apart for non-ST-elevation myocardial infarction (NSTEMI) diagnosis. We sought to develop a pathway based on absolute/relative changes between two serial conventional cardiac troponin I (cTnI) values 3-hours apart for 15-day MACE identification. Methods: This was a prospective cohort study conducted in the two large ED's at the Ottawa Hospital. Adults with NSTEMI symptoms were enrolled over 32 months. Patients with STEMI, hospitalized for unstable angina, or with only one cTnI were excluded. We collected baseline characteristics, Siemens Vista cTnI at 0 and 3-hours after ED presentation, disposition, and ED length of stay (LOS). Adjudicated primary outcome was 15-day MACE (AMI, revascularization, or death due to cardiac ischemia/unknown cause). We analysed cTnI values by 99th percentile cut-off multiples (45, 100 and 250ng/L). Results: 1,683 patients (mean age 64.7 years; 55.3% female; median ED LOS 7 hours; 88 patients with 15-day MACE) were included. 1,346 (80.0%) patients with both cTnI ≤45ng/L; and 58 (3.4%) of the 213 patients with one value≥100ng/L but both <250ng/L or ≤20% change did not suffer MACE. Among 124 patients (7.4%) with one value >45ng/L but both <100ng/L based on 3 or 6-hour cTnI, one patient with Δ<10ng/L and 6 of 19 patients with Δ≥20ng/L were diagnosed with NSTEMI (patients with Δ10-19ng/L between first and second cTnI had third one at 6-hours). Based on the results, we developed the Ottawa Troponin Pathway (OTP) with a 98.9% sensitivity (95%CI 96.7-100%) and 94.6% specificity (95%CI 93.4-95.7%). Conclusion: The OTP, using two conventional cTnI measurements performed 3-hours apart, should lead to better identification of NSTEMI particularly those with values >99th percentile cut-off, standardize management and reduce the ED LOS.
Introduction: For rhythm control of acute atrial fibrillation (AAF) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAF, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an apriori-specified modified intention-to-treat (MITT) basis excluding patients who never received the study infusion (e.g. spontaneous conversion). Data were analyzed using chi-squared tests and logistic regression. Our target sample size was 374 evaluable patients. Results: Of 395 randomized patients, 18 were excluded from the MITT analysis; none were lost to follow-up. The Drug-Shock (N = 198) and Shock Only (N = 180) groups (total = 378) were similar for all characteristics including mean age (60.0 vs 59.5 yrs), duration of AAF (10.1 vs 10.8 hrs), previous AF (67.2% vs 68.3%), median CHADS2 score (0 vs 0), and mean initial heart rate (119.9 vs 118.0 bpm). More patients converted to normal sinus rhythm in the Drug-Shock group (97.0% vs 92.2%; absolute difference 4.8%, 95% CI 0.2-9.9; P = 0.04). The multivariable analyses confirmed the Drug-Shock strategy superiority (P = 0.04). There were no statistically significant differences for time to conversion (91.4 vs 85.4 minutes), total ED length of stay (7.1 vs 7.7 hours), disposition home (97.0% vs 96.1%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion was more common in the Drug-Shock group (8.1% vs 0.6%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAF patients to go home in sinus rhythm. A strategy of initial cardioversion with procainamide was superior to a strategy of immediate ECV.
Introduction: Individualizing risk for stroke following a transient ischemic attack (TIA) is a topic of intense research, as existing scores are context-dependent or have not been well validated. The Canadian TIA Score stratifies risk of subsequent stroke into low, moderate and high risk. Our objective was to prospectively validate the Canadian TIA Score in a new cohort of emergency department (ED) patients. Methods: We conducted a prospective cohort study in 14 Canadian EDs over 4 years. We enrolled consecutive adult patients with an ED visit for TIA or nondisabling stroke. Treating physicians recorded standardized clinical variables onto data collection forms. Given the ability of prompt emergency carotid endarterectomy (CEA) to prevent stroke (NNT = 3) in high risk patients, our primary outcome was the composite of subsequent stroke or CEA ≤7 days. We conducted telephone follow-up using the validated Questionnaire for Verifying Stroke Free Status at 7 and 90 days. Outcomes were adjudicated by panels of 3 local stroke experts, blinded to the index ED data collection form. Based on prior work, we estimated a sample size of 5,004 patients including 93 subsequent strokes, would yield 95% confidence bands of +/− 10% for sensitivity and likelihood ratio (LR). Our analyses assessed interval LRs (iLR) with 95% CIs. Results: We prospectively enrolled 7,569 patients with mean 68.4 +/−14.7 years and 52.4% female, of whom 107 (1.4%) had a subsequent stroke and 74 (1.0%) CEA ≤7 days (total outcomes = 181). We enrolled 81.2% of eligible patients; missed patients were similar to enrolled. The Canadian TIA Score stratified the stroke/CEA ≤7days risk as: Low (probability <0.2%, iLR 0.20 [95%CI 0.091-0.44]; Moderate (probability 1.3%, iLR 0.79 [0.68-0.92]; High (probability 2.6%, iLR 2.2 [1.9-2.6]. Sensitivity analysis for just stroke ≤7 days yielded similar results: Low iLR 0.17 [95%CI 0.056-0.52], Medium iLR 0.89 [0.75-1.1], High iLR 2.0 [1.6-2.4]. Conclusion: The Canadian TIA Score accurately identifies TIA patients risk for stroke/CEA ≤7 days. Patients classified as low risk can be safely discharged following a careful ED assessment with elective follow-up. Patients at moderate risk can undergo additional testing in the ED, have antithrombotic therapy optimized, and be offered early stroke specialist follow-up. Patients at high risk should in most cases be fully investigated and managed ideally in consultation with a stroke specialist during their index ED visit.
BACKGROUND: IGTS is a rare phenomenon of paradoxical germ cell tumor (GCT) growth during or following treatment despite normalization of tumor markers. We sought to evaluate the frequency, clinical characteristics and outcome of IGTS in patients in 21 North-American and Australian institutions. METHODS: Patients with IGTS diagnosed from 2000-2017 were retrospectively evaluated. RESULTS: Out of 739 GCT diagnoses, IGTS was identified in 33 patients (4.5%). IGTS occurred in 9/191 (4.7%) mixed-malignant GCTs, 4/22 (18.2%) immature teratomas (ITs), 3/472 (0.6%) germinomas/germinomas with mature teratoma, and in 17 secreting non-biopsied tumours. Median age at GCT diagnosis was 10.9 years (range 1.8-19.4). Male gender (84%) and pineal location (88%) predominated. Of 27 patients with elevated markers, median serum AFP and Beta-HCG were 70 ng/mL (range 9.2-932) and 44 IU/L (range 4.2-493), respectively. IGTS occurred at a median time of 2 months (range 0.5-32) from diagnosis, during chemotherapy in 85%, radiation in 3%, and after treatment completion in 12%. Surgical resection was attempted in all, leading to gross total resection in 76%. Most patients (79%) resumed GCT chemotherapy/radiation after surgery. At a median follow-up of 5.3 years (range 0.3-12), all but 2 patients are alive (1 succumbed to progressive disease, 1 to malignant transformation of GCT). CONCLUSION: IGTS occurred in less than 5% of patients with GCT and most commonly after initiation of chemotherapy. IGTS was more common in patients with IT-only on biopsy than with mixed-malignant GCT. Surgical resection is a principal treatment modality. Survival outcomes for patients who developed IGTS are favourable.
Introduction: Current guideline recommendations for optimal management of non-purulent skin and soft tissue infections (SSTIs) are based on expert consensus. There is currently a lack of evidence to guide emergency physicians on when to select oral versus intravenous antibiotic therapy. The primary objective was to identify risk factors associated with oral antibiotic treatment failure. A secondary objective was to describe the epidemiology of adult emergency department (ED) patients with non-purulent SSTIs. Methods: We performed a health records review of adults (age 18 years) with non-purulent SSTIs treated at two tertiary care EDs. Patients were excluded if they had a purulent infection or infected ulcers without surrounding cellulitis. Treatment failure was defined any of the following after a minimum of 48 hours of oral therapy: (i) hospitalization for SSTI; (ii) change in class of oral antibiotic owing to infection progression; or (iii) change to intravenous therapy owing to infection progression. Multivariable logistic regression was used to identify predictors independently associated with the primary outcome of oral antibiotic treatment failure after a minimum of 48 hours of oral therapy. Results: We enrolled 500 patients (mean age 64 years, 279 male (55.8%) and 126 (25.2%) with diabetes) and the hospital admission rate was 29.6%. The majority of patients (70.8%) received at least one intravenous antibiotic dose in the ED. Of 288 patients who had received a minimum of 48 hours of oral antibiotics, there were 85 oral antibiotic treatment failures (29.5%). Tachypnea at triage (odds ratio [OR]=6.31, 95% CI=1.80 to 22.08), chronic ulcers (OR=4.90, 95% CI=1.68 to 14.27), history of MRSA colonization or infection (OR=4.83, 95% CI=1.51 to 15.44), and cellulitis in the past 12 months (OR=2.23, 95% CI=1.01 to 4.96) were independently associated with oral antibiotic treatment failure. Conclusion: This is the first study to evaluate potential predictors of oral antibiotic treatment failure for non-purulent SSTIs in the ED. We observed a high rate of treatment failure and hospitalization. Tachypnea at triage, chronic ulcers, history of MRSA colonization or infection and cellulitis within the past year were independently associated with oral antibiotic treatment failure. Emergency physicians should consider these risk factors when deciding on oral versus intravenous antimicrobial therapy for non-purulent SSTIs being managed as outpatients.
Introduction: Two published studies reported natriuretic peptides can aid in risk-stratification of Emergency Department (ED) syncope. We sought to assess the role of N-Terminal pro Brain Natriuretic Peptide (NT pro-BNP) to identify syncope patients at risk for serious adverse events (SAE) within 30 days of the ED visit, and its value above that of the Canadian Syncope Risk Score (CSRS). Methods: We conducted a multicenter prospective cohort study at 6 large Canadian EDs from Nov 2011 to Feb 2015. We enrolled adults who presented within 24-hours of syncope and excluded those with persistent altered mentation, obvious seizure, and intoxication. We collected patient characteristics, nine CSRS predictors (includes troponin), ED management and NT pro-BNP levels. Adjudicated serious adverse events (SAE) included death, cardiac SAE (arrhythmias, myocardial infarction, serious structural heart disease) and non-cardiac SAE (pulmonary embolism, severe hemorrhage and procedural interventions within 30-days). We used two tailed t-test and logistic regression analysis. Results: Of the 1359 patients (mean age 57.2 years, 54.7% females, 13.3% hospitalized) enrolled, 148 patients (10.9%; 0.7% deaths, 7.9% cardiac SAE including 6.1% arrhythmia) suffered SAE within 30-days. The mean NT pro-BNP values, when compared to the patients with no SAE (499.8ng/L) was significantly higher among the 56 patients who suffered SAE after ED disposition (3147ng/L, p=0.001), and among the 35 patients with cardiac SAE after ED disposition (2016.2ng/L, p=0.02). While there was a trend to higher levels among patients who suffered arrhythmia after the ED visit, it was not statistically significant (1776.4ng/L, p=0.07). In a model with CSRS predictors, the adjusted odds ratio for NT pro-BNP was 8.0 (95%CI 1.8, 35.9) and troponin was 3.8 (95%CI 1.7, 8.8). The addition of NT pro-BNP did not significantly improve the classification performance (p=0.76) with areas under the curves for CSRS was 0.91 (95%CI 0.88, 0.95) and CSRS with NT pro-BNP was 0.92 (95%CI 0.88, 0.95). Conclusion: In this multicenter study, mean NT pro-BNP levels were significantly higher among ED syncope patients who suffered SAE including cardiac SAE after ED disposition. Though NT pro-BNP was a significant independent predictor of SAE after ED disposition, it did not improve accuracy in ED syncope risk-stratification when compared to CSRS. Hence, we do not recommend NT pro-BNP measurement for ED syncope management.
Introduction: The Ottawa SAH Rule was developed to identify patients at high-risk for subarachnoid hemorrhage (SAH) who require investigations and the 6-Hour CT Rule found that computed tomography (CT) was 100% sensitive for SAH 6 hours of headache onset. Together, they form the Ottawa SAH Strategy. Our objectives were to assess: 1) Safety of the Ottawa SAH Strategy and its 2) Impact on: a) CTs, b) LPs, c) ED length of stay, and d) CT angiography (CTA). Methods: We conducted a multicentre prospective before/after study at 6 tertiary-care EDs January 2010 to December 2016 (implementation July 2013). Consecutive alert, neurologically intact adults with a headache peaking within one hour were included. SAH was defined by subarachnoid blood on head CT (radiologists final report); xanthochromia in the cerebrospinal fluid (CSF); >1x106/L red blood cells in the final tube of CSF with an aneurysm on CTA. Results: We enrolled 3,669 patients, 1,743 before and 1,926 after implementation, including 185 with SAH. The investigation rate before implementation was 89.0% (range 82.9 to 95.6%) versus 88.4% (range 85.2 to 92.3%) after implementation. The proportion who had CT remained stable (88.0% versus 87.4%; p=0.60), while the proportion who had LP decreased from 38.9% to 25.9% (p<0.001), and the proportion investigated with CTA increased from 18.8% to 21.6% (p=0.036). The additional testing rate (i.e. LP or CTA) diminishedfrom 50.1% to 40.8% (p<0.001). The proportion admitted declined from 9.8% to 7.3% (p=0.008), while the mean length of ED stay was stable (6.2 +/− 4.0 to 6.4 +/− 4.1 hours; p=0.45). For the 1,201 patients with CT 6 hours, there was an absolute decrease in additional testing (i.e. LP or CTA) of 15.0% (46.6% versus 31.6%; p<0.001). The sensitivity of the Ottawa SAH Rule was 100% (95%CI: 98-100%), and the 6-Hour CT Rule was 95.3% (95%CI: 88.9-98.3) for SAH. Five patients with early CT had SAH with CT reported as normal: 2 unruptured aneuryms on CTA and presumed traumatic LP (determined by treating neurosurgeon); 1 missed by the radiologist on the initial interpretation; 1 dural vein fistula (i.e. non-aneuyrsmal); and 1 profoundly anemic (Hgb 63g/L). Conclusion: The Ottawa SAH Strategy is highly sensitive and can be used routinely when SAH is being considered in alert and neurologically intact headache patients. Its implementation was associated with a decrease in LPs and admissions to hospital.
Introduction: The Canadian Syncope Risk Score (CSRS) was developed to identify patients at risk for serious adverse events (SAE) within 30 days of an Emergency Department (ED) visit for syncope. We sought to validate the score in a new cohort of ED patients. Methods: We conducted a multicenter prospective cohort study at 8 large academic tertiary-care EDs across Canada from March 2014 to Dec 2016. We enrolled adults (age 16 years) who presented within 24 hours of syncope, after excluding those with persistent altered mentation, witnessed seizure, intoxication, and major trauma requiring hospitalization. Treating ED physicians collected the nine CSRS predictors at the index visit. Adjudicated SAE included death, arrhythmias and non-arrhythmic SAE (myocardial infarction, serious structural heart disease, pulmonary embolism, severe hemorrhage and procedural interventions within 30-days). We assessed area under the Receiver Operating Characteristic (ROC) curve, score calibration, and the classification performance for the various risk categories. Results: Of the 2547 patients enrolled, 146 (5.7%) were lost to follow-up and 111 (4.3%) had serious condition during the index ED visit and were excluded. Among the 2290 analyzed, 79 patients (3.4%; 0.4% death, 1.4% arrhythmia) suffered 30-day serious outcomes after ED disposition. The accuracy of the CSRS remained high with area under the ROC curve at 0.87 (95%CI 0.82-0.92), similar to the derivation phase (0.87; 95%CI 0.84-0.89). The score showed excellent calibration at the prespecified risk strata. For the very-low risk category (0.3% SAE of which 0.2% were arrhythmia and no deaths) the sensitivity was 97.5% and negative predictive value was 99.7% (95%CI 98.7-99.9). For the very high-risk category (61.5% SAE of which 26.9% were arrhythmia and 11.5% death) the specificity was 99.4% and positive predictive value was 61.5% (95% CI 43.0-77.2). Conclusion: In this multicenter validation study, the CSRS accurately risk stratified ED patients with syncope for short-term serious outcomes after ED disposition. The score should aid in minimizing investigation and observation of very-low risk patients, and prioritization of inpatient vs outpatient investigations or following of the rest. The CSRS is ready for implementation studies examining ED management decisions, patient safety and health care resource utilization.
Introduction: Creatine kinase (CK) measurement, despite not being recommended for the diagnosis of a Non-ST Elevation Myocardial Infarction (NSTEMI) is still routinely performed in the emergency department (ED) for the workup of NSTEMI. The diagnostic utility of CK among ED patients with suspected NSTEMI is still not well understood. The objectives of this study were to assess: the additional value of CK in NSTEMI diagnosis and the correlation between the highest CK/TNI values and ejection fraction (EF) on follow-up echocardiography among patients with suspected NSTEMI. Methods: This was a prospective cohort study conducted at the Civic and General Campuses of The Ottawa Hospital from March 2014 to March 2016. We enrolled adults (18 years) for whom troponin (TNI) and CK were ordered for chest pain or non-chest pain symptoms within the past 24 hours concerning for NSTEMI and excluded those with suspected ST-Elevation Myocardial Infarction (STEMI). Primary outcome was a 30-day NSTEMI adjudicated by two blinded physicians. Demographics, medical history, and ED CK/TNI values were collected. We used descriptive statistics and report test diagnostic characteristics. Results: Of the 1,663 patients enrolled, 84 patients (5.1%) suffered NSTEMI. The sensitivity and specificity of CK was 30.9% (95%CI 21.1, 40.8) and 91.4% (95%CI 90.0, 92.8) respectively. The sensitivity and specificity of troponin was 96.4% (95%CI 92.4, 100) and 88.1% (95%CI 86.5, 89.7) respectively. Among 3 (0.2%) patients with missed NSTEMI diagnosis with TNI, CK measurements did not add value. The mean CK values were not significantly different between those with normal and abnormal EF on follow-up (132.4 U/L and 146.3 U/L respectively; p=0.44), whereas the mean TNI values were significantly different (0.5 µg/L and 1.3 µg/L respectively; p=0.046). Conclusion: CK measurements neither provide any additional value in the work-up of NSTEMI in the ED nor correlate with EF on follow-up. Discontinuing routine CK measurements would reduce overall costs and improve resource utilization in the ED, and streamline the management of patients in the ED with chest pain.
Introduction: Emergency department (ED) patients with non-purulent skin and soft tissue infections (SSTIs) requiring intravenous antibiotics may be managed via outpatient parenteral antibiotic therapy (OPAT). To date, there are no prospective studies describing the performance of an ED-to-OPAT clinic program. Furthermore, there are no studies that have examined physician rationale for intravenous therapy, despite this being a critical first step in the decision to refer to an OPAT program. Methods: We conducted a prospective observational cohort study of adults (age 18 years) with non-purulent SSTIs receiving parenteral therapy at two tertiary care EDs. Patients were excluded if they had purulent infections or could not provide consent. The emergency physician completed a form documenting rationale for intravenous therapy, infection size, and choice of antimicrobial agent, dose and duration. OPAT treatment failure was defined as hospitalization after a minimum of 48 hours of OPAT for: (i) worsening infection; (ii) peripheral intravenous line complications; or (iii) adverse antibiotic events. Patient satisfaction was assessed at a 14-day telephone follow up. Results: We enrolled a consecutive sample of 153 patients (mean age 60 years, 82 male (53.6%) and 38 (24.8%) with diabetes). A total of 137 patients (89.5%) attended their clinic appointment. Of the 101 patients prescribed cefazolin, 50.5% received 1000 mg and 48.5% received 2000 mg per day. There were low rates of OPAT treatment failure (3.9%). None of the adverse peripheral intravenous line events (9.8%) or adverse antibiotic events (7.2%) required hospitalization. Patients reported a high degree of satisfaction with timeliness of clinic referral (median score 9 out of 10) and overall care received (median score of 10 out of 10). The top 5 reasons given by physicians for selecting intravenous therapy were: clinical impression of severity (52.9%); failed oral antibiotic therapy (41.8%); diabetes (17.6%); severe pain (7.8%); and peripheral vascular disease (7.8%). Conclusion: This is the first study to identify physician rationale for the use of intravenous antibiotics for SSTIs. There was significant variability in antibiotic prescribing practices by ED physicians. This prospective study demonstrates that an ED-to-OPAT clinic program for non-purulent SSTIs is safe, has a low rate of treatment failures and results in high patient satisfaction.
The ‘Digital Index of North American Archaeology’ (DINAA) project demonstrates how the aggregation and publication of government-held archaeological data can help to document human activity over millennia and at a continental scale. These data can provide a valuable link between specific categories of information available from publications, museum collections and online databases. Integration improves the discovery and retrieval of records of archaeological research currently held by multiple institutions within different information systems. It also aids in the preservation of those data and makes efforts to archive these research results more resilient to political turmoil. While DINAA focuses on North America, its methods have global applicability.
The value of auditory enrichment for psychological well-being has been studied in a variety of species, including birds, cattle, horses and primates. To date the effect of auditory stimulation on the behaviour of dogs housed in rescue shelters is unknown. Rescue shelters provide temporary housing for thousands of stray and abandoned dogs every year. However well these dogs are cared for, it cannot be ignored that being in such a situation is stressful. Research suggests that music may be a useful moderator of stress in humans. The question remains as to whether auditory stimulation has such a beneficial effect in dogs. This study investigated the behaviour of sheltered dogs in response to five types of auditory stimulation to determine whether the dogs’ behaviour was influenced by their auditory environment.
VLBI observations of the nucleus of Centaurus A were made in April, 1982 at two frequencies with an array of five Australian radio antennas as part of the Southern Hemisphere VLBI Experiment (SHEVE). Observations were undertaken at 2.29 GHz with all five antennas, while only two were operational at 8.42 GHz. The 2.29 GHz data yielded significant information on the structure of the nuclear jet. At 8.42 GHz a compact unresolved core was detected as well.
Patients with poorly controlled diabetes mellitus may have a sentinel emergency department (ED) visit for a precipitating condition prior to presenting for a hyperglycemic emergency, such as diabetic ketoacidosis (DKA) or hyperosmolar hyperglycemic state (HHS). This study’s objective was to describe the epidemiology and outcomes of patients with a sentinel ED visit prior to their hyperglycemic emergency visit.
This was a 1-year health records review of patients≥18 years old presenting to one of four tertiary care EDs with a discharge diagnosis of hyperglycemia, DKA, or HHS. Trained research personnel collected data on patient characteristics, management, disposition, and determined whether patients came to the ED within the 14 days prior to their hyperglycemia visit. Descriptive statistics were used to summarize the data.
Of 833 visits for hyperglycemia, 142 (17.0%; 95% CI: 14.5% to 19.6%) had a sentinel ED presentation within the preceding 14 days. Mean (SD) age was 50.5 (19.0) years and 54.4% were male; 104 (73.2%) were discharged from this initial visit, and 98/104 (94.2%) were discharged either without their glucose checked or with an elevated blood glucose (>11.0 mmol/L). Of the sentinel visits, 93 (65.5%) were for hyperglycemia and 22 (15.5%) for infection. Upon returning to the ED, 61/142 (43.0%) were admitted for severe hyperglycemia, DKA, or HHS.
In this unique ED-based study, diabetic patients with a sentinel ED visit often returned and required subsequent admission for hyperglycemia. Clinicians should be vigilant in checking blood glucose and provide clear discharge instructions for follow-up and glucose management to prevent further hyperglycemic emergencies from occurring.
To assess the role of methodological differences on measured trace-element concentrations in ice cores, we developed an experiment to test the effects of acidification strength and time on dust dissolution using snow samples collected in West Antarctica and Alaska. We leached Antarctic samples for 3 months at room temperature using nitric acid at concentrations of 0.1, 1.0 and 10.0% (v/v). At selected intervals (20 min, 24 hours, 5 days, 14 days, 28 days, 56 days, 91 days) we analyzed 23 trace elements using inductively coupled plasma mass spectrometry. Concentrations of lithogenic elements scaled with acid strength and increased by 100–1380% in 3 months. Incongruent elemental dissolution caused significant variability in calculated crustal enrichment factors through time (factor of 1.3 (Pb) to 8.0 (Cs)). Using snow samples collected in Alaska and acidified at 1% (v/v) for 383 days, we found that the increase in lithogenic element concentration with time depends strongly on initial concentration, and varies by element (e.g. Fe linear regression slope = 1.66; r = 0.98). Our results demonstrate that relative trace-element concentrations measured in ice cores depend on the acidification method used.
We use numerical simulations from the Community Coordinated Modeling Center to provide, for the first time, a coherent temporal description of the magnetic reconnection process of two dayside Electron Diffusion Regions (EDRs) identified in Magnetospheric Multiscale Mission data. The model places the MMS spacecraft near the separator line in these most intense and long-lived events. A listing of 31 dayside EDRs identified by the authors is provided to encourage collaboration in analysis of these unique encounters.