To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Introduction: In-hospital cardiac arrest (IHCA) most commonly occurs in non-monitored areas, where we observed a 10min delay before defibrillation (Phase I). Nurses (RNs) and respiratory therapists (RTs) cannot legally use Automated External Defibrillators (AEDs) during IHCA without a medical directive. We sought to evaluate IHCA outcomes following usual implementation (Phase II) vs. a Theory-Based educational program (Phase III) allowing RNs and RTs to use AEDs during IHCA. Methods: We completed a pragmatic before-after study of consecutive IHCA. We used ICD-10 codes to identify potentially eligible cases and included IHCA cases for which resuscitation was attempted. We obtained consensus on all data definitions before initiation of standardized-piloted data extraction by trained investigators. Phase I (Jan.2012-Aug.2013) consisted of baseline data. We implemented the AED medical directive in Phase II (Sept.2013-Aug.2016) using usual implementation strategies. In Phase III (Sept.2016-Dec.2017) we added an educational video informed by key constructs from a Theory of Planned Behavior survey. We report univariate comparisons of Utstein IHCA outcomes using 95% confidence intervals (CI). Results: There were 753 IHCA for which resuscitation was attempted with the following similar characteristics (Phase I n = 195; II n = 372; III n = 186): median age 68, 60.0% male, 79.3% witnessed, 29.7% non-monitored medical ward, 23.9% cardiac cause, 47.9% initial rhythm of pulseless electrical activity and 27.2% ventricular fibrillation/tachycardia (VF/VT). Comparing Phases I, II and III: an AED was used 0 times (0.0%), 21 times (5.6%), 15 times (8.1%); time to 1st rhythm analysis was 6min, 3min, 1min; and time to 1st shock was 10min, 10min and 7min. Comparing Phases I and III: time to 1st shock decreased by 3min (95%CI -7; 1), sustained ROSC increased from 29.7% to 33.3% (AD3.6%; 95%CI -10.8; 17.8), and survival to discharge increased from 24.6% to 25.8% (AD1.2%; 95%CI -7.5; 9.9). In the VF/VT subgroup, time to first shock decreased from 9 to 3 min (AD-6min; 95%CI -12; 0) and survival increased from 23.1% to 38.7% (AD15.6%; 95%CI -4.3; 35.4). Conclusion: The implementation of a medical directive allowing for AED use by RNs and RRTs successfully improved key outcomes for IHCA victims, particularly following the Theory-Based education video. The expansion of this project to other hospitals and health care professionals could significantly impact survival for VF/VT patients.
Introduction: For rhythm control of acute atrial fibrillation (AAF) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAF, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an apriori-specified modified intention-to-treat (MITT) basis excluding patients who never received the study infusion (e.g. spontaneous conversion). Data were analyzed using chi-squared tests and logistic regression. Our target sample size was 374 evaluable patients. Results: Of 395 randomized patients, 18 were excluded from the MITT analysis; none were lost to follow-up. The Drug-Shock (N = 198) and Shock Only (N = 180) groups (total = 378) were similar for all characteristics including mean age (60.0 vs 59.5 yrs), duration of AAF (10.1 vs 10.8 hrs), previous AF (67.2% vs 68.3%), median CHADS2 score (0 vs 0), and mean initial heart rate (119.9 vs 118.0 bpm). More patients converted to normal sinus rhythm in the Drug-Shock group (97.0% vs 92.2%; absolute difference 4.8%, 95% CI 0.2-9.9; P = 0.04). The multivariable analyses confirmed the Drug-Shock strategy superiority (P = 0.04). There were no statistically significant differences for time to conversion (91.4 vs 85.4 minutes), total ED length of stay (7.1 vs 7.7 hours), disposition home (97.0% vs 96.1%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion was more common in the Drug-Shock group (8.1% vs 0.6%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAF patients to go home in sinus rhythm. A strategy of initial cardioversion with procainamide was superior to a strategy of immediate ECV.
Introduction: Individualizing risk for stroke following a transient ischemic attack (TIA) is a topic of intense research, as existing scores are context-dependent or have not been well validated. The Canadian TIA Score stratifies risk of subsequent stroke into low, moderate and high risk. Our objective was to prospectively validate the Canadian TIA Score in a new cohort of emergency department (ED) patients. Methods: We conducted a prospective cohort study in 14 Canadian EDs over 4 years. We enrolled consecutive adult patients with an ED visit for TIA or nondisabling stroke. Treating physicians recorded standardized clinical variables onto data collection forms. Given the ability of prompt emergency carotid endarterectomy (CEA) to prevent stroke (NNT = 3) in high risk patients, our primary outcome was the composite of subsequent stroke or CEA ≤7 days. We conducted telephone follow-up using the validated Questionnaire for Verifying Stroke Free Status at 7 and 90 days. Outcomes were adjudicated by panels of 3 local stroke experts, blinded to the index ED data collection form. Based on prior work, we estimated a sample size of 5,004 patients including 93 subsequent strokes, would yield 95% confidence bands of +/− 10% for sensitivity and likelihood ratio (LR). Our analyses assessed interval LRs (iLR) with 95% CIs. Results: We prospectively enrolled 7,569 patients with mean 68.4 +/−14.7 years and 52.4% female, of whom 107 (1.4%) had a subsequent stroke and 74 (1.0%) CEA ≤7 days (total outcomes = 181). We enrolled 81.2% of eligible patients; missed patients were similar to enrolled. The Canadian TIA Score stratified the stroke/CEA ≤7days risk as: Low (probability <0.2%, iLR 0.20 [95%CI 0.091-0.44]; Moderate (probability 1.3%, iLR 0.79 [0.68-0.92]; High (probability 2.6%, iLR 2.2 [1.9-2.6]. Sensitivity analysis for just stroke ≤7 days yielded similar results: Low iLR 0.17 [95%CI 0.056-0.52], Medium iLR 0.89 [0.75-1.1], High iLR 2.0 [1.6-2.4]. Conclusion: The Canadian TIA Score accurately identifies TIA patients risk for stroke/CEA ≤7 days. Patients classified as low risk can be safely discharged following a careful ED assessment with elective follow-up. Patients at moderate risk can undergo additional testing in the ED, have antithrombotic therapy optimized, and be offered early stroke specialist follow-up. Patients at high risk should in most cases be fully investigated and managed ideally in consultation with a stroke specialist during their index ED visit.
Introduction: The Ottawa SAH Rule was developed to identify patients at high-risk for subarachnoid hemorrhage (SAH) who require investigations and the 6-Hour CT Rule found that computed tomography (CT) was 100% sensitive for SAH 6 hours of headache onset. Together, they form the Ottawa SAH Strategy. Our objectives were to assess: 1) Safety of the Ottawa SAH Strategy and its 2) Impact on: a) CTs, b) LPs, c) ED length of stay, and d) CT angiography (CTA). Methods: We conducted a multicentre prospective before/after study at 6 tertiary-care EDs January 2010 to December 2016 (implementation July 2013). Consecutive alert, neurologically intact adults with a headache peaking within one hour were included. SAH was defined by subarachnoid blood on head CT (radiologists final report); xanthochromia in the cerebrospinal fluid (CSF); >1x106/L red blood cells in the final tube of CSF with an aneurysm on CTA. Results: We enrolled 3,669 patients, 1,743 before and 1,926 after implementation, including 185 with SAH. The investigation rate before implementation was 89.0% (range 82.9 to 95.6%) versus 88.4% (range 85.2 to 92.3%) after implementation. The proportion who had CT remained stable (88.0% versus 87.4%; p=0.60), while the proportion who had LP decreased from 38.9% to 25.9% (p<0.001), and the proportion investigated with CTA increased from 18.8% to 21.6% (p=0.036). The additional testing rate (i.e. LP or CTA) diminishedfrom 50.1% to 40.8% (p<0.001). The proportion admitted declined from 9.8% to 7.3% (p=0.008), while the mean length of ED stay was stable (6.2 +/− 4.0 to 6.4 +/− 4.1 hours; p=0.45). For the 1,201 patients with CT 6 hours, there was an absolute decrease in additional testing (i.e. LP or CTA) of 15.0% (46.6% versus 31.6%; p<0.001). The sensitivity of the Ottawa SAH Rule was 100% (95%CI: 98-100%), and the 6-Hour CT Rule was 95.3% (95%CI: 88.9-98.3) for SAH. Five patients with early CT had SAH with CT reported as normal: 2 unruptured aneuryms on CTA and presumed traumatic LP (determined by treating neurosurgeon); 1 missed by the radiologist on the initial interpretation; 1 dural vein fistula (i.e. non-aneuyrsmal); and 1 profoundly anemic (Hgb 63g/L). Conclusion: The Ottawa SAH Strategy is highly sensitive and can be used routinely when SAH is being considered in alert and neurologically intact headache patients. Its implementation was associated with a decrease in LPs and admissions to hospital.
Introduction: Current guideline recommendations for optimal management of non-purulent skin and soft tissue infections (SSTIs) are based on expert consensus. There is currently a lack of evidence to guide emergency physicians on when to select oral versus intravenous antibiotic therapy. The primary objective was to identify risk factors associated with oral antibiotic treatment failure. A secondary objective was to describe the epidemiology of adult emergency department (ED) patients with non-purulent SSTIs. Methods: We performed a health records review of adults (age 18 years) with non-purulent SSTIs treated at two tertiary care EDs. Patients were excluded if they had a purulent infection or infected ulcers without surrounding cellulitis. Treatment failure was defined any of the following after a minimum of 48 hours of oral therapy: (i) hospitalization for SSTI; (ii) change in class of oral antibiotic owing to infection progression; or (iii) change to intravenous therapy owing to infection progression. Multivariable logistic regression was used to identify predictors independently associated with the primary outcome of oral antibiotic treatment failure after a minimum of 48 hours of oral therapy. Results: We enrolled 500 patients (mean age 64 years, 279 male (55.8%) and 126 (25.2%) with diabetes) and the hospital admission rate was 29.6%. The majority of patients (70.8%) received at least one intravenous antibiotic dose in the ED. Of 288 patients who had received a minimum of 48 hours of oral antibiotics, there were 85 oral antibiotic treatment failures (29.5%). Tachypnea at triage (odds ratio [OR]=6.31, 95% CI=1.80 to 22.08), chronic ulcers (OR=4.90, 95% CI=1.68 to 14.27), history of MRSA colonization or infection (OR=4.83, 95% CI=1.51 to 15.44), and cellulitis in the past 12 months (OR=2.23, 95% CI=1.01 to 4.96) were independently associated with oral antibiotic treatment failure. Conclusion: This is the first study to evaluate potential predictors of oral antibiotic treatment failure for non-purulent SSTIs in the ED. We observed a high rate of treatment failure and hospitalization. Tachypnea at triage, chronic ulcers, history of MRSA colonization or infection and cellulitis within the past year were independently associated with oral antibiotic treatment failure. Emergency physicians should consider these risk factors when deciding on oral versus intravenous antimicrobial therapy for non-purulent SSTIs being managed as outpatients.
Introduction: The Canadian Syncope Risk Score (CSRS) was developed to identify patients at risk for serious adverse events (SAE) within 30 days of an Emergency Department (ED) visit for syncope. We sought to validate the score in a new cohort of ED patients. Methods: We conducted a multicenter prospective cohort study at 8 large academic tertiary-care EDs across Canada from March 2014 to Dec 2016. We enrolled adults (age 16 years) who presented within 24 hours of syncope, after excluding those with persistent altered mentation, witnessed seizure, intoxication, and major trauma requiring hospitalization. Treating ED physicians collected the nine CSRS predictors at the index visit. Adjudicated SAE included death, arrhythmias and non-arrhythmic SAE (myocardial infarction, serious structural heart disease, pulmonary embolism, severe hemorrhage and procedural interventions within 30-days). We assessed area under the Receiver Operating Characteristic (ROC) curve, score calibration, and the classification performance for the various risk categories. Results: Of the 2547 patients enrolled, 146 (5.7%) were lost to follow-up and 111 (4.3%) had serious condition during the index ED visit and were excluded. Among the 2290 analyzed, 79 patients (3.4%; 0.4% death, 1.4% arrhythmia) suffered 30-day serious outcomes after ED disposition. The accuracy of the CSRS remained high with area under the ROC curve at 0.87 (95%CI 0.82-0.92), similar to the derivation phase (0.87; 95%CI 0.84-0.89). The score showed excellent calibration at the prespecified risk strata. For the very-low risk category (0.3% SAE of which 0.2% were arrhythmia and no deaths) the sensitivity was 97.5% and negative predictive value was 99.7% (95%CI 98.7-99.9). For the very high-risk category (61.5% SAE of which 26.9% were arrhythmia and 11.5% death) the specificity was 99.4% and positive predictive value was 61.5% (95% CI 43.0-77.2). Conclusion: In this multicenter validation study, the CSRS accurately risk stratified ED patients with syncope for short-term serious outcomes after ED disposition. The score should aid in minimizing investigation and observation of very-low risk patients, and prioritization of inpatient vs outpatient investigations or following of the rest. The CSRS is ready for implementation studies examining ED management decisions, patient safety and health care resource utilization.
The ‘Digital Index of North American Archaeology’ (DINAA) project demonstrates how the aggregation and publication of government-held archaeological data can help to document human activity over millennia and at a continental scale. These data can provide a valuable link between specific categories of information available from publications, museum collections and online databases. Integration improves the discovery and retrieval of records of archaeological research currently held by multiple institutions within different information systems. It also aids in the preservation of those data and makes efforts to archive these research results more resilient to political turmoil. While DINAA focuses on North America, its methods have global applicability.
VLBI observations of the nucleus of Centaurus A were made in April, 1982 at two frequencies with an array of five Australian radio antennas as part of the Southern Hemisphere VLBI Experiment (SHEVE). Observations were undertaken at 2.29 GHz with all five antennas, while only two were operational at 8.42 GHz. The 2.29 GHz data yielded significant information on the structure of the nuclear jet. At 8.42 GHz a compact unresolved core was detected as well.
To assess the role of methodological differences on measured trace-element concentrations in ice cores, we developed an experiment to test the effects of acidification strength and time on dust dissolution using snow samples collected in West Antarctica and Alaska. We leached Antarctic samples for 3 months at room temperature using nitric acid at concentrations of 0.1, 1.0 and 10.0% (v/v). At selected intervals (20 min, 24 hours, 5 days, 14 days, 28 days, 56 days, 91 days) we analyzed 23 trace elements using inductively coupled plasma mass spectrometry. Concentrations of lithogenic elements scaled with acid strength and increased by 100–1380% in 3 months. Incongruent elemental dissolution caused significant variability in calculated crustal enrichment factors through time (factor of 1.3 (Pb) to 8.0 (Cs)). Using snow samples collected in Alaska and acidified at 1% (v/v) for 383 days, we found that the increase in lithogenic element concentration with time depends strongly on initial concentration, and varies by element (e.g. Fe linear regression slope = 1.66; r = 0.98). Our results demonstrate that relative trace-element concentrations measured in ice cores depend on the acidification method used.
The Joint European X-Ray Telescope, JET-X, is one of the core instruments of the scientific payload of the USSR SPECTRUM-X astrophysics mission due for launch in 1993. The JET-X instrument concept is described and its scientific performance and capability discussed.
Background: It has been hypothesized that [18F]-sodium fluoride (NaF) uptake imaged with positron emission tomography (PET) binds to hydroxyapatite molecules expressed in regions with active calcification. Therefore, we aimed to validate NaF as a marker of hydroxyapatite expression in high-risk carotid plaque. Methods: Eleven patients (69 ± 5 years, 3 female) scheduled for carotid endarterectomy were prospectively recruited for NaF PET/CT. One patient received a second contralateral endarterectomy; two patients were excluded (intolerance to contrast media and PET/CT misalignment). The bifurcation of the common carotid was used as the reference point; NaF uptake (tissue to blood ratio - TBR) was measured at every PET slice extending 2 cm above and below the bifurcation. Excised plaque was immunostained with Goldner’s Trichrome and whole-slide digitized images were used to quantify hydroxyapatite expression. Pathology was co-registered with PET. Results: NaF uptake was related to the extent of hydroxyapatite expression (r=0.45, p<0.001). Upon classifying bilateral plaque for symptomatology, symptomatic plaque was associated with cerebrovascular events (3.75±1.1 TBR, n=9) and had greater NaF uptake than clinically silent asymptomatic plaque (2.79±0.6 TBR, n=11) (p=0.04). Conclusion: NaF uptake is related to hydroxyapatite expression and is increased in plaque associated with cerebrovascular events. NaF may serve as a novel biomarker of active calcification and plaque vulnerability.
The terminal lake systems of central Australia are key sites for the reconstruction of late Quaternary paleoenvironments. Paleoshoreline deposits around these lakes reflect repeated lake filling episodes and such landforms have enabled the establishment of a luminescence-based chronology for filling events in previous studies. Here we present a detailed documentation of the morphology and chemistry of soils developed in four well-preserved beach ridges of late Pleistocene and mid-to-late Holocene age at Lake Callabonna to assess changes in dominant pedogenic processes. All soil profiles contain evidence for the incorporation of eolian-derived material, likely via the formation of desert pavements and vesicular horizons, and limited illuviation due to generally shallow wetting fronts. Even though soil properties in the four studied profiles also provide examples of parent material influence or site-specific processes related to the geomorphic setting, there is an overall trend of increasing enrichment of eolian-derived material since at least ~ 33 ka. Compared to the Holocene profiles, the derived average accumulation rates for the late Pleistocene profiles are significantly lower and may suggest that soils record important regional changes in paleoenvironments and dust dynamics related to shifts in the Southern Hemisphere westerlies.
We consider the dynamics of actively entraining turbulent density currents on a conical sloping surface in a rotating fluid. A theoretical plume model is developed to describe both axisymmetric flow and single-stream currents of finite angular extent. An analytical solution is derived for flow dominated by the initial buoyancy flux and with a constant entrainment ratio, which serves as an attractor for solutions with alternative initial conditions where the initial fluxes of mass and momentum are non-negligible. The solutions indicate that the downslope propagation of the current halts at a critical level where there is purely azimuthal flow, and the boundary layer approximation breaks down. Observations from a set of laboratory experiments are consistent with the dynamics predicted by the model, with the flow approaching a critical level. Interpretation in terms of the theory yields an entrainment coefficient
$E\propto 1/\Omega $
where the rotation rate is
. We also derive a corresponding theory for density currents from a line source of buoyancy on a planar slope. Our theoretical models provide a framework for designing and interpreting laboratory studies of turbulent entrainment in rotating dense flows on slopes and understanding their implications in geophysical flows.
Determining the appropriate disposition of emergency department (ED) syncope patients is challenging. Previously developed decision tools have poor diagnostic test characteristics and methodological flaws in their derivation that preclude their use. We sought to develop a scale to risk-stratify adult ED syncope patients at risk for serious adverse events (SAEs) within 30 days.
We conducted a medical record review to include syncope patients age ≥ 16 years and excluded patients with ongoing altered mental status, alcohol or illicit drug use, seizure, head injury leading to loss of consciousness, or severe trauma requiring admission. We collected 105 predictor variables (demographics, event characteristics, comorbidities, medications, vital signs, clinical examination findings, emergency medical services and ED electrocardiogram/ monitor characteristics, investigations, and disposition variables) and information on the occurrence of predefined SAEs. Univariate and multiple logistic regression analyses were performed.
Among 505 enrolled patient visits, 49 (9.7%) suffered an SAE. Predictors of SAE and their resulting point scores were as follows: age ≥ 75 years (1), shortness of breath (2), lowest ED systolic blood pressure < 80 mm Hg (2), Ottawa Electrocardiographic Criteria present (2), and blood urea nitrogen > 15 mmol/L (3). The final score calculated by addition of the individual scores for each variable (range 0–10) was found to accurately stratify patients into low risk (score < 1, 0% SAE risk), moderate risk (score 1, 3.7% SAE risk), or high risk (score > 1, ≥ 10% SAE risk).
We derived a risk scale that accurately predicts SAEs within 30 days in ED syncope patients. If validated, this will be a potentially useful clinical decision tool for emergency physicians, may allow judicious use of health care resources, and may improve patient care and safety.
To examine barriers to initiation and continuation of mental health treatment among individuals with common mental disorders.
Data were from the World Health Organization (WHO) World Mental Health (WMH) surveys. Representative household samples were interviewed face to face in 24 countries. Reasons to initiate and continue treatment were examined in a subsample (n = 636 78) and analyzed at different levels of clinical severity.
Among those with a DSM-IV disorder in the past 12 months, low perceived need was the most common reason for not initiating treatment and more common among moderate and mild than severe cases. Women and younger people with disorders were more likely to recognize a need for treatment. A desire to handle the problem on one's own was the most common barrier among respondents with a disorder who perceived a need for treatment (63.8%). Attitudinal barriers were much more important than structural barriers to both initiating and continuing treatment. However, attitudinal barriers dominated for mild-moderate cases and structural barriers for severe cases. Perceived ineffectiveness of treatment was the most commonly reported reason for treatment drop-out (39.3%), followed by negative experiences with treatment providers (26.9% of respondents with severe disorders).
Low perceived need and attitudinal barriers are the major barriers to seeking and staying in treatment among individuals with common mental disorders worldwide. Apart from targeting structural barriers, mainly in countries with poor resources, increasing population mental health literacy is an important endeavor worldwide.
Previous community surveys of the drop out from mental health treatment have been carried out only in the USA and Canada.
To explore mental health treatment drop out in the World Health Organization World Mental Health Surveys.
Representative face-to-face household surveys were conducted among adults in 24 countries. People who reported mental health treatment in the 12 months before interview (n = 8482) were asked about drop out, defined as stopping treatment before the provider wanted.
Overall, drop out was 31.7%: 26.3% in high-income countries, 45.1% in upper-middle-income countries, and 37.6% in low/ lower/middle-income countries. Drop out from psychiatrists was 21.3% overall and similar across country income groups (high 20.3%, upper-middle 23.6%, low/lower-middle 23.8%) but the pattern of drop out across other sectors differed by country income group. Drop out was more likely early in treatment, particularly after the second visit.
Drop out needs to be reduced to ensure effective treatment.