To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Introduction: In-hospital cardiac arrest (IHCA) most commonly occurs in non-monitored areas, where we observed a 10min delay before defibrillation (Phase I). Nurses (RNs) and respiratory therapists (RTs) cannot legally use Automated External Defibrillators (AEDs) during IHCA without a medical directive. We sought to evaluate IHCA outcomes following usual implementation (Phase II) vs. a Theory-Based educational program (Phase III) allowing RNs and RTs to use AEDs during IHCA. Methods: We completed a pragmatic before-after study of consecutive IHCA. We used ICD-10 codes to identify potentially eligible cases and included IHCA cases for which resuscitation was attempted. We obtained consensus on all data definitions before initiation of standardized-piloted data extraction by trained investigators. Phase I (Jan.2012-Aug.2013) consisted of baseline data. We implemented the AED medical directive in Phase II (Sept.2013-Aug.2016) using usual implementation strategies. In Phase III (Sept.2016-Dec.2017) we added an educational video informed by key constructs from a Theory of Planned Behavior survey. We report univariate comparisons of Utstein IHCA outcomes using 95% confidence intervals (CI). Results: There were 753 IHCA for which resuscitation was attempted with the following similar characteristics (Phase I n = 195; II n = 372; III n = 186): median age 68, 60.0% male, 79.3% witnessed, 29.7% non-monitored medical ward, 23.9% cardiac cause, 47.9% initial rhythm of pulseless electrical activity and 27.2% ventricular fibrillation/tachycardia (VF/VT). Comparing Phases I, II and III: an AED was used 0 times (0.0%), 21 times (5.6%), 15 times (8.1%); time to 1st rhythm analysis was 6min, 3min, 1min; and time to 1st shock was 10min, 10min and 7min. Comparing Phases I and III: time to 1st shock decreased by 3min (95%CI -7; 1), sustained ROSC increased from 29.7% to 33.3% (AD3.6%; 95%CI -10.8; 17.8), and survival to discharge increased from 24.6% to 25.8% (AD1.2%; 95%CI -7.5; 9.9). In the VF/VT subgroup, time to first shock decreased from 9 to 3 min (AD-6min; 95%CI -12; 0) and survival increased from 23.1% to 38.7% (AD15.6%; 95%CI -4.3; 35.4). Conclusion: The implementation of a medical directive allowing for AED use by RNs and RRTs successfully improved key outcomes for IHCA victims, particularly following the Theory-Based education video. The expansion of this project to other hospitals and health care professionals could significantly impact survival for VF/VT patients.
Introduction: Patients with major bleeding (e.g. gastrointestinal bleeding, and intracranial hemorrhage [ICH]) are commonly encountered in the Emergency Department (ED). A growing number of patients are on either oral or parenteral anticoagulation (AC), but the impact of AC on outcomes of patients with major bleeding is unknown. With regards to oral anticoagulation (OAC), we particularly sought to analyze differences between patients on Warfarin or Direct Oral Anticoagulants (DOACs). Methods: We analyzed a prospectively collected registry (2011-2016) of patients who presented to the ED with major bleeding at two academic hospitals. “Major bleeding” was defined by the International Society on Thrombosis and Haemostasis criteria. The primary outcome, in-hospital mortality, was analyzed using a multivariable logistic regression model. Secondary outcomes included discharge to long-term care among survivors, total hospital length of stay (LOS) among survivors, and total hospital costs. Results: 1,477 patients with major bleeding were included. AC use was found among 215 total patients (14.6%). Among OAC patients (n = 181), 141 (77.9%) had used Warfarin, and 40 (22.1%) had used a DOAC. 484 patients (32.8%) died in-hospital. AC use was associated with higher in-hospital mortality (adjusted odds ratio [OR]: 1.50 [1.17-1.93]). Among survivors to discharge, AC use was associated with higher discharge to long-term care (adjusted OR: 1.73 [1.18-2.57]), prolonged median LOS (19 days vs. 16 days, P = 0.03), and higher mean costs ($69,273 vs. $58,156, P = 0.02). With regards to OAC, a higher proportion of ICH was seen among patients on Warfarin (39.0% vs. 32.5%), as compared to DOACs. No difference in mortality was seen between DOACs and Warfarin (adjusted OR: 0.84 [0.40-1.72]). Patients with major bleeding on Warfarin had longer median LOS (11 days vs. 6 days, P = 0.03) and higher total costs ($51,524 vs. $35,176, P < 0.01) than patients on DOACs. Conclusion: AC use was associated with higher mortality among ED patients with major bleeding. Among survivors, AC use was associated with increased LOS, costs, and discharge to long-term care. Among OAC patients, no difference in mortality was found. Warfarin was associated with prolonged LOS and costs, likely secondary to higher incidence of ICH, as compared to DOACs.
BACKGROUND: IGTS is a rare phenomenon of paradoxical germ cell tumor (GCT) growth during or following treatment despite normalization of tumor markers. We sought to evaluate the frequency, clinical characteristics and outcome of IGTS in patients in 21 North-American and Australian institutions. METHODS: Patients with IGTS diagnosed from 2000-2017 were retrospectively evaluated. RESULTS: Out of 739 GCT diagnoses, IGTS was identified in 33 patients (4.5%). IGTS occurred in 9/191 (4.7%) mixed-malignant GCTs, 4/22 (18.2%) immature teratomas (ITs), 3/472 (0.6%) germinomas/germinomas with mature teratoma, and in 17 secreting non-biopsied tumours. Median age at GCT diagnosis was 10.9 years (range 1.8-19.4). Male gender (84%) and pineal location (88%) predominated. Of 27 patients with elevated markers, median serum AFP and Beta-HCG were 70 ng/mL (range 9.2-932) and 44 IU/L (range 4.2-493), respectively. IGTS occurred at a median time of 2 months (range 0.5-32) from diagnosis, during chemotherapy in 85%, radiation in 3%, and after treatment completion in 12%. Surgical resection was attempted in all, leading to gross total resection in 76%. Most patients (79%) resumed GCT chemotherapy/radiation after surgery. At a median follow-up of 5.3 years (range 0.3-12), all but 2 patients are alive (1 succumbed to progressive disease, 1 to malignant transformation of GCT). CONCLUSION: IGTS occurred in less than 5% of patients with GCT and most commonly after initiation of chemotherapy. IGTS was more common in patients with IT-only on biopsy than with mixed-malignant GCT. Surgical resection is a principal treatment modality. Survival outcomes for patients who developed IGTS are favourable.
Introduction: The Canadian Syncope Risk Score (CSRS) was developed to identify patients at risk for serious adverse events (SAE) within 30 days of an Emergency Department (ED) visit for syncope. We sought to validate the score in a new cohort of ED patients. Methods: We conducted a multicenter prospective cohort study at 8 large academic tertiary-care EDs across Canada from March 2014 to Dec 2016. We enrolled adults (age 16 years) who presented within 24 hours of syncope, after excluding those with persistent altered mentation, witnessed seizure, intoxication, and major trauma requiring hospitalization. Treating ED physicians collected the nine CSRS predictors at the index visit. Adjudicated SAE included death, arrhythmias and non-arrhythmic SAE (myocardial infarction, serious structural heart disease, pulmonary embolism, severe hemorrhage and procedural interventions within 30-days). We assessed area under the Receiver Operating Characteristic (ROC) curve, score calibration, and the classification performance for the various risk categories. Results: Of the 2547 patients enrolled, 146 (5.7%) were lost to follow-up and 111 (4.3%) had serious condition during the index ED visit and were excluded. Among the 2290 analyzed, 79 patients (3.4%; 0.4% death, 1.4% arrhythmia) suffered 30-day serious outcomes after ED disposition. The accuracy of the CSRS remained high with area under the ROC curve at 0.87 (95%CI 0.82-0.92), similar to the derivation phase (0.87; 95%CI 0.84-0.89). The score showed excellent calibration at the prespecified risk strata. For the very-low risk category (0.3% SAE of which 0.2% were arrhythmia and no deaths) the sensitivity was 97.5% and negative predictive value was 99.7% (95%CI 98.7-99.9). For the very high-risk category (61.5% SAE of which 26.9% were arrhythmia and 11.5% death) the specificity was 99.4% and positive predictive value was 61.5% (95% CI 43.0-77.2). Conclusion: In this multicenter validation study, the CSRS accurately risk stratified ED patients with syncope for short-term serious outcomes after ED disposition. The score should aid in minimizing investigation and observation of very-low risk patients, and prioritization of inpatient vs outpatient investigations or following of the rest. The CSRS is ready for implementation studies examining ED management decisions, patient safety and health care resource utilization.
The value of auditory enrichment for psychological well-being has been studied in a variety of species, including birds, cattle, horses and primates. To date the effect of auditory stimulation on the behaviour of dogs housed in rescue shelters is unknown. Rescue shelters provide temporary housing for thousands of stray and abandoned dogs every year. However well these dogs are cared for, it cannot be ignored that being in such a situation is stressful. Research suggests that music may be a useful moderator of stress in humans. The question remains as to whether auditory stimulation has such a beneficial effect in dogs. This study investigated the behaviour of sheltered dogs in response to five types of auditory stimulation to determine whether the dogs’ behaviour was influenced by their auditory environment.
VLBI observations of the nucleus of Centaurus A were made in April, 1982 at two frequencies with an array of five Australian radio antennas as part of the Southern Hemisphere VLBI Experiment (SHEVE). Observations were undertaken at 2.29 GHz with all five antennas, while only two were operational at 8.42 GHz. The 2.29 GHz data yielded significant information on the structure of the nuclear jet. At 8.42 GHz a compact unresolved core was detected as well.
Patients with poorly controlled diabetes mellitus may have a sentinel emergency department (ED) visit for a precipitating condition prior to presenting for a hyperglycemic emergency, such as diabetic ketoacidosis (DKA) or hyperosmolar hyperglycemic state (HHS). This study’s objective was to describe the epidemiology and outcomes of patients with a sentinel ED visit prior to their hyperglycemic emergency visit.
This was a 1-year health records review of patients≥18 years old presenting to one of four tertiary care EDs with a discharge diagnosis of hyperglycemia, DKA, or HHS. Trained research personnel collected data on patient characteristics, management, disposition, and determined whether patients came to the ED within the 14 days prior to their hyperglycemia visit. Descriptive statistics were used to summarize the data.
Of 833 visits for hyperglycemia, 142 (17.0%; 95% CI: 14.5% to 19.6%) had a sentinel ED presentation within the preceding 14 days. Mean (SD) age was 50.5 (19.0) years and 54.4% were male; 104 (73.2%) were discharged from this initial visit, and 98/104 (94.2%) were discharged either without their glucose checked or with an elevated blood glucose (>11.0 mmol/L). Of the sentinel visits, 93 (65.5%) were for hyperglycemia and 22 (15.5%) for infection. Upon returning to the ED, 61/142 (43.0%) were admitted for severe hyperglycemia, DKA, or HHS.
In this unique ED-based study, diabetic patients with a sentinel ED visit often returned and required subsequent admission for hyperglycemia. Clinicians should be vigilant in checking blood glucose and provide clear discharge instructions for follow-up and glucose management to prevent further hyperglycemic emergencies from occurring.
Introduction: The Canadian C-Spine Rule (CCR) was validated by emergency physicians and triage nurses to determine the need for radiography in alert and stable Emergency Department trauma patients. It was modified and validated for use by paramedics in 1,949 patients. The prehospital CCR calls for evaluation of active neck rotation if patients have none of 3 high-risk criteria and at least 1 of 4 low-risk criteria. This study evaluated the impact and safety of the implementation of the CCR by paramedics. Methods: This single-centre prospective cohort implementation study took place in Ottawa, Canada. Advanced and primary care paramedics received on-line and in-person training on the CCR, allowing them to use the CCR to evaluate eligible patients and selectively transport them without immobilization. We evaluated all consecutive eligible adult patients (GCS 15, stable vital signs) at risk for neck injury. Paramedics were required to complete a standardized study data form for each eligible patient evaluated. Study staff reviewed paramedic documentation and corresponding hospital records and diagnostic imaging reports. We followed all patients without initial radiologic evaluation for 30 days for referral to our spine service, or subsequent visit with radiologic evaluation. Analyses included sensitivity, specificity, kappa coefficient, t-test, and descriptive statistics with 95% CIs. Results: The 4,034 patients enrolled between Jan. 2011 and Aug. 2015 were: mean age 43 (range 16-99), female 53.3%, motor vehicle collision 51.9%, fall 23.8%, admitted to hospital 7.0%, acute c-spine injury 0.8%, and clinically important c-spine injury (0.3%). The CCR classified patients for 11 important injuries with sensitivity 91% (95% CI 58-100%), and specificity 67% (95% CI 65-68%). Kappa agreement for interpretation of the CCR between paramedics and study investigators was 0.94 (95% CI 0.92-0.95). Paramedics were comfortable or very comfortable using the CCR in 89.8% of cases. Mean scene time was 3 min (15.6%) shorter for those not immobilized (17 min vs. 20 min; p=0.0001). A total of 2,569 (63.7%) immobilizations were safely avoided using the CCR. Conclusion: Paramedics could safely and accurately apply the CCR to low-risk trauma patients. This had a significant impact on scene times and the number of prehospital immobilizations.
Local health departments (LHDs) have little guidance for operationalizing community resilience (CR). We explored how community coalitions responded to 4 CR levers (education, engagement, partnerships, and community self-sufficiency) during the first planning year of the Los Angeles County Community Disaster Resilience (LACCDR) Project.
Sixteen communities were selected and randomly assigned to the experimental CR group or the control preparedness group. Eight CR coalitions met monthly to plan CR-building activities or to receive CR training from a public health nurse. Trained observers documented the coalitions’ understanding and application of CR at each meeting. Qualitative content analysis was used to analyze structured observation reports around the 4 levers.
Analysis of 41 reports suggested that coalitions underwent a process of learning about and applying CR concepts in the planning year. Groups resonated with ideas of education, community self-sufficiency, and engagement, but increasing partnerships was challenging.
LHDs can support coalitions by anticipating the time necessary to understand CR and by facilitating engagement. Understanding the issues that emerge in the early phases of planning and implementing CR-building activities is critical. LHDs can use the experience of the LACCDR Project’s planning year as a guide to navigate challenges and issues that emerge as they operationalize the CR model. (Disaster Med Public Health Preparedness. 2016;10:812–821)
Four working groups and three task groups of IAU Commission 5 deal specifically with information handling, technical aspects of collection, archiving, storage and dissemination of data, with designations and classification of astronomical objects, with library services, editorial policies, computer communications, ad hoc methodologies, and with various standards, reference frames etc. Information about Commission 5 working and task groups and their activities may be found in http://nut.inasan.rssi.ru/IAU/.
Dual-energy X-ray absorptiometry (DXA) and isotope dilution technique have been used as reference methods to validate the estimates of body composition by simple field techniques; however, very few studies have compared these two methods. We compared the estimates of body composition by DXA and isotope dilution (18O) technique in apparently healthy Indian men and women (aged 19–70 years, n 152, 48 % men) with a wide range of BMI (14–40 kg/m2). Isotopic enrichment was assessed by isotope ratio mass spectroscopy. The agreement between the estimates of body composition measured by the two techniques was assessed by the Bland–Altman method. The mean age and BMI were 37 (sd 15) years and 23·3 (sd 5·1) kg/m2, respectively, for men and 37 (sd 14) years and 24·1 (sd 5·8) kg/m2, respectively, for women. The estimates of fat-free mass were higher by about 7 (95 % CI 6, 9) %, those of fat mass were lower by about 21 (95 % CI − 18, − 23) %, and those of body fat percentage (BF%) were lower by about 7·4 (95 % CI − 8·2, − 6·6) % as obtained by DXA compared with the isotope dilution technique. The Bland–Altman analysis showed wide limits of agreement that indicated poor agreement between the methods. The bias in the estimates of BF% was higher at the lower values of BF%. Thus, the two commonly used reference methods showed substantial differences in the estimates of body composition with wide limits of agreement. As the estimates of body composition are method-dependent, the two methods cannot be used interchangeably.
Altered levels of selenium and copper have been linked with altered cardiovascular disease risk factors including changes in blood triglyceride and cholesterol levels. However, it is unclear whether this can be observed prenatally. This cross-sectional study includes 274 singleton births from 2004 to 2005 in Baltimore, Maryland. We measured umbilical cord serum selenium and copper using inductively coupled plasma mass spectrometry. We evaluated exposure levels vis-à-vis umbilical cord serum triglyceride and total cholesterol concentrations in multivariable regression models adjusted for gestational age, birth weight, maternal age, race, parity, smoking, prepregnancy body mass index, n-3 fatty acids and methyl mercury. The percent difference in triglycerides comparing those in the highest v. lowest quartile of selenium was 22.3% (95% confidence interval (CI): 7.1, 39.7). For copper this was 43.8% (95% CI: 25.9, 64.3). In multivariable models including both copper and selenium as covariates, copper, but not selenium, maintained a statistically significant association with increased triglycerides (percent difference: 40.7%, 95% CI: 22.1, 62.1). There was limited evidence of a relationship of increasing selenium with increasing total cholesterol. Our findings provide evidence that higher serum copper levels are associated with higher serum triglycerides in newborns, but should be confirmed in larger studies.
Determining the appropriate disposition of emergency department (ED) syncope patients is challenging. Previously developed decision tools have poor diagnostic test characteristics and methodological flaws in their derivation that preclude their use. We sought to develop a scale to risk-stratify adult ED syncope patients at risk for serious adverse events (SAEs) within 30 days.
We conducted a medical record review to include syncope patients age ≥ 16 years and excluded patients with ongoing altered mental status, alcohol or illicit drug use, seizure, head injury leading to loss of consciousness, or severe trauma requiring admission. We collected 105 predictor variables (demographics, event characteristics, comorbidities, medications, vital signs, clinical examination findings, emergency medical services and ED electrocardiogram/ monitor characteristics, investigations, and disposition variables) and information on the occurrence of predefined SAEs. Univariate and multiple logistic regression analyses were performed.
Among 505 enrolled patient visits, 49 (9.7%) suffered an SAE. Predictors of SAE and their resulting point scores were as follows: age ≥ 75 years (1), shortness of breath (2), lowest ED systolic blood pressure < 80 mm Hg (2), Ottawa Electrocardiographic Criteria present (2), and blood urea nitrogen > 15 mmol/L (3). The final score calculated by addition of the individual scores for each variable (range 0–10) was found to accurately stratify patients into low risk (score < 1, 0% SAE risk), moderate risk (score 1, 3.7% SAE risk), or high risk (score > 1, ≥ 10% SAE risk).
We derived a risk scale that accurately predicts SAEs within 30 days in ED syncope patients. If validated, this will be a potentially useful clinical decision tool for emergency physicians, may allow judicious use of health care resources, and may improve patient care and safety.
Open science is a new concept for the practice of experimental laboratory-based research, such as drug discovery. The authors have recently gained experience in how to run such projects and here describe some straightforward steps others may wish to take towards more openness in their own research programmes. Existing and inexpensive online tools can solve many challenges, while some psychological barriers to the free sharing of all data and ideas are more substantial.
Malaria is a disease that still affects a significant proportion of the global human population. Whilst advances have been made in lowering the numbers of cases and deaths, it is clear that a strategy based solely on disease control year on year, without reducing transmission and ultimately eradicating the parasite, is unsustainable. This article highlights the current mainstay treatments alongside a selection of emerging new clinical molecules from the portfolio of Medicines for Malaria Venture (MMV) and our partners. In each case, the key highlights from each research phase are described to demonstrate how these new potential medicines were discovered. Given the increased focus of the community on eradicating the disease, the strategy for next generation combination medicines that will provide such potential is explained.
Background: More effective psychological treatments for psychosis are required. Case series data and pilot trials suggest metacognitive therapy (MCT) is a promising treatment for anxiety and depression. Other research has found negative metacognitive beliefs and thought-control strategies may be involved in the development and maintenance of hallucinations and delusions. The potential of MCT in treating psychosis has yet to be investigated. Aims: Our aim was to find out whether a short number of MCT sessions would be associated with clinically significant and sustained improvements in delusions, hallucinations, anxiety, depression and subjective recovery in patients with treatment-resistant long-standing psychosis. Method: Three consecutively referred patients, each with a diagnosis of paranoid schizophrenia and continuing symptoms, completed a series of multiple baseline assessments. Each then received between 11 and 13 sessions of MCT and completed regular assessments of progress, during therapy, post-therapy and at 3-month follow-up. Results: Two out of 3 participants achieved clinically significant reductions across a range of symptom-based outcomes at end-of-therapy. Improvement was sustained at 3-month follow-up for one participant. Conclusions: Our study demonstrates the feasibility of using MCT with people with medication-resistant psychosis. MCT was acceptable to the participants and associated with meaningful change. Some modifications may be required for this population, after which a controlled trial may be warranted.
Background: A randomized controlled trial has shown that supervised, facility-based exercise training is effective in improving glycemic control in type 2 diabetes. However, these programs are associated with additional costs. This analysis assessed the cost-effectiveness of such programs.
Methods: Analysis used data from the Diabetes Aerobic and Resistance Exercise (DARE) clinical trial which compared three different exercise programs (resistance, aerobic or a combination of both) of 6 months duration with a control group (no exercise program). Clinical outcomes at 6 months were entered for individual patients into the UKPDS economic model for type 2 diabetes adapted for the Canadian context. From this, expected life-years, quality-adjusted life-years (QALYs) and costs were estimated for all patients within the trial.
Results: The combined exercise program was the most expensive ($40,050) followed by the aerobic program ($39,250), the resistance program ($38,300) and no program ($31,075). QALYs were highest for combined (8.94), followed by aerobic (8.77), resistance (8.73) and no program (8.70). The incremental cost per QALY gained for the combined exercise program was $4,792 compared with aerobic alone, $8,570 compared with resistance alone, and $37,872 compared with no program. The combined exercise program remained cost-effective for all scenarios considered within sensitivity analysis.
Conclusions: A program providing training in both resistance and aerobic exercise was the most cost-effective of the alternatives compared. Based on previous funding decisions, exercise training for individuals with diabetes can be considered an efficient use of resources.