To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Lipid-based nutrient supplements (LNS) may be beneficial for malnourished HIV-infected patients starting antiretroviral therapy (ART). We assessed the effect of adding vitamins and minerals to LNS on body composition and handgrip strength during ART initiation. ART-eligible HIV-infected patients with BMI <18·5 kg/m2 were randomised to LNS or LNS with added high-dose vitamins and minerals (LNS-VM) from referral for ART to 6 weeks post-ART and followed up until 12 weeks. Body composition by bioelectrical impedance analysis (BIA), deuterium (2H) diluted water (D2O) and air displacement plethysmography (ADP), and handgrip strength were determined at baseline and at 6 and 12 weeks post-ART, and effects of LNS-VM v. LNS at 6 and 12 weeks investigated. BIA data were available for 1461, D2O data for 479, ADP data for 498 and handgrip strength data for 1752 patients. Fat mass tended to be lower, and fat-free mass correspondingly higher, by BIA than by ADP or D2O. At 6 weeks post-ART, LNS-VM led to a higher regain of BIA-assessed fat mass (0·4 (95 % CI 0·05, 0·8) kg), but not fat-free mass, and a borderline significant increase in handgrip strength (0·72 (95 % CI −0·03, 1·5) kg). These effects were not sustained at 12 weeks. Similar effects as for BIA were seen using ADP or D2O but no differences reached statistical significance. In conclusion, LNS-VM led to a higher regain of fat mass at 6 weeks and to a borderline significant beneficial effect on handgrip strength. Further research is needed to determine appropriate timing and supplement composition to optimise nutritional interventions in malnourished HIV patients.
Introduction: Patients with major bleeding (e.g. gastrointestinal bleeding, and intracranial hemorrhage [ICH]) are commonly encountered in the Emergency Department (ED). A growing number of patients are on either oral or parenteral anticoagulation (AC), but the impact of AC on outcomes of patients with major bleeding is unknown. With regards to oral anticoagulation (OAC), we particularly sought to analyze differences between patients on Warfarin or Direct Oral Anticoagulants (DOACs). Methods: We analyzed a prospectively collected registry (2011-2016) of patients who presented to the ED with major bleeding at two academic hospitals. “Major bleeding” was defined by the International Society on Thrombosis and Haemostasis criteria. The primary outcome, in-hospital mortality, was analyzed using a multivariable logistic regression model. Secondary outcomes included discharge to long-term care among survivors, total hospital length of stay (LOS) among survivors, and total hospital costs. Results: 1,477 patients with major bleeding were included. AC use was found among 215 total patients (14.6%). Among OAC patients (n = 181), 141 (77.9%) had used Warfarin, and 40 (22.1%) had used a DOAC. 484 patients (32.8%) died in-hospital. AC use was associated with higher in-hospital mortality (adjusted odds ratio [OR]: 1.50 [1.17-1.93]). Among survivors to discharge, AC use was associated with higher discharge to long-term care (adjusted OR: 1.73 [1.18-2.57]), prolonged median LOS (19 days vs. 16 days, P = 0.03), and higher mean costs ($69,273 vs. $58,156, P = 0.02). With regards to OAC, a higher proportion of ICH was seen among patients on Warfarin (39.0% vs. 32.5%), as compared to DOACs. No difference in mortality was seen between DOACs and Warfarin (adjusted OR: 0.84 [0.40-1.72]). Patients with major bleeding on Warfarin had longer median LOS (11 days vs. 6 days, P = 0.03) and higher total costs ($51,524 vs. $35,176, P < 0.01) than patients on DOACs. Conclusion: AC use was associated with higher mortality among ED patients with major bleeding. Among survivors, AC use was associated with increased LOS, costs, and discharge to long-term care. Among OAC patients, no difference in mortality was found. Warfarin was associated with prolonged LOS and costs, likely secondary to higher incidence of ICH, as compared to DOACs.
Introduction: Individualizing risk for stroke following a transient ischemic attack (TIA) is a topic of intense research, as existing scores are context-dependent or have not been well validated. The Canadian TIA Score stratifies risk of subsequent stroke into low, moderate and high risk. Our objective was to prospectively validate the Canadian TIA Score in a new cohort of emergency department (ED) patients. Methods: We conducted a prospective cohort study in 14 Canadian EDs over 4 years. We enrolled consecutive adult patients with an ED visit for TIA or nondisabling stroke. Treating physicians recorded standardized clinical variables onto data collection forms. Given the ability of prompt emergency carotid endarterectomy (CEA) to prevent stroke (NNT = 3) in high risk patients, our primary outcome was the composite of subsequent stroke or CEA ≤7 days. We conducted telephone follow-up using the validated Questionnaire for Verifying Stroke Free Status at 7 and 90 days. Outcomes were adjudicated by panels of 3 local stroke experts, blinded to the index ED data collection form. Based on prior work, we estimated a sample size of 5,004 patients including 93 subsequent strokes, would yield 95% confidence bands of +/− 10% for sensitivity and likelihood ratio (LR). Our analyses assessed interval LRs (iLR) with 95% CIs. Results: We prospectively enrolled 7,569 patients with mean 68.4 +/−14.7 years and 52.4% female, of whom 107 (1.4%) had a subsequent stroke and 74 (1.0%) CEA ≤7 days (total outcomes = 181). We enrolled 81.2% of eligible patients; missed patients were similar to enrolled. The Canadian TIA Score stratified the stroke/CEA ≤7days risk as: Low (probability <0.2%, iLR 0.20 [95%CI 0.091-0.44]; Moderate (probability 1.3%, iLR 0.79 [0.68-0.92]; High (probability 2.6%, iLR 2.2 [1.9-2.6]. Sensitivity analysis for just stroke ≤7 days yielded similar results: Low iLR 0.17 [95%CI 0.056-0.52], Medium iLR 0.89 [0.75-1.1], High iLR 2.0 [1.6-2.4]. Conclusion: The Canadian TIA Score accurately identifies TIA patients risk for stroke/CEA ≤7 days. Patients classified as low risk can be safely discharged following a careful ED assessment with elective follow-up. Patients at moderate risk can undergo additional testing in the ED, have antithrombotic therapy optimized, and be offered early stroke specialist follow-up. Patients at high risk should in most cases be fully investigated and managed ideally in consultation with a stroke specialist during their index ED visit.
Introduction: The Ottawa SAH Rule was developed to identify patients at high-risk for subarachnoid hemorrhage (SAH) who require investigations and the 6-Hour CT Rule found that computed tomography (CT) was 100% sensitive for SAH 6 hours of headache onset. Together, they form the Ottawa SAH Strategy. Our objectives were to assess: 1) Safety of the Ottawa SAH Strategy and its 2) Impact on: a) CTs, b) LPs, c) ED length of stay, and d) CT angiography (CTA). Methods: We conducted a multicentre prospective before/after study at 6 tertiary-care EDs January 2010 to December 2016 (implementation July 2013). Consecutive alert, neurologically intact adults with a headache peaking within one hour were included. SAH was defined by subarachnoid blood on head CT (radiologists final report); xanthochromia in the cerebrospinal fluid (CSF); >1x106/L red blood cells in the final tube of CSF with an aneurysm on CTA. Results: We enrolled 3,669 patients, 1,743 before and 1,926 after implementation, including 185 with SAH. The investigation rate before implementation was 89.0% (range 82.9 to 95.6%) versus 88.4% (range 85.2 to 92.3%) after implementation. The proportion who had CT remained stable (88.0% versus 87.4%; p=0.60), while the proportion who had LP decreased from 38.9% to 25.9% (p<0.001), and the proportion investigated with CTA increased from 18.8% to 21.6% (p=0.036). The additional testing rate (i.e. LP or CTA) diminishedfrom 50.1% to 40.8% (p<0.001). The proportion admitted declined from 9.8% to 7.3% (p=0.008), while the mean length of ED stay was stable (6.2 +/− 4.0 to 6.4 +/− 4.1 hours; p=0.45). For the 1,201 patients with CT 6 hours, there was an absolute decrease in additional testing (i.e. LP or CTA) of 15.0% (46.6% versus 31.6%; p<0.001). The sensitivity of the Ottawa SAH Rule was 100% (95%CI: 98-100%), and the 6-Hour CT Rule was 95.3% (95%CI: 88.9-98.3) for SAH. Five patients with early CT had SAH with CT reported as normal: 2 unruptured aneuryms on CTA and presumed traumatic LP (determined by treating neurosurgeon); 1 missed by the radiologist on the initial interpretation; 1 dural vein fistula (i.e. non-aneuyrsmal); and 1 profoundly anemic (Hgb 63g/L). Conclusion: The Ottawa SAH Strategy is highly sensitive and can be used routinely when SAH is being considered in alert and neurologically intact headache patients. Its implementation was associated with a decrease in LPs and admissions to hospital.
Introduction: Two published studies reported natriuretic peptides can aid in risk-stratification of Emergency Department (ED) syncope. We sought to assess the role of N-Terminal pro Brain Natriuretic Peptide (NT pro-BNP) to identify syncope patients at risk for serious adverse events (SAE) within 30 days of the ED visit, and its value above that of the Canadian Syncope Risk Score (CSRS). Methods: We conducted a multicenter prospective cohort study at 6 large Canadian EDs from Nov 2011 to Feb 2015. We enrolled adults who presented within 24-hours of syncope and excluded those with persistent altered mentation, obvious seizure, and intoxication. We collected patient characteristics, nine CSRS predictors (includes troponin), ED management and NT pro-BNP levels. Adjudicated serious adverse events (SAE) included death, cardiac SAE (arrhythmias, myocardial infarction, serious structural heart disease) and non-cardiac SAE (pulmonary embolism, severe hemorrhage and procedural interventions within 30-days). We used two tailed t-test and logistic regression analysis. Results: Of the 1359 patients (mean age 57.2 years, 54.7% females, 13.3% hospitalized) enrolled, 148 patients (10.9%; 0.7% deaths, 7.9% cardiac SAE including 6.1% arrhythmia) suffered SAE within 30-days. The mean NT pro-BNP values, when compared to the patients with no SAE (499.8ng/L) was significantly higher among the 56 patients who suffered SAE after ED disposition (3147ng/L, p=0.001), and among the 35 patients with cardiac SAE after ED disposition (2016.2ng/L, p=0.02). While there was a trend to higher levels among patients who suffered arrhythmia after the ED visit, it was not statistically significant (1776.4ng/L, p=0.07). In a model with CSRS predictors, the adjusted odds ratio for NT pro-BNP was 8.0 (95%CI 1.8, 35.9) and troponin was 3.8 (95%CI 1.7, 8.8). The addition of NT pro-BNP did not significantly improve the classification performance (p=0.76) with areas under the curves for CSRS was 0.91 (95%CI 0.88, 0.95) and CSRS with NT pro-BNP was 0.92 (95%CI 0.88, 0.95). Conclusion: In this multicenter study, mean NT pro-BNP levels were significantly higher among ED syncope patients who suffered SAE including cardiac SAE after ED disposition. Though NT pro-BNP was a significant independent predictor of SAE after ED disposition, it did not improve accuracy in ED syncope risk-stratification when compared to CSRS. Hence, we do not recommend NT pro-BNP measurement for ED syncope management.
VLBI observations of the nucleus of Centaurus A were made in April, 1982 at two frequencies with an array of five Australian radio antennas as part of the Southern Hemisphere VLBI Experiment (SHEVE). Observations were undertaken at 2.29 GHz with all five antennas, while only two were operational at 8.42 GHz. The 2.29 GHz data yielded significant information on the structure of the nuclear jet. At 8.42 GHz a compact unresolved core was detected as well.
Introduction: Concern for occult serious conditions leads to variations in ED syncope management [hospitalization, duration of ED/inpatient monitoring including Syncope Observation Units (SOU) for prolonged monitoring]. We sought to develop evidence-based recommendations for duration of ED/post-ED ECG monitoring using the Canadian Syncope Risk Score (CSRS) by assessing the time to serious adverse event (SAE) occurrence. Methods: We enrolled adults with syncope at 6 EDs and collected demographics, time of syncope and ED arrival, CSRS predictors and time of SAE. We stratified patients as per the CSRS (low, medium and high risk as ≤0, 1-3 and ≥4 respectively). 30-day adjudicated SAEs included death, myocardial infarction, arrhythmia, structural heart disease, pulmonary embolism or serious hemorrhage. We categorized arrhythmias, interventions for arrhythmias and death from unknown cause as arrhythmic SAE and the rest as non-arrhythmic SAE. We performed Kaplan-Meier analysis using time of ED registration for primary and time of syncope for secondary analyses. Results: 5,372 patients (mean age 54.3 years, 54% females, and 13.7% hospitalized) were enrolled with 538 (10%) patients suffering SAE (0.3% died due to an unknown cause and 0.5% suffered ventricular arrhythmia). 64.8% of SAEs occurred within 6 hours of ED arrival. The probability for any SAE or arrhythmia was highest within 2-hours of ED arrival for low-risk patients (0.65% and 0.31%; dropped to 0.54% and 0.06% after 2-hours) and within 6-hours for the medium and high-risk patients (any SAE 6.9% and 17.4%; arrhythmia 6.5% and 18.9% respectively) which also dropped after 6-hours (any SAE 0.99% and 2.92%; arrhythmia 0.78% and 3.07% respectively). For any CSRS threshold, the risk of arrhythmia was highest within the first 15-days (for CSRS ≥2 patients 15.6% vs. 0.006%). ED monitoring for 2-hours (low-risk) and 6-hours (medium and high-risk) and using a CSRS ≥2 cut-off for outpatient 15-day ECG monitoring will lead to 52% increase in arrhythmia detection. The majority (82.2%) arrived to the ED within 2-hours (median time 1.1 hours) and secondary analysis yielded similar results. Conclusion: Our study found 2 and 6 hours of ED monitoring for low-risk and medium/high-risk CSRS patients respectively, with 15-day outpatient ECG monitoring for CSRS ≥2 patients will improve arrhythmia detection without the need for hospitalization or observation units.
Introduction: Patients with poorly controlled diabetes mellitus may present repeatedly to the emergency department (ED) for management and treatment of hyperglycemic episodes, including diabetic ketoacidosis and hyperosmolar hyperglycemic state. The objective of this study was to identify risk factors that predict unplanned recurrent ED visits for hyperglycemia in patients with diabetes within 30 days of initial presentation. Methods: We conducted a one-year health records review of patients ≥18 years presenting to one of four tertiary care EDs with a discharge diagnosis of hyperglycemia, diabetic ketoacidosis or hyperosmolar hyperglycemic state. Trained research personnel collected data on patient characteristics and determined if patients had an unplanned recurrent ED visit for hyperglycemia within 30 days of their initial presentation. Multivariate logistic regression models using generalized estimating equations to account for patients with multiple visits determined predictor variables independently associated with recurrent ED visits for hyperglycemia within 30 days. Results: There were 833 ED visits for hyperglycemia in the one-year period. 54.6% were male and mean (SD) age was 48.8 (19.5). Of all visitors, 156 (18.7%) had a recurrent ED visit for hyperglycemia within 30 days. Factors independently associated with recurrent hyperglycemia visits included a previous hyperglycemia visit in the past month (odds ratio [OR] 3.5, 95% confidence interval [CI] 2.1-5.8), age <25 years (OR 2.6, 95% CI 1.5-4.7), glucose >20 mmol/L (OR 2.2, 95% CI 1.3-3.7), having a family physician (OR 2.2, 95% CI 1.0-4.6), and being on insulin (OR 1.9, 95% CI 1.1-3.1). Having a systolic blood pressure between 90-150 mmHg (OR 0.53, 95% CI 0.30-0.93) and heart rate >110 bpm (OR 0.41, 95% CI 0.23-0.72) were protective factors independently associated with not having a recurrent hyperglycemia visit. Conclusion: This unique ED-based study reports five risk factors and two protective factors associated with recurrent ED visits for hyperglycemia within 30 days in patients with diabetes. These risk factors should be considered by clinicians when making management, prognostic, and disposition decisions for diabetic patients who present with hyperglycemia.
There is a need for simple proxies of health status, in order to improve monitoring of chronic disease risk within and between populations, and to assess the efficacy of public health interventions as well as clinical management. This review discusses how, building on recent research findings, body composition outcomes may contribute to this effort. Traditionally, body mass index has been widely used as the primary index of nutritional status in children and adults, but it has several limitations. We propose that combining information on two generic traits, indexing both the ‘metabolic load’ that increases chronic non-communicable disease risk, and the homeostatic ‘metabolic capacity’ that protects against these diseases, offers a new opportunity to improve assessment of disease risk. Importantly, this approach may improve the ability to take into account ethnic variability in chronic disease risk. This approach could be applied using simple measurements readily carried out in the home or community, making it ideal for M-health and E-health monitoring strategies.
Introduction: Suspicion of arrhythmias among syncope patients is the leading cause of emergency department (ED) referrals and hospitalization. However, the risk factors for short-term arrhythmias are not well defined. We sought to develop a risk prediction tool to identify syncope patients at risk for 30-day arrhythmia or death after ED disposition. Methods: This prospective cohort study involved 6 academic EDs that enrolled adult syncope patients. We collected standardized variables at index presentation from history, clinical examination, investigations including ECG, and patients’ disposition. Adjudicated outcomes included death (due to arrhythmia or unknown cause), arrhythmia or procedural intervention to treat arrhythmias within 30-days after ED disposition. Multivariable logistic regression was used to derive the model; bootstrap sampling for internal validation and to estimate shrinkage and optimism. Results: 5,010 adult syncope patients (mean age 53.4 years, 54.8% females, and 9.5% hospitalized) were enrolled with 106 (3.6%) patients suffering arrhythmia or death within 30-days after ED disposition. Of 39 candidate predictors examined, eight were included in the final model: vasovagal predisposition, heart disease, any ED systolic blood pressure <90 or >180 mmHg, troponin (>99%ile), QRS duration >130msec, QTc interval >480msec and ED diagnosis of cardiac, or vasovagal syncope [Optimism corrected c-statistic: 0.91 (95%CI 0.87-0.93); Hosmer-Lemeshow p=0.08]. The Canadian Syncope Arrhythmia Risk Score had a risk ranging from 0.2% for a score of -2 to 74.5% for a score of 8. Sensitivity for threshold score ≤-1 was 100% (95% CI 96.5-100) and specificity for a score of ≥4 was 97.0% (95% CI 96.5-97.5). Conclusion: The Canadian Syncope Arrhythmia Risk Score can improve acute management of ED patients with syncope by better identification of those at higher-risk for short-term arrhythmia or death. Once validated, the tool can be used to guide disposition decision and can also aid in selection of patients for out-of-hospital cardiac monitoring if discharged home.
Introduction: Short-term risk of arrhythmia or death among emergency department (ED) syncope patients with atrial fibrillation/flutter (AFF) has not been reported in the literature. Our objectives were to assess the incidence and the independent risk of 30-day arrhythmia or death for syncope patients with AFF after ED disposition. Methods: We conducted a prospective study at 6 Canadian academic EDs to include adults with syncope. We collected demographic, clinical and ECG characteristics while our outcome assessments were completed by medical records review and by telephone follow-up of patients after 30 days. Primary outcome was arrhythmia or death within 30-days after ED disposition and secondary outcomes included non-arrhythmic cardiac and non-cardiac outcomes. We performed descriptive and logistic regression analyses. Results: We enrolled 4,266 patients: mean age 53.4 years, 55.4% females, and 8.5% with AFF. After excluding those with outcomes in the ED, lost to follow-up and those with other non-sinus rhythms, 3,417 patients in the sinus and 280 patients in the AFF groups were analyzed. The incidence of arrhythmia or death was significantly higher in the AFF group (Relative Risk 5.1; 95% CI 3.1-8.4; p<0.0001) but there were no significant differences in secondary outcomes between the groups. The unadjusted odds ratio for 30-days arrhythmia or deaths among ED syncope patients with AFF was 5.4 (95% CI 3.2- 9.2). After adjusting for important baseline risk factors by multivariable analysis, the odds ratio for arrhythmia or death in patients with AFF was 1.5 (95% CI 0.8-2.7). Conclusion: The risk of AFF for 30-day arrhythmia or death among syncope patients after ED disposition is higher but is attenuated when adjusted for important patient characteristics. Future research should assess long-term outcomes among syncope patients with AFF to guide follow-up after ED discharge.
Four working groups and three task groups of IAU Commission 5 deal specifically with information handling, technical aspects of collection, archiving, storage and dissemination of data, with designations and classification of astronomical objects, with library services, editorial policies, computer communications, ad hoc methodologies, and with various standards, reference frames etc. Information about Commission 5 working and task groups and their activities may be found in http://nut.inasan.rssi.ru/IAU/.
Dual-energy X-ray absorptiometry (DXA) and isotope dilution technique have been used as reference methods to validate the estimates of body composition by simple field techniques; however, very few studies have compared these two methods. We compared the estimates of body composition by DXA and isotope dilution (18O) technique in apparently healthy Indian men and women (aged 19–70 years, n 152, 48 % men) with a wide range of BMI (14–40 kg/m2). Isotopic enrichment was assessed by isotope ratio mass spectroscopy. The agreement between the estimates of body composition measured by the two techniques was assessed by the Bland–Altman method. The mean age and BMI were 37 (sd 15) years and 23·3 (sd 5·1) kg/m2, respectively, for men and 37 (sd 14) years and 24·1 (sd 5·8) kg/m2, respectively, for women. The estimates of fat-free mass were higher by about 7 (95 % CI 6, 9) %, those of fat mass were lower by about 21 (95 % CI − 18, − 23) %, and those of body fat percentage (BF%) were lower by about 7·4 (95 % CI − 8·2, − 6·6) % as obtained by DXA compared with the isotope dilution technique. The Bland–Altman analysis showed wide limits of agreement that indicated poor agreement between the methods. The bias in the estimates of BF% was higher at the lower values of BF%. Thus, the two commonly used reference methods showed substantial differences in the estimates of body composition with wide limits of agreement. As the estimates of body composition are method-dependent, the two methods cannot be used interchangeably.
Altered levels of selenium and copper have been linked with altered cardiovascular disease risk factors including changes in blood triglyceride and cholesterol levels. However, it is unclear whether this can be observed prenatally. This cross-sectional study includes 274 singleton births from 2004 to 2005 in Baltimore, Maryland. We measured umbilical cord serum selenium and copper using inductively coupled plasma mass spectrometry. We evaluated exposure levels vis-à-vis umbilical cord serum triglyceride and total cholesterol concentrations in multivariable regression models adjusted for gestational age, birth weight, maternal age, race, parity, smoking, prepregnancy body mass index, n-3 fatty acids and methyl mercury. The percent difference in triglycerides comparing those in the highest v. lowest quartile of selenium was 22.3% (95% confidence interval (CI): 7.1, 39.7). For copper this was 43.8% (95% CI: 25.9, 64.3). In multivariable models including both copper and selenium as covariates, copper, but not selenium, maintained a statistically significant association with increased triglycerides (percent difference: 40.7%, 95% CI: 22.1, 62.1). There was limited evidence of a relationship of increasing selenium with increasing total cholesterol. Our findings provide evidence that higher serum copper levels are associated with higher serum triglycerides in newborns, but should be confirmed in larger studies.
The emergence of invasive fungal wound infections (IFIs) in combat casualties led to development of a combat trauma-specific IFI case definition and classification. Prospective data were collected from 1133 US military personnel injured in Afghanistan (June 2009–August 2011). The IFI rates ranged from 0·2% to 11·7% among ward and intensive care unit admissions, respectively (6·8% overall). Seventy-seven IFI cases were classified as proven/probable (n = 54) and possible/unclassifiable (n = 23) and compared in a case-case analysis. There was no difference in clinical characteristics between the proven/probable and possible/unclassifiable cases. Possible IFI cases had shorter time to diagnosis (P = 0·02) and initiation of antifungal therapy (P = 0·05) and fewer operative visits (P = 0·002) compared to proven/probable cases, but clinical outcomes were similar between the groups. Although the trauma-related IFI classification scheme did not provide prognostic information, it is an effective tool for clinical and epidemiological surveillance and research.