To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The German Twin Family Panel (TwinLife) is a German longitudinal study of monozygotic and dizygotic same-sex twin pairs and their families that was designed to investigate the development of social inequalities over the life course. The study covers an observation period from approximately 2014 to 2023. The target population of the sample are reared-together twins of four different age cohorts that were born in 2009/2010 (cohort 1), in 2003/2004 (cohort 2), in 1997/1998 (cohort 3) and between 1990 and 1993 (cohort 4). In the first wave, the study included data on 4097 twin families. Families were recruited in all parts of Germany so that the sample comprises the whole range of the educational, occupational and income structure. As of 2019, two face-to-face, at-home interviews and two telephone interviews have been conducted. Data from the first home and telephone interviews are already available free of charge as a scientific use-file from the GESIS data archive. This report aims to provide an overview of the study sample and design as well as constructs that are unique in TwinLife in comparison with previous twin studies — such as an assessment of cognitive abilities or information based on the children’s medical records and report cards. In addition, major findings based on the data already released are displayed, and future directions of the study are presented and discussed.
In 2016, we reviewed preventive control measures for secondary transmission of Shiga-toxin producing Escherichia coli (STEC) in humans in European Union (EU)/European Free Trade Association (EEA) countries to inform the revision of the respective Norwegian guidelines which at that time did not accommodate for the varying pathogenic potential of STEC. We interviewed public health experts from EU/EEA institutes, using a semi-structured questionnaire. We revised the Norwegian guidelines using a risk-based approach informed by the new scientific evidence on risk factors for HUS and the survey results. All 13 (42%) participating countries tested STEC for Shiga toxin (stx) 1, stx2 and eae (encoding intimin). Five countries differentiated their control measures based on clinical and/or microbiological case characteristics, but only Denmark based their measures on routinely conducted stx subtyping. In all countries, but Norway, clearance was obtained with ⩽3 negative STEC specimens. After this review, Norway revised the STEC guidelines and recommended only follow-up of cases infected with high-virulent STEC (determined by microbiological and clinical information); clearance is obtained with three negative specimens. Implementation of the revised Norwegian guidelines will lead to a decrease of STEC cases needing follow-up and clearance, and will reduce the burden of unnecessary public health measures and the socioeconomic impact on cases. This review of guidelines could assist other countries in adapting their STEC control measures.
Apolipoprotein E (APOE) E4 is the main genetic risk factor for Alzheimer’s disease (AD). Due to the consistent association, there is interest as to whether E4 influences the risk of other neurodegenerative diseases. Further, there is a constant search for other genetic biomarkers contributing to these phenotypes, such as microtubule-associated protein tau (MAPT) haplotypes. Here, participants from the Ontario Neurodegenerative Disease Research Initiative were genotyped to investigate whether the APOE E4 allele or MAPT H1 haplotype are associated with five neurodegenerative diseases: (1) AD and mild cognitive impairment (MCI), (2) amyotrophic lateral sclerosis, (3) frontotemporal dementia (FTD), (4) Parkinson’s disease, and (5) vascular cognitive impairment.
Genotypes were defined for their respective APOE allele and MAPT haplotype calls for each participant, and logistic regression analyses were performed to identify the associations with the presentations of neurodegenerative diseases.
Our work confirmed the association of the E4 allele with a dose-dependent increased presentation of AD, and an association between the E4 allele alone and MCI; however, the other four diseases were not associated with E4. Further, the APOE E2 allele was associated with decreased presentation of both AD and MCI. No associations were identified between MAPT haplotype and the neurodegenerative disease cohorts; but following subtyping of the FTD cohort, the H1 haplotype was significantly associated with progressive supranuclear palsy.
This is the first study to concurrently analyze the association of APOE isoforms and MAPT haplotypes with five neurodegenerative diseases using consistent enrollment criteria and broad phenotypic analysis.
Angiostrongylus cantonensis is a pathogenic nematode and the cause of neuroangiostrongyliasis, an eosinophilic meningitis more commonly known as rat lungworm disease. Transmission is thought to be primarily due to ingestion of infective third stage larvae (L3) in gastropods, on produce, or in contaminated water. The gold standard to determine the effects of physical and chemical treatments on the infectivity of A. cantonensis L3 larvae is to infect rodents with treated L3 larvae and monitor for infection, but animal studies are laborious and expensive and also raise ethical concerns. This study demonstrates propidium iodide (PI) to be a reliable marker of parasite death and loss of infective potential without adversely affecting the development and future reproduction of live A. cantonensis larvae. PI staining allows evaluation of the efficacy of test substances in vitro, an improvement upon the use of lack of motility as an indicator of death. Some potential applications of this assay include determining the effectiveness of various anthelmintics, vegetable washes, electromagnetic radiation and other treatments intended to kill larvae in the prevention and treatment of neuroangiostrongyliasis.
The introduction of agriculture is known to have profoundly affected the ecological complexion of landscapes. In this study, a rapid transition from C3 to C4 vegetation is inferred from a shift to higher stable carbon (13C/12C) isotope ratios of soils and sediments in the Benoué River Valley and upland Fali Mountains in northern Cameroon. Landscape change is viewed from the perspective of two settlement mounds and adjacent floodplains, as well as a rock terrace agricultural field dating from 1100 cal yr BP to the recent past (<400 cal yr BP). Nitrogen (15N/14N) isotope ratios and soil micromorphology demonstrate variable uses of land adjacent to the mound sites. These results indicate that Early Iron Age settlement practices involved exploitation of C3 plants on soils with low δ15N values, indicating wetter soils. Conversely, from the Late Iron Age (>700 cal yr BP) until recent times, high soil and sediment δ13C and δ15N values reflect more C4 biomass and anthropogenic organic matter in open, dry environments. The results suggest that Iron Age settlement practices profoundly changed landscapes in this part of West Africa through land clearance and/or utilization of C4 plants.
Quality Improvement and Patient Safety (QIPS) plays an important role in addressing shortcomings in optimal healthcare delivery. However, there is little published guidance available for emergency department (ED) teams with respect to developing their own QIPS programs. We sought to create recommendations for established and aspiring ED leaders to use as a pathway to better patient care through programmatic QIPS activities, starting internally and working towards interdepartmental collaboration.
An expert panel comprised of ten ED clinicians with QIPS and leadership expertise was established. A scoping review was conducted to identify published literature on establishing QIPS programs and frameworks in healthcare. Stakeholder consultations were conducted among Canadian healthcare leaders, and recommendations were drafted by the expert panel based on all the accumulated information. These were reviewed and refined at the 2018 CAEP Academic Symposium in Calgary using in-person and technologically-supported feedback.
Recommendations include: creating a sense of urgency for improvement; engaging relevant stakeholders and leaders; creating a formal local QIPS Committee; securing funding and resources; obtaining local data to guide the work; supporting QIPS training for team members; encouraging interprofessional, cross-departmental, and patient collaborations; using an established QIPS framework to guide the work; developing reward mechanisms and incentive structures; and considering to start small by focusing on a project rather than a program.
A list of 10 recommendations is presented as guiding principles for the establishment and sustainable deployment of QIPS activities in EDs throughout Canada and abroad. ED leaders are encouraged to implement our recommendations in an effort to improve patient care.
From 2007 to 2010, the largest reported Q-fever epidemic occurred in the Netherlands with 4026 notified laboratory-confirmed cases. During the course of the epidemic, health-seeking behaviour changed and awareness among health professionals increased. Changes in laboratory workflows were implemented. The aim of this study was to analyse how these changes instigated adjustments of notification criteria and how these adjustments affected the monitoring and interpretation of the epidemic. We used the articles on laboratory procedures related to the epidemic and a description of the changes that were made to the notification criteria. We compared the output of a regional laboratory with notifications to the regional Public Health Service and the national register of infectious diseases. We compared the international notification criteria for acute Q-fever. Screening with ELISA IgM phase II and PCR was added to the diagnostic workflow. In the course of the epidemic, serology often revealed a positive IgG/IgM result although cases were not infected recently. With increasing background seroprevalence, the presence of IgM antibodies can only be suggestive for acute Q-fever and has to be confirmed either by seroconversion of IgG or a positive PCR result. Differences in sero-epidemiology make it unlikely that full harmonisation of notification criteria between countries is feasible.
Introduction: Acute upper gastrointestinal bleeding (UGIB) is a common presentation to emergency departments (ED). Of these patients, 35-45% receive a blood transfusion. Guidelines for blood transfusion in UGIB have been well established, and recommend a hemoglobin (Hb) level below 70 g/L as the transfusion target in a stable patient. There is no consensus on a transfusion threshold for unstable UGIB. There is limited data regarding physician practices in the ED. The aim of our study is to determine the appropriateness, by expert consensus, of blood transfusions in UGIB in a tertiary care hospital ED. Methods: We retrospectively reviewed patients presenting with UGIB to the University of Alberta Hospital ED in 2016. These patients were then screened for blood transfusions. Data were obtained from the patient records. Chart derived data were verified with records obtained from the blood bank. For each patient, the history, vitals, Glasgow Blatchford Score (GBS), relevant labs, and record of blood transfusions were collected and organized into a case summary. Each patient summary was presented individually to a panel of three expert clinicians (2 Gastroenterology, 1 Emergency Medicine), who then decided on the appropriateness of each blood transfusion by consensus. Results: Blood transfusions (data available 395/400) were given to 51% (202/395) of patients presenting with UGIB. Of these, 86% (174/202) were judged to be appropriate. Of the 395 patients, 34% (135/395) had a Hb of <70 g/L. Of these, 93% (126/135) were transfused, and all of these were considered appropriate. 18% (70/395) had a Hb between 71-80. 74% (52/70) of these patients were given blood, and 79% (41/52) were considered appropriate. 13% (50/395) of the patients had a Hb between 81-90, with 28% (14/50) receiving a transfusion. Of these, 36% (5/14) were deemed to be appropriate. 35% (140/395) of patients had a Hb of >90. 7% (10/140) of these received blood. 20% (2/10) were considered appropriate. Conclusion: The panel of expert clinicians judged 86% of the blood transfusions to be appropriate. All transfusions under the recommended guideline of 70 g/L were considered appropriate. In addition, the majority of transfusions above a Hb of 70 g/L were considered appropriate, but 37% were not. Further studies evaluating the feasibility of current guideline recommendations in an ED setting are required. Educational interventions should be created to reduce inappropriate blood transfusions above a Hb 70 g/L.
Introduction: Upper gastrointestinal bleeding (UGIB) is a common presentation to the emergency department (ED). Early endoscopy within 24 hours has been shown to reduce re-bleeding rates and lower mortality. However, low-risk patients can often be managed through outpatient follow-up. The aim of this study was to compare the timing and appropriateness of endoscopy and proton pump inhibitor (PPI) use in a tertiary care ED setting for low- and high-risk patients determined using the Glasgow Blatchford Score (GBS). Methods: Retrospective chart review was conducted to examine the management of patients presenting with an UGIB in 2016 to the University of Alberta Hospital ED. TANDEM and Emergency Department Information System (EDIS) databases were used to identify patients using specific ICD-10 codes and the CEDIS presenting complaints of vomiting blood or blood in stool/melena. Patients with GBS 0-3 were categorized as low-risk and those with GBS > 3 were considered high-risk with appropriateness of and time to endoscopy, disposition of patient at 24 hours, and use of PPIs determined for each group. Results: A total of 400 patients were included. A total of 319/400 patients (80%) underwent esophagogastroduodenoscopy (EGD). EGD was performed within 24 hours in 37% of patients (29/78) with GBS 0 to 3 and in 77% (248/322) with GBS greater than 3. Of the remaining high-risk patients, 11% (36/322) underwent EGD after 24 hours and 12% (38/322) did not undergo EGD. The endoscopic diagnoses were peptic ulcer disease (PUD) in 41% of patients (130/319), esophagitis in 18% (56/319), and varices in 14% (45/319). PPIs (data available 375/400) were administered (mainly intravenously) to 93% (279/300) of high-risk and 79% (59/75) of low-risk patients. Data on patient disposition showed 60/322 (19%) high-risk patients were discharged from the ED within 24 hours and only 31/60 (52%) of these underwent EGD before discharge. Of 29 low-risk patients undergoing EGD within 24 hours, 9 (31%) were admitted, 17 (59%) were discharged from ED, and 3 (10%) were kept for observation in the ED greater than 24 hours. Of low-risk patients, 76% (59/78) were discharged from the ED within 24 hours. Conclusion: A majority of patients presenting with UGIB appropriately received endoscopy within 24 hours. 19% of high-risk patients were discharged from the ED. Earlier discharge for low-risk patients can be improved as only 76% of low-risk patients were discharged from the ED within 24 hours. As expected, PPI use was high in these patients.
Background: Buprenorphine/naloxone (bup/nal) is a partial opioid agonist/antagonist and recommended first line treatment for opioid use disorder (OUD). Emergency departments (EDs) are a key point of contact with the healthcare system for patients living with OUD. Aim Statement: We implemented a multi-disciplinary quality improvement project to screen patients for OUD, initiate bup/nal for eligible individuals, and provide rapid next business day walk-in referrals to addiction clinics in the community. Measures & Design: From May to September 2018, our team worked with three ED sites and three addiction clinics to pilot the program. Implementation involved alignment with regulatory requirements, physician education, coordination with pharmacy to ensure in-ED medication access, and nurse education. The project is supported by a full-time project manager, data analyst, operations leaders, physician champions, provincial pharmacy, and the Emergency Strategic Clinical Network leadership team. For our pilot, our evaluation objective was to determine the degree to which our initiation and referral pathway was being utilized. We used administrative data to track the number of patients given bup/nal in ED, their demographics and whether they continued to fill bup/nal prescriptions 30 days after their ED visit. Addiction clinics reported both the number of patients referred to them and the number of patients attending their referral. Evaluation/Results: Administrative data shows 568 opioid-related visits to ED pilot sites during the pilot phase. Bup/nal was given to 60 unique patients in the ED during 66 unique visits. There were 32 (53%) male patients and 28 (47%) female patients. Median patient age was 34 (range: 21 to 79). ED visits where bup/nal was given had a median length of stay of 6 hours 57 minutes (IQR: 6 hours 20 minutes) and Canadian Triage Acuity Scores as follows: Level 1 – 1 (2%), Level 2 – 21 (32%), Level 3 – 32 (48%), Level 4 – 11 (17%), Level 5 – 1 (2%). 51 (77%) of these visits led to discharge. 24 (47%) discharged patients given bup/nal in ED continued to fill bup/nal prescriptions 30 days after their index ED visit. EDs also referred 37 patients with OUD to the 3 community clinics, and 16 of those individuals (43%) attended their first follow-up appointment. Discussion/Impact: Our pilot project demonstrates that with dedicated resources and broad institutional support, ED patients with OUD can be appropriately initiated on bup/nal and referred to community care.
In Norway, incidence of sporadic domestically acquired salmonellosis is low, and most frequently due to Salmonalla Typhimurium. We investigated the risk factors for sporadic Salmonella infections in Norway to improve control and prevention measures. Surveillance data for all Salmonella infections from 2000 to 2015 were analysed for seasonality and proportion associated with domestic reservoirs, hedgehogs and wild birds. A prospective case–control study was conducted from 2010 to 2012 by recruiting cases from the Norwegian Surveillance System for Communicable Diseases and controls from the Norwegian Population Registry (389 cases and 1500 controls). Univariable analyses using logistic regression were conducted and a multivariable model was developed using regularised/penalised logistic regression. In univariable analysis, eating snow, dirt, sand or playing in a sandbox (aOR 4.14; CI 2.15–7.97) was associated with salmonellosis. This was also the only exposure significantly associated with illness in the multivariable model. Since 2004, 34.2% (n = 354) of S. Typhimuirum cases had an MLVA profile linked to a domestic reservoir. A seasonal trend with a peak in August for all Salmonella types and in February for S. Typhimurium was observed. Indirect exposure to domestic reservoirs remains a source of salmonellosis in Norway, particularly for children. Information to the public about avoiding environmental exposure should be strengthened and initiatives to combat salmonellosis in the food chain should be reinforced.
Consumption of fruits and vegetables has been shown to contribute to mental and cognitive health in older adults from Western industrialized countries. However, it is unclear whether this effect replicates in older adults from non-Western developing countries. Thus, the present study examined the contribution of fruit and vegetable consumption to mental and cognitive health in older persons from China, India, Mexico, Russia, South Africa and Ghana.
Representative cross-sectional and cross-national study.
We used data from the WHO Study on Global Ageing and Adult Health (SAGE), sampled in 2007 to 2010. Our final sample size included 28 078 participants.
Fruit and vegetable consumption predicted an increased cognitive performance in older adults including improved verbal recall, improved delayed verbal recall, improved digit span test performance and improved verbal fluency; the effect of fruit consumption was much stronger than the effect of vegetable consumption. Regarding mental health, fruit consumption was significantly associated with better subjective quality of life and less depressive symptoms; vegetable consumption, however, did not significantly relate to mental health.
Consumption of fruits is associated with both improved cognitive and mental health in older adults from non-Western developing countries, and consumption of vegetables is associated with improved cognitive health only. Increasing fruit and vegetable consumption might be one easy and cost-effective way to improve the overall health and quality of life of older adults in non-Western developing countries.
Measurements in the infrared wavelength domain allow direct assessment of the physical state and energy balance of cool matter in space, enabling the detailed study of the processes that govern the formation and evolution of stars and planetary systems in galaxies over cosmic time. Previous infrared missions revealed a great deal about the obscured Universe, but were hampered by limited sensitivity.
SPICA takes the next step in infrared observational capability by combining a large 2.5-meter diameter telescope, cooled to below 8 K, with instruments employing ultra-sensitive detectors. A combination of passive cooling and mechanical coolers will be used to cool both the telescope and the instruments. With mechanical coolers the mission lifetime is not limited by the supply of cryogen. With the combination of low telescope background and instruments with state-of-the-art detectors SPICA provides a huge advance on the capabilities of previous missions.
SPICA instruments offer spectral resolving power ranging from R ~50 through 11 000 in the 17–230 μm domain and R ~28.000 spectroscopy between 12 and 18 μm. SPICA will provide efficient 30–37 μm broad band mapping, and small field spectroscopic and polarimetric imaging at 100, 200 and 350 μm. SPICA will provide infrared spectroscopy with an unprecedented sensitivity of ~5 × 10−20 W m−2 (5σ/1 h)—over two orders of magnitude improvement over what earlier missions. This exceptional performance leap, will open entirely new domains in infrared astronomy; galaxy evolution and metal production over cosmic time, dust formation and evolution from very early epochs onwards, the formation history of planetary systems.
Introduction: ex-specific diagnostic cutoffs may improve the test characteristics of high-sensitivity troponin assays for the diagnosis of myocardial infarction. Sex-specific cutoffs for ruling in MI improve the sensitivity of the assay for MI among women, and improve the specificity of diagnosis among men. We hypothesized that the use of sex-specific high-sensitivity Troponin T (hsTnT) cutoffs for ruling out MI at the time of ED arrival would improve the classification efficiency of the assay by enabling more patients to have MI ruled out at the time of ED arrival while maintaining diagnostic sensitivity. The objective of this study was to quantify the test characteristics of sex-specific cutoffs of an hsTnT assay for acute myocardial infarction (AMI) when performed at ED arrival in patients with chest pain. Methods: This retrospective study included consecutive ED patients with suspected cardiac chest pain evaluated in four urban EDs were, excluding those with ST-elevation AMI, cardiac arrest or abnormal kidney function. The primary outcomes was AMI at 7 days. Secondary outcomes included major adverse cardiac events (MACE: all-cause mortality, AMI and revascularization) and the individual MACE components. We quantified test characteristics (sensitivity, negative predictive value, likelihood ratios and proportion of patients ruled out) for multiple combinations of sex-specific rule-out cutoffs. We calculated net reclassification improvement compared to universal rule-out cutoffs of 5ng/L (the assays limit of detection) and 6ng/L (the FDA-approved limit of quantitation for US laboratories). Results: 7130 patients, including 3931 men and 3199 women, were included. The 7-day incidence of AMI was 7.38% among men and 3.78% among women. Universal cutoffs of 5 and 6 ng/L ruled out AMI with 99.7% sensitivity in 33.6 and 42.2% of patients. The best-performing combination of sex-specific cutoffs (8g/L for men and 6ng/L for men) ruled out AMI with 98.7% sensitivity in 51.9% of patients. Conclusion: Sex-specific hsTnT cutoffs for ruling out AMI at ED arrival may achieve substantial improvement in classification performance, enabling more patients to be ruled out at ED arrival, while maintaining acceptable diagnostic sensitivity for AMI. Universal and sex-specific rule-out cutoffs differ by only small changes in hsTnT concentration. Therefore, these findings should be confirmed in other datasets.
Introduction: Choosing Wisely Canada has identified blood transfusions as a priority area for improving clinical appropriateness. Relevant recommendations include Dont transfuse blood if other non-transfusion therapies or observation would be just as effective. In parallel with this recommendation, the Alberta division of Towards Optimized Practice (ToP) has developed guidelines for the treatment of iron deficiency anemia (IDA) that emphasize the use of non-transfusion therapies (i.e. parenteral or oral iron, in appropriate patients). Choosing Wisely also emphasizes strategies to better engage patients in shared decision making. Methods: In order to better engage patients in shared decision making about their treatment options, both physician and patient handouts were developed using an iterative process. The development of the patient-facing documents began with a synthesis of educational materials currently available to patients with IDA. Clinical leaders from nine different specialties (Emergency Medicine, Family Medicine, Day Medicine, Hematology, and others) were continually engaged in the development of content using a consensus model. A focus group of ESCN patient advisors was assembled to review materials with an emphasis on: (1) Are the patient materials easily understood? (2) Are intended messages resonating while avoiding unintended messaging? (3) What information do patients require that has not been included? Following the focus group, revisions were made to patient materials and a subsequent online survey confirmed that the final version addressed any issues they had raised. Results: A four-page patient handout/infographic was developed utilizing best practices in information design, and in physician and patient engagement. Content includes the causes and symptoms of IDA, progressive treatment options from dietary changes to transfusion, and the four Choosing Wisely questions to discuss with your doctor. Conclusion: Patient education materials can be developed according to best practices in information design and stakeholder engagement. Patient focus groups demonstrate that such materials are easier to understand, and better equip patients to engage in shared decision making.
Introduction: Patients with chronic kidney disease (CKD) are at high risk of cardiovascular events, and have worse outcomes following acute myocardial infarction (AMI). Cardiac troponin is often elevated in CKD, making the diagnosis of AMI challenging in this population. We sought to quantify test characteristics for AMI of a high-sensitivity troponin T (hsTnT) assay performed at emergency department (ED) arrival in CKD patients with chest pain, and to derive rule-out cutoffs specific to patient subgroups stratified by estimated glomerular filtration rate (eGFR). We also quantified the sensitivity and classification performance of the assays limit of detection (5 ng/L) and the FDA-approved limit of quantitation (6 ng/L) for ruling out AMI at ED arrival. Methods: Consecutive patients in four urban EDs from the 2013 calendar year with suspected cardiac chest pain who had a Roche Elecsys hsTnT assay performed on arrival were included f. This analysis was restricted to patients with an eGFR< 60 ml/min/1.73m2. The primary outcome was 7-day AMI. Secondary outcomes included major adverse cardiac events (death, AMI and revascularization). Test characteristics were calculated and ROC curves were generated for eGFR subgroups. Results: 1416 patients were included. 7-day AMI incidence was 10.1%. 73% of patients had an initial hsTnT concentration greater than the assays 99th percentile (14 ng/L). TCurrently accepted cutoffs to rule out MI at ED arrival ( 5 ng/L and 6 ng/L) had 100% sensitivity for AMI, but no patients with an eGFR less than 30 ml/min/1.73M had hsTnT concentrations below these thresholds. We derived eGFR-adjusted cutoffs to rule out MI with sensitivity >98% at ED arrival, which were able to rule out 6-42% of patients, depending on eGFR category. The proportion of patients able to be accurately ruled-in with a single hsTnT assay was substantially lower among patients with an eGFR <30 ml/min/1.73m2 (6-20% vs 25-43%). We also derived eGFR-adjusted cutoffs to rule-in AMI with specificity >90%, which accurately ruled-in up to 18% of patients. Conclusion: Cutoffs achieving acceptable diagnostic performance for AMI using single hsTnT sampling on ED arrival may have limited clinical utility, particularly among patients with very low eGFR. The ideal diagnostic strategy for AMI in patients with CKD likely involves serial high-sensitivity troponin testing with diagnostic thresholds customized to different eGFR categories.
We present a phylogenetic revision of the Sticta filix morphodeme in New Zealand. This non-monophyletic group of early diverging clades in the genus Sticta is characterized by a stalked thallus with a green primary photobiont and the frequent formation of a dendriscocauloid cyanomorph. Traditionally, three species have been distinguished in New Zealand: S. filix (Sw.) Nyl., S. lacera (Hook. f. & Taylor) Müll. Arg. and S. latifrons A. Rich., with two cyanomorphs separated under the names Dendriscocaulon dendriothamnodes Dughi ex D. J. Galloway (traditionally associated with S. latifrons) and D. dendroides (Nyl.) R. Sant. ex H. Magn. (traditionally associated with S. filix). Sticta lacera was not included in the present study due to the lack of authentic material (all specimens originally identified under that name and sequenced clustered with S. filix); S. filix was confirmed as a distinct species whereas S. latifrons s. lat. was shown to represent two unrelated species, S. latifrons s. str. and the reinstated S. menziesii Hook. f. & Taylor. The cyanomorphs of S. filix and S. latifrons are not conspecific with the types of the names D. dendriothamnodes and D. dendroides, respectively; the D. dendriothamnodes cyanomorph belongs to the Australian taxon Sticta stipitata C. Knight ex F. Wilson, which is not present in New Zealand, whereas the D. dendroides cyanomorph corresponds to a previously unrecognized species with unknown chloromorph, recombined here as Sticta dendroides (Nyl.) Moncada, Lücking & de Lange. Thus, instead of three species (S. filix, S. lacera, S. latifrons) with their corresponding cyanomorphs, five species are now distinguished in this guild in New Zealand: S. dendroides (cyanomorph only), S. filix (chloro- and cyanomorph), S. lacera (chloromorph only), S. latifrons (chloro- and cyanomorph) and S. menziesii (chloro- and cyanomorph). A key is presented for identification of the chloromorphs and the dendriscocauloid cyanomorphs of all species. Semi-quantitative analysis suggests that species in this guild are good indicators of intact forest ecosystems in New Zealand and that the two newly recognized species, S. dendroides and S. menziesii, appear to perform particularly well in this respect. The use of lichens as bioindicators of environmental health is not yet established in New Zealand and so, based on our results, we make the case to develop this approach more thoroughly.
The shallow subsurface of Groningen, the Netherlands, is heterogeneous due to its formation in a Holocene tidal coastal setting on a periglacially and glacially inherited landscape with strong lateral variation in subsurface architecture. Soft sediments with low, small-strain shear wave velocities (VS30 around 200 m s−1) are known to amplify earthquake motions. Knowledge of the architecture and properties of the subsurface and the combined effect on the propagation of earthquake waves is imperative for the prediction of geohazards of ground shaking and liquefaction at the surface. In order to provide information for the seismic hazard and risk analysis, two geological models were constructed. The first is the ‘Geological model for Site response in Groningen’ (GSG model) and is based on the detailed 3D GeoTOP voxel model containing lithostratigraphy and lithoclass attributes. The GeoTOP model was combined with information from boreholes, cone penetration tests, regional digital geological and geohydrological models to cover the full range from the surface down to the base of the North Sea Supergroup (base Paleogene) at ~800 m depth. The GSG model consists of a microzonation based on geology and a stack of soil stratigraphy for each of the 140,000 grid cells (100 m × 100 m) to which properties (VS and parameters relevant for nonlinear soil behaviour) were assigned. The GSG model serves as input to the site response calculations that feed into the Ground Motion Model. The second model is the ‘Geological model for Liquefaction sensitivity in Groningen’ (GLG). Generally, loosely packed sands might be susceptible to liquefaction upon earthquake shaking. In order to delineate zones of loosely packed sand in the first 40 m below the surface, GeoTOP was combined with relative densities inferred from a large cone penetration test database. The marine Naaldwijk and Eem Formations have the highest proportion of loosely packed sand (31% and 38%, respectively) and thus are considered to be the most vulnerable to liquefaction; other units contain 5–17% loosely packed sand. The GLG model serves as one of the inputs for further research on the liquefaction potential in Groningen, such as the development of region-specific magnitude scaling factors (MSF) and depth–stress reduction relationships (rd).
The cryosphere is an essential component of the global climate system, equally affecting climate processes significantly and being subject, and particularly sensitive, to changes in climate conditions. Numerical models are an important tool for assessing climate-change impacts on the Antarctic ice–sheet–ice–shelf–ocean system. They not only complement field and satellite remotesensing investigations but are often the only feasible alternative for addressing some of the important parameters and processes. Over the last few years, our group has made significant progress in developing and applying innovative numerical methods. In this paper, we provide a brief overview of some of the methods employed and the major results obtained for a number of case studies in the Atlantic sector of Antarctica.
Although procedural sedation for cardioversion is a common event in emergency departments (EDs), there is limited evidence surrounding medication choices. We sought to evaluate geographic and temporal variation in sedative choice at multiple Canadian sites, and to estimate the risk of adverse events due to sedative choice.
This is a secondary analysis of one health records review, the Recent Onset Atrial Fibrillation or Flutter-0 (RAFF-0 [n=420, 2008]) and one prospective cohort study, the Recent Onset Atrial Fibrillation or Flutter-1 (RAFF-1 [n=565, 2010 – 2012]) at eight and six Canadian EDs, respectively. Sedative choices within and among EDs were quantified, and the risk of adverse events was examined with adjusted and unadjusted comparisons of sedative regimes.
In RAFF-0 and RAFF-1, the combination of propofol and fentanyl was most popular (63.8% and 52.7%) followed by propofol alone (27.9% and 37.3%). There were substantially more adverse events in the RAFF-0 data set (13.5%) versus RAFF-1 (3.3%). In both data sets, the combination of propofol/fentanyl was not associated with increased adverse event risk compared to propofol alone.
There is marked variability in procedural sedation medication choice for a direct current cardioversion in Canadian EDs, with increased use of propofol alone as a sedation agent over time. The risk of adverse events from procedural sedation during cardioversion is low but not insignificant. We did not identify an increased risk of adverse events with the addition of fentanyl as an adjunctive analgesic to propofol.