To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Nearly half of care home residents with advanced dementia have clinically significant agitation. Little is known about costs associated with these symptoms toward the end of life. We calculated monetary costs associated with agitation from UK National Health Service, personal social services, and societal perspectives.
Prospective cohort study.
Thirteen nursing homes in London and the southeast of England.
Seventy-nine people with advanced dementia (Functional Assessment Staging Tool grade 6e and above) residing in nursing homes, and thirty-five of their informal carers.
Data collected at study entry and monthly for up to 9 months, extrapolated for expression per annum. Agitation was assessed using the Cohen-Mansfield Agitation Inventory (CMAI). Health and social care costs of residing in care homes, and costs of contacts with health and social care services were calculated from national unit costs; for a societal perspective, costs of providing informal care were estimated using the resource utilization in dementia (RUD)-Lite scale.
After adjustment, health and social care costs, and costs of providing informal care varied significantly by level of agitation as death approached, from £23,000 over a 1-year period with no agitation symptoms (CMAI agitation score 0–10) to £45,000 at the most severe level (CMAI agitation score >100). On average, agitation accounted for 30% of health and social care costs. Informal care costs were substantial, constituting 29% of total costs.
With the increasing prevalence of dementia, costs of care will impact on healthcare and social services systems, as well as informal carers. Agitation is a key driver of these costs in people with advanced dementia presenting complex challenges for symptom management, service planners, and providers.
Subcutaneous adipose tissue (scAT) and peripheral blood mononuclear cells (PBMC) play a significant role in obesity-associated systemic low-grade inflammation. High-fat diet (HFD) is known to induce inflammatory changes in both scAT and PBMC. However, the time course of the effect of HFD on these systems is still unknown. The aim of the present study was to determine the time course of the effect of HFD on PBMC and scAT. New Zealand white rabbits were fed HFD for 5 or 10 weeks (i.e. HFD-5 and HFD-10) or regular chow (i.e. control (CNT)-5 and CNT-10). Thereafter, metabolic and inflammatory parameters of PBMC and scAT were quantified. HFD induced hyperfattyacidaemia in HFD-5 and HFD-10 groups, with the development of insulin resistance in HFD-10, while no changes were observed in scAT lipid metabolism and inflammatory status. HFD activated the inflammatory pathways in PBMC of HFD-5 group and induced modified autophagy in that of HFD-10. The rate of fat oxidation in PBMC was directly associated with the expression of inflammatory markers and tended to inversely associate with autophagosome formation markers in PBMC. HFD affected systemic substrate metabolism, and the metabolic, inflammatory and autophagy pathways in PBMC in the absence of metabolic and inflammatory changes in scAT. Dietary approaches or interventions to avert HFD-induced changes in PBMC could be essential to prevent metabolic and inflammatory complications of obesity and promote healthier living.
The rocky shores of the north-east Atlantic have been long studied. Our focus is from Gibraltar to Norway plus the Azores and Iceland. Phylogeographic processes shape biogeographic patterns of biodiversity. Long-term and broadscale studies have shown the responses of biota to past climate fluctuations and more recent anthropogenic climate change. Inter- and intra-specific species interactions along sharp local environmental gradients shape distributions and community structure and hence ecosystem functioning. Shifts in domination by fucoids in shelter to barnacles/mussels in exposure are mediated by grazing by patellid limpets. Further south fucoids become increasingly rare, with species disappearing or restricted to estuarine refuges, caused by greater desiccation and grazing pressure. Mesoscale processes influence bottom-up nutrient forcing and larval supply, hence affecting species abundance and distribution, and can be proximate factors setting range edges (e.g., the English Channel, the Iberian Peninsula). Impacts of invasive non-native species are reviewed. Knowledge gaps such as the work on rockpools and host–parasite dynamics are also outlined.
The efficient and effective movement of research into practice is acknowledged as crucial to improving population health and assuring return on investment in healthcare research. The National Center for Advancing Translational Science which sponsors Clinical and Translational Science Awards (CTSA) recognizes that dissemination and implementation (D&I) sciences have matured over the last 15 years and are central to its goals to shift academic health institutions to better align with this reality. In 2016, the CTSA Collaboration and Engagement Domain Task Force chartered a D&I Science Workgroup to explore the role of D&I sciences across the translational research spectrum. This special communication discusses the conceptual distinctions and purposes of dissemination, implementation, and translational sciences. We propose an integrated framework and provide real-world examples for articulating the role of D&I sciences within and across all of the translational research spectrum. The framework’s major proposition is that it situates D&I sciences as targeted “sub-sciences” of translational science to be used by CTSAs, and others, to identify and investigate coherent strategies for more routinely and proactively accelerating research translation. The framework highlights the importance of D&I thought leaders in extending D&I principles to all research stages.
Oceanic anoxic events (OAEs) are contemporaneous with 11 of the 18 largest Phanerozoic extinction events, but the magnitude and selectivity of their paleoecological impact remains disputed. OAEs are associated with abrupt, rapid warming and increased CO2 flux to the atmosphere; thus, insights from this study may clarify the impact of current anthropogenic climate change on the biosphere. We investigated the influence of the Late Cretaceous Bonarelli event (OAE2; Cenomanian/Turonian stage boundary; ~94 Ma) on generic- and species-level molluscan diversity, extinction rates, and ecological turnover. Cenomanian/Turonian results were compared with changes across all Cretaceous stage boundaries, some of which are coincident with less severe OAEs. We found increased generic turnover, but not species-level turnover, associated with several Cretaceous OAEs. The absence of a species-level pattern may reflect species occurrence data that are too temporally coarse to robustly detect patterns. Five hypotheses of ecological selectivity relating anoxia to survivorship were tested across stage boundaries with respect to faunality, mobility, and diet using generalized linear models. Interestingly, benthic taxa were consistently selected against throughout the Cretaceous regardless of the presence or absence of OAEs. These results suggest that: (1) the Cenomanian/Turonian boundary (OAE2) was associated with a decline in molluscan diversity and increase in extinction rate that were significantly more severe than Cretaceous background levels; and (2) no differential ecological selectivity was associated with OAE-related diversity declines among the variables tested here.
We read with interest the recent editorial, “The Hennepin Ketamine Study,” by Dr. Samuel Stratton commenting on the research ethics, methodology, and the current public controversy surrounding this study.1 As researchers and investigators of this study, we strongly agree that prospective clinical research in the prehospital environment is necessary to advance the science of Emergency Medical Services (EMS) and emergency medicine. We also agree that accomplishing this is challenging as the prehospital environment often encounters patient populations who cannot provide meaningful informed consent due to their emergent conditions. To ensure that fellow emergency medicine researchers understand the facts of our work so they may plan future studies, and to address some of the questions and concerns in Dr. Stratton’s editorial, the lay press, and in social media,2 we would like to call attention to some inaccuracies in Dr. Stratton’s editorial, and to the lay media stories on which it appears to be based.
Ho JD, Cole JB, Klein LR, Olives TD, Driver BE, Moore JC, Nystrom PC, Arens AM, Simpson NS, Hick JL, Chavez RA, Lynch WL, Miner JR. The Hennepin Ketamine Study investigators’ reply. Prehosp Disaster Med. 2019;34(2):111–113
OBJECTIVES/SPECIFIC AIMS: To establish an effective team of researchers working towards developing and validating prognostic models employing use of image analyses and other numerical metadata to better understand pediatric undernutrition, and to learn how different approaches can be brought together collaboratively and efficiently. METHODS/STUDY POPULATION: Over the past 18 months we have established a transdisciplinary team spanning three countries and the Schools of Medicine, Engineering, Data Science and Global Health. We first identified two team leaders specifically a pediatric physician scientist (SS) and a data scientist/engineer (DB). The leaders worked together to recruit team members, with the understanding that different ideas are encouraged and will be used collaboratively to tackle the problem of pediatric undernutrition. The final data analytic and interpretative core team consisted of four data science students, two PhD students, an undergraduate biology major, a recent medical graduate, and a PhD research scientist. Additional collaborative members included faculty from Biomedical Engineering, the School of Medicine (Pediatrics and Pathology) along with international Global Health faculty from Pakistan and Zambia. We learned early on that it was important to understand what each of the member’s motivation for contributing to the project was along with aligning that motivation with the overall goals of the team. This made us help prioritize team member tasks and streamline ideas. We also incorporated a mechanism of weekly (monthly/bimonthly for global partners) meetings with informal oral presentations which consisted of each member’s current progress, thoughts and concerns, and next experimental goals. This method enabled team leaders to have a 3600 mechanism of feedback. Overall, we assessed the effectiveness of our team by two mechanisms: 1) ongoing team member feedback, including team leaders, and 2) progress of the research project. RESULTS/ANTICIPATED RESULTS: Our feedback has shown that on initial development of the team there was hesitance in communication due to the background diversity of our various member along with different cultural/social expectations. We used ice-breaking methods such as dedicated time for brief introductions, career directions, and life goals for each team member. We subsequently found that with the exception of one, all other team members noted our working environment professional and conducive to productivity. We also learnt from our method of ongoing constant feedback that at times, due to the complexity of different disciplines, some information was lost due to the difference in educational backgrounds. We have now employed new methods to relay information more effectively, with the use of not just sharing literature but also by explaining the content. The progress of our research project has varied over the past 4-6 months. There was a steep learning curve for almost every member, for example all the data science students had never studied anything related to medicine during their education, including minimal if none exposure to the ethics of medical research. Conversely, team members with medical/biology backgrounds had minimal prior exposure to computational modeling, computer engineering and the verbage of communicating mathematical algorithms. While this may have slowed our progress we learned that by asking questions and engaging every member it was easier to delegate tasks effectively. Once our team reached an overall understanding of each member’s goals there was a steady progress in the project, with new results and new methods of analysis being tested every week. DISCUSSION/SIGNIFICANCE OF IMPACT: We expect that our on-going collaboration will result in the development of new and novel modalities to understand and diagnose pediatric undernutrition, and can be used as a model to tackle several other problems. As with many team science projects, credit and authorship are challenges that we are outlining creative strategies for as suggested by International Committee of Medical Journal Editors (ICMJE) and other literature.
The genetic and environmental contributions of negative valence systems (NVS) to internalizing pathways study (also referred to as the Adolescent and Young Adult Twin Study) was designed to examine varying constructs of the NVS as they relate to the development of internalizing disorders from a genetically informed perspective. The goal of this study was to evaluate genetic and environmental contributions to potential psychiatric endophenotypes that contribute to internalizing psychopathology by studying adolescent and young adult twins longitudinally over a 2-year period. This report details the sample characteristics, study design, and methodology of this study. The first wave of data collection (i.e., time 1) is complete; the 2-year follow-up (i.e., time 2) is currently underway. A total of 430 twin pairs (N = 860 individual twins; 166 monozygotic pairs; 57.2% female) and 422 parents or legal guardians participated at time 1. Twin participants completed self-report surveys and participated in experimental paradigms to assess processes within the NVS. Additionally, parents completed surveys to report on themselves and their twin children. Findings from this study will help clarify the genetic and environmental influences of the NVS and their association with internalizing risk. The goal of this line of research is to develop methods for early internalizing disorder risk detection.
Identifying genetic relationships between complex traits in emerging adulthood can provide useful etiological insights into risk for psychopathology. College-age individuals are under-represented in genomic analyses thus far, and the majority of work has focused on the clinical disorder or cognitive abilities rather than normal-range behavioral outcomes.
This study examined a sample of emerging adults 18–22 years of age (N = 5947) to construct an atlas of polygenic risk for 33 traits predicting relevant phenotypic outcomes. Twenty-eight hypotheses were tested based on the previous literature on samples of European ancestry, and the availability of rich assessment data allowed for polygenic predictions across 55 psychological and medical phenotypes.
Polygenic risk for schizophrenia (SZ) in emerging adults predicted anxiety, depression, nicotine use, trauma, and family history of psychological disorders. Polygenic risk for neuroticism predicted anxiety, depression, phobia, panic, neuroticism, and was correlated with polygenic risk for cardiovascular disease.
These results demonstrate the extensive impact of genetic risk for SZ, neuroticism, and major depression on a range of health outcomes in early adulthood. Minimal cross-ancestry replication of these phenomic patterns of polygenic influence underscores the need for more genome-wide association studies of non-European populations.
Neospora caninum is a coccidian intracellular protozoan capable of infecting a wide range of mammals, although severe disease is mostly reported in dogs and cattle. Innate defences triggered by monocytes/macrophages are key in the pathogenesis of neosporosis, as these cells are first-line defenders against intracellular infections. The aim of this study was to characterize infection and innate responses in macrophages infected with N. caninum using a well-known cell model to study macrophage functions (human monocyte THP-1 cells). Intracellular invasion of live tachyzoites occurred as fast as 4 h (confirmed with immunofluorescence microscopy using N. caninum-specific antibodies). Macrophages infected by N. caninum had increased expression of pro-inflammatory cytokines (TNFα, IL-1β, IL-8, IFNγ). Interestingly, N. caninum induced expression of host-defence peptides (cathelicidins), a mechanism of defence never reported for N. caninum infection in macrophages. The expression of cytokines and cathelicidins in macrophages invaded by N. caninum was mediated by mitogen-activated protein kinase (MEK 1/2). Secretion of such innate factors from N. caninum-infected macrophages reduced parasite internalization and promoted the secretion of pro-inflammatory cytokines in naïve macrophages. We concluded that rapid invasion of macrophages by N. caninum triggered protective innate defence mechanisms against intracellular pathogens.
Introduction: Redirecting low acuity patients from emergency departments to primary care walk-in clinics has been identified as a priority by many health authorities. Promoting family physicians for the management of ambulatory patients with urgent health concerns reflects the assumption that primary care facilities can offer high-quality and more affordable ambulatory emergency care. However, no performance assessment framework has been developed for ambulatory emergency care and consequently, quality of care provided in these alternate settings has never been formally compared. Primary objective: To identify structure, process and outcome indicators for ambulatory emergency care. Methods: We will identify and develop quality indicators (QIs) for ambulatory emergency care using a RAND/UCLA Appropriateness Method (RAM) composed of three different steps. First, we will perform a scoping literature review to inventory 1) all previously recommended QIs assessing care provided to ambulatory emergency patients in the ED or the primary care settings; 2) all conditions evaluated with the retrieved QIs; and 3) all outcomes measured by the same QIs. Second, a steering committee composed of the research team and of international experts in performance assessment in emergency and primary care will be presented with the lists of QI-related conditions and outcomes. They will be asked to identify potential outcome indicators for ambulatory emergency care by generating any relevant combinations of one condition and one outcome (e.g. acute asthma exacerbation/re-consultation). Committee members will be given the latitude to use and pair any conditions or outcomes not included in the lists as long as they think the resulting indicators are compatible with the study objectives. Using a structured nominal group approach, they will combine their suggestions and refine the list of potential QIs. This list of potential outcome indicators composed of pairs “condition/outcome” will be merged with the list of already published QIs identified during the literature review. Third, as per the RAM standards, we will assemble an international multidisciplinary panel (n=20) of patients, emergency and primary care providers, researchers and decision makers, after recommendations from international emergency and primary care associations, and from the Canadian Strategy for Patient-Oriented Research (SPOR) Support Units. Through iterative rounds of ratings using both web-based survey tools and videoconferencing, panelists will independently assess all candidate QIs. They will be asked to rate on a nine-level scale to what extent each QI is a relevant and useful measure of ambulatory emergency care quality. From one round to the next, QIs with a median panelist rating score of one to three will be excluded. Those with a median score of seven or more will be automatically included in the final list. QIs with median score of four to six will be retained for future deliberations among the panelists. Rounds of ratings will be conducted until all QIs are classified. Impact: The QIs identified will be used to develop a performance assessment framework for ambulatory emergency care. This will represent an essential step toward testing the assumption that EDs and primary care walk-in clinics provide equivalent care quality to low acuity patients.
The Zadko telescope is a 1 m f/4 Cassegrain telescope, situated in the state of Western Australia about 80-km north of Perth. The facility plays a niche role in Australian astronomy, as it is the only meter class facility in Australia dedicated to automated follow-up imaging of alerts or triggers received from different external instruments/detectors spanning the entire electromagnetic spectrum. Furthermore, the location of the facility at a longitude not covered by other meter class facilities provides an important resource for time critical projects. This paper reviews the status of the Zadko facility and science projects since it began robotic operations in March 2010. We report on major upgrades to the infrastructure and equipment (2012–2014) that has resulted in significantly improved robotic operations. Second, we review the core science projects, which include automated rapid follow-up of gamma ray burst (GRB) optical afterglows, imaging of neutrino counterpart candidates from the ANTARES neutrino observatory, photometry of rare (Barbarian) asteroids, supernovae searches in nearby galaxies. Finally, we discuss participation in newly commencing international projects, including the optical follow-up of gravitational wave (GW) candidates from the United States and European GW observatory network and present first tests for very low latency follow-up of fast radio bursts. In the context of these projects, we outline plans for a future upgrade that will optimise the facility for alert triggered imaging from the radio, optical, high-energy, neutrino, and GW bands.
The Universe is permeated by hot, turbulent, magnetized plasmas. Turbulent plasma is a major constituent of active galactic nuclei, supernova remnants, the intergalactic and interstellar medium, the solar corona, the solar wind and the Earth’s magnetosphere, just to mention a few examples. Energy dissipation of turbulent fluctuations plays a key role in plasma heating and energization, yet we still do not understand the underlying physical mechanisms involved. THOR is a mission designed to answer the questions of how turbulent plasma is heated and particles accelerated, how the dissipated energy is partitioned and how dissipation operates in different regimes of turbulence. THOR is a single-spacecraft mission with an orbit tuned to maximize data return from regions in near-Earth space – magnetosheath, shock, foreshock and pristine solar wind – featuring different kinds of turbulence. Here we summarize the THOR proposal submitted on 15 January 2015 to the ‘Call for a Medium-size mission opportunity in ESAs Science Programme for a launch in 2025 (M4)’. THOR has been selected by European Space Agency (ESA) for the study phase.
The grounded ice in the Totten and Dalton glaciers is an essential component of the buttressing for the marine-based Aurora basin, and hence their stability is important to the future rate of mass loss from East Antarctica. Totten and Vanderford glaciers are joined by a deep east-west running subglacial trench between the continental ice sheet and Law Dome, while a shallower trench links the Totten and Dalton glaciers. All three glaciers flow into the ocean close to the Antarctic circle and experience ocean-driven ice shelf melt rates comparable with the Amundsen Sea Embayment. We investigate this combination of trenches and ice shelves with the BISICLES adaptive mesh ice-sheet model and ocean-forcing melt rates derived from two global climate models. We find that ice shelf ablation at a rate comparable with the present day is sufficient to cause widespread grounding line retreat in an east-west direction across Totten and Dalton glaciers, with projected future warming causing faster retreat. Meanwhile, southward retreat is limited by the shallower ocean facing slopes between the coast and the bulk of the Aurora sub-glacial trench. However the two climate models produce completely different future ice shelf basal melt rates in this region: HadCM3 drives increasing sub-ice shelf melting to ~2150, while ECHAM5 shows little or no increase in sub-ice shelf melting under the two greenhouse gas forcing scenarios.
Background: Planning for neurology training necessitated a reflection on the experience of graduates. We explored practice characteristics, and training experience of recent graduates. Methods: Graduates from 2010-2014 completed a survey. Results: Response rate was 37% of 211. 56% were female. 91% were adult neurologists. 65% practiced in an outpatient setting. 63% worked in academics. 85% completed subspecialty training (median 1 year). 36% work 3 days a week or less. 82% took general call (median 1 night weekly). Role preparation was considered very good or excellent for most; however poor or fair ratings were 17% in advocacy and 8% in leadership. Training feedback was at least “good” for 87%. Burnout a few times a week or more was noted by 5% (6% during residency, particularly PGY1 and 5). 64% felt overly burdened by paperwork. Although most felt training was adequate, it was poor or fair at preparing for practice management (85%) and personal balance (55%). Most conditions were under-observed in training environment. Many noted a need for more independent practice development and community neurology. Conclusions: Although our training was found to be very good, some identified needs included advocacy training, and more training in general neurology in the longitudinal outpatient/community settings.
Introduction: Delirium is a dreadful complication in seniors’ acute care. Many studies are available on the incidence of delirium, however ED-induced delirium is far less studied. We aim to evaluate the incidence and impact of ED-induced delirium among older non-delirious admitted ED patients who have prolonged ED stays (≥ 8 hours). Methods: This prospective INDEED study phase 1 included patients recruited from 4 Canadian EDs. Inclusion criteria: 1) Patients aged 65 and over; 2) ED stay ≥ 8 hours; 3) Patient is admitted to the hospital; 4) Patient is non-delirious upon arrival and at the end of the first 8 hours; 5) Independent or semi-independent patient. Eligible patients were assessed by a research assistant after an 8 hour exposition to the ED and evaluated twice a day up to 24h after ward admission. Patients’ functional and cognitive status were assessed using validated OARS and TICS-m tools. The Confusion Assessment Method was used to detect incident delirium. Hospital length of stays (LOS) were obtained. Univariate and multivariate analyses were conducted to evaluate outcomes. Results: Of the 380 patients prospectively followed, mean age was 76.5 (± 8.9), male represent 50% and 16.5% very old seniors (> 85 y.o.). The overall incidence of ED-induced delirium was 8.4%. Distribution by the 4 sites was: 10%, 13.8%, 5.5% & 13.4%. The mean ED LOS varied from 29 to 48 hours. The mean hospital LOS was increase by 6.1 days in the delirious patients compared to non-delirious patient (p<0.05). Increase mean hospital LOS distribution by site was by: 6.9, 8.5, 4.3 and 5.2 days for the ED-induced delirium patients. Conclusion: ED-induced delirium was recorded in nearly one senior out of ten after a minimal 8 hour exposure in the ED environment. An episode of delirium increases hospital LOS by about a week and therefore could contribute to ED overcrowding.
Introduction: Delirium is a frequent complication among seniors in the emergency department (ED). This condition is often underdiagnosed by ED professionals even though it is associated with functional & cognitive decline, longer hospital length of stay, institutionalization and death. Frailty is increasingly recognized as an independent predictor of adverse events in seniors and screening for frailty in EDs has recently been recommended. The aim of this study was to assess if screening seniors for frailty in EDs could help identify those at risk of ED-induced delirium. Methods: This study is part of the Incidence and Impact measurement of Delirium Induced by ED-Stay study, an ongoing multicenter prospective cohort study in 5 Quebec EDs. Patients were recruited after 8 hours in the ED exposure & followed up to 24h after ward admission. Frailty was assessed at ED admission using the Canadian Study of Health and Aging-Clinical Frailty Scale (CSHA-CFS) which classified seniors from robust (1/7) to severely frail (7/7). Seniors with CSHA-CFS ≥ 5/7 were considered frail. Delirium was assessed using the Confusion assessment method and Delirium Index. Results: Of the 380 patients recruited, mean age was 76.5 (±8.9). Male were 50%. Mean stay in the ED was 1.4 day (±0.82). Preliminary data show an incidence of ED-induced delirium of 8.4%. Average frailty score at baseline was 3.5/7. 72 patients were considered frail, while 289 were considered robust. Among the frail seniors, there were 48.4% (30-66%) patients with ED-induced delirium vs 17.9% (13.7-22.0] in the non-frail ones (p<0.0001). Conclusion: Increased frailty appears to be associated with increased ED-induced delirium. Screening for frailty at emergency triage could help ED professionals identify seniors at higher risk of ED-induced delirium. Further studies are required to confirm the importance of the association between frailty and ED-induced delirium
Introduction: Delirium is a common medical complication among seniors in hospital setting. In the emergency department (ED), its prevalence varies between 7 & 14%. Delirium is associated with increased mortality & longer hospital stay. This condition is also associated with functional & cognitive decline in hospitalized seniors and higher risk of institutionalization up to 2 years after their discharge. However, no data is currently available for ED patients. The aim of this study was to evaluate the association between ED-induced delirium and functional & cognitive decline in seniors at 60 days. Methods: This study is part of the Incidence and Impact measurement of Delirium Induced by ED-Stay (INDEED) study, an ongoing multicenter prospective cohort study in 5 Quebec EDs. Patients were recruited after 8 hours in the ED and followed up to 24h after admission. A 60-day follow-up phone assessment was also conducted. Delirium was measured by the validated Confusion Assessment Method & the Delirium Index. Functional status was measured by the validated OARS. Cognitive status was measured using the validated TICS-M. Functional and cognitive decline were obtained by comparing the baseline and 60-days follow-up scores. Results: 380 seniors were recruited and 280 had 60-day follow-up data available. ED-induced delirium was 8.4% of seniors. There was a difference in mean functional decline among seniors with and without ED-induced delirium 2.95(1.23-4.67) vs 1.55(1.20-1.91, pwlicoxon= 0.05] Proportion of seniors showing a decline ≥2 points on the OARS was significantly higher In those with ED-induced delirium (65,0 % vs 40.18 %, p=0.03). Seniors with ED-induced delirium also showed a significant decline in mean TICS scores [3.31 (0,82-5.84) vs -0.01((-.071-0.75)), pwlicoxon =0.009]. There was no significant difference in the proportions of seniors showing a decline ≥ 3 OARS points between those with or without delirium (p=0.06). Conclusion: ED-induced delirium seems to be associated with poor functional and cognitive outcomes in older patients 60 days after discharge from the hospital. Further studies are required to confirm clinical importance ED-induced delirium delayed complication.