To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Several life-threatening diseases of the kidney have their origins in mutational events that occur during embryonic development. In this study, we investigate the role of the Wolffian duct (WD), the earliest embryonic epithelial progenitor of renal tubules, in the etiology of autosomal dominant polycystic kidney disease (ADPKD). ADPKD is associated with a germline mutation of one of the two Pkd1 alleles. For the disease to occur, a second event that disrupts the expression of the other inherited Pkd1 allele must occur. We postulated that this secondary event can occur in the pronephric WD. Using Cre-Lox recombination, mice with WD-specific deletion of one or both Pkd1 alleles were generated. Homozygous Pkd1-targeted deletion in WD-derived tissues resulted in mice with large cystic kidneys and serologic evidence of renal failure. In contrast, heterozygous deletion of Pkd1 in the WD led to kidneys that were phenotypically indistinguishable from control in the early postnatal period. High-throughput sequencing, however, revealed underlying gene and microRNA (miRNA) changes in these heterozygous mutant kidneys that suggest a strong predisposition toward developing ADPKD. Bioinformatic analysis of this data demonstrated an upregulation of several miRNAs that have been previously associated with PKD; pathway analysis further demonstrated that the differentially expressed genes in the heterozygous mutant kidneys were overrepresented in signaling pathways associated with maintenance and function of the renal tubular epithelium. These results suggest that the WD may be an early epithelial target for the genetic or molecular signals that can lead to cyst formation in ADPKD.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
This is a copy of the slides presented at the meeting but not formally written up for the volume.
Reflection high-energy electron diffraction (RHEED) is one of the most robust and widespread techniques used for in-situ monitoring during molecular beam epitaxy (MBE) growth. Thus, all MBE systems have an electron gun allowing additional electron-beam stimulated in-situ characterizations. At WVU we are developing two such techniques, spectral analysis of cathodoluminescence (CL) in wide bandgap semiconductors and reflection high-energy electron diffraction-total reflection angle x-ray spectroscopy (RHEED-TRAXS) for in-situ composition monitoring and control.A pressing issue remaining for epitaxial growth is real-time compositional control to a high level of accuracy. For many materials, such as multi-element nitrides and oxides with unity sticking coefficients, it would be extremely beneficial to monitor the composition to a fraction of a monolayer. This technique needs to be both element-specific and surface-sensitive. RHEED-TRAXS is a leading contender as such a technique. The electron beam from a RHEED gun impinges on the sample at a small angle of incidence approximately equal to the critical angle for x-ray reflection. This geometry ensures that the measurement is extremely surface sensitive. This technique can be used to obtain both structural information, via RHEED, and chemical information, via x-ray detection. We are currently developing a compact RHEED-TRAXS using a state-of-the-art Si P-intrinsic-N (PIN) photodiode technology. We have used this system to investigate Ga and In coverage during the growth of GaN, and have observed Ga bi-layer evolution during growth, Mg destabilization of the Ga wetting layer, and significant In surface segregation. We are also investigating the in-situ, real-time composition measurements in complex oxide systems such as YMnO3 with promising initial results.In-situ cathodoluminescence (CL) occurring during RHEED is a strong candidate to determine the growth temperature and alloy composition for wide bandgap semiconductors. CL is easily detected up to and beyond typical growth temperatures for GaN and InGaN, accurately and reproducibly determining sample temperature during growth. Room CL measurement at room temperature can also be used as a means to check the quality of the substrate by comparing intensities of the GaN band edge energy peak and defect peaks. We have performed a detailed study of the factors influencing high temperature CL, and find the reproducibility of CL data and ability for fast CL scanning provide strong advantages for use in the growth of GaN films. CL could also be observed during growth using a ccd camera. This could be used to see temperature inhomogenaities, and potentially to map alloy composition fluctuations. Using tunable narrow bandpass optical filters, we can obtain a spatial/spectral map of sample CL. We will present CL images of samples at differing temperatures.This work was supported by the AFOSR MURI F49620-03-1-0330 and by ONR Grant N00014-02-1-0974.
Introduction: It is recommended that seniors consulting to the Emergency Department (ED) undergo a comprehensive geriatric screening, which is difficult for most EDs. Patient self-assessment using electronic tablet could be an interesting solution to this issue. However, the acceptability of self-assessment by older ED patients remains unknown. Assessing acceptability is a fundamental step in evaluating new interventions. The main objective of this project is to compare the acceptability of older patient self-assessment in the ED to that of a standard assessment made by a professional, according to seniors and their caregivers. Methods: Design: This randomized crossover design cohort study took place between May and July 2018. Participants: 1) Patients aged ≥65 years consulting to the ED, 2) their caregiver, when present. Measurements: Patients performed self-assessment of their frailty, cognitive and functional status using an electronic tablet. Acceptability was measured using the Treatment Acceptability and Preferences (TAP) questionnaires. Analyses: Descriptive analyses were performed for sociodemographic variables. Scores were adjusted for confounding variables using multivariate linear regression. Thematic content analysis was performed by two independent analysts for qualitative data collected in the TAP's open-ended question. Results: A total of 67 patients were included in this study. Mean age was 75.5 ± 8.0 and 55.2% of participants were women. Adjusted mean TAP scores for RA evaluation and patient self-assessment were 2.36 and 2.20, respectively. We found no difference between the two types of evaluations (p = 0.0831). When patients are stratified by age groups, patients aged 85 and over (n = 11) showed a difference between the TAPs scores, 2.27 for RA evaluation and 1.72 for patient self-assessment (p = 0.0053). Our qualitative data shows that this might be attributed to the use of technology, rather than to the self-assessment itself. Data from 9 caregivers showed a 2.42 mean TAP score for RA evaluation and 2.44 for self-assessment. However, this relatively small sample size prevented us to perform statistical tests. Conclusion: Our results show that older patients find self-assessment in the ED using an electronic tablet just as acceptable as a standard evaluation by a professional.
X-ray diffraction techniques have been used for the structure characterization of Y-Ba-Cu-O and Tl-Ca-Ba-Cu-O thin films. A powder diffraction analysis of Y-Ba-Cu-O films showed that the films deposited at 650°C on Si are polycrystalline and have an orthorhambic structure similar to that of the YBa2Cu3O7 bulk superconductors. In addition to the conventional powder diffraction technique, both the rocking curve and the grazing incidence diffraction methods were used to characterize a YBa2Cu3O7 film on (110) SrTiO3 substrate. Results showed that the film was epitaxially grown and aligned with its substrate in a true epitaxy. Phase identification and line broadening analyses of Tl-Ca-Ba-Cu-O films showed that the films are comprised of one or more superconducting phases and probably contain stacking faults.
The decision rules individuals use to judge wrongdoing committed inside corporations and other hierarchical organizations are not well understood. We explore this issue by asking random samples of individuals in Moscow, Tokyo, and Washington, D. C., to respond to four short vignettes describing acts of wrongdoing by people in corporations. The vignettes are experiments that manipulate the actor's mental state, the actor's position in the organization, and whether the actor's decision was influenced by others in the organization. We examine (1) the distribution of responsibility among people in the organization, (2) how individual responsibility affects the attribution of responsibility to the organization itself, and (3) cross-national differences in attributions. We find that both what the actors did (their deeds) and the position they occupied (their roles) significantly influence the responsibility attributed to them. The responsibility attributed to the organizations themselves is a function of the responsibility attributed to the actors inside the organization, but not a function of the independent variables in the experiments. Cross-national differences emerge with respect to the responsibility assigned both to individuals and to the organizations themselves. We discuss implications of these results for past and future work.
Polyphenol oxidase (PPO) in red clover (RC) has been shown to reduce both lipolysis and proteolysis in silo and implicated (in vitro) in the rumen. However, all in vivo comparisons have compared RC with other forages, typically with lower levels of PPO, which brings in other confounding factors as to the cause for the greater protection of dietary nitrogen (N) and C18 polyunsaturated fatty acids (PUFA) on RC silage. This study compared two RC silages which when ensiled had contrasting PPO activities (RC+ and RC−) against a control of perennial ryegrass silage (PRG) to ascertain the effect of PPO activity on dietary N digestibility and PUFA biohydrogenation. Two studies were performed the first to investigate rumen and duodenal flow with six Hereford×Friesian steers, prepared with rumen and duodenal cannulae, and the second investigating whole tract N balance using six Holstein-Friesian non-lactating dairy cows. All diets were offered at a restricted level based on animal live weight with each experiment consisting of two 3×3 Latin squares using big bale silages ensiled in 2010 and 2011, respectively. For the first experiment digesta flow at the duodenum was estimated using a dual-phase marker system with ytterbium acetate and chromium ethylenediaminetetraacetic acid as particulate and liquid phase markers, respectively. Total N intake was higher on the RC silages in both experiments and higher on RC− than RC+. Rumen ammonia-N reflected intake with ammonia-N per unit of N intake lower on RC+ than RC−. Microbial N duodenal flow was comparable across all silage diets with non-microbial N higher on RC than the PRG with no difference between RC+ and RC−, even when reported on a N intake basis. C18 PUFA biohydrogenation was lower on RC silage diets than PRG but with no difference between RC+ and RC−. The N balance trial showed a greater retention of N on RC+ over RC−; however, this response is likely related to the difference in N intake over any PPO driven protection. The lack of difference between RC silages, despite contrasting levels of PPO, may reflect a similar level of protein-bound-phenol complexing determined in each RC silage. Previously this complexing has been associated with PPOs protection mechanism; however, this study has shown that protection is not related to total PPO activity.
This study of loneliness across adult lifespan examined its associations with sociodemographics, mental health (positive and negative psychological states and traits), subjective cognitive complaints, and physical functioning.
Analysis of cross-sectional data
340 community-dwelling adults in San Diego, California, mean age 62 (SD = 18) years, range 27–101 years, who participated in three community-based studies.
Loneliness measures included UCLA Loneliness Scale Version 3 (UCLA-3), 4-item Patient-Reported Outcomes Measurement Information System (PROMIS) Social Isolation Scale, and a single-item measure from the Center for Epidemiologic Studies Depression (CESD) scale. Other measures included the San Diego Wisdom Scale (SD-WISE) and Medical Outcomes Survey- Short form 36.
Seventy-six percent of subjects had moderate-high levels of loneliness on UCLA-3, using standardized cut-points. Loneliness was correlated with worse mental health and inversely with positive psychological states/traits. Even moderate severity of loneliness was associated with worse mental and physical functioning. Loneliness severity and age had a complex relationship, with increased loneliness in the late-20s, mid-50s, and late-80s. There were no sex differences in loneliness prevalence, severity, and age relationships. The best-fit multiple regression model accounted for 45% of the variance in UCLA-3 scores, and three factors emerged with small-medium effect sizes: wisdom, living alone and mental well-being.
The alarmingly high prevalence of loneliness and its association with worse health-related measures underscore major challenges for society. The non-linear age-loneliness severity relationship deserves further study. The strong negative association of wisdom with loneliness highlights the potentially critical role of wisdom as a target for psychosocial/behavioral interventions to reduce loneliness. Building a wiser society may help us develop a more connected, less lonely, and happier society.
To assess variability in antimicrobial use and associations with infection testing in pediatric ventilator-associated events (VAEs).
Descriptive retrospective cohort with nested case-control study.
Pediatric intensive care units (PICUs), cardiac intensive care units (CICUs), and neonatal intensive care units (NICUs) in 6 US hospitals.
Children≤18 years ventilated for≥1 calendar day.
We identified patients with pediatric ventilator-associated conditions (VACs), pediatric VACs with antimicrobial use for≥4 days (AVACs), and possible ventilator-associated pneumonia (PVAP, defined as pediatric AVAC with a positive respiratory diagnostic test) according to previously proposed criteria.
Among 9,025 ventilated children, we identified 192 VAC cases, 43 in CICUs, 70 in PICUs, and 79 in NICUs. AVAC criteria were met in 79 VAC cases (41%) (58% CICU; 51% PICU; and 23% NICU), and varied by hospital (CICU, 20–67%; PICU, 0–70%; and NICU, 0–43%). Type and duration of AVAC antimicrobials varied by ICU type. AVAC cases in CICUs and PICUs received broad-spectrum antimicrobials more often than those in NICUs. Among AVAC cases, 39% had respiratory infection diagnostic testing performed; PVAP was identified in 15 VAC cases. Also, among AVAC cases, 73% had no associated positive respiratory or nonrespiratory diagnostic test.
Antimicrobial use is common in pediatric VAC, with variability in spectrum and duration of antimicrobials within hospitals and across ICU types, while PVAP is uncommon. Prolonged antimicrobial use despite low rates of PVAP or positive laboratory testing for infection suggests that AVAC may provide a lever for antimicrobial stewardship programs to improve utilization.
We observed pediatric S. aureus hospitalizations decreased 36% from 26.3 to 16.8 infections per 1,000 admissions from 2009 to 2016, with methicillin-resistant S. aureus (MRSA) decreasing by 52% and methicillin-susceptible S. aureus decreasing by 17%, among 39 pediatric hospitals. Similar decreases were observed for days of therapy of anti-MRSA antibiotics.
A cluster of Salmonella Paratyphi B variant L(+) tartrate(+) infections with indistinguishable pulsed-field gel electrophoresis patterns was detected in October 2015. Interviews initially identified nut butters, kale, kombucha, chia seeds and nutrition bars as common exposures. Epidemiologic, environmental and traceback investigations were conducted. Thirteen ill people infected with the outbreak strain were identified in 10 states with illness onset during 18 July–22 November 2015. Eight of 10 (80%) ill people reported eating Brand A raw sprouted nut butters. Brand A conducted a voluntary recall. Raw sprouted nut butters are a novel outbreak vehicle, though contaminated raw nuts, nut butters and sprouted seeds have all caused outbreaks previously. Firms producing raw sprouted products, including nut butters, should consider a kill step to reduce the risk of contamination. People at greater risk for foodborne illness may wish to consider avoiding raw products containing raw sprouted ingredients.
We consider a polling system with two queues, exhaustive service, no switchover times, and exponential service times with rate µ in each queue. The waiting cost depends on the position of the queue relative to the server: it costs a customer c per time unit to wait in the busy queue (where the server is) and d per time unit in the idle queue (where there is no server). Customers arrive according to a Poisson process with rate λ. We study the control problem of how arrivals should be routed to the two queues in order to minimize the expected waiting costs and characterize individually and socially optimal routeing policies under three scenarios of available information at decision epochs: no, partial, and complete information. In the complete information case, we develop a new iterative algorithm to determine individually optimal policies (which are symmetric Nash equilibria), and show that such policies can be described by a switching curve. We use Markov decision processes to compute the socially optimal policies. We observe numerically that the socially optimal policy is well approximated by a linear switching curve. We prove that the control policy described by this linear switching curve is indeed optimal for the fluid version of the two-queue polling system.
BACKGROUND: Meningiomas are the most common primary benign brain tumors in adults. Given the extended life expectancy of most meningiomas, consideration of quality of life (QOL) is important when selecting the optimal management strategy. There is currently a dearth of meningioma-specific QOL tools in the literature. OBJECTIVE: In this systematic review, we analyze the prevailing themes and propose toward building a meningioma-specific QOL assessment tool. METHODS: A systematic search was conducted, and only original studies based on adult patients were considered. QOL tools used in the various studies were analyzed for identification of prevailing themes in the qualitative analysis. The quality of the studies was also assessed. RESULTS: Sixteen articles met all inclusion criteria. Fifteen different QOL assessment tools assessed social and physical functioning, psychological, and emotional well-being. Patient perceptions and support networks had a major impact on QOL scores. Surgery negatively affected social functioning in younger patients, while radiation therapy had a variable impact. Any intervention appeared to have a greater negative impact on physical functioning compared to observation. CONCLUSION: Younger patients with meningiomas appear to be more vulnerable within social and physical functioning domains. All of these findings must be interpreted with great caution due to great clinical heterogeneity, limited generalizability, and risk of bias. For meningioma patients, the ideal QOL questionnaire would present outcomes that can be easily measured, presented, and compared across studies. Existing scales can be the foundation upon which a comprehensive, standard, and simple meningioma-specific survey can be prospectively developed and validated.
Background: With advancements in technology, the use of video as a pedagogical method in medical education has gained in popularity, and may aid in teaching clinical skills. In the UBC MD program, videos have been used to assist in teaching the -neurological exam for several decades, but the currently available videos are outdated and not of contemporary quality. Methods: Drawing upon the cognitive theory of multimedia learning from Mayer and Moreno (2003) which describes methods to maximize learning by minimizing cognitive load, we developed a tool to systematically assess pedagogical videos. We inventoried twelve existing neurology videos and analyzed their use of methods such as weeding (removing extraneous information), signalling (visually highlighting important information), and chunking (grouping similar information together). Results: Generally, older videos had poor audiovisual quality that introduced extraneous load, while more current videos had higher production value, albeit inconsistent with the depth of their content. We therefore produced a new three-part neurological exam video series. We wrote storyboards, filmed with a focus on visually depicting the exam and findings, and edited to elucidate relevant physiological concepts. Conclusions: The end product has been adopted by the UBC MD program, and can be shared with other programs who may wish to adopt them.
Introduction: Prevalence and incidence of delirium in older patients admitted to acute and long-term care facilities ranges between 9.6% and 89% but little is known in the context of emergency department (ED) incident delirium. Literature regarding the incidence of delirium in the ED and its potential impacts on hospital length of stay (LOS), functional status and unplanned ED readmissions is scant, its consequences have yet to be clearly identified in order to orient modern acute medical care. Methods: This study is part of the multicenter prospective cohort INDEED study. Three Canadian EDs completed the two years prospective study (March-July 2015 and Feb-May 2016). Patients aged 65 years old, initially free of delirium with an ED stay 8hours were followed up to 24h after ward admission. Patients were assessed 2x/day during their entire ED stay and up to 24 hours on hospital ward by research assistants (RA). The primary outcome of this study was incident delirium in the ED or within 24 h of ward admission. Functional and cognitive status were assessed using validated Older Americans’ Resources and Services and the Telephone Interview for Cognitive Status- modified tools. The Confusion Assessment Method (CAM) was used to detect incident delirium. ED and hospital administrative data were collected. Inter-observer agreement was realized among RA. Results: Incident delirium was not different between sites, nor between phases, nor between times from one site to another. All phases confounded, there is between 7 to 11% of ED related incident delirious episodes. Differences were seen in ED LOS between sites in non-delirious patients, but also between some sites for delirious participants (p<0.05). Only one site had a difference in ED LOS between their delirious and non-delirious patients, respectively of 52.1 and 40.1 hours (p<0.05). There is also a difference between sites in the time between arrival to the ED and the incidence of delirium (p=0.003). Kappa statistics were computed to measure inter-rater reliability of the CAM. Based on an alpha of 5%, 138 patients would allow 80% power for an estimated overall incidence proportion of 15 % with 5% precision.. Other predictive delirium variables, such as cognitive status, environmental factors, functional status, comorbidities, physiological status, and ED and hospital length of stay were similar between sites and phases. Conclusion: The fact that incidence of delirium was the same for all sites, despite the differences of ED LOS and different time periods suggest that many other modifiable and non-modifiable factors along LOS influenced the incidence of ED induced delirium. Emergency physician should concentrate on improving senior-friendly environment for the ED.
Introduction: It is documented that physicians and nurses fail to detect delirium in more than half of cases from various clinical settings, which could have serious consequences for seniors and for our health care system. The present study aimed to describe the rate of documented incident delirium in 5 Canadian Emergency departments (ED) by health professionals (HP). Methods: This study is part of the multicenter prospective cohort INDEED study. Patients aged 65 years old, initially free of delirium with an ED stay 8hours were followed up to 24h after ward admission. Delirium status was assessed twice daily using the Confusion Assessment Method (CAM) by trained research assistants (RA). HP reviewed patient charts to assess detection of delirium. HP had no specific routine detection of delirious ED patients. Inter-observer agreement was realized among RA. Comparison of detection between RA and HP was realized with univariate analyses. Results: Among the 652 included patients, 66 developed a delirium as evaluated with the CAM by the RA. Among those 66 patients, only 10 deliriums (15.2%) were documented in the patients medical file by the HP. 54 (81.8%) patients with a CAM positive for delirium by the RA were not recorded by the HP, 2 had incomplete charts. The delirium index was significantly higher in the HP reported group compared to the HP not reported, respectively 7.1 and 4.5 (p<0.05). Other predictive delirium variables, such as cognitive status, functional status, comorbidities, physiological status, and ED and hospital length of stay were similar between groups. Conclusion: It seems that health professionals missed 81.8% of the potential delirious ED patients in comparison to routine structured screening of delirium. HP could identify patients with a greater severity of symptoms. Our study points out the need to better identify elders at risk to develop delirium and the need for fast and reliable tools to improve the screening of this disorder.
Introduction: The purpose of this study is to determine if the introduction of a pre-arrival and pre-departure Trauma Checklist as a cognitive aid, coupled with an educational session, will improve clinical performance in a simulated environment. The Trauma Checklist was developed in response to a quality assurance review of high-acuity trauma activations. It focuses on pre-arrival preparation and a pre-departure review prior to patient transfer to diagnostic imaging or the operating room. We conducted a pilot, randomized control trial assessing the impact of the Trauma Checklist on time to critical interventions on a simulated pediatric patient by multidisciplinary teams. Methods: Emergency department teams composed of 2 physicians, 2 nurses and 2 confederate actors were enrolled in our study. In the intervention arm, participants watched a 10-minute educational video modelling the use of the trauma checklist prior to their simulation scenario and were provided a copy of the checklist. Teams participated in a standardized simulation scenario caring for a severely injured adolescent patient with hemorrhagic shock, respiratory failure and increased intracranial pressure. Our primary outcome of interest was time measurement to initiation of key clinical interventions, including intubation, first blood product administration, massive transfusion protocol activation, initiation of hyperosmolar therapy and others. Secondary outcome measures included a Trauma Task Performance score and checklist completion scores. Results: We enrolled 14 multidisciplinary teams (n=56 participants) into our study. There was a statistically significant decrease in median time to initiation of hyperosmolar therapy by teams in the intervention arm compared to the control arm (581 seconds, [509-680] vs. 884 seconds, [588-1144], p=0.03). Time to initiation of other clinical interventions was not statistically significant. There was a trend to higher Trauma Task Performance scores in the intervention group however this did not reach statistical significant (p=0.09). Pre-arrival and pre-departure checklist scores were higher in the intervention group (9.0 [9.0-10.0] vs. 7.0 [6.0-8.0], p=0.17 and 12.0 [11.5-12.0] vs. 7.5 [6.0-8.5], p=0.01). Conclusion: Teams using the Trauma Checklist did not have decreased time to initiation of key clinical interventions except in initiating hyperosmolar therapy. Teams in the intervention arm had statistically significantly higher pre-arrival and pre-departure scores, with a trend to higher Trauma Task Performance scores. Our study was a pilot and recruitment did not achieve the anticipated sample size, thus underpowered. The impact of this checklist should be studied outside tertiary trauma centres, particularly in trainees and community emergency providers, to assess for benefit and further generalizability.
Introduction: Emergency department (ED) stay and its associated conditions (immobility, inadequate hydration and nutrition, lack of stimulation) favor the development of delirium in vulnerable elderly patients. Poorly controlled pain, and paradoxically opioid pain treatment, has also been identified as a trigger for delirium. The aim of this study was to assess the relationship between pain, opioid treatment, and delirium in elderly ED patients. Methods: A multicenter prospective cohort study was conducted in four hospitals across the province of Québec (Canada). Patients aged 65 years old, waiting for care unit admission between February and May 2016, who were non-delirious upon ED arrival, independent or semi-independent for their activities of daily living, and had an ED stay of at least 8 hours were included. Delirium assessments were made twice a day for their entire ED stay and for the first 24 hours in the hospital ward using the Confusion Assessment Method (CAM). Pain intensity was evaluated using a visual analog scale (0-100) during the initial interview, and all opioid treatments were documented. Results: A total of 338 patients were included; 51% were female, mean age was 77 years (SD: 8). Forty-one patients (12%) experienced delirium during their hospital stay occurring within a mean delay of 47 hours (SD: 19) after ED admission. Among patients with pain intensity 60, 22% experienced delirium compared to 10.7% for patients with pain <60 (p<0.05). No significant association was found between opioid consumption and delirium (p=0.22). Logistic regression controlling for age, sex, ED stay duration, and opioids intake showed that patients with pain intensity 60 are 2.6 (95%CI: 1.2-5.9) more likely to develop delirium than patients who had pain <60. Conclusion: Severe pain, not opioids, is associated with the development of delirium during ED stay. Adequate pain control during the hospital stay may contribute to the decrease of delirium episodes.