To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Human brain is a complex organ comprising multiple cell types of differing function. Although histological evaluation remains the mainstay approach for evaluating tissue, comprehensive molecular characterization is now possible due to advanced -omic approaches. microRNAs (miRNAs) are small (~22 nt) RNA molecules that regulate gene expression and mediate cellular differentiation in normal brain development. miRNAs also make excellent tissue markers due to their abundance, cell-type and disease-stage specificity, and stability in solid/liquid clinical samples. To advance our knowledge of miRNA-mediated gene regulation in human brain, we generated comprehensive miRNA expression profiles from 117 fresh normal brain samples through barcoded small RNA sequencing; tissues included neocortex, allocortex, white matter, cerebellum, olfactory bulb, optic nerve, pineal gland and spinal cord. FASTQ sequence files were annotated using state-of-the-art sequence annotation available through the Renwick lab. Following data pre-processing, high expression analysis of miRNA profiles showed that miR-9 was the highest expressed miRNA in neocortex, cerebellum and olfactory bulb, whereas miR-22 was highest expressed in cingulate cortex, optic nerve and spinal cord; interestingly, miR-29 was the highest expressed miRNA in hippocampus. Our analyses showed a trend towards unique miRNA signatures in different anatomical areas of the brain. Our next step is to perform miRNA fluorescence in situ hybridization on formalin-fixed paraffin-embedded tissues using a novel method developed in the Renwick lab. Accurate miRNA characterization of normal tissues will provide a firm basis for understanding miRNA changes in neurological diseases.
This presentation will enable the learner to:
1.Describe the function of miRNAs and their suitability as tissue/cell specific signatures
2.Describe the miRNA expression trends in profiling various anatomical regions of the central nervous system
Better understanding of interplay among symptoms, cognition and functioning in first-episode psychosis (FEP) is crucial to promoting functional recovery. Network analysis is a promising data-driven approach to elucidating complex interactions among psychopathological variables in psychosis, but has not been applied in FEP.
This study employed network analysis to examine inter-relationships among a wide array of variables encompassing psychopathology, premorbid and onset characteristics, cognition, subjective quality-of-life and psychosocial functioning in 323 adult FEP patients in Hong Kong. Graphical Least Absolute Shrinkage and Selection Operator (LASSO) combined with extended Bayesian information criterion (BIC) model selection was used for network construction. Importance of individual nodes in a generated network was quantified by centrality analyses.
Our results showed that amotivation played the most central role and had the strongest associations with other variables in the network, as indexed by node strength. Amotivation and diminished expression displayed differential relationships with other nodes, supporting the validity of two-factor negative symptom structure. Psychosocial functioning was most strongly connected with amotivation and was weakly linked to several other variables. Within cognitive domain, digit span demonstrated the highest centrality and was connected with most of the other cognitive variables. Exploratory analysis revealed no significant gender differences in network structure and global strength.
Our results suggest the pivotal role of amotivation in psychopathology network of FEP and indicate its critical association with psychosocial functioning. Further research is required to verify the clinical significance of diminished motivation on functional outcome in the early course of psychotic illness.
Major depressive disorder (MDD) represents a leading cause of disability. This study examines the course of disability in patients with chronic, recurrent and remitting MDD compared to healthy controls and identifies predictors of disability in remitting MDD.
We included 914 participants from the Netherlands Study of Depression and Anxiety (NESDA). DSM-IV MDD and WHO DAS II disability were assessed at baseline and at 2, 4 and 6 years. Six-year total and domain-specific disability were analysed and compared in participants with chronic (n = 57), recurrent (n = 120), remitting (n = 127) MDD and in healthy controls (n = 430). Predictors of residual disability were identified using linear regression analysis.
At baseline, most disability was found in chronic MDD, followed by recurrent MDD, remitting MDD and healthy controls. Across diagnostic groups, most disability was found in household activities, interpersonal functioning, participation in society and cognition. A chronic course was associated with chronic disability. Symptom remission was associated with a decrease in disability, but some disability remained. In remitting MDD, higher residual disability was predicted by older age, more severe avoidance symptoms, higher disability at baseline and late symptom remission. Severity of residual disability correlated with the severity of residual depressive symptoms.
Symptomatic remission is a prerequisite for improvements in disability. However, disability persists despite symptom remission. Therefore, treatment of MDD should include an explicit focus on disability, especially on the more complex domains. To this end, treatments should promote behavioural activation and address subthreshold depressive symptoms in patients with remitted MDD.
The objectives of this paper are to: (1) identify contextual factors such as policy that impacted the implementation of community-based primary health care (CBPHC) innovations among 12 Canadian research teams and (2) describe strategies used by the teams to address contextual factors influencing implementation of CBPHC innovations. In primary care settings, consideration of contextual factors when implementing change has been recognized as critically important to success. However, contextual factors are rarely recorded, analyzed or considered when implementing change. The lack of consideration of contextual factors has negative implications not only for successfully implementing primary health care (PHC) innovations, but also for their sustainability and scalability. For this evaluation, data collection was conducted using self-administered questionnaires and follow-up telephone interviews with team representatives. We used a combination of directed and conventional content analysis approaches to analyze the questionnaire and interview data. Representatives from all 12 teams completed the questionnaire and 11 teams participated in the interviews; 40 individuals participated in this evaluation. Four themes representing contextual factors that impacted the implementation of CBPHC innovations were identified: (I) diversity of jurisdictions (II) complexity of interactions and collaborations (III) policy, and (IV) the multifaceted nature of PHC. The teams used six strategies to address these contextual factors including: (1) conduct an environmental scan at the beginning (2) maintaining engagement among partners and stakeholders by encouraging open and inclusive communication; (3) contextualizing the innovation for different settings; (4) anticipating and addressing changes, delays, and the need for additional resources; (5) fostering a culture of research and innovation among partners and stakeholders; and (6) ensuring information about the innovation is widely available. Implementing CBPHC innovations across jurisdictions is complex and involves navigating through multiple contextual factors. Awareness of the dynamic nature of context should be considered when implementing innovations.
We argue that the ways in which we as humans derive well-being from nature – for example by harvesting firewood, selling fish or enjoying natural beauty – feed back into how we behave towards the environment. This feedback is mediated by institutions (rules, regulations) and by individual capacities to act. Understanding these relationships can guide better interventions for sustainably improving well-being and alleviating poverty. However, more attention needs to be paid to how experience-related benefits from nature influence attitudes and actions towards the environment, and how these relationships can be reflected in more environmentally sustainable development projects.
The Comprehensive Framework for Disaster Evaluation Typologies, developed in 2017 (CFDET 2017), aims to unify and facilitate agreement regarding the identification, structure, and relationships between various evaluation typologies found in the disaster setting. A peer-reviewed validation process sought input from international experts in the fields of disaster medicine, disaster/emergency management, humanitarian/development, and evaluation. This paper discusses the validation process, its results, and outcomes.
Previous frameworks, identified in the literature, lack validation and consistent terminology. To gain credibility and utility, this unique framework needed to be validated by international experts in the disaster setting.
A mixed methods approach was designed to validate the framework. An initial iterative process informed an online survey which used a combination of a five-point Likert scale and open-ended questions. Pre-determined consensus thresholds, informed by a targeted literature review, provided the validation criteria.
A sample of 33 experts from 11 countries responded to the validation process. Quantitative measures largely supported the elements and relationships of the framework, and strongly supported its value and usefulness for supporting, promoting, and undertaking evaluations, as well as its usefulness for teaching evaluation in the disaster setting. Qualitative input suggested opportunities to strengthen and enhance the framework. There were limited responses to better understand the barriers and enablers of undertaking disaster evaluations. A potential for self-selection bias of respondents may be a limitation of this study. The attainment of high consensus thresholds, however, provides confidence in the validity of the results.
For the first time, a framework of this nature has undergone a rigorous validation process by experts in three related disciplines at an international level. The modified framework, CFDET 2018, provides a unifying framework within which existing evaluation typologies can be structured. It gives evaluators confidence to choose an appropriate strategy for their particular evaluation in the disaster setting and facilitates consistency in reporting across the different phases of a disaster to better understand the process, outcomes, and impacts of the efficacy and efficiency of interventions. Future research could create a series of toolkits to support improved disaster evaluation processes and to evaluate the utility of the framework in the real-world setting.
Although researchers have described numerous risk factors for salmonellosis and for infection with specific common serotypes, the drivers of Salmonella serotype diversity among human populations remain poorly understood. In this retrospective observational study, we partition records of serotyped non-typhoidal Salmonella isolates from human clinical specimens reported to CDC national surveillance by demographic, geographic and seasonal characteristics and adapt sample-based rarefaction methods from the field of community ecology to study how Salmonella serotype diversity varied within and among these populations in the USA during 1996–2016. We observed substantially higher serotype richness in children <2 years old than in older children and adults and steadily increasing richness with age among older adults. Whereas seasonal and regional variation in serotype diversity was highest among infants and young children, variation by specimen source was highest in adults. Our findings suggest that the risk for infection from uncommon serotypes is associated with host and environmental factors, particularly among infants, young children and older adults. These populations may have a higher proportion of illness acquired through environmental transmission pathways than published source attribution models estimate.
Introduction: When a patient is incapable of making medical decisions for themselves, choices are made according to the patient's previously expressed, wishes, values, and beliefs by a substitute decision maker (SDM). While interventions to engage patients in their own advance care planning exist, little is known about public readiness to act as a SDM on behalf of a loved one. This mixed-methods survey aimed to describe attitudes, enablers and barriers to preparedness to act as a SDM, and support for a population-level curriculum on the role of an SDM in end-of-life and resuscitative care. Methods: From November 2017 to June 2018, a mixed-methods street intercept survey was conducted in Ottawa, Canada. Descriptive statistics and logistic regression analysis were used to assess predictors of preparedness to be a SDM and understand support for a high school curriculum. Responses to open-ended questions were analyzed using inductive thematic analysis. Results: The 430 respondents were mostly female (56.5%) with an average age of 33.9. Although 73.0% of respondents felt prepared to be a SDM, 41.0% of those who reported preparedness never had a meaningful conversation with loved ones about their wishes in critical illness. The only predictors of SDM preparedness were the belief that one would be a future SDM (OR 2.36 95% CI 1.34-4.17), and age 50-64 compared to age 16-17 (OR 7.46 95% CI 1.25-44.51). Thematic enablers of preparedness included an understanding of a patient's wishes, the role of the SDM and strong familial relationships. Barriers included cultural norms, family conflict, and a need for time for high stakes decisions. Most respondents (71.9%) believed that 16 year olds should learn about SDMs. They noted age appropriateness, potential developmental and societal benefit, and improved decision making, while cautioning the need for a nuanced approach respectful of different maturity levels, cultures and individual experiences. Conclusion: This study reveals a concerning gap between perceived preparedness and actions taken in preparation to be an SDM for loved ones suffering critical illness. The results also highlight the potential role for high school education to address this gap. Future studies should further explore the themes identified to inform development of resources and curricula for improved health literacy in resuscitation and end-of-life care.
Introduction: Oxygen is commonly administered to prehospital patients presenting with acute myocardial infarction (AMI). We conducted a systematic review to determine if oxygen administration, in AMI, impacts patient outcomes. Methods: We conducted a systematic search using MeSH terms and keywords in Medline, Embase, Cochrane Database of Systematic Reviews, Cochrane Central, clinicaltrials.gov and ISRCTN for relevant randomized controlled trials and observational studies comparing oxygen administration and no oxygen administration. The outcomes of interest were: mortality (≤30 days, in-hospital, and intermediate 2-11 months), infarct size, and major adverse cardiac events (MACE). Risk of Bias assessments were performed and GRADE methodology was employed to assess quality and overall confidence in the effect estimate. A meta-analysis was performed using RevMan 5 software. Results: Our search yielded 1192 citations of which 48 studies were reviewed as full texts and a total of 8 studies were included in the analysis. All evidence was considered low or very low quality. Five studies reported on mortality finding low quality evidence of no benefit or harm. Low quality evidence demonstrated no benefit or harm from supplemental oxygen administration. Similarly, no benefit or harm was found in MACE or infarct size (very low quality). Normoxia was defined as oxygen saturation measured via pulse oximetry at ≥90% in one recent study and ≥94% in another. Conclusion: We found low and very low quality evidence that the administration of supplemental oxygen to normoxic patients experiencing AMI, provides no clear harm nor benefit for mortality or MACE. The evidence on infarct size was inconsistent and warrants further prospective examination.
Introduction: Opioids are routinely administered for analgesia to prehospital patients experiencing chest discomfort from acute myocardial infarction (AMI). We conducted a systematic review to determine if opioid administration impacts patient outcomes. Methods: We conducted a systematic search using MeSH terms and keywords in Medline, Embase, Cochrane Database of Systematic Reviews, Cochrane Central and Clinicaltrials.gov for relevant randomized controlled trials and observational studies comparing opioid administration in AMI patients from 1990 to 2017. The outcomes of interest were: all-cause short-term mortality (≤30 days), major adverse cardiac events (MACE), platelet activity and aggregation, immediate adverse events, infarct size, and analgesia. Included studies were hand searched for additional citations. Risk of Bias assessments were performed and GRADE methodology was employed to assess quality and overall confidence in the effect estimate. Results: Our search yielded 3001 citations of which 19 studies were reviewed as full texts and a total of 9 studies were included in the analysis. The studies predominantly reported on morphine as the opioid. Five studies reported on mortality (≤30 days), seven on MACE, four on platelet activity and aggregation, two on immediate adverse events, two on infarct size and none on analgesic effect. We found low quality evidence suggesting no benefit or harm in terms of mortality or MACE. However, low quality evidence indicates that opioids increase infarct size. Low-quality evidence also shows reduced serum P2Y12 (eg: clopidogrel and ticagrelor) active metabolite levels and increased platelet reactivity in the first several hours post administration following an increase in vomiting. Conclusion: We find low and very low quality evidence that the administration of opioids in STEMI may be adversely related to vomiting and some surrogate outcomes including increased infarct size, reduced serum P2Y12 levels, and increased platelet activity. We found no clear benefit or harm on patient-oriented clinical outcomes including mortality.
Introduction: The Canadian Association of Emergency Physicians (CAEP) Atrial Fibrillation (AF) Guidelines prioritizes early cardioversion and discharge home in the management of rapid AF, however not all patients can be safely cardioverted in the emergency department (ED). Given limited ED-based evidence on rate control, we sought to better understand the burden of disease in AF patients not managed by rhythm control and identify opportunities for improved care. Methods: We conducted a health records review of consecutive AF patient visits at two Canadian academic hospital EDs over a 12-month period. We included all patients ≥18 years with AF on electrocardiogram, a heart rate ≥100 beats per minute (bpm), and who did not receive cardioversion. Outcomes included: (1) incidence of patients managed by rate control; (2) specific rate control management practices including choice of agent, route of administration, dosing, and timing; (3) adverse events; (4) compliance with CAEP AF Guidelines; and (5) disposition and outcomes. Results: Of 972 rapid AF patient visits, 307 were excluded and 665 were included, with mean age 77.2, female 51.6%. Of those included, 43.0% were given rate control medications, most common being metoprolol (72.0%). Admission to hospital occurred in 61.4% of visits, and 77.9% of AF cases were secondary to another medical condition. In those given rate control medications, 9.1% suffered adverse events and only 55.6% had a final ED heart rate ≤100 bpm. Inappropriate use of rate control medications was found in 44.8% of cases, specifically inappropriate choice of agent (4.5%), inappropriate route of administration (26.9%), over-dosed (2.4%), under-dosed (5.2%), and inadequate timing (5.6%). Conclusion: We demonstrated that for rapid AF patients not receiving cardioversion, most cases were secondary to a medical cause and of those receiving rate control, there were a concerning number of adverse events related to inappropriate choice of agent, route of administration, dosage, and timing. Moving forward, better awareness of the CAEP AF Guidelines by ED physicians will ensure safer use of rate control agents for rapid AF patients.
Innovation Concept: Emergency medicine (EM) programs have restructured their training using a Competence by Design model. This model emphasizes entrustable professional activities (EPAs) that residents must fulfill before advancing in their training. The first EPA (EPA 1) for the transition to discipline (TTD) stage involves managing the unstable patient. Data from the University of Toronto (U of T) program suggests residents lack enough exposure to these patient presentations during TTD – creating a disconnect between anticipated clinical exposure and the expectation for residents to achieve competence in EPA 1. Methods: To overcome this gap, U of T EM faculty specifically targeted EPA 1 while designing the TTD curriculum. Kern's six-step approach to curriculum development in medical education was used. This six-step approach involves: problem identification, needs assessment, goals and objectives, education strategies, implementation and evaluation. To maximize feasibility of the new curriculum, existing sessions were mapped against EPAs and required training activities to identify synchrony where possible. Residents were scheduled on EM rotations with weekly academic days that included this novel curriculum. Curriculum, Tool or Material: Didactic lectures, procedural workshops and simulation were closely integrated in TTD to address EPA 1. Lectures introduced approaches to cardinal presentations. An interactive workshop introduced ACLS and PALS algorithms and defibrillator use. Three simulation sessions focused on ACLS, shock, airway, trauma and the altered patient. A final simulation session allowed spaced-repetition and integration of these topics. After the completion of TTD, residents participated in a six-scenario simulation OSCE directly assessing EPA 1. Conclusion: The curriculum was evaluated using a multifaceted approach including surveys, self-assessments, faculty feedback and OSCE performance. Overall, the curriculum achieved its goal in addressing EPA 1. It was well-received by faculty and residents. Residents rated the sessions highly, and self-reported improved confidence in assessing unstable patients and adhering to ACLS algorithms. The simulation OSCE demonstrated expected competency by residents in EPA 1. One limitation identified was the lack of a pediatric simulation session which has now been incorporated into the curriculum. Moving forward, this innovative curriculum will undergo continuous cycles of evaluation and improvement with a goal of applying a similar design to other stages of CBD.
Determining infectious cross-transmission events in healthcare settings involves manual surveillance of case clusters by infection control personnel, followed by strain typing of clinical/environmental isolates suspected in said clusters. Recent advances in genomic sequencing and cloud computing now allow for the rapid molecular typing of infecting isolates.
To facilitate rapid recognition of transmission clusters, we aimed to assess infection control surveillance using whole-genome sequencing (WGS) of microbial pathogens to identify cross-transmission events for epidemiologic review.
Clinical isolates of Staphylococcus aureus, Enterococcus faecium, Pseudomonas aeruginosa, and Klebsiella pneumoniae were obtained prospectively at an academic medical center, from September 1, 2016, to September 30, 2017. Isolate genomes were sequenced, followed by single-nucleotide variant analysis; a cloud-computing platform was used for whole-genome sequence analysis and cluster identification.
Most strains of the 4 studied pathogens were unrelated, and 34 potential transmission clusters were present. The characteristics of the potential clusters were complex and likely not identifiable by traditional surveillance alone. Notably, only 1 cluster had been suspected by routine manual surveillance.
Our work supports the assertion that integration of genomic and clinical epidemiologic data can augment infection control surveillance for both the identification of cross-transmission events and the inclusion of missed and exclusion of misidentified outbreaks (ie, false alarms). The integration of clinical data is essential to prioritize suspect clusters for investigation, and for existing infections, a timely review of both the clinical and WGS results can hold promise to reduce HAIs. A richer understanding of cross-transmission events within healthcare settings will require the expansion of current surveillance approaches.
Upper respiratory tract infections (URTIs) account for substantial attendances at emergency departments (EDs). There is a need to elucidate determinants of antibiotic prescribing in time-strapped EDs – popular choices for primary care despite highly accessible primary care clinics. Semi-structured in-depth interviews were conducted with purposively sampled physicians (n = 9) in an adult ED in Singapore. All interviews were analysed using thematic analysis and further interpreted using the Social Ecological Model to explain prescribing determinants. Themes included: (1) reliance on clinical knowledge and judgement, (2) patient-related factors, (3) patient–physician relationship factors, (4) perceived practice norms, (5) policies and treatment guidelines and (6) patient education and awareness. The physicians relied strongly on their clinical knowledge and judgement in managing URTI cases and seldom interfered with their peers’ clinical decisions. Despite departmental norms of not prescribing antibiotics for URTIs, physicians would prescribe antibiotics when faced with uncertainty in patients’ diagnoses, treating immunocompromised or older patients with comorbidities, and for patients demanding antibiotics, especially under time constraints. Participants had a preference for antibiotic prescribing guidelines based on local epidemiology, but viewed hospital policies on prescribing as a hindrance to clinical judgement. Participants highlighted the need for more public education and awareness on the appropriate use of antibiotics and management of URTIs. Organisational practice norms strongly influenced antibiotic prescribing decisions by physicians, who can be swayed by time pressures and patient demands. Clinical decision support tools, hospital guidelines and patient education targeting at individual, interpersonal and community levels could reduce unnecessary antibiotic use.
Thermal analysts have exploited the sensitivity of carbonate mineral decomposition to furnace atmosphere as a diagnostic tool for identifying and quantifying these minerals in mixtures and solid solutions (1-3). However, thermal analysis techniques alone cannot reveal information about the reaction products after each thermal event. In-situ high temperature x-ray diffraction is one technique that can identify these products. Using this technique, Kissinger et al. (4) identified the reaction products of the thermal decomposition of reagent grade FeCO3 (siderite) and MgCO3 (magnesite). However, the thermal behavior of analytical reagent grade carbonates differs from natural minerals (1). Milodowski and Morgan (5) used in-situ XRD to investigate the thermal behavior of the dolomite-ankerite series.
The synchrotron x-ray source provides a unique opportunity to observe many “in-situ” processes. The formation of the “short-lived” intermediate species, Ta2C, during the combustion synthesis of TaC, has been observed and reported by monitoring the Bragg diffraction peaks of the reactants and products, Similarly, the synthesis of the ferroelectric material, BaTiO3, and subsequent phase transfonnation from cubic to tetragonal have also been investigated. These experiments would not have been possible without the high incident x-ray flux available at a synchrotron source.
OBJECTIVES/SPECIFIC AIMS: Objective: Approximately 86 million people in the US have prediabetes, but only a fraction of them receive proven effective therapies to prevent diabetes. Further, the effectiveness of these therapies varies with individual risk of progression to diabetes. We estimated the value of targeting those individuals at highest diabetes risk for treatment, compared to treating all individuals meeting inclusion criteria for the Diabetes Prevention Program (DPP). METHODS/STUDY POPULATION: METHODS: Using a micro-simulation model, we estimated total lifetime costs and quality-adjusted life expectancy (QALE) for individuals receiving: (1) lifestyle intervention involving an intensive program focused on healthy diet and exercise, (2) metformin administration, or (3) no intervention. The model combines several components. First a Cox proportional hazards model predicted onset of diabetes from baseline characteristics for each pre-diabetic individual and yielded a probability distribution for each alternative. We derived this risk model from the Diabetes Prevention Program (DPP) clinical trial data and the follow-up study DPP-OS. The Michigan Diabetes Research Center Model for Diabetes then estimated costs and outcomes for individuals after diabetes diagnosis using standard of care diabetes treatment. Based on individual costs and QALE, we evaluated NMB of the two interventions at population and individual levels, stratified by risk quintiles for diabetes onset at 3 years. RESULTS/ANTICIPATED RESULTS: Results: Compared to usual care, lifestyle modification conferred positive benefits for all eligible individuals. Metformin’s NMB was negative for the lowest population risk quintile. By avoiding use among individuals who would not benefit, targeted administration of metformin conferred a benefit of $500-$800 per person, depending on duration of treatment effect. When treating only 20% of the population (e.g., due to capacity constraints), targeting conferred a NMB of $14,000-$18,000 per person for lifestyle modification and $16,000-$20,000 for metformin. DISCUSSION/SIGNIFICANCE OF IMPACT: Conclusions: Metformin confers value only among higher risk individuals, so targeting its use is worthwhile. While lifestyle modification confers value for all eligible individuals, prioritizing the intervention to high risk patients when capacity is constrained substantially increases societal benefits.