To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Clinical trial processes are unnecessarily inefficient and costly, slowing the translation of medical discoveries into treatments for people living with disease. To reduce redundancies and inefficiencies, a group of clinical trial experts developed a framework for clinical trial site readiness based on existing trial site qualifications from sponsors. The site readiness practices are encompassed within six domains: research team, infrastructure, study management, data collection and management, quality oversight, and ethics and safety. Implementation of this framework for clinical trial sites would reduce inefficiencies in trial conduct and help prepare new sites to enter the clinical trials enterprise, with the potential to improve the reach of clinical trials to underserved communities. Moreover, the framework holds benefits for trial sponsors, contract research organizations, trade associations, trial participants, and the public. For novice sites considering future trials, we provide a framework for site preparation and the engagement of stakeholders. For experienced sites, the framework can be used to assess current practices and inform and engage sponsors, staff, and participants. Details in the supplementary materials provide easy access to key regulatory documents and resources. Invited perspective articles provide greater depth from a systems, DEIA (diversity, equity, inclusion, and accessibility) and decentralized trials perspective.
Resident education in emergency medicine (EM) relies upon a variety of teaching platforms and mediums, including real-life clinical scenarios, simulation, academic day (lectures, small group sessions), journal clubs, and teaching learners. However, the coronavirus disease 2019 (COVID-19) pandemic has disrupted teaching and learning, forcing programs to adapt to ensure residents can progress in their training.1 Suddenly, academic days cannot be held in person, emergency department (ED) volumes are dynamically changing, and the role of residents in ED procedures has been questioned. Furthermore, medical student rotations through the ED have been cancelled, decreasing resident exposure to undergraduate teaching. These changes to resident education threaten resident wellness and will have downstream effects on training and personal professional development. In response, programs must develop strategies to ensure that residents continue receiving high-quality training in a safe learning environment. In this review, we will cover recommended strategies put forth by two large EM programs in Ontario (Table 1).
Emergency medicine (EM) training programs incorporate simulation for teaching as well as formative and summative assessment. The development of a simulation curriculum for Canadian postgraduate EM programs is underway and would be facilitated by a standardized, user-friendly, nationally endorsed simulation template. We convened a nationally representative group of simulation educators to participate in a three-phase process to develop and refine a simulation case template for Canadian EM educators. Participants provided feedback by means of free text comments and focus groups which were analyzed to inform modification of the template. We anticipate that this template will facilitate the sharing of cases across sites and the development of standardized cases for simulation-based assessment.
The national implementation of competency-based medical education (CBME) has prompted an increased interest in identifying and tracking clinical and educational outcomes for emergency medicine training programs. For the 2019 Canadian Association of Emergency Physicians (CAEP) Academic Symposium, we developed recommendations for measuring outcomes in emergency medicine training in the context of CBME to assist educational leaders and systems designers in program evaluation.
We conducted a three-phase study to generate educational and clinical outcomes for emergency medicine (EM) education in Canada. First, we elicited expert and community perspectives on the best educational and clinical outcomes through a structured consultation process using a targeted online survey. We then qualitatively analyzed these responses to generate a list of suggested outcomes. Last, we presented these outcomes to a diverse assembly of educators, trainees, and clinicians at the CAEP Academic Symposium for feedback and endorsement through a voting process.
Academic Symposium attendees endorsed the measurement and linkage of CBME educational and clinical outcomes. Twenty-five outcomes (15 educational, 10 clinical) were derived from the qualitative analysis of the survey results and the most important short- and long-term outcomes (both educational and clinical) were identified. These outcomes can be used to help measure the impact of CBME on the practice of Emergency Medicine in Canada to ensure that it meets both trainee and patient needs.
To address the increasing demand for the use of simulation for assessment, our objective was to review the literature pertaining to simulation-based assessment and develop a set of consensus-based expert-informed recommendations on the use of simulation-based assessment as presented at the 2019 Canadian Association of Emergency Physicians (CAEP) Academic Symposium on Education.
A panel of Emergency Medicine (EM) physicians from across Canada, with leadership roles in simulation and/or assessment, was formed to develop the recommendations. An initial scoping literature review was conducted to extract principles of simulation-based assessment. These principles were refined via thematic analysis, and then used to derive a set of recommendations for the use of simulation-based assessment, organized by the Consensus Framework for Good Assessment. This was reviewed and revised via a national stakeholder survey, and then the recommendations were presented and revised at the consensus conference to generate a final set of recommendations on the use of simulation-based assessment in EM.
We developed a set of recommendations for simulation-based assessment, using consensus-based expert-informed methods, across the domains of validity, reproducibility, feasibility, educational and catalytic effects, acceptability, and programmatic assessment. While the precise role of simulation-based assessment will be a subject of continued debate, we propose that these recommendations be used to assist educators and program leaders as they incorporate simulation-based assessment into their programs of assessment.
Simulation plays an integral role in the Canadian healthcare system with applications in quality improvement, systems development, and medical education. High-quality, simulation-based research will ensure its effective use. This study sought to summarize simulation-based research activity and its facilitators and barriers, as well as establish priorities for simulation-based research in Canadian emergency medicine (EM).
Simulation-leads from Canadian departments or divisions of EM associated with a general FRCP-EM training program surveyed and documented active EM simulation-based research at their institutions and identified the perceived facilitators and barriers. Priorities for simulation-based research were generated by simulation-leads via a second survey; these were grouped into themes and finally endorsed by consensus during an in-person meeting of simulation leads. Priority themes were also reviewed by senior simulation educators.
Twenty simulation-leads representing all 14 invited institutions participated in the study between February and May, 2018. Sixty-two active, simulation-based research projects were identified (median per institution = 4.5, IQR 4), as well as six common facilitators and five barriers. Forty-nine priorities for simulation-based research were reported and summarized into eight themes: simulation in competency-based medical education, simulation for inter-professional learning, simulation for summative assessment, simulation for continuing professional development, national curricular development, best practices in simulation-based education, simulation-based education outcomes, and simulation as an investigative methodology.
This study summarized simulation-based research activity in EM in Canada, identified its perceived facilitators and barriers, and built national consensus on priority research themes. This represents the first step in the development of a simulation-based research agenda specific to Canadian EM.
There is increasing evidence to support integration of simulation into medical training; however, no national emergency medicine (EM) simulation curriculum exists. Using Delphi methodology, we aimed to identify and establish content validity for adult EM curricular content best suited for simulation-based training, to inform national postgraduate EM training.
A national panel of experts in EM simulation iteratively rated potential curricular topics, on a 4-point scale, to determine those best suited for simulation-based training. After each round, responses were analyzed. Topics scoring <2/4 were removed and remaining topics were resent to the panel for further ratings until consensus was achieved, defined as Cronbach α ≥ 0.95. At conclusion of the Delphi process, topics rated ≥ 3.5/4 were considered “core” curricular topics, while those rated 3.0-3.5 were considered “extended” curricular topics.
Forty-five experts from 13 Canadian centres participated. Two hundred eighty potential curricular topics, in 29 domains, were generated from a systematic literature review, relevant educational documents and Delphi panellists. Three rounds of surveys were completed before consensus was achieved, with response rates ranging from 93-100%. Twenty-eight topics, in eight domains, reached consensus as “core” curricular topics. Thirty-five additional topics, in 14 domains, reached consensus as “extended” curricular topics.
Delphi methodology allowed for achievement of expert consensus and content validation of EM curricular content best suited for simulation-based training. These results provide a foundation for improved integration of simulation into postgraduate EM training and can be used to inform a national simulation curriculum to supplement clinical training and optimize learning.
While carrying out a scoping review of earthquake response, we found that there is no universal standardized approach for assessing the quality of disaster evidence, much of which is variable or not peer reviewed. With the lack of a framework to ascertain the value and validity of this literature, there is a danger that valuable insights may be lost. We propose a theoretical framework that may, with further validation, address this gap.
Existing frameworks – quality of reporting of meta-analyses (QUORUM), meta-analysis of observational studies in epidemiology (MOOSE), the Cochrane assessment of bias, Critical Appraisal Skills Programme (CASP) checklists, strengthening the reporting of observation studies in epidemiology (STROBE), and consensus guidelines on reports of field interventions in disasters and emergencies (CONFIDE)–were analyzed to identify key domains of quality. Supporting statements, based on these existing frameworks were developed for each domain to form an overall theoretical framework of quality. This was piloted on a data set of publications from a separate scoping review.
Four domains of quality were identified: robustness, generalizability, added value, and ethics with 11 scored, supporting statements. Although 73 out of 111 papers (66%) scored below 70%, a sizeable portion (34%) scored higher.
Our theoretical framework presents, for debate and further validation, a method of assessing the quality of non-traditional studies and thus supporting the best available evidence approach to disaster response. (Disaster Med Public Health Preparedness. 2019;13:147–151)
A key task of the team leader in a medical emergency is effective information gathering. Studying information gathering patterns is readily accomplished with the use of gaze-tracking glasses. This technology was used to generate hypotheses about the relationship between performance scores and expert-hypothesized visual areas of interest in residents across scenarios in simulated medical resuscitation examinations.
Emergency medicine residents wore gaze-tracking glasses during two simulation-based examinations (n=29 and 13 respectively). Blinded experts assessed video-recorded performances using a simulation performance assessment tool that has validity evidence in this context. The relationships between gaze patterns and performance scores were analyzed and potential hypotheses generated. Four scenarios were assessed in this study: diabetic ketoacidosis, bradycardia secondary to beta-blocker overdose, ruptured abdominal aortic aneurysm and metabolic acidosis caused by antifreeze ingestion.
Specific gaze patterns were correlated with objective performance. High performers were more likely to fixate on task-relevant stimuli and appropriately ignore task-irrelevant stimuli compared with lower performers. For example, shorter latency to fixation on the vital signs in a case of diabetic ketoacidosis was positively correlated with performance (r=0.70, p<0.05). Conversely, total time spent fixating on lab values in a case of ruptured abdominal aortic aneurysm was negatively correlated with performance (r= −0.50, p<0.05).
There are differences between the visual patterns of high and low-performing residents. These findings may allow for better characterization of expertise development in resuscitation medicine and provide a framework for future study of visual behaviours in resuscitation cases.
The scholarly dissemination of innovative medical education practices helps broaden the reach of this type of work, allowing scholarship to have an impact beyond a single institution. There is little guidance in the literature for those seeking to publish program evaluation studies and innovation papers. This study aims to derive a set of evidence-based features of high-quality reports on innovations in emergency medicine (EM) education.
We conducted a scoping review and thematic analysis to determine quality markers for medical education innovation reports, with a focus on EM. A search of MEDLINE, EMBASE, ERIC, and Google Scholar was augmented by a hand search of relevant publication guidelines, guidelines for authors, and website submission portals from medical education and EM journals. Study investigators reviewed the selected articles, and a thematic analysis was conducted.
Our search strategy identified 14 relevant articles from which 34 quality markers were extracted. These markers were grouped into seven important themes: goals and need for innovation, preparation, innovation development, innovation implementation, evaluation of innovation, evidence of reflective practice, and reporting and dissemination. In addition, multiple outlets for the publication of EM education innovations were identified and compiled.
The publication and dissemination of innovations are critical for the EM education community and the training of health professionals. We anticipate that our list of innovation report quality markers will be used by EM education innovators to support the dissemination of novel educational practices.
Reviews help scholars consolidate evidence and guide their educational practice. However, few papers describe how to effectively publish review papers. We completed a scoping review to develop a set of quality indicators that will assist junior authors to publish reviews and integrative scholarship.
MEDLINE, Embase, ERIC, and Google Scholar were searched for English language articles published between 2012 and January 2016 using the terms review, medical education, how to publish, and emergency medicine. Titles and abstracts were reviewed by two authors and included if they focused on how to publish a review or outlined reporting guidelines of reviews. The articles were reviewed in parallel for calibration, and disagreements were resolved through a consensus.
A full text review of the 25 articles was conducted, and 196 recommendations were extracted from 13 articles. A hand search of the included articles’ reference lists and expert recommendation found an additional eight articles. These recommendations were thematically analysed into a list of seven themes and 32 items. Additionally, seven evaluation tools and reporting guidelines were found to guide researchers in optimizing their reviews for publication.
In emergency medicine education, review articles can help synthesize educational research so that educators can engage in evidence-based scholarly teaching. We hope that this work will act as an introduction to those interested in engaging in integrative scholarship by providing them with a guide to key quality markers and important checklists for improving their research.
Education scholarship can be conducted using a variety of methods, from quantitative experiments to qualitative studies. Qualitative methods are less commonly used in emergency medicine (EM) education research but are well-suited to explore complex educational problems and generate hypotheses. We aimed to review the literature to provide resources to guide educators who wish to conduct qualitative research in EM education.
We conducted a scoping review to outline: 1) a list of journals that regularly publish qualitative educational papers; 2) an aggregate set of quality markers for qualitative educational research and scholarship; and 3) a list of quality checklists for qualitative educational research and scholarship.
We found nine journals that have published more than one qualitative educational research paper in EM. From the literature, we identified 39 quality markers that were grouped into 10 themes: Initial Grounding Work (preparation, background); Goals, Problem Statement, or Question; Methods (general considerations); Sampling Techniques; Data Collection Techniques; Data Interpretation and Theory Generation; Measures to Optimize Rigour and Trustworthiness; Relevance to the Field; Evidence of Reflective Practice; Dissemination and Reporting. Lastly, five quality checklists were found for guiding educators in reporting their qualitative work.
Many problems that EM educators face are well-suited to exploration using qualitative methods. The results of our scoping review provide publication venues, quality indicators, and checklists that may be useful to EM educators embarking on qualitative projects.
A key skill for successful clinician educators is the effective dissemination of scholarly innovations and research. Although there are many ways to disseminate scholarship, the most accepted and rewarded form of educational scholarship is publication in peer-reviewed journals.
This paper provides direction for emergency medicine (EM) educators interested in publishing their scholarship via traditional peer-reviewed avenues. It builds upon four literature reviews that aggregated recommendations for writing and publishing high-quality quantitative and qualitative research, innovations, and reviews. Based on the findings from these literature reviews, the recommendations were prioritized for importance and relevance to novice clinician educators by a broad community of medical educators.
The top items from the expert vetting process were presented to the 2016 Canadian Association of Emergency Physicians (CAEP) Academic Symposium Consensus Conference on Education Scholarship. This community of EM educators identified the highest yield recommendations for junior medical education scholars. This manuscript elaborates upon the top recommendations identified through this consensus-building process.
Multibeam bathymetry and 3.5-kHz sub-bottom profiler data collected from the US icebreaker Healy in 2003 provide convincing evidence for grounded ice on the Chukchi Borderland off the northern Alaskan margin, Arctic Ocean. The data show parallel, glacially induced seafloor scours, or grooves, and intervening ridges that reach widths of 1000 m (rim to rim) and as much as 40 m relief. Following previous authors, we refer to these features as “megascale glacial lineations (MSGLs).” Additional support for ice grounding is apparent from stratigraphic unconformities, interpreted to have been caused by ice-induced erosion. Most likely, the observed sea-floor features represent evidence for massive ice-shelf grounding. The general ESE/WNW direction of the MSGLs, together with sediment, evidently bulldozed off the Chukchi Plateau, that is mapped on the western (Siberian) side of the plateau, suggests ice flow from the Canada Basin side of Chukchi Borderland. Two separate generations of glacially derived MSGLs are identified on the Chukchi Borderland from the Healy geophysical data. The deepest and oldest extensive MSGLs appear to be draped by sediments less than 5 m thick, whereas no sediment drape can be distinguished within the resolution of the sub-bottom profiles on the younger generation.
Sea ice microalgae actively contribute to the pool of dissolved organic matter (DOM) available for bacterial metabolism, but this link has historically relied on bulk correlations between chlorophyll a (a surrogate for algal biomass) and bacterial abundance. We incubated microbes from both the bottom (congelation layer) and surface brine region of Antarctic fast ice for nine days. Algal-derived DOM was manipulated by varying the duration of irradiance, restricting photosynthesis with 3-(3,4-dichlorophenyl)-1,1-dimethylurea (DCMU) or incubating in the dark. The bacterial response to changes in DOM availability was examined by performing cell counts, quantifying bacterial metabolic activity and examining community composition with denaturing gradient gel electrophoresis. The percentage of metabolically active bacteria was relatively low in the surface brine microcosm (10–20% of the bacterial community), the treatment with DCMU indirectly restricted bacterial growth and there was some evidence for changes in community structure. Metabolic activity was higher (35–69%) in the bottom ice microcosm, and while there was no variation in community structure, bacterial growth was restricted in the treatment with DCMU compared to the light/dark treatment. These results are considered preliminary, but provide a useful illustration of sea ice microbial dynamics beyond the use of ‘snapshot’ biomass correlations.