To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Alteration of the colonic microbiota following antimicrobial exposure allows colonization by antimicrobial-resistant organisms (AROs). Ingestion of a probiotic, such as Lactobacillus rhamnosus GG (LGG), could prevent colonization or infection with AROs by promoting healthy colonic microbiota. The purpose of this trial was to determine the effect of LGG administration on ARO colonization in hospitalized patients receiving antibiotics.
Prospective, double-blinded, randomized controlled trial of LGG versus placebo among patients receiving broad-spectrum antibiotics.
Tertiary care center.
In total, 88 inpatients receiving broad-spectrum antibiotics were enrolled.
Patients were randomized to receive 1 capsule containing 1×1010 cells of LGG twice daily (n = 44) or placebo (n = 44), stratified by ward type. Stool or rectal-swab specimens were collected for culture at enrollment, during admission, and at discharge. Using selective media, specimens were cultured for Clostridioides difficile, vancomycin-resistant Enterococcus spp (VRE), and antibiotic-resistant gram-negative bacteria. The primary outcome was any ARO acquisition. Secondary outcomes included loss of any ARO if colonized at enrollment, and acquisition or loss of individual ARO.
ARO colonization prevalence at study enrollment was similar (LGG 39% vs placebo 39%). We detected no difference in any ARO acquisition (LGG 30% vs placebo 33%; OR,1.19; 95% CI, 0.38–3.75) nor for any individual ARO acquisition. There was no difference in the loss of any ARO (LGG 18% vs placebo 24%; OR, 1.44; 95% CI, 0.27–7.68) nor for any individual ARO.
LGG administration neither prevented acquisition of ARO nor accelerated loss of ARO colonization.
Because knowledge of fungal diversity is very incomplete, it is possible that anthropogenic impacts are driving species to extinction before they have been discovered. Fungal inventories are still incomplete and do not reflect the complete diversity of this large taxon. Whilst molecular advancements are leading to an increased rate of species discovery, there is still much to be done to understand the diversity of fungi, identify rare species and establish conservation goals. Citizen science via social media could play an increasingly important role in mycological research, and its continued development should be supported and encouraged. The involvement of non-professionals in data collection helps increase public awareness, as well as extending the scope and efficiency of fungal surveys. Future academic mycological research could benefit from social media interaction and engagement with the amateur mycological community, which may accelerate the achievement of more effective conservation goals.
Multiple guidelines recommend discontinuation of prophylactic antibiotics <24 hours after surgery. In a multicenter, retrospective cohort of 2,954 mastectomy patients ± immediate breast reconstruction, we found that utilization of prophylactic postdischarge antibiotics varied dramatically at the surgeon level among general surgeons and was virtually universal among plastic surgeons.
This paper discusses the evidence for periodic human activity in the Cairngorm Mountains of Scotland from the late 9th millennium to the early 4th millennium cal bc. While contemporary paradigms for Mesolithic Europe acknowledge the significance of upland environments, the archaeological record for these areas is not yet as robust as that for the lowland zone. Results of excavation at Chest of Dee, along the headwaters of the River Dee, are set into a wider context with previously published excavations in the area. A variety of site types evidences a sophisticated relationship between people and a dynamic landscape through a period of changing climate. Archaeological benefits of the project include the ability to examine novel aspects of the archaeology leading to a more comprehensive understanding of Mesolithic lifeways. It also offers important lessons in site survival, archaeological investigation, and the management of the upland zone.
Cadaveric and older radiographic studies suggest that concurrent cervical spine fractures are rare in gunshot wounds (GSWs) to the head. Despite this knowledge, patients with craniofacial GSWs often arrive with spinal motion restriction (SMR) in place. This study quantifies the incidence of cervical spine injuries in GSWs to the head, identified using computerized tomography (CT). Fracture frequency is hypothesized to be lower in self-inflicted (SI) injuries.
Isolated craniofacial GSWs were queried from this Level I trauma center registry from 2013-2017 and the US National Trauma Data Bank (NTDB) from 2012–2016 (head or face abbreviated injury scale [AIS] >2). Datasets included age, gender, SI versus not, cervical spine injury, spinal surgery, and mortality. For this hospital’s data, prehospital factors, SMR, and CTs performed were assessed. Statistical evaluation was done with Stata software, with P <.05 significant.
Two-hundred forty-one patients from this hospital (mean age 39; 85% male; 66% SI) and 5,849 from the NTDB (mean age 38; 84% male; 53% SI) were included. For both cohorts, SI patients were older (P < .01) and had increased mortality (P < .01). Overall, cervical spine fractures occurred in 3.7%, with 5.4% requiring spinal surgery (0.2% of all patients). The frequency of fracture was five-fold greater in non-SI (P < .05). Locally, SMR was present in 121 (50.2%) prior to arrival with six collars (2.5%) placed in the trauma bay. Frequency of SMR was similar regardless of SI status (49.0% versus 51.0%; P = not significant) but less frequent in hypotensive patients and those receiving cardiopulmonary resuscitation (CPR). The presence of SMR was associated with an increased use of CT of the cervical spine (80.0% versus 33.0%; P < .01).
Cervical spine fractures were identified in less than four percent of isolated GSWs to the head and face, more frequently in non-SI cases. Prehospital SMR should be avoided in cases consistent with SI injury, and for all others, SMR should be discontinued once CT imaging is completed with negative results.
Despite recommendations to discontinue prophylactic antibiotics after incision closure or <24 hours after surgery, prophylactic antibiotics are continued after discharge by some clinicians. The objective of this study was to determine the prevalence and factors associated with postdischarge prophylactic antibiotic use after spinal fusion.
Multicenter retrospective cohort study.
This study included patients aged ≥18 years undergoing spinal fusion or refusion between July 2011 and June 2015 at 3 sites. Patients with an infection during the surgical admission were excluded.
Prophylactic antibiotics were identified at discharge. Factors associated with postdischarge prophylactic antibiotic use were identified using hierarchical generalized linear models.
In total, 8,652 spinal fusion admissions were included. Antibiotics were prescribed at discharge in 289 admissions (3.3%). The most commonly prescribed antibiotics were trimethoprim/sulfamethoxazole (22.1%), cephalexin (18.8%), and ciprofloxacin (17.1%). Adjusted for study site, significant factors associated with prophylactic discharge antibiotics included American Society of Anesthesiologists (ASA) class ≥3 (odds ratio [OR], 1.31; 95% CI, 1.00–1.70), lymphoma (OR, 2.57; 95% CI, 1.11–5.98), solid tumor (OR, 3.63; 95% CI, 1.62–8.14), morbid obesity (OR, 1.64; 95% CI, 1.09–2.47), paralysis (OR, 2.38; 95% CI, 1.30–4.37), hematoma/seroma (OR, 2.93; 95% CI, 1.17–7.33), thoracic surgery (OR, 1.39; 95% CI, 1.01–1.93), longer length of stay, and intraoperative antibiotics.
Postdischarge prophylactic antibiotics were uncommon after spinal fusion. Patient and perioperative factors were associated with continuation of prophylactic antibiotics after hospital discharge.
Introduction: Compared to other areas in Alberta Health Services (AHS), internal data show that emergency departments (EDs) and urgent care centres (UCCs) experience a high rate of workforce violence. As such, reducing violence in AHS EDs and UCCs is a key priority. This project explored staff's lived experience with patient violence with the goal of better understanding its impact, and what strategies and resources could be put in place. Methods: To obtain a representative sample, we recruited staff from EDs and a UCC (n = 6) situated in urban and rural settings across Alberta. As the interviews had the potential to be upsetting, we conducted in-person interviews in a private space. Interviews were conducted with over 60 staff members including RNs, LPNs, unit clerks, physicians, and protective services. Data collection and analysis occurred simultaneously and iteratively until saturation was reached. The analysis involved data reduction, category development, and synthesis. Key phrases and statements were first highlighted. Preliminary labels were then assigned to the data and data was then organized into meaningful clusters. Finally, we identified common themes of participants’ lived experience. Triangulation of sources, independent and team analysis, and frequent debriefing sessions were used to enhance the trustworthiness of the data. Results: Participants frequently noted the worry they carry with them when coming into work, but also said there was a high threshold of acceptance dominating ED culture. A recurring feature of this experience was the limited resources (e.g., no peace officers, scope of security staff) available to staff to respond when patients behave violently or are threatening. Education like non-violent crisis intervention training, although helpful, was insufficient to make staff feel safe. Participants voiced the need for more protective services, the addition of physical barriers like locking doors and glass partitions, more investment in addictions and mental health services (e.g., increased access to psychiatrists or addictions counsellors), and a greater shared understanding of AHS’ zero tolerance policy. Conclusion: ED and UCC staff describe being regularly exposed to violence from patients and visitors. Many of these incidents go unreported and unresolved, leaving the workforce feeling worried and unsupported. Beyond education, the ED and UCC workforce need additional resources to support them in feeling safe coming to work.
Introduction: Emergency Departments (EDs) are at high risk of workforce-directed violence (WDV). To address ED violence in Alberta Health Services (AHS), we conducted key informant interviews to identify successful strategies that could be adopted in AHS EDs. Methods: The project team identified potential participants through their ED network; additional contacts were identified through snowball sampling. We emailed 197 individuals from Alberta (123), Canada (46), and abroad (28). The interview guide was developed and reviewed in partnership with ED managers and Workplace Health and Safety. We conducted semi-structured phone interviews with 26 representatives from urban and rural EDs or similar settings from Canada, the United States, and Australia. This interview process received an ARECCI score of 2. Two researchers conducted a content analysis of the interview notes; rural and urban sites were analyzed separately. We extracted strategies, their impact, and implementation barriers and facilitators. Strategies identified were categorized into emergent themes. We aggregated similar strategies and highlighted key or unique findings. Results: Interview results showed that there is no single solution to address ED violence. Sites with effective violence prevention strategies used a comprehensive approach where multiple strategies were used to address the issue. For example, through a violence prevention working group, one site implemented weekly violence simulations, a peer mentorship support team, security rounding, and more. This multifaceted approach had positive results: a decrease in code whites, staff feeling more supported, and the site no longer being on union “concerned” lists. Another promising strategy included addressing the culture of violence by increasing reporting, clarifying policies (i.e., zero tolerance), and establishing flagging or alert systems for visitors with violent histories. Physician involvement and support was highly valued in responding to violence (e.g., support when refusing care, on the code white response team, flagging). Conclusion: Overall, one strategy is not enough to successfully address WDV in EDs. Strategies need to be comprehensive and context specific, especially when considering urban and rural sites with different resources available. We note that few strategies were formally evaluated, and recommend that future work focus on developing comprehensive metrics to evaluate the strategies and define success.
Introduction: Vaginal bleeding in early pregnancy is a common emergency department (ED) presentation, with many of these episodes resulting in poor obstetrical outcome. These outcomes have been extensively studied, but there have been few evaluations of what variables are associated predictors. This study aimed to identify predictors of less than optimal obstetrical outcomes for women who present to the ED with early pregnancy bleeding. Methods: A regional centre health records review included pregnant females who presented to the ED with vaginal bleeding at <20 weeks gestation. This study investigated differences in presenting features between groups with subsequent optimal outcomes (OO; defined as a full-term live birth >37 weeks) and less than optimal outcomes (LOO; defined as a miscarriage, stillbirth or pre-term live birth). Predictor variables included: maternal age, gestational age at presentation, number of return ED visits, socioeconomic status (SES), gravida-para-abortus status, Rh status, Hgb level and presence of cramping. Rates and results of point of care ultrasound (PoCUS) and ultrasound (US) by radiology were also considered. Results: Records for 422 patients from Jan 2017 to Nov 2018 were screened and 180 patients were included. Overall, 58.3% of study participants had a LOO. The only strong predictor of outcome was seeing an Intra-Uterine Pregnancy (IUP) with Fetal Heart Beat (FHB) on US; OO rate 74.3% (95% CI 59.8-88.7; p < 0.01). Cramping (with bleeding) trended towards a higher rate of LOO (62.7%, 95% CI 54.2-71.1; p = 0.07). SES was not a reliable predictor of LOO, with similar clinical outcome rates above and below the poverty line (57.5% [95% CI 46.7-68.3] vs 59% [95% CI 49.3-68.6] LOO). For anemic patients, the non-live birth rate was 100%, but the number with this variable was small (n = 5). Return visits (58.3%, 95% CI 42.2-74.4), previous abortion (58.8%, 95% CI 49.7-67.8), no living children (60.2%, 95% CI 50.7-69.6) and past pregnancy (55.9%, 95% CI 46.6-65.1) were not associated with higher rates of LOO. Conclusion: Identification of a live IUP, anemia, and cramping have potential as predictors of obstetrical outcome in early pregnancy bleeding. This information may provide better guidance for clinical practice and investigations in the emergency department and the predictive value of these variables support more appropriate counseling to this patient population.
Introduction: Distal radial fractures (DRF) remain the most commonly encountered fracture in the Emergency Department (ED). The initial management of displaced DRFs by Emergency Physicians (EP) poses considerable resource allocation. We wished to determine the adequacy of reduction, both initially and at follow up. This data updates previously presented high level findings. Methods: We performed a mixed-methods study including patients who underwent procedural sedation and manipulation by an EP for a DRF. Radiological images performed at initial assessment, post-reduction, and clinic follow up were reviewed by a panel of orthopedic surgeons and radiologists blinded to outcomes, and assessed for evidence of displacement. Demographic data were pooled from patient records and included in statistical analysis. Results: Seventy patients were included and had follow-up completed. Initial reduction was deemed to be adequate in 37 patients (53%; 95% CI 41.32 to 64.10%). At clinic follow-up assessment, 26 reductions remained adequate; a slippage rate of 30% (95% CI of 17.37 to 45.90). Overall 7 patients (10%; 95% CI 4.65 to 19.51%) required revision of the initial reduction in the operating room. Agreement on adequacy of reduction on post-reduction radiographs between radiologists and orthopedic surgeons was 38.6% (95% CI -38.3 to -7.4, Kappa -0.229). The statistical strength of this agreement is worse than what would be expected by chance alone. There was no association found between age, sex, or of time of initial presentation and final outcomes. Conclusion: Although blinded review by specialists determined only half of initial EP DRF reductions to be radiographically adequate, only 10 percent actually required further intervention. Agreement between specialists on adequacy was poor. The majority of DRFs reduced by EPs do not require further surgical intervention.
Introduction: Determining fluid status prior to resuscitation provides a more accurate guide for appropriate fluid administration in the setting of undifferentiated hypotension. Emergency Department (ED) point of care ultrasound (PoCUS) has been proposed as a potential non-invasive, rapid, repeatable investigation to ascertain inferior vena cava (IVC) characteristics. Our goal was to determine the feasibility of using PoCUS to measure IVC size and collapsibility. Methods: This was a planned secondary analysis of data from a prospective multicentre international study investigating PoCUS in ED patients with undifferentiated hypotension. We prospectively collected data on IVC size and collapsibility using a standard data collection form in 6 centres. The primary outcome was the proportion of patients with a clinically useful (determinate) scan defined as a clearly visible intrahepatic IVC, measurable for size and collapse. Descriptive statistics are provided. Results: A total of 138 scans were attempted on 138 patients; 45.7% were women and the median age was 58 years old. Overall, one hundred twenty-nine scans (93.5%; 95% CI 87.9 to 96.7%) were determinate. 131 (94.9%; 89.7 to 97.7%) were determinate for IVC size, and 131 (94.9%; 89.7 to 97.7%) were determinate for collapsibility. Conclusion: In this analysis of 138 ED patients with undifferentiated hypotension, the vast majority of PoCUS scans to investigate IVC characteristics were determinate. Future work should include analysis of the value of IVC size and collapsibility in determining fluid status in this group.
Introduction: Crowding is associated with poor patient outcomes in emergency departments (ED). Measures of crowding are often complex and resource-intensive to score and use in real-time. We evaluated single easily obtained variables to establish the presence of crowding compared to more complex crowding scores. Methods: Serial observations of patient flow were recorded in a tertiary Canadian ED. Single variables were evaluated including total number of patients in the ED (census), in beds, in the waiting room, in the treatment area waiting to be assessed, and total inpatient admissions. These were compared with Crowding scores (NEDOCS, EDWIN, ICMED, three regional hospital modifications of NEDOCS) as predictors of crowding. Predictive validity was compared to the reference standard of physician perception of crowding, using receiver operator curve analysis. Results: 144 of 169 potential events were recorded over 2 weeks. Crowding was present in 63.9% of the events. ED census (total number of patients in the ED) was strongly correlated with crowding (AUC = 0.82 with 95% CI = 0.76 - 0.89) and its performance was similar to that of NEDOCS (AUC = 0.80 with 95% CI = 0.76 - 0.90) and a more complex local modification of NEDOCS, the S-SAT (AUC = 0.83, 95% CI = 0.74 - 0.89). Conclusion: The single indicator, ED census was as predictive for the presence of crowding as more complex crowding scores. A two-stage approach to crowding intervention is proposed that first identifies crowding with a real-time ED census statistic followed by investigation of precipitating and modifiable factors. Real time signalling may permit more standardized and effective approaches to manage ED flow.
Introduction: Patients presenting to the emergency department (ED) with hypotension have a high mortality rate and require careful yet rapid resuscitation. The use of cardiac point of care ultrasound (PoCUS) in the ED has progressed beyond the basic indications of detecting pericardial fluid and activity in cardiac arrest. We examine if finding left ventricular dysfunction (LVD) on emergency physician performed PoCUS reliably predicts the presence of cardiogenic shock in hypotensive ED patients. Methods: We prospectively collected PoCUS findings performed in 135 ED patients with undifferentiated hypotension as part of an international study. Patients with clearly identified etiologies for hypotension were excluded, along with other specific presumptive diagnoses. LVD was defined as identification of a generally hypodynamic LV in the setting of shock. PoCUS findings were collected using a standardized protocol and data collection form. All scans were performed by PoCUS-trained emergency physicians. Final shock type was defined as cardiogenic or non-cardiogenic by independent specialist blinded chart review. Results: All 135 patients had complete follow up. Median age was 56 years, 53% of patients were male. Disease prevalence for cardiogenic shock was 12% and the mortality rate was 24%. The presence of LVD on PoCUS had a sensitivity of 62.50% (95%CI 35.43% to 84.80%), specificity of 94.12% (88.26% to 97.60%), positive-LR 10.62 (4.71 to 23.95), negative-LR 0.40 (0.21 to 0.75) and accuracy of 90.37% (84.10% to 94.77%) for detecting cardiogenic shock. Conclusion: Detecting left ventricular dysfunction on PoCUS in the ED may be useful in confirming the underlying shock type as cardiogenic in otherwise undifferentiated hypotensive patients.
Foraging strategies in gentoo penguins (Pygoscelis papua) have been well studied (e.g. Croxall et al. 1988, Robinson & Hindell 1996, Lescroël et al. 2004, Takahashi et al. 2008, Xavier et al. 2017). The general consensus is this largest member of the three pygoscelid penguins displays both nearshore benthic and pelagic foraging tactics to consume combinations of crustaceans and fish. In a recent study, Carpenter-Kling et al. (2017) reported that gentoos at sub-Antarctic Marion Island displayed a novel foraging strategy that consisted of alternating typical lengthy foraging trips with much shorter nearshore afternoon trips. They suggest the latter foraging behaviour may be a response to suboptimal feeding conditions caused by local environmental change. This novel discovery reinforces the fact that, despite considerable study, not all foraging tactics in penguins have been documented. In this paper, we describe what we believe to be, yet another undocumented foraging tactic employed by gentoos.
To assess potential transmission of antibiotic-resistant organisms (AROs) using surrogate markers and bacterial cultures.
A 1,260-bed tertiary-care academic medical center.
The study included 25 patients (17 of whom were on contact precautions for AROs) and 77 healthcare personnel (HCP).
Fluorescent powder (FP) and MS2 bacteriophage were applied in patient rooms. HCP visits to each room were observed for 2–4 hours; hand hygiene (HH) compliance was recorded. Surfaces inside and outside the room and HCP skin and clothing were assessed for fluorescence, and swabs were collected for MS2 detection by polymerase chain reaction (PCR) and selective bacterial cultures.
Transfer of FP was observed for 20 rooms (80%) and 26 HCP (34%). Transfer of MS2 was detected for 10 rooms (40%) and 15 HCP (19%). Bacterial cultures were positive for 1 room and 8 HCP (10%). Interactions with patients on contact precautions resulted in fewer FP detections than interactions with patients not on precautions (P < .001); MS2 detections did not differ by patient isolation status. Fluorescent powder detections did not differ by HCP type, but MS2 was recovered more frequently from physicians than from nurses (P = .03). Overall, HH compliance was better among HCP caring for patients on contact precautions than among HCP caring for patients not on precautions (P = .003), among nurses than among other nonphysician HCP at room entry (P = .002), and among nurses than among physicians at room exit (P = .03). Moreover, HCP who performed HH prior to assessment had fewer fluorescence detections (P = .008).
Contact precautions were associated with greater HCP HH compliance and reduced detection of FP and MS2.
We evaluated the impact of an electronic health record based 72-hour antimicrobial time-out (ATO) on antimicrobial utilization. We observed that 6 hours after the ATO, 21% of empiric antimicrobials were discontinued or de-escalated. There was a significant reduction in the duration of antimicrobial therapy but no impact on overall antimicrobial usage metrics.
Background: Chest tube insertion is a time and safety critical procedure with a significant complication rate (up to 30%). Industry routinely uses Lean and ergonomic methodology to improve systems. This process improvement study used best evidence review, small group consensus, process mapping and prototyping in order to design a lean and ergonomically mindful equipment solution. Aim Statement: By simplifying and reorganising chest tube equipment, we aim to provide users with adequate equipment, reduce equipment waste, and wasted effort locating equipment. Measures & Design: The study was conducted between March 2018 and November 2018. An initial list of process steps from the best available evidence was produced. This list was then augmented by multispecialty team consensus (3 Emergency Physicians, 1 Thoracic Surgeon, 1 medical student, 2 EM nurses). Necessary equipment was identified. Next, two prototyping phases were conducted using a task trainer and a realistic interprofessional team (1 EM Physician, 1 ER Nurse, 1 Medical student) to refine the equipment list and packaging. A final equipment storage system was produced and evaluated by an interprofessional team during cadaver training using a survey and Likert scales. Evaluation/Results: There were 47 equipment items in the pre-intervention ED chest tube tray. After prototyping 21 items were removed while nine critical items were added. The nine items missing from the original design were found in four different locations in the department. Six physicians and seven RNs participated in cadaver testing and completed an evaluation survey of the new layout. Participants preferred the new storage design (Likert median 5, IQR of 1) over the current storage design (median of 1, IQR of 1). Discussion/Impact: The results suggest that the lean equipment storage is preferred by ED staff compared to the current set-up, may reduce time finding missing equipment, and will reduce waste. Future simulation work will quantitatively understand compliance with safety critical steps, user stress, wasted user time and cost.
Introduction: Improving public access and training for epinephrine auto-injectors (EAIs) can reduce time to initial treatment in anaphylaxis. Effective use of EAIs by the public requires bystanders to respond in a timely and proficient manner. We wished to examine optimal methods for assessing effective training and skill retention for public use of EAIs, including the use of microskills lists. Methods: In this prospective, stratified randomized study, 154 participants at 15 sites receiving installation of public EAIs were randomized to one of three experimental education interventions: A) didactic poster (POS) teaching; B) poster with video teaching (VID), and C) Poster, video, and simulation training (SIM). Participants were tested by participation in a standardized simulated anaphylaxis scenario at 0-months, immediately following training, and again at follow-up at 3 months. Participants’ responses were videoed and assessed by two blinded raters using microksills checklists. The microskills lists were derived from the best available evidence and interprofessional process mapping using a skills trainer. The interobserver reliability was assessed for each item in a 14 step microskill checklist composed of 3-point and 5-point Likert scale questions around EpiPen use, expressed as Kappa Values. Results: Overall there was poor agreement between the two raters. Being composed or panicked had the highest level of agreement K = 0.7, but a result that did not reach statistical significance (substantial agreement, p = 0.06) calling for EMS support has the second highest level of agreement, K = 0.6 (moderate agreement, p = 0.01), the remainder of the items had very low to moderate agreement with a Kappa value range of -103 to 0.48. Conclusion: Although microskills chesklists have been shown to identify areas where learners and interprofessional teams require deliberate practice, these results support previously published evidence that the use of microskills checklists to assess skills has poor reproducibility. Performance will be further assessed in this study using global rating scales, which have shown higher levels of agreement in other studies.
Introduction: Chest tube insertion, a critical procedure with a published complication rate (30%), is a required competency for emergency physicians. Microskills training has been shown to identify steps that require deliberate practice. Objectives were: 1. Develop a chest tube insertion microskills checklist to facilitate IPE, 2. Compare the microskills checklist with published best available evidence, 3. Develop an educational video based on the process map, 4. Evaluate the video in an interprofessional team prior to cadaver training as a proof of concept. Methods: The study was conducted between March 2018 and November 2018. An initial list of process steps from the best available evidence was produced. This list was then augmented by multispecialty team consensus (3 Emergency Physicians, 1 Thoracic Surgeon, 1 medical student, 2 EM nurses). Two prototyping phases were conducted using a task trainer and a realistic interprofessional team (1 EM Physician, 1 ER Nurse, 1 Medical student). A final microskills list was produced and compared to the procedural steps described in consensus publications. An educational video was produced and evaluated by an interprofessional team prior to cadaver training using a survey and Likert scales as a proof of concept. Participants were 7 EM RNs and 6 ATLS trained physicians. Participants were asked to fill out a nine-question survey, using a 5-point Likert Scale (1-strongly disagree to 5 strongly agree). Results: The final process map contained 54 interdisciplinary steps, compared to ATLS that describes 14 main steps and peer reviewed articles that describe 9 main steps. The microskills checklist described, in more detail, the steps that relate to team interaction and the operational environment. Physicians rated the training video were able to apply what they learned in the video with an average of 4.67 (median of 5, mode of 5, and an IQR of 0.75). Conclusion: The development of the process maps and microkills checklists provides interprofessional teams with more information about chest tube insertion than instructions described in commonly available courses and procedural steps derived by consensus.
The pore structure of vapour deposited ASW is poorly understood, despite its importance to fundamental processes such as grain chemistry, cooling of star forming regions, and planet formation. We studied structural changes of vapour deposited D2O on intra-molecular to 30 nm length scales at temperatures ranging from 18 to 180 K and observed enhanced mobility from 100 to 150 K. An Arrhenius type model describes the loss of surface area and porosity with a common set of kinetic parameters. The low activation energy (428 K) is commensurate with van der Waals forces between nm-scale substructures in the ice. Our findings imply that water porosity will always change with time, even at low temperatures.