To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Water Quality Act of 1987 ushered in a new era of clean water policy to the US. The Act stands today as the longest-lived example of national water quality policy. It included a then-revolutionary funding model for wastewater infrastructure - the Clean Water State Revolving Fund - which gave states much greater authority to allocate clean water infrastructure resources. Significant differences between states exist in terms of their ability to provide adequate resources for the program, as well as their ability (or willingness) to meet the wishes of Congress to serve environmental needs and communities. This book examines the patterns of state program resource distribution using case studies and analysis of state and national program data. This book is important for researchers from a range of disciplines, including water, environmental and infrastructure policy, federalism/intergovernmental relations, intergovernmental administration, and natural resource management, as well as policy makers and policy advocates.
Antibiotics are among the most commonly prescribed medications, and there is evidence to guide the optimal use of these agents for most situations encountered in clinical medicine, including for both treatment and prophylaxis. Nevertheless, clinicians routinely prescribe antibiotics in ways that diverge from this evidence, such as prescribing them when not indicated, for durations longer than necessary, or selecting broad-spectrum antibiotics when a narrower-spectrum agent would suffice.1,2 This overuse of antibiotics contributes to the public health crisis of antibiotic resistance while exposing patients to potential antibiotic-related harms.
Dysfunction in major stress response systems during the acute aftermath of trauma may contribute to risk for developing posttraumatic stress disorder (PTSD). The current study investigated how PTSD diagnosis and symptom severity, depressive symptoms, and childhood trauma uniquely relate to diurnal neuroendocrine secretion (cortisol and alpha-amylase rhythms) in women who recently experienced interpersonal trauma compared to non-traumatized controls (NTCs).
Using a longitudinal design, we examined diurnal cortisol and alpha-amylase rhythms in 98 young women (n = 57 exposed to recent interpersonal trauma, n = 41 NTCs). Participants provided saliva samples and completed symptom measures at baseline and 1-, 3-, and 6-month follow-up.
Multilevel models (MLMs) revealed lower waking cortisol predicted the development of PTSD in trauma survivors and distinguished at-risk women from NTCs. Women with greater childhood trauma exposure exhibited flatter diurnal cortisol slopes. Among trauma-exposed individuals, lower waking cortisol levels were associated with higher concurrent PTSD symptom severity. Regarding alpha-amylase, MLMs revealed women with greater childhood trauma exposure exhibited higher waking alpha-amylase and slower diurnal alpha-amylase increase.
Results suggest lower waking cortisol in the acute aftermath of trauma may be implicated in PTSD onset and maintenance. Findings also suggest childhood trauma may predict a different pattern of dysfunction in stress response systems following subsequent trauma exposure than the stress system dynamics associated with PTSD risk; childhood trauma appears to be associated with flattened diurnal cortisol and alpha-amylase slopes, as well as higher waking alpha-amylase.
Pain is poorly identified in dementia due to complete or partial loss in communication, which is associated with progressive cognitive impairment. If it goes untreated, pain can lead to behavioral disturbances (e.g., agitation/aggression), delirium, inappropriate pharmacotherapy (e.g., psychotropics), hospitalizations and caregiver distress. There are limited prevalence data in the literature on pain in dementia subtypes.
This study aims to investigate the prevalence and intensity of pain in various dementia subtypes in aged care residents living with dementia (RLWD), using a technology-driven pain assessment tool.
A 1-year retrospective cross-sectional study was conducted on the presence and intensity of pain in referrals to Dementia Support Australia from residential aged care homes (RACHs), using PainChek®. PainChek® is a pain assessment tool that uses artificial intelligence algorithms (e.g., automated facial recognition and analysis) to identify facial expressions indicative of pain in conjunction with other digital checklists of pain behaviors such as vocalization and movement cues. Presence and intensity of pain were identified using PainChek® categories (scores): no pain (0-6), mild pain (7-11), moderate pain (12-15) and severe pain (16-42).
During the study period (01/11/2017-31/10/2018), a sample of 479 referrals (age: 81.9 ± 8.3 years old; 55.5% female) from 370 RACHs with Alzheimer’s disease (AD; 40.9%), vascular dementia (VaD; 12.7%), mixed dementia (MD; 5.9%), dementia with Lewy body (DLB; 2.9%), and frontotemporal dementia (FTD; 2.3%) were examined. Pain was prevalent in two-thirds (65.6%) of the referrals with almost half (48.4%) of these categorized as experiencing moderate-severe pain. MD and those with DLB (78.6% each) shared the highest prevalence of pain, followed by AD (64.3%) > VaD (62.3%) > FTD (54.6%). Prevalence of severe pain was as follow: MD (17.9%) > AD (12.3%) > VaD (11.5%) > FTD (9.1%) > DLB (7.1%).
To date, this is the largest study that presented data on pain prevalence and intensity in major dementia subtypes in the RACH setting. Moderate-severe pain is highly prevalent in RLWD, which appears to differ by dementia subtypes. This may reveal the impact of neuropathological etiology of those subtypes on the neurobiology of pain.
People living with dementia (PLWD) in residential aged care homes (RACHs) are frequently prescribed psychotropic medications due to the high prevalence of neuropsychiatric symptoms, also known as behaviors and psychological symptoms of dementia (BPSD). However, the gold standard to support BPSD is using psychosocial/non-pharmacological therapies.
This study aims to describe and evaluate services and neuropsychiatric outcomes associated with the provision of psychosocial person-centered care interventions delivered by national multidisciplinary dementia-specific behavior support programs.
A 2-year retrospective pre-post study with a single-arm analysis was conducted on BPSD referrals received from Australian RACHs to the two Dementia Support Australia (DSA) programs, the Dementia Behavior Management Advisory Service (DBMAS) and the Severe Behavior Response Teams (SBRT). Neuropsychiatric outcomes were measured using the Neuropsychiatric Inventory (NPI) total scores and total distress scores. The questionnaire version “NPI-Q” was administered for DBMAS referrals whereas the nursing home version “NPI-NH” was administered for SBRT referrals. Linear mixed effects models were used for analysis, with time, baseline score, age, sex, and case length as predictors. Clinical significance was measured using Cohen’s effect size (d; ≥0.3), the mean change score (MCS; 3 points for the NPI-Q and 4 points for the NPI-NH) and the mean percent change (MPC; ≥30%) in NPI parameters.
A total of 5,914 referrals (55.9% female, age 82.3 ± 8.6 y) from 1,996 RACHs were eligible for analysis. The most common types of dementia were Alzheimer’s disease (37.4%) and vascular dementia (11.7%). The average case length in DSA programs was 57.2 ± 26.3 days. The NPI scores were significantly reduced as a result of DSA programs, independent of covariates. There were significant reductions in total NPI scores as a result of the DBMAS (61.4%) and SBRT (74.3%) programs. For NPI distress scores, there were 66.5% and 69.1% reductions from baseline for the DBMAS and SBRT programs, respectively. All metrics (d, MCS, MPC) were above the threshold set for determining a clinically significant effect.
Multimodal psychosocial interventions delivered by DSA programs are clinically effective as demonstrated by positive referral outcomes, such as improved BPSD and related caregiver distress.
This article explores the use of specially trained canines to detect the location of human burials in nonmodern archaeological contexts. It discusses the history of the discipline, training and field methods, the importance of developing a working relationship with descendant communities, project examples, an assessment of canine detection effectiveness, and ways to select a canine detection team. The article highlights how the application of canine detection training and protocols to the archaeological record makes it possible to locate potential precontact Native American burial areas without ground disturbance. In some cases, probable burial areas located by canines can be confidentially mapped to ensure avoidance during upcoming construction projects. For a variety of reasons, many Native American communities have been wary of embracing this new method to locate ancestral burials. Today, however, canine detection is widely accepted by many tribal groups in California to locate ancestral burials that might be impacted by construction. Although additional controlled studies and rigorous field laboratory experiments are needed to understand the range of variation in efficacy fully, available results in both North America and Europe demonstrate that specially trained canines can often accurately locate human burials that are more than a thousand years old to within a few meters.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
Quantitative plant biology is an interdisciplinary field that builds on a long history of biomathematics and biophysics. Today, thanks to high spatiotemporal resolution tools and computational modelling, it sets a new standard in plant science. Acquired data, whether molecular, geometric or mechanical, are quantified, statistically assessed and integrated at multiple scales and across fields. They feed testable predictions that, in turn, guide further experimental tests. Quantitative features such as variability, noise, robustness, delays or feedback loops are included to account for the inner dynamics of plants and their interactions with the environment. Here, we present the main features of this ongoing revolution, through new questions around signalling networks, tissue topology, shape plasticity, biomechanics, bioenergetics, ecology and engineering. In the end, quantitative plant biology allows us to question and better understand our interactions with plants. In turn, this field opens the door to transdisciplinary projects with the society, notably through citizen science.
The analysis presented here was motivated by an objective of describing the interactions between the physical and biological processes governing the responses of tidal wetlands to rising sea level and the ensuing equilibrium elevation. We define equilibrium here as meaning that the elevation of the vegetated surface relative to mean sea level (MSL) remains within the vertical range of tolerance of the vegetation on decadal time scales or longer. The equilibrium is dynamic, and constantly responding to short-term changes in hydrodynamics, sediment supply, and primary productivity. For equilibrium to occur, the magnitude of vertical accretion must be great enough to compensate for change in the rate of sea-level rise (SLR). SLR is defined here as meaning the local rate relative to a benchmark, typically a gauge. Equilibrium is not a given, and SLR can exceed the capacity of a wetland to accrete vertically.
Delineating the proximal urethra can be critical for radiotherapy planning but is challenging on computerised tomography (CT) imaging.
Materials and methods:
We trialed a novel non-invasive technique to allow visualisation of the proximal urethra using a rapid sequence magnetic resonance imaging (MRI) protocol to visualise the urinary flow in patients voiding during the simulation scan.
Of the seven patients enrolled, four were able to void during the MRI scan. For these four patients, direct visualisation of urinary flow through the proximal urethra was achieved. The average volume of the proximal urethra contoured on voiding MRI was significantly higher than the proximal urethra contoured on CT, 4·07 and 1·60 cc, respectively (p = 0·02). The proximal urethra location also differed; the Dice coefficient average was 0·28 (range 0–0·62).
In this small, proof-of-concept prospective clinical trial, the volume and location of the proximal urethra differed significantly when contoured on a voiding MRI scan compared to that determined by a conventional CT simulation. The shape of the proximal urethra on voiding MRI may be more anatomically correct compared to the proximal urethra shape determined with a semi-rigid catheter in place.
Conservation measures providing food-rich habitats through agri-environment schemes (AES) have the potential to affect the demography and local abundance of species limited by food availability. The European Turtle Dove Streptopelia turtur is one of Europe’s fastest declining birds, with breeding season dietary changes coincident with a reduction in reproductive output suggesting food limitation during breeding. In this study we provided seed-rich habitats at six intervention sites over a 4-year period and tested for impacts of the intervention on breeding success, ranging behaviour and the local abundance of territorial turtle doves. Nesting success and chick biometrics were unrelated to the local availability of seed-rich habitat or to the proximity of intervention plots. Nestling weight was higher close to human habitation consistent with an influence of anthropogenic supplementary food provision. Small home ranges were associated with a high proportion of non-farmed habitats, while large home ranges were more likely to contain seed-rich habitat suggesting that breeding doves were willing to travel further to utilize such habitat where available. Extensively managed grassland and intervention plot fields were selected by foraging turtle doves. A slower temporal decline in the abundance of breeding males on intervention sites probably reflects enhanced habitat suitability during territory settlement. Refining techniques to deliver sources of sown, natural, and supplementary seed that are plentiful, accessible, and parasite-free is likely to be crucial for the conservation of turtle doves.
Congenital heart disease (CHD) is the most common birth defect for infants born in the United States, with approximately 36,000 affected infants born annually. While mortality rates for children with CHD have significantly declined, there is a growing population of individuals with CHD living into adulthood prompting the need to optimise long-term development and quality of life. For infants with CHD, pre- and post-surgery, there is an increased risk of developmental challenges and feeding difficulties. Feeding challenges carry profound implications for the quality of life for individuals with CHD and their families as they impact short- and long-term neurodevelopment related to growth and nutrition, sensory regulation, and social-emotional bonding with parents and other caregivers. Oral feeding challenges in children with CHD are often the result of medical complications, delayed transition to oral feeding, reduced stamina, oral feeding refusal, developmental delay, and consequences of the overwhelming intensive care unit (ICU) environment. This article aims to characterise the disruptions in feeding development for infants with CHD and describe neurodevelopmental factors that may contribute to short- and long-term oral feeding difficulties.
To estimate the impact of California’s antimicrobial stewardship program (ASP) mandate on methicillin-resistant Staphylococcus aureus (MRSA) and Clostridioides difficile infection (CDI) rates in acute-care hospitals.
Centers for Medicare and Medicaid Services (CMS)–certified acute-care hospitals in the United States.
2013–2017 data from the CMS Hospital Compare, Provider of Service File and Medicare Cost Reports.
Difference-in-difference model with hospital fixed effects to compare California with all other states before and after the ASP mandate. We considered were standardized infection ratios (SIRs) for MRSA and CDI as the outcomes. We analyzed the following time-variant covariates: medical school affiliation, bed count, quality accreditation, number of changes in ownership, compliance with CMS requirements, % intensive care unit beds, average length of stay, patient safety index, and 30-day readmission rate.
In 2013, California hospitals had an average MRSA SIR of 0.79 versus 0.94 in other states, and an average CDI SIR of 1.01 versus 0.77 in other states. California hospitals had increases (P < .05) of 23%, 30%, and 20% in their MRSA SIRs in 2015, 2016, and 2017, respectively. California hospitals were associated with a 20% (P < .001) decrease in the CDI SIR only in 2017.
The mandate was associated with a decrease in CDI SIR and an increase in MRSA SIR.
Depression is common in people living with HIV (PLWH) and can contribute to neurocognitive dysfunction. Depressive symptoms in PLWH are often measured by assessing only cognitive/affective symptoms. Latinx adults, however, often express depressive symptoms in a somatic/functional manner, which is not typically captured in assessments of depression among PLWH. Given the disproportionate burden of HIV that Latinx adults face, examining whether variations in expressed depressive symptoms differentially predict neurocognitive outcomes between Latinx and non-Hispanic white PLWH is essential.
This cross-sectional study included 140 PLWH (71% Latinx; 72% male; mean (M) age = 47.1 ± 8.5 years; M education = 12.6 ± 2.9 years) who completed a comprehensive neurocognitive battery, Wechsler Test of Adult Reading (WTAR), and Beck Depression Inventory-II (BDI-II). Neurocognitive performance was measured using demographically adjusted T-scores. BDI-II domain scores were computed for the Fast-Screen (cognitive/affective items) score (BDI-FS) and non-FS score (BDI-NFS; somatic/functional items).
Linear regressions revealed that the BDI-NFS significantly predicted global neurocognitive function and processing speed in the Latinx group (p < .05), such that higher physical/functional symptoms predicted worse performance. In the non-Hispanic white group, the cognitive/affective symptoms significantly predicted processing speed (p = .02), with more symptoms predicting better performance. Interaction terms of ethnicity and each BDI sub-score indicated that Latinx participants with higher cognitive/affective symptoms performed worse on executive functioning.
Depressive symptoms differentially predict neurocognitive performance in Latinx and non-Hispanic white PLWH. These differences should be considered when conducting research and intervention among the increasingly culturally and ethnically diverse population of PLWH.
This paper reviews current knowledge of the structure, genesis, cytochemistry and putative functions of the haplosporosomes of haplosporidians (Urosporidium, Haplosporidium, Bonamia, Minchinia) and paramyxids (Paramyxa, Paramyxoides, Marteilia, Marteilioides, Paramarteilia), and the sporoplasmosomes of myxozoans (Myxozoa – Malacosporea, Myxosporea). In all 3 groups, these bodies occur in plasmodial trophic stages, disappear at the onset of sporogony, and reappear in the spore. Some haplosporidian haplosporosomes lack the internal membrane regarded as characteristic of these bodies and that phylum. Haplosporidian haplosporogenesis is through the Golgi (spherulosome in the spore), either to form haplosporosomes at the trans-Golgi network, or for the Golgi to produce formative bodies from which membranous vesicles bud, thus acquiring the external membrane. The former method also forms sporoplasmosomes in malacosporeans, while the latter is the common method of haplosporogenesis in paramyxids. Sporoplasmogenesis in myxosporeans is largely unknown. The haplosporosomes of Haplosporidium nelsoni and sporoplasmosomes of malacosporeans are similar in arraying themselves beneath the plasmodial plasma membrane with their internal membranes pointing to the exterior, possibly to secrete their contents to lyse host cells or repel haemocytes. It is concluded that these bodies are probably multifunctional within and between groups, their internal membranes separating different functional compartments, and their origin may be from common ancestors in the Neoproterozoic.
Few studies have examined burnout in psychosocial oncology clinicians. The aim of this systematic review was to summarize what is known about the prevalence and severity of burnout in psychosocial clinicians who work in oncology settings and the factors that are believed to contribute or protect against it.
Articles on burnout (including compassion fatigue and secondary trauma) in psychosocial oncology clinicians were identified by searching PubMed/MEDLINE, EMBASE, PsycINFO, the Cumulative Index to Nursing and Allied Health Literature, and the Web of Science Core Collection.
Thirty-eight articles were reviewed at the full-text level, and of those, nine met study inclusion criteria. All were published between 2004 and 2018 and included data from 678 psychosocial clinicians. Quality assessment revealed relatively low risk of bias and high methodological quality. Study composition and sample size varied greatly, and the majority of clinicians were aged between 40 and 59 years. Across studies, 10 different measures were used to assess burnout, secondary traumatic stress, and compassion fatigue, in addition to factors that might impact burnout, including work engagement, meaning, and moral distress. When compared with other medical professionals, psychosocial oncology clinicians endorsed lower levels of burnout.
Significance of results
This systematic review suggests that psychosocial clinicians are not at increased risk of burnout compared with other health care professionals working in oncology or in mental health. Although the data are quite limited, several factors appear to be associated with less burnout in psychosocial clinicians, including exposure to patient recovery, discussing traumas, less moral distress, and finding meaning in their work. More research using standardized measures of burnout with larger samples of clinicians is needed to examine both prevalence rates and how the experience of burnout changes over time. By virtue of their training, psychosocial clinicians are well placed to support each other and their nursing and medical colleagues.
In this paper, we investigate the impingement of a two-dimensional (2-D) vortex pair translating downwards onto a horizontal wall with a wavy surface. A principal purpose is to compare the vortex dynamics with the complementary case of a wavy vortex pair (deformed by the long-wavelength Crow instability) impinging onto a flat surface. The simpler case of a 2-D vortex pair descending onto a flat horizontal ground plane leads to the well known ‘rebound’ effect, wherein the primary vortex pair approaches the wall but subsequently advects vertically upwards, due to the induced velocity of secondary vorticity. In contrast, a wavy vortex pair descending onto a flat plane leads to ‘rebounding’ vorticity in the form of vortex rings. A descending 2-D vortex pair, impinging on a wavy wall, also generates ‘rebounding’ vortex rings. In this case, we observe that the vortex pair interacts first with the ‘hills’ of the wavy wall before the ‘valleys’. The resulting secondary vorticity rolls up into a concentrated vortex tube, ultimately forming a vortex loop along each valley. Each vortex loop pinches off to form a vortex ring, which advects upwards. Surprisingly, these rebounding vortex rings evolve without the strong axial flows fundamental to the wavy vortex case. The present research is relevant to wing tip trailing vortices interacting with a non-uniform ground plane. A non-flat wall is shown to accelerate the decay of the primary vortex pair. Such a passive, ground-based method to diminish the wake vortex hazard close to the ground is consistent with Stephan et al. (J. Aircraft, vol. 50 (4), 2013a, pp. 1250–1260; CEAS Aeronaut. J., vol. 5 (2), 2013b, pp. 109–125).
Introduction: For rhythm control of acute atrial flutter (AAFL) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAFL, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an intention-to-treat basis. Statistical significance was assessed using chi-squared tests and multivariable logistic regression. Results: We randomized 76 patients, and none was lost to follow-up. The Drug-Shock (N = 33) and Shock Only (N = 43) groups were similar for all characteristics including mean age (66.3 vs 63.4 yrs), duration of AAFL (30.1 vs 24.5 hrs), previous AAFL (72.7% vs 69.8%), median CHADS2 score (1 vs 1), and mean initial heart rate (128.9 vs 126.0 bpm). The Drug-Shock and Shock only groups were similar for the primary outcome of conversion (100% vs 93%; absolute difference 7.0%, 95% CI -0.6;14.6; P = 0.25). The multivariable analyses confirmed the similarity of the two strategies (P = 0.19). In the Drug-Shock group 21.2% of patients converted with the infusion. There were no statistically significant differences for time to conversion (84.2 vs 97.6 minutes), total ED length of stay (9.4 vs 7.5 hours), disposition home (100% vs 95.3%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion (usually for transient hypotension) was more common in the Drug-Shock group (9.1% vs 0.0%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAFL patients to go home in sinus rhythm. IV procainamide alone was effective in only one fifth of patients, much less than for acute AF.
Introduction: Cases of anaphylaxis in children are often not appropriately managed by caregivers. We aimed to develop and to test the effectiveness of an education tool to help pediatric patients and their families better understand anaphylaxis and its management and to improve current knowledge and treatment guidelines adherence. Methods: The GEAR (Guidelines and Educational programs based on an Anaphylaxis Registry) is an initiative that recruits children with food-induced anaphylaxis who have visited the ED at the Montreal Children's Hospital and at The Children's Clinic located in Montreal, Quebec. The patients and parents, together, were asked to complete six questions related to the triggers, recognition and management of anaphylaxis at the time of presentation to the allergy clinic. Participants were automatically shown a 5-minute animated video addressing the main knowledge gaps related to the causes and management of anaphylaxis. At the end of the video, participants were redirected to same 6 questions to respond again. To test long-term knowledge retention, the questionnaire will be presented again in one year's time. A paired t-test was used to compare the difference between the baseline score and the follow-up score based on percentage of correct answers of the questionnaire. Results: From June to November 2019, 95 pediatric patients with diagnosed food-induced anaphylaxis were recruited. The median patient age was 4.5 years (Interquartile Range (IQR): 1.6–7.4) and half were male (51.6%). The mean questionnaire baseline score was 0.77 (77.0%, standard deviation (sd): 0.16) and the mean questionnaire follow-up score was 0.83 (83.0%, sd: 0.17). There was a significant difference between the follow-up score and baseline score (difference: 0.06, 95% CI: 0.04, 0.09). There were no associations of baseline questionnaire scores and change in scores with age and sex. Conclusion: Our video teaching method was successful in educating patients and their families to better understand anaphylaxis. The next step is to acquire long-term follow up scored to determine retention of knowledge.