We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We describe the scientific goals and survey design of the First Large Absorption Survey in H i (FLASH), a wide field survey for 21-cm line absorption in neutral atomic hydrogen (H i) at intermediate cosmological redshifts. FLASH will be carried out with the Australian Square Kilometre Array Pathfinder (ASKAP) radio telescope and is planned to cover the sky south of
$\delta \approx +40\,\deg$
at frequencies between 711.5 and 999.5 MHz. At redshifts between
$z = 0.4$
and
$1.0$
(look-back times of 4 – 8 Gyr), the H i content of the Universe has been poorly explored due to the difficulty of carrying out radio surveys for faint 21-cm line emission and, at ultra-violet wavelengths, space-borne searches for Damped Lyman-
$\alpha$
absorption in quasar spectra. The ASKAP wide field of view and large spectral bandwidth, in combination with a radio-quiet site, will enable a search for absorption lines in the radio spectra of bright continuum sources over 80% of the sky. This survey is expected to detect at least several hundred intervening 21-cm absorbers and will produce an H i-absorption-selected catalogue of galaxies rich in cool, star-forming gas, some of which may be concealed from optical surveys. Likewise, at least several hundred associated 21-cm absorbers are expected to be detected within the host galaxies of radio sources at
$0.4 < z < 1.0$
, providing valuable kinematical information for models of gas accretion and jet-driven feedback in radio-loud active galactic nuclei. FLASH will also detect OH 18-cm absorbers in diffuse molecular gas, megamaser OH emission, radio recombination lines, and stacked H i emission.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Intensified cover-cropping practices are increasingly viewed as a herbicide-resistance management tool but clear distinction between reactive and proactive resistance management performance targets is needed. We evaluated two proactive performance targets for integrating cover-cropping tactics, including (1) facilitation of reduced herbicide inputs and (2) reduced herbicide selection pressure. We conducted corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] field experiments in Pennsylvania and Delaware using synthetic weed seedbanks of horseweed [Conyza canadensis (L.) Cronquist] and smooth pigweed (Amaranthus hybridus L.) to assess winter and summer annual population dynamics, respectively. The effect of alternative cover crops was evaluated across a range of herbicide inputs. Cover crop biomass production ranged from 2,000 to 8,500 kg ha−1 in corn and 3,000 to 5,500 kg ha−1 in soybean. Experimental results demonstrated that herbicide-based tactics were the primary drivers of total weed biomass production, with cover-cropping tactics providing an additive weed-suppression benefit. Substitution of cover crops for PRE or POST herbicide programs did not reduce total weed control levels or cash crop yields but did result in lower net returns due to higher input costs. Cover-cropping tactics significantly reduced C. canadensis populations in three of four cover crop treatments and decreased the number of large rosettes (>7.6-cm diameter) at the time of preplant herbicide exposure. Substitution of cover crops for PRE herbicides resulted in increased selection pressure on POST herbicides, but reduced the number of large individuals (>10 cm) at POST applications. Collectively, our findings suggest that cover crops can reduce the intensity of selection pressure on POST herbicides, but the magnitude of the effect varies based on weed life-history traits. Additional work is needed to describe proactive resistance management concepts and performance targets for integrating cover crops so producers can apply these concepts in site-specific, within-field management practices.
Introduction: Discharge communication in the pediatric emergency department (ED) is an important aspect of successful transition home for patients and families. The content, process, and pattern of discharge communication in a pediatric ED encounter has yet to be comprehensively explored. The objective of this study was to identify and characterize elements and patterns of discharge communication occurring during pediatric ED visits between health care providers (HCPs) and families. Methods: We analyzed real time video observations (N = 53) of children (0-18) presenting to two Canadian pediatric EDs with fever or minor head injury. We used a revised version of an existing coding scheme, PEDICSv2, to code all encounters. PEDICSv2 includes 32 elements capturing discharge communication. Inter-rater reliability was established with a second coder. Descriptive statistics reflecting the rates of delivery of each communication content element was reported to assess repetition at four stages of the visit (introduction/planning, actions/interventions, diagnosis/home management plan and summary/conclusion). Communication content was analyzed to depict behaviors of individual HCPs and the total communication delivered to the patient and caregiver by the healthcare team. Results: Results show 55.6% of families were asked to repeat their main concern by multiple HCPs during their ED visit. However, only 14.8% of families had comprehension of delivered discharge information assessed by more than one HCP. When involved in care, physicians were the most likely HCP to perform a comprehension assessment. Most of the communication delivered by nursing staff were elements involved in the introduction/planning and action/intervention stages of the visit. Conclusion: Findings indicate that most repetition occurs while eliciting a main concern during the introduction and planning stage of a pediatric ED encounter. In contrast, communication elements focusing on understanding the home management plan are less likely to be repeated by multiple HCPs. Future work focusing on structuring team workflow to minimize repetition during the introduction and planning stage may allow for clearer discharge teaching and more frequent comprehension assessment.
Treatment resistant schizophrenia (TRS) is one of the most disabling of psychiatric disorders, affecting about 1/3 of patients. First-line treatments include both atypical and typical antipsychotics. The original atypical, clozapine, is a final option, and although it has been shown to be the only effective treatment for TRS, many patients do not respond well to clozapine. Clozapine use is related to adverse events, most notably agranulocytosis, a potentially fatal blood disorder which affects about 1% of those prescribed clozapine and requires regular blood monitoring. This as a barrier to prescription and there is a long delay in access for TRS patients, of five or more years, from first antipsychotic prescription. Better tools to predict treatment resistance and to identify risk of adverse events would allow faster and safer access to clozapine for patients who are likely to benefit from it. The CRESTAR project (www.crestar-project.eu) is a European Framework 7 collaborative project that aims to develop tools to predict i) treatment response, particularly patients who are less likely to respond to usual antipsychotics, indicating treatment with clozapine as early as possible, ii) patients who are at high or low risk of adverse events and side effects, iii) extreme TRS patients so that they can be stratified in clinical trials for novel treatments. CRESTAR has addressed these questions by examining genome-wide association data, genome sequence, epigenetic biomarkers and epidemiological data in European patient cohorts characterized for treatment response, and adverse drug reaction using data from clozapine therapeutic drug monitoring and linked National population medical and pharmacy databases, to identify predictive factors. In parallel CRESTAR will perform health economic research on potential benefits, and ethics and patient-centred research with stakeholders.
Adult forms of members of the Callodistomidae always parasitize the gallbladder of freshwater fishes and occur in Africa and America. This study provides a description of a new South American species belonging in Prosthenhystera from the gallbladder of a characid fish (Bryconamericus ikaa), and ribosomal gene sequences (28S rDNA and ITS1-5.8S-ITS2) are used to demonstrate molecular differences between the new species and congeners as well as explore interrelationships among congeners. Additionally, the first cytological analysis is conducted for a member of the family to determine chromosome number and arrangement. Prosthenhystera gattii n. sp. most closely resembles Prosthenhystera caballeroi in morphology, but the vitellarium is more extensive reaching anterior to the caecal bifurcation in the new species and the uterus is confined to the hindbody in P. gattii n. sp., whereas it extends to the level of the pharynx in P. caballeroi. Also, the testes, cirrus sac, seminal receptacle and the ratio of body length to width are larger in P. gattii n. sp. Independent Bayesian inference analyses of 28S rDNA and ITS1-5.8S-ITS2 sequence fragments produced phylograms that showed P. gattii n. sp. is more similar to Prosthenhystera obesa + Prosthenhystera oonastica than P. caballeroi + two unidentified species of Prosthenhystera, but with poor posterior probability support for the node in the ITS1-5.8S-ITS2-based phylogram. Further, the genetic distance between P. oonastica and P. gattii n. sp. are the largest among Prosthenhystera spp. Cytological analysis revealed ten metacentric chromosomes, which is fewer than the 12–18 chromosomes present in species from the closely related Gorgoderidae.
Organic grain producers are interested in reducing tillage to conserve soil and decrease labor and fuel costs. We examined agronomic and economic tradeoffs associated with alternative strategies for reducing tillage frequency and intensity in a cover crop–soybean (Glycine max L. Merr.) sequence within a corn (Zea mays L.)–soybean–spelt (Triticum spelta L.) organic cropping system experiment in Pennsylvania. Tillage-based soybean production preceded by a cover crop mixture of annual ryegrass (Lolium perenne L. ssp. multiflorum), orchardgrass (Dactylis glomerata L.) and forage radish (Raphanus sativus L.) interseeded into corn grain (Z. mays L.) was compared with reduced-tillage soybean production preceded by roller-crimped cereal rye (Secale cereale L.) that was sown after corn silage. Total aboveground weed biomass did not differ between soybean production strategies. Each strategy, however, was characterized by high inter-annual variability in weed abundance. Tillage-based soybean production marginally increased grain yield by 0.28 Mg ha−1 compared with reduced-tillage soybean. A path model of soybean yield indicated that soybean stand establishment and weed biomass were primary drivers of yield, but soybean production strategy had a measurable effect on yields due to factors other than within-season weed–crop competition. Cumulative tillage frequency and intensity were quantified for each cover crop—sequence using the Soil Tillage Intensity Rating (STIR) index. The reduced-tillage soybean sequence resulted in 50% less soil disturbance compared to tillage-based soybean sequence across study years. Finally, enterprise budget comparisons showed that the reduced-tillage soybean sequence resulted in lower input costs than the tillage-based soybean sequence but was approximately $114 ha−1 less profitable because of lower average yields.
Introduction: Effective communication to develop a shared understanding of patient/caregiver (P/C) expectations is critical during emergency department (ED) encounters. However, there is limited research examining the use of communication tools of P/C expectations to improve communication in the ED. The objective of this study was to examine satisfaction with a patient expectations questionnaire, known as the PrEPP tool, and its impact on communication and management of patients in the ED. Methods: The PrEPP tool collected P/C expectations over 3 phases of the study. In phase1, the PrEPP tool was distributed to all P/Cs (CTAS score of 2 to 5) in four EDs in Nova Scotia. In phase 2 the PrEPP tool was refined to a 5-item questionnaire. In phase 3 the PrEPP tool was re-implemented over a six-month period. Follow-up surveys were distributed to P/Cs via email (phase 1, 3) and HCPs on iPads in the ED (phase 3) to determine the impact of the tool on communication and management of patients. Entries were compiled on a REDCap database and descriptive statistics were used to analyze responses related to satisfaction.The PrEPP tool collected P/C expectations over 3 phases of the study. In phase1, the PrEPP tool was distributed to all P/Cs (CTAS score of 2 to 5) in four EDs in Nova Scotia. In phase 2 the PrEPP tool was refined to a 5-item questionnaire. In phase 3 the PrEPP tool was re-implemented over a six-month period. Follow-up surveys were distributed to P/Cs via email (phase 1, 3) and HCPs on iPads in the ED (phase 3) to determine the impact of the tool on communication and management of patients. Entries were compiled on a REDCap database and descriptive statistics were used to analyze responses related to satisfaction. Results: In Phase 1, 11418 PrEPP tools and 147 surveys (29% response rate) were collected from January-June 2016. The majority of P/Cs found the PrEPP questionnaire easy to complete (95.9%) and felt HCPs met their expectations (87.1%). In Phase 3, 951 P/C (31.1% response rate) and 128 HCP surveys were collected. Of P/C respondents 45.9% felt PrEPP helped to communicate expectations, while 49.7% said that they would like to use it on future ED visits. The majority of P/C respondents (75.4%) indicated their expectations were met during their visit to the ED. Of those whose expectations were not met, 69% felt their expectations were not discussed. The majority of HCP respondents (90.4%) indicated they used the PrEPP tool at least sometimes. Also, 78.4% said it influenced patient communication and 42% indicated the tool influenced management of patients at least sometimes. Conclusion: Obtaining expectations early in the patient encounter may provide opportunities for improved communication in the ED. P/Cs found the PrEPP tool easy to use to communicate their expectations and HCPs felt it influenced communication and management of patients in the ED. Further qualitative thematic analysis is needed to explore how the PrEPP tool impacted ED visits.
Introduction: Effective communication to develop a shared understanding of patient expectations is critical to a positive encounter in the Emergency Department (ED). However, there is limited research examining Patient/Caregiver (P/C ) expectations in the ED and what factors lead to P/C presentation. This study aims to address this gap by answering the following questions: 1) What are common P/C reported factors affecting ED presentation? 2) What are common P/C expectations of an ED visit? 3) How do P/C expectations vary based on ED site or factors affecting presentation in the ED? Methods: The Preparing Emergency Patients and Providers (PrEPP) tool was designed to collect P/C expectations, worries, perceived causes of symptoms, and factors affecting presentation from a convenience sample of patient visits to the emergency department (ED). The PrEPP tool was provided to all P/Cs with CTAS 2-5 when they registered at one of 4 EDs in the Halifax area from January to June 2016. Completed tools were collected in a REDCap database where qualitative data was coded into categories (i.e. presenting illness, injury). Descriptive and chi-squared statistical analyses were performed. Results: In total, 11,418 PrEPP tools were collected; representing 12% of the total ED visits to the 4 ED sites during the study period. The main factors affecting ED presentation were: self-referral 68%, family/friends 20%, telehealth 8%, unable to see their GP 7%, GP referral 6%, or walk-in-clinic 5%. P/Cs main causes of worry were: presenting illness 19%, injury 15%, or pain 14%. The main expectations for the ED visit were to get a: physician's opinion 73%, x-ray 40%, or blood test 20%. Most P/Cs indicated they did not expect medication during (63%), or after (66%), their ED visit. There were significant differences in P/C expectations between adult and pediatric EDs (χ2 = 720.949, df = 14, P = 0.000) and those P/Cs unable or able to access primary care prior to ED presentation (χ2 = 38.980, df = 1, P = 0.000). The rate of expecting a physician's opinion at the pediatric ED was higher than the adult ED (77.6% vs 70.9%), while lower for expecting CT/MRIs (4.6% vs 11.4%). P/Cs who were unable to access primary care prior to ED presentation expected services which were available at primary care at a higher rate than those who accessed primary care (58.5% vs 36.7%). Conclusion: Our findings identify some of the factors that influence P/C's decision to present to the ED and their expectations of the ED visit.
It has recently been shown that the abundance of cold neutral gas may follow a similar evolution as the star formation history. This is physically motivated, since stars form out of this component of the neutral gas and if the case, would resolve the long-standing issue that there is a clear disparity between the total abundance of neutral gas and star-forming activity over the history of the Universe. Radio-band 21-cm absorption traces the cold gas and comparison with the Lyman-α absorption, which traces all of the gas, provides a measure of the cold gas fraction, or the spin temperature, Tspin. The recent study has shown that the spin temperature (degenerate with the ratio of the absorber/emitter extent) appears to be anti-correlated with the star formation density, ψ*, with 1/Tspin undergoing a similar steep evolution as ψ* over redshifts of 0 ≲ z ≲ 3, whereas the total neutral hydrogen exhibits little evolution. Above z ∼ 3, where ψ* shows a steep decline with redshift, there are insufficient 21-cm data to determine whether 1/Tspin continues to follow ψ*. Knowing this is paramount in ascertaining whether the cold neutral gas does trace the star formation over the Universe’s history. We explore the feasibility of resolving this with 21-cm observations of the largest contemporary sample of reliable damped Lyman-α absorption systems and conclude that, while today’s largest radio interferometers can reach the required sensitivity at z ≲ 3.5, the Square Kilometre Array is required to probe higher redshifts.
Seven half-day regional listening sessions were held between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide-resistance management. The objective of the listening sessions was to connect with stakeholders and hear their challenges and recommendations for addressing herbicide resistance. The coordinating team hired Strategic Conservation Solutions, LLC, to facilitate all the sessions. They and the coordinating team used in-person meetings, teleconferences, and email to communicate and coordinate the activities leading up to each regional listening session. The agenda was the same across all sessions and included small-group discussions followed by reporting to the full group for discussion. The planning process was the same across all the sessions, although the selection of venue, time of day, and stakeholder participants differed to accommodate the differences among regions. The listening-session format required a great deal of work and flexibility on the part of the coordinating team and regional coordinators. Overall, the participant evaluations from the sessions were positive, with participants expressing appreciation that they were asked for their thoughts on the subject of herbicide resistance. This paper details the methods and processes used to conduct these regional listening sessions and provides an assessment of the strengths and limitations of those processes.
Herbicide resistance is ‘wicked’ in nature; therefore, results of the many educational efforts to encourage diversification of weed control practices in the United States have been mixed. It is clear that we do not sufficiently understand the totality of the grassroots obstacles, concerns, challenges, and specific solutions needed for varied crop production systems. Weed management issues and solutions vary with such variables as management styles, regions, cropping systems, and available or affordable technologies. Therefore, to help the weed science community better understand the needs and ideas of those directly dealing with herbicide resistance, seven half-day regional listening sessions were held across the United States between December 2016 and April 2017 with groups of diverse stakeholders on the issues and potential solutions for herbicide resistance management. The major goals of the sessions were to gain an understanding of stakeholders and their goals and concerns related to herbicide resistance management, to become familiar with regional differences, and to identify decision maker needs to address herbicide resistance. The messages shared by listening-session participants could be summarized by six themes: we need new herbicides; there is no need for more regulation; there is a need for more education, especially for others who were not present; diversity is hard; the agricultural economy makes it difficult to make changes; and we are aware of herbicide resistance but are managing it. The authors concluded that more work is needed to bring a community-wide, interdisciplinary approach to understanding the complexity of managing weeds within the context of the whole farm operation and for communicating the need to address herbicide resistance.
Introduction: TREKK is a national knowledge mobilization network of clinicians, researchers and parents aimed at improving emergency care for children by increasing collaborations between general and pediatric emergency departments (ED). This study aimed to determine patterns of knowledge sharing within the network and identify connections, barriers and opportunities to obtaining pediatric information and training. Methods: Social network analysis (SNA) uses network theory to understand patterns of interaction. Two SNAs were conducted in 2014 and 2015 using an online network survey distributed to 37 general EDs. Data was analyzed using UCI Net and Netdraw to identify connections, knowledge sharing and knowledge brokers within the network. Building on these results, we then conducted 22 semi-structured follow-up interviews (2016) with healthcare professionals (HCPs) at General EDs across Canada, purposefully sampled to include individuals from connected and disconnected sites, as identified in the SNA. Interviews were analyzed by 2 reviewers using content and thematic analysis. Results: SNA data was analyzed for 135 participants across the network. Results from 2014 showed that the network was divided along provincial lines, with most individuals connecting with colleagues within their own institution. Results from 2015 showed more inter-site interconnectivity and a reduction in isolated sites over time from 17 to 3. Interview participants included physicians (59%) and nurses (41%) from 18 general EDs in urban (68%) and rural/remote (32%) Canada. HCPs sought information both formally and informally, by using guidelines, talking to colleagues, and attending pediatric related training sessions. Network structure and processes were felt to increase connections, support practice change, and promote standards of care. Participants identified personal, organizational and system-level barriers to information and skill acquisition, including resources and personal costs, geography, dissemination, and time. Providing easy access to information at the point of care was promoted through enhancing content visibility and by embedding resources into local systems. There remains a need to share successful methods of local dissemination and implementation across the network, and to leverage local professional champions such as clinical nurse liaisons. Conclusion: This study highlights the power of a network to increase connections between HCPs working in general and pediatric EDs. Findings reinforce the critical role of ongoing network evaluation to improve the design and delivery of knowledge mobilization initiatives.
Introduction: The World Health Organization recommends emergency care training for laypeople in low-resource settings, but the effects of these programs on patient outcomes and community health have not been systematically reviewed. Our objective was to identify the individual and community health effects of educating laypeople to deliver emergency care in low-resource settings. Methods: We conducted a systematic review to address this question: in low-resource populations (P), does emergency care education for laypeople (I) confer any measurable effect on patient morbidity and mortality, or community capacity and resilience for emergency health conditions (O), in comparison with no training or other education(C)? We searched 12 electronic databases and grey literature for quantitative studies. We conducted duplicate and independent title and abstract screening, methodological and outcomes extraction, and study quality assessment using the Effective Public Health Practice Tool. We developed a narrative summary of findings. (PROSPERO: CRD42014009685) Results: We reviewed 16,017 abstracts and 372 full-text papers. 38 met inclusion criteria. Most topically relevant papers were excluded because they assessed educational outcomes. Cardiopulmonary resuscitation training (6 papers) improved cardiac arrest survival and enhanced capacity to respond to cardiac arrest in rural Norway, Denmark and commercial aircraft operations. A public education campaign in remote Denmark improved absolute cardiac arrest survival by 5.4% (95%CI 2-12). Lay trauma training (12 papers) reduced absolute injury mortality and improved community capacity in Iraq, Cambodia, Iran and Indigenous New Zealand communities. A trauma care program in Iraq and Cambodia reduced absolute mortality by 25% (95%CI 17.2-33). Education for mothers on paediatric fevers in Ethiopia was associated with 40% relative reductions in under-5 mortality (95%CI 29.2-50.6). Similar training improved access to care for paediatric malnutrition, malaria, pneumonia, and gastrointestinal disease in Nigeria, Kenya, Senegal, Burkina Faso, Mali, and India (13 papers). Overdose education and naloxone distribution was associated with reductions in opioid overdose deaths (3 papers), including in Massachusetts where high-uptake communities for overdose education had significantly lower overdose fatality rates than no-uptake communities (rate ratio 0.54, 95%CI 0.39-0.76). Community education improved measures of access to emergency care for remote Indigenous populations in Canada, Alaska and Nepal (3 papers) and adolescent mental health capacity in Australia (1 paper). Studies were of low or medium quality. Conclusion: In addition to established interventions for injury and cardiac arrest, emergency care training can improve community capacity in underserviced populations, and save lives in opioid overdose, paediatric infectious disease and malnutrition.
Introduction: Optimal discharge communication between healthcare providers and parents who present to the emergency department (ED) with their children is not well understood. Current research regarding discharge communication is equivocal and predominantly focused on evaluating different delivery formats or strategies with little attention given to communication behaviours or the context in which the communication occurs. The objective of this study was to characterize the process and structure of discharge communication in a pediatric ED context. Methods: Real-time video observation and follow-up surveys were used in two academic pediatric EDs in Canada. Parents who presented with their child to the ED with one of six illness presentations, a Canadian Triage Acuity Score of 3-5 were eligible to participate. All ED physicians, learners, and staff members were also eligible. Provider-parent communication was analyzed using the Roter Interaction Analysis System (RIAS) to code each utterance. Parent health literacy and anxiety were measured upon admission to the ED. Parent recall of important discharge information and satisfaction with communication was assessed within 72 hours of discharge. Results: A total of 107 ED patient visits were video recorded and a total of 70,000 utterances were coded across six illness presentations: abdominal pain (n=23), asthma (n=7), bronchiolitis (n=4), diarrhea/vomiting (n=20), fever (n=27), and minor head injury (n=26). The average length of stay for participants was 3 hours, with an average of three provider interactions per visit. Interactions ranged in time from less than one minute up to 29 minutes, with an average of six minutes per interaction. The majority of visits were first episodes for the presenting illness (63.2%). Physician utterances coded most commonly involved giving medical information (22.9%), whereas nurses most commonly gave orientation instructions (20.9%). Learners were most likely to employ active listening techniques (14.2%). Communication that provided post-discharge instructions for parents comprised 8.5% of all utterances. Overall, providers infrequently assessed parental understanding of information (2.0%). Only 26% of parents recalled receiving important discharge information deemed relevant to their childs disposition. Yet, parent satisfaction with the amount of information communicated during the ED visit was generally high (89.6% agreed or strongly agreed). Conclusion: This is the first study of ED discharge communication to be conducted in a pediatric setting using video observation methods. Provider-parent communication was predominantly characterized by giving medical information, with little time devoted to preparing families to care for their child at home. Greater assessment of parent comprehension of discharge communication is needed to ensure that parents understand important instructions and know when to seek further care.
On 27 April 2015, Washington health authorities identified Escherichia coli O157:H7 infections associated with dairy education school field trips held in a barn 20–24 April. Investigation objectives were to determine the magnitude of the outbreak, identify the source of infection, prevent secondary illness transmission and develop recommendations to prevent future outbreaks. Case-finding, hypothesis generating interviews, environmental site visits and a case–control study were conducted. Parents and children were interviewed regarding event activities. Odds ratios (OR) and 95% confidence intervals (CI) were computed. Environmental testing was conducted in the barn; isolates were compared to patient isolates using pulsed-field gel electrophoresis (PFGE). Sixty people were ill, 11 (18%) were hospitalised and six (10%) developed haemolytic uremic syndrome. Ill people ranged in age from <1 year to 47 years (median: 7), and 20 (33%) were female. Twenty-seven case-patients and 88 controls were enrolled in the case–control study. Among first-grade students, handwashing (i.e. soap and water, or hand sanitiser) before lunch was protective (adjusted OR 0.13; 95% CI 0.02–0.88, P = 0.04). Barn samples yielded E. coli O157:H7 with PFGE patterns indistinguishable from patient isolates. This investigation provided epidemiological, laboratory and environmental evidence for a large outbreak of E. coli O157:H7 infections from exposure to a contaminated barn. The investigation highlights the often overlooked risk of infection through exposure to animal environments as well as the importance of handwashing for disease prevention. Increased education and encouragement of infection prevention measures, such as handwashing, can prevent illness.
Introduction: Understanding factors that influence laboratory test ordering in emergency departments (EDs) can help to improve current laboratory test ordering practices. The aim of this study is to compare patterns and influences in laboratory test ordering between emergency physicians and nurses at two ED sites, Halifax Infirmary (HI) and Dartmouth General (DG). Methods: A mixed-methods approach involving administrative data and telephone interviews was employed. Data from 211,279 patients at HI and DG EDs were analyzed. Chi-square analysis and binary logistic regression were used to determine significant factors influencing whether a test was ordered, as well as significant factors predicting likelihood of a nurse or a physician ordering a test. All significant associations had a p-value of <0.0001. Interviews were conducted (n=25) with doctors and nurses in order to explore areas of potential influence in a clinician’s decision-making process, and discuss what makes decision making difficult or inconsistent in the ED. These interviews were analyzed according to the Theoretical Domains Framework. The interviews were coded by two individuals using a consensus methodology in order to ensure accuracy of coding. Results: Overall, laboratory tests were more likely to be ordered at DG than at HI (OR=1.52, 95% CI: [1.48, 1.55]). Laboratory tests were more likely to be ordered by nurses at DG than at HI (OR=1.58, 95% CI: [1.54, 1.62]). Laboratory tests were more likely to be ordered if the ED was not busy, if the patient was over 65, had a high acuity, had a long stay in the ED, required consults, or was admitted to hospital. Doctors were more likely to order a laboratory test in patients over 65, requiring consults or hospital admission, whereas nurses were more likely to order laboratory tests in patients with high acuity or long stays in the ED. Data from the interviews suggested differing influences on decision making between nurses and doctors, especially in the areas of social influence and knowledge. Conclusion: Currently, there is limited research that investigates behaviour of both emergency physicians and nurses. By determining barriers that are most amenable to behaviour change in emergency physicians and nurses, findings from this work may be used to update practice guidelines, ensuring more consistency and efficiency in laboratory test ordering in the ED.
Introduction: Effective communication to develop a shared understanding of patient expectations is critical in establishing a positive medical encounter in the emergency department (ED). However, there is limited research examining patient/caregiver expectations in the ED, and their impact on the beliefs, attitudes and behaviours during and after an ED visit. The objective of this study is to examine patient/caregiver expectations and satisfaction with care in the ED using a patient expectation questionnaire and a follow up survey. Methods: As a part of a larger 3-phase study on patient/caregiver expectations in adult and pediatric EDs, a 7-item, paper-based questionnaire was distributed to all patients and/or caregivers who presented to one of four EDs in Nova Scotia with a Canadian Triage and Acuity Scale (CTAS) score of 2 to 5. A follow-up survey was distributed to all willing participants via email to determine their satisfaction with care received in ED. Descriptive statistics were used to analyze responses. Results: Phase 1 was conducted from January to September 2016. In total, 24,788 expectation questionnaires were distributed to ED patients/caregivers, 11,571 were collected (47% response rate), and 509 patients were contacted for a follow-up survey. Preliminary analysis of 4,533 questionnaires shows the majority of patients (67.1%) made the decision by themselves to present to the ED, while others were advised by a family/friend (22%). Respondents were most worried about an injury (17.8%) followed by illness (15.6%) and expected to talk to a physician (69.9%) and receive an x-ray (39.3%). The majority of physicians (53.3%) reported the expectation tool helped in caring for the patient and 87.5% felt they met patient expectations. There were 147 patient/caregiver responses to a follow-up survey (29% response rate) and 87.1% of responders reported that ED clinicians met their expectations. Conclusion: Patient/caregivers have a variety of concerns, questions, and expectations when presenting to the ED. Obtaining expectations early in the patient encounter may provide opportunities for improved communication between clinicians and patients while enhancing satisfaction with care received. Further analysis is needed to determine the impact of the expectation questionnaire on productivity in the ED.