To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
BASF Corporation has developed P-hydroxyphenylpyruvate dioxygenase (HPPD) inhibitor-resistant cotton and soybean that will allow growers to use isoxaflutole in future weed management programs. In 2019 and 2020, a multi-state research project was conducted non-crop to examine weed control following isoxaflutole applied preemergence alone and with a number of tank mix partners at high and low labeled rates. At 28 DAT, Palmer amaranth was controlled ≥95% at 6 of 7 locations with isoxaflutole plus the high rate of diuron or fluridone. These same combinations provided the greatest control 42 DAT at 4 of 7 locations. Where large crabgrass was present, isoxaflutole plus the high rate of diuron, fluridone, pendimethalin, or S-metolachlor or isoxaflutole plus the low rate of fluometuron controlled large crabgrass ≥95% in 2 of 3 locations 28 DAT. In 2 of 3 locations, isoxaflutole plus the high rate of pendimethalin or S-metolachlor improved large crabgrass control 42 DAT when compared to isoxaflutole alone. At 21 DAT, morningglory was controlled ≥95% at all locations with isoxaflutole plus the high rate of diuron and at 3 of 4 locations with isoxaflutole plus the high rate of fluometuron. At 42 DAT at all locations, isoxaflutole plus diuron or fluridone and isoxaflutole plus the high rate of fluometuron improved morningglory control compared to isoxaflutole alone. These results suggest that isoxaflutole applied preemergence alone or in tank mixture is efficacious on a number of cross-spectrum annual weeds in cotton and extended weed control may be achieved when isoxaflutole is tank mixed with a number of soil residual herbicides.
The discovery of wake bistability has generated an upsurge in experimental investigations into the wakes of simplified vehicle geometries. Particular focus has centred on the probabilistic switching between two asymmetrical bistable wake states of a square-back Ahmed body; however, the majority of this research has been undertaken in wind tunnels with turbulence intensities of less than $1\,\%$, considerably lower than typical atmospheric levels. To better simulate bistability under on-road conditions, in which turbulence intensities can easily reach levels of $10\,\%$ or more, this experimental study investigates the effects of free-stream turbulence on the bistability characteristics of the square-back Ahmed body. Through passive generation and quantification of the free-stream turbulent conditions, a monotonic correlation was found between the switching rate and free-stream turbulence intensity.
The intravenous (IV) is one of the main parenteral routes for drug administration. Rapid onset of action, precise titration, patient-specific dosing and bypass of liver metabolism are a few of its advantages, while hypersensitivity reactions, adverse effects, infection risk and a higher overall cost some of its most debated downsides. Unlike other areas of Medicine, IV has been significantly under-utilized in Psychiatry.
This systematic review analyzed the evidence for effectiveness and safety behind the use of IV medication used for the management of acute disturbance.
APA PsycINFO, MEDLINE, and EMBASE databases were searched for eligible studies. Studies were included if they used IV medication to treat acute disturbance, in English language, had participants aged >18. The quality of the included studies was assessed using the National Institutes of Health quality checklist.
17 studies were deemed eligible. Data analysis was limited to narrative synthesis since primary outcome measures varied significantly between each study. Findings showed strong evidence for efficacy and safety of dexmedetomidine, droperidol, midazolam, and olanzapine. These medications displayed a short time to sedation, reduction in agitation levels, or large percentage of patients adequately sedated with a low number of adverse events. Results did not provide enough evidence for the use of IV ketamine, haloperidol, diazepam, lorazepam, and promethazine.
This review supports dexmedetomidine, droperidol, midazolam, and olanzapine as safe and efficacious options for managing acute disturbance via the intravenous route, particularly in special clinical settings where trained staff, optimal monitoring, resuscitation equipment and ventilators are all at hand.
Healthcare personnel with severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection were interviewed to describe activities and practices in and outside the workplace. Among 2,625 healthcare personnel, workplace-related factors that may increase infection risk were more common among nursing-home personnel than hospital personnel, whereas selected factors outside the workplace were more common among hospital personnel.
Herbicide resistance is an increasing issue in many weed species, including rigid ryegrass (Lolium rigidum Gaudin); a major weed of winter cropping systems in southern Australia. Recently, this weed has also been found in summer crops in the southeastern region of Australia. Effective control of this herbicide-resistant weed across southeastern Australia requires alternative management strategies. These strategies can be informed by analyses on the interaction of germinable seeds with their regional environments and by identifying the differences between populations of varying herbicide-resistance levels. In this study, we explore how various environmental factors differentially affect the seed germination and seedling emergence of three L. rigidum populations, including one glyphosate-resistant population (GR), one glyphosate-susceptible population (GS), and one population of unknown resistance status (CC04). Germination was greater than 90% for all populations at each temperature regime, except 15/5 C. Populations germinated at a lower rate under 15/5 C, ranging from 74% to 87% germination. Salt stress had a similar effect on the germination of all populations, with 0% germination occurring at 250 mM salt stress. Population GS had greater tolerance to osmotic stress, with 65% germination at −0.4 MPa compared with 47% and 43% germination for CC04 and GR, respectively; however, germination was inhibited at −0.8 and −1.6 MPa for all populations. All populations had lower germination when placed in complete darkness as opposed to alternating light/dark. Germination in darkness was lower for CC04 (69%) than GR (83%) and GS (83%). Seedling emergence declined with increasing burial depth with the lowest emergence occuring at 8 cm (37%) when averaged over the populations. These results indicate that L. rigidum can survive under a range of environmental variables and that the extent of survival differs based on population; however, there was no difference based on herbicide-resistance status.
Regular physical activity is safe and effective therapy for adults with CHD and is recommended by European Society of Cardiology guidelines. The COVID-19 pandemic poses enormous challenges to healthcare teams and patients when ensuring guideline compliance. We explored the implications of COVID-19 on physical activity levels in adult CHD patients.
Materials and methods:
A data-based questionnaire was distributed to adult CHD patients at a regional tertiary centre from October to November 2020.
Prior to the COVID-19 pandemic, 96 (79.3%) of 125 respondents reported participating in regular physical activity, with 66 (52.8%) meeting target levels (moderate physical activity for at least 150 minutes per week). Commonest motivations for physical activity were general fitness (53.6%), weight loss (36.0%), and mental health benefits (30.4%). During the pandemic, the proportion that met target levels significantly decreased from 52.8% to 40.8% (p = 0.03). The commonest reason was fear of COVID-19 (28.0%), followed by loss of motivation (23.2%) and gym/fitness centre closure (15.2%).
The COVID-19 pandemic has negatively impacted exercise levels of adult CHD patients. Most do not meet recommended physical activity levels, mainly attributable to fear of COVID-19. Even before the pandemic, only half of respondents met physical activity guidelines. Availability of online classes can positively impact exercise levels so could enhance guideline compliance. This insight into health perceptions and behaviours of adult CHD patients may help develop quality improvement initiatives to improve physical activity levels in this population.
To assess extent of a healthcare-associated outbreak of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) and to evaluate the effectiveness of infection control measures, including universal masking.
Outbreak investigation including 4 large-scale point-prevalence surveys.
Integrated VA healthcare system with 2 facilities and 330 beds.
Index patient and 250 exposed patients and staff.
We identified exposed patients and staff and classified them as probable and confirmed cases based on symptoms and testing. We performed a field investigation and an assessment of patient and staff interactions to develop probable transmission routes. Infection prevention interventions included droplet and contact precautions, employee quarantine, and universal masking with medical and cloth face masks. We conducted 4 point-prevalence surveys of patient and staff subsets using real-time reverse-transcriptase polymerase chain reaction for SARS-CoV-2.
Among 250 potentially exposed patients and staff, 14 confirmed cases of coronavirus disease 2019 (COVID-19) were identified. Patient roommates and staff with prolonged patient contact were most likely to be infected. The last potential date of transmission from staff to patient was day 22, the day universal masking was implemented. Subsequent point-prevalence surveys in 126 patients and 234 staff identified 0 patient cases and 5 staff cases of COVID-19, without evidence of healthcare-associated transmission.
Universal masking with medical face masks was effective in preventing further spread of SARS-CoV-2 in our facility in conjunction with other traditional infection prevention measures.
Motivated by the desire to understand complex transient behaviour in fluid flows, we study the dynamics of an air bubble driven by the steady motion of a suspending viscous fluid within a Hele-Shaw channel with a centred depth perturbation. Using both experiments and numerical simulations of a depth-averaged model, we investigate the evolution of an initially centred bubble of prescribed volume as a function of flow rate and initial shape. The experiments exhibit a rich variety of organised transient dynamics, involving bubble breakup as well as aggregation and coalescence of interacting neighbouring bubbles. The long-term outcome is either a single bubble or multiple separating bubbles, positioned along the channel in order of increasing velocity. Up to moderate flow rates, the life and fate of the bubble are reproducible and can be categorised by a small number of characteristic behaviours that occur in simply connected regions of the parameter plane. Increasing the flow rate leads to less reproducible time evolutions with increasing sensitivity to initial conditions and perturbations in the channel. Time-dependent numerical simulations that allow for breakup and coalescence are found to reproduce most of the dynamical behaviour observed experimentally, including enhanced sensitivity at high flow rate. An unusual feature of this system is that the set of steady and periodic solutions can change during temporal evolution because both the number of bubbles and their size distribution evolve due to breakup and coalescence events. Calculation of stable and unstable solutions in the single- and two-bubble cases reveals that the transient dynamics is orchestrated by weakly unstable solutions of the system that can appear and disappear as the number of bubbles changes.
COVID-19 altered research in Clinical and Translational Science Award (CTSA) hubs in an unprecedented manner, leading to adjustments for COVID-19 research.
CTSA members volunteered to conduct a review on the impact of CTSA network on COVID-19 pandemic with the assistance from NIH survey team in October 2020. The survey questions included the involvement of CTSAs in decision-making concerning the prioritization of COVID-19 studies. Descriptive and statistical analyses were conducted to analyze the survey data.
60 of the 64 CTSAs completed the survey. Most CTSAs lacked preparedness but promptly responded to the pandemic. Early disruption of research triggered, enhanced CTSA engagement, creation of dedicated research areas and triage for prioritization of COVID-19 studies. CTSAs involvement in decision-making were 16.75 times more likely to create dedicated diagnostic laboratories (95% confidence interval [CI] = 2.17–129.39; P < 0.01). Likewise, institutions with internal funding were 3.88 times more likely to establish COVID-19 dedicated research (95% CI = 1.12–13.40; P < 0.05). CTSAs were instrumental in securing funds and facilitating establishment of laboratory/clinical spaces for COVID-19 research. Workflow was modified to support contracting and IRB review at most institutions with CTSAs. To mitigate chaos generated by competing clinical trials, central feasibility committees were often formed for orderly review/prioritization.
The lessons learned from the COVID-19 pandemic emphasize the pivotal role of CTSAs in prioritizing studies and establishing the necessary research infrastructure, and the importance of prompt and flexible research leadership with decision-making capacity to manage future pandemics.
Rock debris covers ~30% of glacier ablation areas in the Central Himalaya and modifies the impact of atmospheric conditions on mass balance. The thermal properties of supraglacial debris are diurnally variable but remain poorly constrained for monsoon-influenced glaciers over the timescale of the ablation season. We measured vertical debris profile temperatures at 12 sites on four glaciers in the Everest region with debris thickness ranging from 0.08 to 2.8 m. Typically, the length of the ice ablation season beneath supraglacial debris was 160 days (15 May to 22 October)—a month longer than the monsoon season. Debris temperature gradients were approximately linear (r2 > 0.83), measured as −40°C m–1 where debris was up to 0.1 m thick, −20°C m–1 for debris 0.1–0.5 m thick, and −4°C m–1 for debris greater than 0.5 m thick. Our results demonstrate that the influence of supraglacial debris on the temperature of the underlying ice surface, and therefore melt, is stable at a seasonal timescale and can be estimated from near-surface temperature. These results have the potential to greatly improve the representation of ablation in calculations of debris-covered glacier mass balance and projections of their response to climate change.
Background: With the emergence of antibiotic resistant threats and the need for appropriate antibiotic use, laboratory microbiology information is important to guide clinical decision making in nursing homes, where access to such data can be limited. Susceptibility data are necessary to inform antibiotic selection and to monitor changes in resistance patterns over time. To contribute to existing data that describe antibiotic resistance among nursing home residents, we summarized antibiotic susceptibility data from organisms commonly isolated from urine cultures collected as part of the CDC multistate, Emerging Infections Program (EIP) nursing home prevalence survey. Methods: In 2017, urine culture and antibiotic susceptibility data for selected organisms were retrospectively collected from nursing home residents’ medical records by trained EIP staff. Urine culture results reported as negative (no growth) or contaminated were excluded. Susceptibility results were recorded as susceptible, non-susceptible (resistant or intermediate), or not tested. The pooled mean percentage tested and percentage non-susceptible were calculated for selected antibiotic agents and classes using available data. Susceptibility data were analyzed for organisms with ≥20 isolates. The definition for multidrug-resistance (MDR) was based on the CDC and European Centre for Disease Prevention and Control’s interim standard definitions. Data were analyzed using SAS v 9.4 software. Results: Among 161 participating nursing homes and 15,276 residents, 300 residents (2.0%) had documentation of a urine culture at the time of the survey, and 229 (76.3%) were positive. Escherichia coli, Proteus mirabilis, Klebsiella spp, and Enterococcus spp represented 73.0% of all urine isolates (N = 278). There were 215 (77.3%) isolates with reported susceptibility data (Fig. 1). Of these, data were analyzed for 187 (87.0%) (Fig. 2). All isolates tested for carbapenems were susceptible. Fluoroquinolone non-susceptibility was most prevalent among E. coli (42.9%) and P. mirabilis (55.9%). Among Klebsiella spp, the highest percentages of non-susceptibility were observed for extended-spectrum cephalosporins and folate pathway inhibitors (25.0% each). Glycopeptide non-susceptibility was 10.0% for Enterococcus spp. The percentage of isolates classified as MDR ranged from 10.1% for E. coli to 14.7% for P. mirabilis. Conclusions: Substantial levels of non-susceptibility were observed for nursing home residents’ urine isolates, with 10% to 56% reported as non-susceptible to the antibiotics assessed. Non-susceptibility was highest for fluoroquinolones, an antibiotic class commonly used in nursing homes, and ≥ 10% of selected isolates were MDR. Our findings reinforce the importance of nursing homes using susceptibility data from laboratory service providers to guide antibiotic prescribing and to monitor levels of resistance.
Background: Antibiotics are among the most commonly prescribed drugs in nursing homes; urinary tract infections (UTIs) are a frequent indication. Although there is no gold standard for the diagnosis of UTIs, various criteria have been developed to inform and standardize nursing home prescribing decisions, with the goal of reducing unnecessary antibiotic prescribing. Using different published criteria designed to guide decisions on initiating treatment of UTIs (ie, symptomatic, catheter-associated, and uncomplicated cystitis), our objective was to assess the appropriateness of antibiotic prescribing among NH residents. Methods: In 2017, the CDC Emerging Infections Program (EIP) performed a prevalence survey of healthcare-associated infections and antibiotic use in 161 nursing homes from 10 states: California, Colorado, Connecticut, Georgia, Maryland, Minnesota, New Mexico, New York, Oregon, and Tennessee. EIP staff reviewed resident medical records to collect demographic and clinical information, infection signs, symptoms, and diagnostic testing documented on the day an antibiotic was initiated and 6 days prior. We applied 4 criteria to determine whether initiation of treatment for UTI was supported: (1) the Loeb minimum clinical criteria (Loeb); (2) the Suspected UTI Situation, Background, Assessment, and Recommendation tool (UTI SBAR tool); (3) adaptation of Infectious Diseases Society of America UTI treatment guidelines for nursing home residents (Crnich & Drinka); and (4) diagnostic criteria for uncomplicated cystitis (cystitis consensus) (Fig. 1). We calculated the percentage of residents for whom initiating UTI treatment was appropriate by these criteria. Results: Of 248 residents for whom UTI treatment was initiated in the nursing home, the median age was 79 years [IQR, 19], 63% were female, and 35% were admitted for postacute care. There was substantial variability in the percentage of residents with antibiotic initiation classified as appropriate by each of the criteria, ranging from 8% for the cystitis consensus, to 27% for Loeb, to 33% for the UTI SBAR tool, to 51% for Crnich and Drinka (Fig. 2). Conclusions: Appropriate initiation of UTI treatment among nursing home residents remained low regardless of criteria used. At best only half of antibiotic treatment met published prescribing criteria. Although insufficient documentation of infection signs, symptoms and testing may have contributed to the low percentages observed, adequate documentation in the medical record to support prescribing should be standard practice, as outlined in the CDC Core Elements of Antibiotic Stewardship for nursing homes. Standardized UTI prescribing criteria should be incorporated into nursing home stewardship activities to improve the assessment and documentation of symptomatic UTI and to reduce inappropriate antibiotic use.
Bayesian analysis of radiocarbon (14C) dates in North American archaeology is increasing, especially among archaeologists working in deeper time. However, historical archaeologists have been slow to embrace these new techniques, and there have been only a few examples of the incorporation of calendar dates as informative priors in Bayesian models in such work in the United States. To illustrate the value of Bayesian approaches to sites with both substantial earlier Native American occupations as well as a historic era European presence, we present the results of our Bayesian analysis of 14C dates from the earlier Guale village and the Mission period contexts from the Sapelo Shell Ring Complex (9MC23) in southern Georgia. Jefferies and Moore have explored the Spanish Mission period deposits at this site to better understand the Native American interactions with the Spanish during the 16th and 17th centuries along the Georgia Coast. Given the results of our Bayesian modeling, we can say with some degree of confidence that the deposits thus far excavated and sampled contain important information dating to the 17th-century mission on Sapelo Island. In addition, our modeling of new dates suggests the range of the pre-Mission era Guale village. Based on these new dates, we can now say with some degree of certainty which of the deposits sampled likely contain information that dates to one of the critical periods of Mission period research, the AD 1660–1684 period that ushered in the close of mission efforts on the Georgia Coast.
To describe the pattern of transmission of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) during 2 nosocomial outbreaks of coronavirus disease 2019 (COVID-19) with regard to the possibility of airborne transmission.
Contact investigations with active case finding were used to assess the pattern of spread from 2 COVID-19 index patients.
A community hospital and university medical center in the United States, in February and March, 2020, early in the COVID-19 pandemic.
Two index patients and 421 exposed healthcare workers.
Exposed healthcare workers (HCWs) were identified by analyzing the electronic medical record (EMR) and conducting active case finding in combination with structured interviews. Healthcare coworkers (HCWs) were tested for COVID-19 by obtaining oropharyngeal/nasopharyngeal specimens, and RT-PCR testing was used to detect SARS-CoV-2.
Two separate index patients were admitted in February and March 2020, without initial suspicion for COVID-19 and without contact or droplet precautions in place; both patients underwent several aerosol-generating procedures in this context. In total, 421 HCWs were exposed in total, and the results of the case contact investigations identified 8 secondary infections in HCWs. In all 8 cases, the HCWs had close contact with the index patients without sufficient personal protective equipment. Importantly, despite multiple aerosol-generating procedures, there was no evidence of airborne transmission.
These observations suggest that, at least in a healthcare setting, most SARS-CoV-2 transmission is likely to take place during close contact with infected patients through respiratory droplets, rather than by long-distance airborne transmission.
Older adults’ mental health problems are a growing public health concern, especially because their rate of mental health service use is particularly low. Decades of mental health service utilisation models have been developed, yet key assumptions from these models focus primarily on factors that facilitate or inhibit access into the treatment system without taking into considering the dynamics of how individuals respond to their mental health problems and engage in service utilisation. More recently, dynamic models like the Network Episode Model (NEM-II) have been developed to challenge the underlying, rational choice assumption of traditional utilisation models. Given the multifaceted and complex nature of older adults’ mental health problems, the objective of this study was to examine whether the NEM-II is a helpful and appropriate model for understanding the dynamic process of how older adults navigate the mental health system, including factors that advanced and delayed help-seeking. Our qualitative analyses from 15 interviews with older adults revealed that their backgrounds, social supports and treatment systems influence, and are influenced by, their illness careers. Factors that delayed help-seeking included: a lack of support, ‘inappropriate’ referrals/advice from treatment professionals and poor mental health literacy. This research suggests the NEM-II is a helpful and appropriate theory for understanding older adults’ pathways to treatment, and has implications to enhance older adults’ access to psychological services.
Over the years, the practice of medicine has evolved from authority-based to experience-based to evidence-based with the introduction of the scientific process, clinical trials, and outcomes-based data analysis (Tebala GD. Int J Med Sci. 2018;15(12):1397-1405). The time required to perform the necessary randomized controlled trials, a systematic literature review, and meta-analysis of these trials to then create, accept, promulgate, and educate the practicing clinicians to use the evidence-based clinical guidelines is typically measured in years. When the severe acute respiratory syndrome novel coronavirus-2 (SARS-nCoV-2) pandemic commenced in Wuhan, China at the end of 2019, there were few available clinical guidelines to deploy, let alone adapt and adopt to treat the surge of coronavirus disease 2019 (COVID-19) patients. The aim of this study is to first explain how clinical guidelines, on which bedside clinicians have grown accustomed, can be created in the midst of a pandemic, with an evolving scientific understanding of the pathophysiology of the hypercoagulable state. The second is to adapt and adopt current venous thromboembolism diagnostic and treatment guidelines, while relying on the limited available observational reporting of COVID-19 patients to create a comprehensive clinical guideline to treat COVID-19 patients.
The evolution of resistance to multiple herbicides in Palmer amaranth is a major challenge for its management. In this study, a Palmer amaranth population from Hutchinson, Kansas (HMR), was characterized for resistance to inhibitors of photosystem II (PSII) (e.g., atrazine), acetolactate synthase (ALS) (e.g., chlorsulfuron), and EPSP synthase (EPSPS) (e.g., glyphosate), and this resistance was investigated. About 100 HMR plants were treated with field-recommended doses (1×) of atrazine, chlorsulfuron, and glyphosate, separately along with Hutchinson multiple-herbicide (atrazine, chlorsulfuron, and glyphosate)–susceptible (HMS) Palmer amaranth as control. The mechanism of resistance to these herbicides was investigated by sequencing or amplifying the psbA, ALS, and EPSPS genes, the molecular targets of atrazine, chlorsulfuron, and glyphosate, respectively. Fifty-two percent of plants survived a 1× (2,240 g ai ha−1) atrazine application with no known psbA gene mutation, indicating the predominance of a non–target site resistance mechanism to this herbicide. Forty-two percent of plants survived a 1× (18 g ai ha−1) dose of chlorsulfuron with proline197serine, proline197threonine, proline197alanine, and proline197asparagine, or tryptophan574leucine mutations in the ALS gene. About 40% of the plants survived a 1× (840 g ae ha−1) dose of glyphosate with no known mutations in the EPSPS gene. Quantitative PCR results revealed increased EPSPS copy number (50 to 140) as the mechanism of glyphosate resistance in the survivors. The most important finding of this study was the evolution of resistance to at least two sites of action (SOAs) (~50% of plants) and to all three herbicides due to target site as well as non–target site mechanisms. The high incidence of individual plants with resistance to multiple SOAs poses a challenge for effective management of this weed.
Introduction: Affecting roughly 1 in 5 pregnancies, early pregnancy loss is a common experience for reproductive-aged women. In Canada, most women do not establish care with an obstetrical provider until the second trimester of pregnancy. Consequently, pregnant patients experiencing symptoms of early pregnancy loss frequently access care in the emergency department (ED). The objective of this study was to describe the resource utilization and outcomes of women presenting to two Ontario EDs for early pregnancy loss or threatened early pregnancy loss. Methods: This was a retrospective cohort study of pregnant (≤20 weeks), adult (≥18 years) women in two EDs (one community hospital with 110,000 annual ED visits; one academic hospital with 65,000 annual ED visits) between January 2010 and December 2017. Patients were identified by diagnostic codes indicating early pregnancy loss or threatened early pregnancy loss. Results: A total of 16,091 patients were included, with a mean (SD) age of 32.8 (5.6) years. Patients had a total of 22,410 ED visits for early pregnancy complications, accounting for 1.6% of the EDs’ combined visits during the study period. Threatened abortion (n = 11,265, 50.3%) was the most common ED diagnosis, followed by spontaneous abortion (n = 5,652, 25.2%), ectopic pregnancy (n = 3,242, 14.5%), missed abortion (n = 1,541, 6.9%), and other diagnoses (n = 710, 3.2%). 8,000 (44.8%) patients had a radiologist-interpreted ultrasound performed during the initial ED visit. Median (IQR) ED length of stay was 3.4 (2.3 to 5.1) hours. There were 4,561 (25.6%) return ED visits within 30 days, of which 2,317 (50.8%) occurred less than 24 hours of index visit, and 481 (10.6%) were for scheduled, next day ultrasound. The total number of hospital admissions was 1,793 (8.0%), and the majority were for ectopic pregnancy (n = 1,052, 58.7%). Of admitted patients, 1,320 (73.6%) underwent surgical interventions related to early pregnancy. There were 474 (10.4%) patients admitted to hospital during return ED visits. Conclusion: Pregnant patients experiencing symptoms of early pregnancy loss in the ED frequently had radiologist-interpreted US and low rates of hospital admission, yet had high rates of return ED visits. This study highlights the heavy reliance on Ontario EDs to care for patients experiencing complications of early pregnancy.
Introduction: The emergency department (ED) is often the first point of health care contact for patients with mild traumatic brain injury (MTBI). Spontaneous resolution occurs in most patients within 7 days, yet 15-30% will develop post-concussion syndrome (PCS). Given the paucity of effective management strategies to prevent PCS and emerging evidence supporting exercise, the objective of this study was to evaluate the impact of prescribed early light exercise compared to standard discharge instructions for acute MTBI patients in the ED. Methods: This was a randomized controlled trial conducted in three Canadian EDs. Consecutive, adult (18-64 years) ED patients with a MTBI sustained within the preceding 48 hours were eligible for enrollment. The intervention group received discharge instructions prescribing 30 minutes of daily light exercise (e.g., walking), and the control group was given standard MTBI instructions advising gradual return to exercise following symptom resolution. Participants documented their daily physical activities and completed follow-up questionnaires at 7, 14, and 30 days. The primary outcome was the proportion of patients with PCS at 30 days, defined as the presence of ≥ 3 symptoms on the Rivermead Post-concussion Symptoms Questionnaire (RPQ) at 30 days. Results: 367 patients were enrolled (control n = 184; intervention n = 183). Median age was 32 years and 201 (57.6%) were female. There was no difference in the proportion of patients with PCS at 30 days (control 13.4 vs intervention 14.6; Δ1.2, 95% CI: -6.2 to 8.5). There were no differences in median change of RPQ scores (control 14 vs intervention 13; Δ1, 95% CI: -1 to 4), median number of return health care provider visits (control 1 vs intervention 1; Δ0, 95% CI: 0 to 0), or median number of missed school or work days (control 2 vs intervention 2; Δ0, 95% CI: 0 to 1) at 30 days. There was a nonsignificant difference in unplanned return ED visits within 30 days (control 9.9% vs intervention 5.6%; Δ1, 95% CI: -1.4 to 10.3). Participants in the control group reported fewer minutes of light exercise at 7 days (30 vs 35; Δ5, 95% CI: 2 to 15). Conclusion: To our knowledge, this is the first randomized trial of prescribed early light exercise for adults with acute MTBI. There were no differences in recovery or healthcare utilization outcomes. Results suggest prescribed early light exercise should be encouraged as tolerated at ED discharge following MTBI, but exercise prescription alone is not sufficient to prevent PCS.
Introduction: eCTAS is a real time electronic triage decision-support tool designed to improve patient safety and quality of care by standardizing the application of the Canadian Triage and Acuity Scale (CTAS). The tool dynamically calculates a recommended CTAS score based on the presenting complaint, vital signs and selected clinical modifiers. The primary objective was to assess consistency of CTAS score distributions across 35 emergency departments (EDs) by 16 presenting complaints pre and post eCTAS implementation. Methods: This retrospective cohort study used population-based administrative data from January 2016 to December 2018 from all hospital EDs in Ontario that had implemented eCTAS with at least 9 months of data. Following a 3-month stabilization period, we compared data for 6 months post-eCTAS implementation to the same 6-month period the previous year (pre-implementation) to account for potential seasonal variation, patient volume and case-mix. We included triage encounters of adult (≥18 years) patients if they had one of 16 pre-specified high-volume, presenting complaints. A paired-samples t-test was used to determine consistency by estimating the absolute difference in CTAS distribution for each presenting complaint, by each hospital, pre and post eCTAS implementation, compared to the overall average of the 35 EDs. Results: There were 183,231 triage encounters in the pre-eCTAS cohort and 179,983 in the post-eCTAS cohort from 35 EDs across the province. Triage scores were more consistent with the overall average after eCTAS implementation in 6 (37.5%) presenting complaints: chest pain (cardiac features) (p < 0.001), extremity weakness/symptoms of cerebrovascular accident (p < 0.001), fever (p < 0.001), shortness of breath (p < 0.001), syncope (p = 0.02), and hyperglycemia (p = 0.03). Triage consistency was similar pre and post eCTAS implementation for the presenting complaints of altered level of consciousness, anxiety/situational crisis, confusion, depression/suicidal/deliberate self-harm, general weakness, head injury, palpitations, seizure, substance misuse/intoxication or vertigo. Conclusion: A standardized, electronic approach to performing triage assessments increased consistency in CTAS scores across many, but not all, high-volume CEDIS complaints. This does not reflect triage accuracy, as there are no known benchmarks for triage accuracy. Improvements in consistency were greatest for sentinel presenting complaints with a minimum allowable CTAS score.