To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The MITIGATE toolkit was developed to assist urgent care and emergency departments in the development of antimicrobial stewardship programs. At the University of Washington, we adopted the MITIGATE toolkit in 10 urgent care centers, 9 primary care clinics, and 1 emergency department. We encountered and overcame challenges: a complex data build, choosing feasible outcomes to measure, issues with accurate coding, and maintaining positive stewardship relationships. Herein, we discuss solutions to challenges we encountered to provide guidance for those considering using this toolkit.
Compulsory admission procedures of patients with mental disorders vary between countries in Europe. The Ethics Committee of the European Psychiatric Association (EPA) launched a survey on involuntary admission procedures of patients with mental disorders in 40 countries to gather information from all National Psychiatric Associations that are members of the EPA to develop recommendations for improving involuntary admission processes and promote voluntary care.
The survey focused on legislation of involuntary admissions and key actors involved in the admission procedure as well as most common reasons for involuntary admissions.
We analyzed the survey categorical data in themes, which highlight that both medical and legal actors are involved in involuntary admission procedures.
We conclude that legal reasons for compulsory admission should be reworded in order to remove stigmatization of the patient, that raising awareness about involuntary admission procedures and patient rights with both patients and family advocacy groups is paramount, that communication about procedures should be widely available in lay-language for the general population, and that training sessions and guidance should be available for legal and medical practitioners. Finally, people working in the field need to be constantly aware about the ethical challenges surrounding compulsory admissions.
COVID-19 or ‘Coronavirus’ has become a global pandemic since its initial report in Wuhan, China, on November 17, 2020. It is highly infectious and poses significant health risks for those in vulnerable populations. This article aims to provide perspective into an Irish experience, through the eyes of a practicing psychiatric nurse, who has recently graduated medical school and intends to work as an intern doctor.
Introduction: Cannabinoid Hyperemesis Syndrome (CHS) in pediatric patients is poorly characterized. Literature is scarce, making identification and treatment challenging. This study's objective was to describe demographics and visit data of pediatric patients presenting to the emergency department (ED) with suspected CHS, in order to improve understanding of the disorder. Methods: A retrospective chart review was conducted of pediatric patients (12-17 years) with suspected CHS presenting to one of two tertiary-care EDs; one pediatric and one pediatric/adult (combined annual pediatric census 40,550) between April 2014-March 2019. Charts were selected based on discharge diagnosis of abdominal pain or nausea/vomiting with positive cannabis urine screen, or discharge diagnosis of cannabis use, using ICD-10 codes. Patients with confirmed or likely diagnosis of CHS were identified and data including demographics, clinical history, and ED investigations/treatments were recorded by a trained research assistant. Results: 242 patients met criteria for review. 39 were identified as having a confirmed or likely diagnosis of CHS (mean age 16.2, SD 0.85 years with 64% female). 87% were triaged as either CTAS-2 or CTAS-3. 80% of patients had cannabis use frequency/duration documented. Of these, 89% reported at least daily use, the mean consumption was 1.30g/day (SD 1.13g/day), and all reported ≥6 months of heavy use. 69% of patients had at least one psychiatric comorbidity. When presenting to the ED, all had vomiting, 81% had nausea, 81% had abdominal pain, and 30% reported weight loss. Investigations done included venous blood gas (30%), pregnancy test in females (84%), liver enzymes (57%), pelvic or abdominal ultrasound (19%), abdominal X-ray (19%), and CT head (5%). 89% of patients received treatment in the ED with 81% receiving anti-emetics, 68% receiving intravenous (IV) fluids, and 22% receiving analgesics. Normal saline was the most used IV fluid (80%) and ondansetron was the most used anti-emetic (90%). Cannabis was suspected to account for symptoms in 74%, with 31% of these given the formal diagnosis of CHS. 62% of patients had another visit to the ED within 30 days (prior to or post sentinel visit), 59% of these for similar symptoms. Conclusion: This study of pediatric CHS reveals unique findings including a preponderance of female patients, a majority that consume cannabis daily, and weight loss reported in nearly one third. Many received extensive workups and most had multiple clustered visits to the ED.
This retrospective, case series audit assessed the clinical and health-economic impact of long-term treatment with quetiapine (‘Seroquel’), a new atypical antipsychotic, in patients with chronic schizophrenia.
The study design was of a case series format, comprising patients entered from one centre into the open-label extension of a multicentre 6-week efficacy study. Twenty-one patients (15 male, six female; mean age 39 years) were studied, of whom 17 (81%) had been rated as ‘partially responsive’ to previous antipsychotics. Data on hospitalisations and information on symptoms were collected retrospectively for the 12 months before quetiapine treatment was initiated and for the 12 months after.
Quetiapine was effective in reducing psychotic symptoms with mean BPRS scores reducing significantly, from 38 to 21 (P < 0.005). Motor function was also significantly improved with mean Simpson scale scores reducing from 15 to 12 (P < 0.005). Average inpatient days were reduced by 11% in year two (97 compared with 109 days) while the overall costs of treatment, including drug costs, fell by 5% (I£20,843 to I£19,827).
Four patients had been hospitalised for longer than 5 years before starting quetiapine; these chronically institutionalised patients remained in hospital, despite improved clinical outcomes (mean BPRS scores after treatment of 34, compared with 43 before), for the full 12 months of quetiapine treatment. Were the data from this audit to be re-analysed excluding these four patients then average inpatient days would have been reduced by 33% (45 to 30 days) and overall cost of treatment by 19% (I£8617 to I£7011).
This audit suggests that treatment with quetiapine over this 1-year period was associated with both clinical improvements and a decreased usage of inpatient services. The reduction in hospitalisation costs would appear to compensate for the increased cost of drug treatment. Significantly, potential savings appear to be greatest for those patients with a ‘revolving door’ pattern of repeated readmission.
There is evidence that social support predicts self-esteem and related moods for individuals with psychotic disorders. There has, however, been little investigation of relative importance of specific components of social support.
Evidence from social psychology suggests that perceived relational evaluation (PRE) or the extent to which individuals see others as valuing them, is a particularly important determinant of self-esteem and mood.
The current study compared the importance of PRE and other types of social support, in predicting self-esteem and depressive mood, anxiety and anger hostility in a sample of patients in an early intervention program for psychotic disorders.
One hundred and two patients of the Prevention and Early Intervention Program for Psychoses (PEPP) in London, Ontario completed measures of PRE, appraisal, tangible and general emotional social support, self-esteem and mood. in addition, ratings of positive and negative symptoms were completed for all participants.
In general, perceived relational evaluation was the most important predictor of self-esteem and mood. These relationships were not a result of confounding with positive or negative symptoms.
The extent to which an individual perceives himself or herself as being positively valued by those in his or her immediate social environment is a particularly important component of social support in predicting self-esteem and affect of individuals with a psychotic disorder
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.
The Murchison Widefield Array (MWA) is an electronically steered low-frequency (<300 MHz) radio interferometer, with a ‘slew’ time less than 8 s. Low-frequency (∼100 MHz) radio telescopes are ideally suited for rapid response follow-up of transients due to their large field of view, the inverted spectrum of coherent emission, and the fact that the dispersion delay between a 1 GHz and 100 MHz pulse is on the order of 1–10 min for dispersion measures of 100–2000 pc/cm3. The MWA has previously been used to provide fast follow-up for transient events including gamma-ray bursts (GRBs), fast radio bursts (FRBs), and gravitational waves, using systems that respond to gamma-ray coordinates network packet-based notifications. We describe a system for automatically triggering MWA observations of such events, based on Virtual Observatory Event standard triggers, which is more flexible, capable, and accurate than previous systems. The system can respond to external multi-messenger triggers, which makes it well-suited to searching for prompt coherent radio emission from GRBs, the study of FRBs and gravitational waves, single pulse studies of pulsars, and rapid follow-up of high-energy superflares from flare stars. The new triggering system has the capability to trigger observations in both the regular correlator mode (limited to ≥0.5 s integrations) and using the Voltage Capture System (VCS, 0.1 ms integration) of the MWA and represents a new mode of operation for the MWA. The upgraded standard correlator triggering capability has been in use since MWA observing semester 2018B (July–Dec 2018), and the VCS and buffered mode triggers will become available for observing in a future semester.
The search for life in the Universe is a fundamental problem of astrobiology and modern science. The current progress in the detection of terrestrial-type exoplanets has opened a new avenue in the characterization of exoplanetary atmospheres and in the search for biosignatures of life with the upcoming ground-based and space missions. To specify the conditions favourable for the origin, development and sustainment of life as we know it in other worlds, we need to understand the nature of global (astrospheric), and local (atmospheric and surface) environments of exoplanets in the habitable zones (HZs) around G-K-M dwarf stars including our young Sun. Global environment is formed by propagated disturbances from the planet-hosting stars in the form of stellar flares, coronal mass ejections, energetic particles and winds collectively known as astrospheric space weather. Its characterization will help in understanding how an exoplanetary ecosystem interacts with its host star, as well as in the specification of the physical, chemical and biochemical conditions that can create favourable and/or detrimental conditions for planetary climate and habitability along with evolution of planetary internal dynamics over geological timescales. A key linkage of (astro)physical, chemical and geological processes can only be understood in the framework of interdisciplinary studies with the incorporation of progress in heliophysics, astrophysics, planetary and Earth sciences. The assessment of the impacts of host stars on the climate and habitability of terrestrial (exo)planets will significantly expand the current definition of the HZ to the biogenic zone and provide new observational strategies for searching for signatures of life. The major goal of this paper is to describe and discuss the current status and recent progress in this interdisciplinary field in light of presentations and discussions during the NASA Nexus for Exoplanetary System Science funded workshop ‘Exoplanetary Space Weather, Climate and Habitability’ and to provide a new roadmap for the future development of the emerging field of exoplanetary science and astrobiology.
We apply two methods to estimate the 21-cm bispectrum from data taken within the Epoch of Reionisation (EoR) project of the Murchison Widefield Array (MWA). Using data acquired with the Phase II compact array allows a direct bispectrum estimate to be undertaken on the multiple redundantly spaced triangles of antenna tiles, as well as an estimate based on data gridded to the uv-plane. The direct and gridded bispectrum estimators are applied to 21 h of high-band (167–197 MHz; z = 6.2–7.5) data from the 2016 and 2017 observing seasons. Analytic predictions for the bispectrum bias and variance for point-source foregrounds are derived. We compare the output of these approaches, the foreground contribution to the signal, and future prospects for measuring the bispectra with redundant and non-redundant arrays. We find that some triangle configurations yield bispectrum estimates that are consistent with the expected noise level after 10 h, while equilateral configurations are strongly foreground-dominated. Careful choice of triangle configurations may be made to reduce foreground bias that hinders power spectrum estimators, and the 21-cm bispectrum may be accessible in less time than the 21-cm power spectrum for some wave modes, with detections in hundreds of hours.
Introduction: Emergency department (ED) congestion is an ongoing threat to quality care. Traditional measures of ED efficiency use census and wait times over extended time intervals (e.g. per year, per day), failing to capture the hourly variations in ED flow. Borrowing from the traffic theory framework used to describe cars on a freeway, ED flow can instead be characterized by three fundamental parameters: flux (patients traversing a care segment per unit time), density (patients in a care segment per unit time), and duration (length of stay in a care segment). This method allows for the calculation of near-instantaneous ED flux and density. To illustrate, we examined the association between stretcher occupancy and time to physician initial assessment (PIA), seeking to identify thresholds where flux and PIA deteriorate. Methods: We used administrative data as reported to government agencies for 115,559 ED visits from April 1, 2014 to March 31, 2016 at a tertiary academic hospital. Time stamps collected at triage, PIA, and departure were verified by nosologists and used to define two care segments: awaiting assessment or receiving care. Using open-source software developed in-house, we calculated flow measures for each segment at 90-minute intervals. Graphical analysis was supplemented by regression analysis, examining PIA times of high (CTAS 1-3) or low (CTAS 4-5) acuity patients against ED occupancy (=density/staffed stretchers) adjusting for the day of the week, season and fiscal year. Results: At occupancy levels below 50%, PIA times remain stable and flux increases with density, reflecting free flow. Beyond 50% occupancy, PIA times increase linearly and flux plateaus, indicating congestion. While PIA times further deteriorate above 100% occupancy, flow is maintained, reflecting care delivery in non-traditional spaces (e.g. hallways). An inflection point where flux decreased with increased crowding was not identified, despite lengthening queues. Conclusion: The operational performance of a modern ED can be captured and visualized using techniques borrowed from the analysis of vehicular traffic. Unlike cars on a jammed roadway, patients behave more like a compressible fluid and ED care continues despite high degrees of crowding. Nevertheless, congestion begins well below 100% occupancy, presumably reflecting the need for stretcher turnover and saturation in subsegmental work processes. This methodology shows promise to analyze and mitigate the many factors contributing to ED crowding.
We read with interest the recent editorial, “The Hennepin Ketamine Study,” by Dr. Samuel Stratton commenting on the research ethics, methodology, and the current public controversy surrounding this study.1 As researchers and investigators of this study, we strongly agree that prospective clinical research in the prehospital environment is necessary to advance the science of Emergency Medical Services (EMS) and emergency medicine. We also agree that accomplishing this is challenging as the prehospital environment often encounters patient populations who cannot provide meaningful informed consent due to their emergent conditions. To ensure that fellow emergency medicine researchers understand the facts of our work so they may plan future studies, and to address some of the questions and concerns in Dr. Stratton’s editorial, the lay press, and in social media,2 we would like to call attention to some inaccuracies in Dr. Stratton’s editorial, and to the lay media stories on which it appears to be based.
Ho JD, Cole JB, Klein LR, Olives TD, Driver BE, Moore JC, Nystrom PC, Arens AM, Simpson NS, Hick JL, Chavez RA, Lynch WL, Miner JR. The Hennepin Ketamine Study investigators’ reply. Prehosp Disaster Med. 2019;34(2):111–113
Background: Central neuropathic pain syndromes are a result of central nervous system injury, most commonly related to stroke, traumatic spinal cord injury, or multiple sclerosis. These syndromes are distinctly less common than peripheral neuropathic pain, and less is known regarding the underlying pathophysiology, appropriate pharmacotherapy, and long-term outcomes. The objective of this study was to determine the long-term clinical effectiveness of the management of central neuropathic pain relative to peripheral neuropathic pain at tertiary pain centers. Methods: Patients diagnosed with central (n=79) and peripheral (n=710) neuropathic pain were identified for analysis from a prospective observational cohort study of patients with chronic neuropathic pain recruited from seven Canadian tertiary pain centers. Data regarding patient characteristics, analgesic use, and patient-reported outcomes were collected at baseline and 12-month follow-up. The primary outcome measure was the composite of a reduction in average pain intensity and pain interference. Secondary outcome measures included assessments of function, mood, quality of life, catastrophizing, and patient satisfaction. Results: At 12-month follow-up, 13.5% (95% confidence interval [CI], 5.6-25.8) of patients with central neuropathic pain and complete data sets (n=52) achieved a ≥30% reduction in pain, whereas 38.5% (95% CI, 25.3-53.0) achieved a reduction of at least 1 point on the Pain Interference Scale. The proportion of patients with central neuropathic pain achieving both these measures, and thus the primary outcome, was 9.6% (95% CI, 3.2-21.0). Patients with peripheral neuropathic pain and complete data sets (n=463) were more likely to achieve this primary outcome at 12 months (25.3% of patients; 95% CI, 21.4-29.5) (p=0.012). Conclusion: Patients with central neuropathic pain syndromes managed in tertiary care centers were less likely to achieve a meaningful improvement in pain and function compared with patients with peripheral neuropathic pain at 12-month follow-up.
The US Food Safety Modernization Act (FSMA) gives food safety regulators increased authority to require implementation of safety measures to reduce the contamination of produce. To evaluate the future impact of FSMA on food safety, a better understanding is needed regarding outbreaks attributed to the consumption of raw produce. Data reported to the US Centers for Disease Control and Prevention's Foodborne Disease Outbreak Surveillance System during 1998–2013 were analysed. During 1998–2013, there were 972 raw produce outbreaks reported resulting in 34 674 outbreak-associated illnesses, 2315 hospitalisations, and 72 deaths. Overall, the total number of foodborne outbreaks reported decreased by 38% during the study period and the number of raw produce outbreaks decreased 19% during the same period; however, the percentage of outbreaks attributed to raw produce among outbreaks with a food reported increased from 8% during 1998–2001 to 16% during 2010–2013. Raw produce outbreaks were most commonly attributed to vegetable row crops (38% of outbreaks), fruits (35%) and seeded vegetables (11%). The most common aetiologic agents identified were norovirus (54% of outbreaks), Salmonella enterica (21%) and Shiga toxin-producing Escherichia coli (10%). Food-handling errors were reported in 39% of outbreaks. The proportion of all foodborne outbreaks attributable to raw produce has been increasing. Evaluation of safety measures to address the contamination on farms, during processing and food preparation, should take into account the trends occurring before FSMA implementation.
Bovine herpes virus 1 (BHV-1) manifests as a latent viral infection putatively affecting bovines. Understanding its effect on cattle herds is critical to maintaining sustainable beef and dairy production systems, as well as aiding in the development of herd health policies. The primary objective of the current study was, therefore, to use a whole-farm bio-economic model to evaluate the effect of herd seroprevalence to BHV-1 on the productive and economic performance of a spring calving beef cow herd. As part of a wider epidemiological study of herd pathogen status, a total of 4240 cows from 134 spring calving beef cow herds across the Republic of Ireland were blood sampled to measure the seroprevalence to BHV-1. Using data from a national breeding database, productive and reproductive performance indicators were used to parameterize a single year, static and deterministic whole-farm bio-economic model. A spring-calving, pasture-based suckler beef cow production system with an emphasis on calf-to-weanling production was simulated. The impact of BHV-1 seropositivity on whole-farm technical and economic performance was relatively small, with a marginal drop in the net margin of 4% relative to a baseline seronegative herd. Subsequent risk factors for increased pathogenicity were considered such as total herd size, percentage of intra-herd movements and vaccination status for BHV-1. In contrast to all others, scenarios representing herds that were either small in size or those which indicated an active vaccination policy for BHV-1 had no reduction in net margin against the baseline as a result of seropositivity to BHV-1.
The stability of VFB catholytes was investigated using both light-scattering measurements and visual observation. V2O5 precipitates after an induction time τ which shows an Arrhenius variation with temperature. The value of τ increases with increasing [S] and with decreasing [VV] but the activation energy remains constant with a value of (1.791±0.020) eV. Plots of ln τ against [S] and [VV] show good linearity and the slopes give values of βS = 2.073 M-1 and βV5 = –3.434 M-1 for the fractional rates of variation of τ with [S] and [VV], respectively. Combining the Arrhenius Equation with the observed log-linear variation of τ with [S] and [VV] provides a model for simulating the stability of catholytes. The addition of H3PO4 has a strong stabilizing effect on catholytes at higher temperatures. For example, at 50°C the induction time for precipitation for a typical catholyte is enhanced ∼ 12.5-fold by 0.1 M added H3PO4. At concentrations of H3PO4 less than ∼0.04 M, the precipitation time increases with increasing concentration at all temperatures investigated (30–70°C). At higher concentrations, induction time begins to decrease with increasing concentration of H3PO4: the changeover concentration depends on the temperature. Experiments at 70°C using other phosphate additives (sodium triphosphate, Na5P3O10, and sodium hexametaphosphate, (NaPO3)6) showed similar results to H3PO4.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
We present techniques developed to calibrate and correct Murchison Widefield Array low-frequency (72–300 MHz) radio observations for polarimetry. The extremely wide field-of-view, excellent instantaneous (u, v)-coverage and sensitivity to degree-scale structure that the Murchison Widefield Array provides enable instrumental calibration, removal of instrumental artefacts, and correction for ionospheric Faraday rotation through imaging techniques. With the demonstrated polarimetric capabilities of the Murchison Widefield Array, we discuss future directions for polarimetric science at low frequencies to answer outstanding questions relating to polarised source counts, source depolarisation, pulsar science, low-mass stars, exoplanets, the nature of the interstellar and intergalactic media, and the solar environment.