To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
First-degree relatives of patients with psychotic disorder have higher levels of polygenic risk (PRS) for schizophrenia and higher levels of intermediate phenotypes.
We conducted, using two different samples for discovery (n = 336 controls and 649 siblings of patients with psychotic disorder) and replication (n = 1208 controls and 1106 siblings), an analysis of association between PRS on the one hand and psychopathological and cognitive intermediate phenotypes of schizophrenia on the other in a sample at average genetic risk (healthy controls) and a sample at higher than average risk (healthy siblings of patients). Two subthreshold psychosis phenotypes, as well as a standardised measure of cognitive ability, based on a short version of the WAIS-III short form, were used. In addition, a measure of jumping to conclusion bias (replication sample only) was tested for association with PRS.
In both discovery and replication sample, evidence for an association between PRS and subthreshold psychosis phenotypes was observed in the relatives of patients, whereas in the controls no association was observed. Jumping to conclusion bias was similarly only associated with PRS in the sibling group. Cognitive ability was weakly negatively and non-significantly associated with PRS in both the sibling and the control group.
The degree of endophenotypic expression of schizophrenia polygenic risk depends on having a sibling with psychotic disorder, suggestive of underlying gene–environment interaction. Cognitive biases may better index genetic risk of disorder than traditional measures of neurocognition, which instead may reflect the population distribution of cognitive ability impacting the prognosis of psychotic disorder.
Public support is usually a precondition for the adoption and successful implementation of costly policies. We argue that such support is easier to achieve with policy-packages that incorporate primary and ancillary measures. We specifically distinguish command-and-control and market-based measures as primary measures and argue that the former will usually garner more public support than the latter given the low-visibility tendency of costs associated with command-and-control measures. Nevertheless, if included in a policy-package, ancillary measures are likely to increase public support by reducing negative effects of primary measures. Based on a choice experiment with a representative sample of 2,034 Swiss citizens, we assessed these arguments with respect to political efforts to reduce vehicle emissions. The empirical analysis supported the argument that policy-packaging affects public support positively, particularly generating more support when ancillary measures are added. Lastly, we ultimately observe that command-and-control measures obtain more public support than market-based instruments.
This paper presents a novel approach for the determination of True-Speed-Over-Ground for trains. Speed determination is accomplished by correlating the received signals of two side-looking radar sensors. The theoretically achievable precision is derived. Test measurements are done in two different scenarios to give a proof of concept. Thereafter a series of field measurements is performed to rate the practical suitability of the approach. The results of the measurements are thoroughly evaluated. The test and field measurements are carried out using a 24 GHz frequency modulated continuous wave radar.
Dolfi, Guralnick, Praeger and Spiga asked whether there exist infinitely many primitive groups of twisted wreath type with non-trivial coprime subdegrees. Here, we settle this question in the affirmative. We construct infinite families of primitive twisted wreath permutation groups with non-trivial coprime subdegrees. In particular, we define a primitive twisted wreath group G(m, q) constructed from the non-abelian simple group PSL(2, q) and a primitive permutation group of diagonal type with socle PSL(2, q)m, and determine many subdegrees for this group. A consequence is that we determine all values of m and q for which G(m, q) has non-trivial coprime subdegrees. In the case where m = 2 and
, we obtain a full classification of all pairs of non-trivial coprime subdegrees.
Children represent a particularly vulnerable population in disasters. Disaster Risk Reduction refers to a systematic approach to identifying, assessing, and reducing risks of disaster through sets of interventions towards disaster causes and population vulnerabilities. Disaster Risk Reduction through the education of the population, and especially children, is an emerging field requiring further study.
To test the hypothesis that an educational program on Disaster Risk Reduction can induce a sustained improvement in knowledge, risk perception, awareness, and attitudes toward preparedness behavior of children.
A Disaster Risk Reduction educational program for students aged 10-12 was completed in an earthquake-prone region of Jordan (Madaba). Subject students (A) and control groups of similarly aged untrained children in public (B) and private (C) schools were surveyed one year after the program. Surveys focused on disaster knowledge, risk perception, awareness, and preparedness behavior. Likert scales were used for some questions and binary yes/no for others. Results were collated and total scores averaged for each section. Average scores were compared between groups and analyzed using SPSS.
Students who had completed the Disaster Risk Reduction program were found through Levene’s test to have statistically significant improvement in earthquake knowledge (5.921 vs. 4.55 vs. 5.125), enhanced risk perception (3.966 vs. 3.580 vs. 3.789), and improved awareness of earthquakes (4.652 vs. 3.293 vs. 4.060) with heightened attitudes toward preparedness behavior (8.008 vs. 6.517 vs. 7.597) when compared to untrained public and private school control groups, respectively.
Disaster Risk Reduction education programs can have lasting impacts when applied to children. They can improve students’ knowledge, risk perception, awareness, and attitudes towards preparedness. Further work is required to determine the frequency of re-education required and appropriate age groups for educational interventions.
In the past five decades, the region of Latin America and the Caribbean (LAC) has been subject to several types of terrorist attacks, with most committed by local terrorist organizations. However, there have also been attacks by international terrorist groups. Internationally, terrorist attacks are increasing in both frequency and complexity. Significant concerns exist regarding the use of Chemical Warfare Agents (CWAs) in civilian settings. Asphyxiants (e.g. cyanide), opioids (e.g. fentanyl), and nerve agents (e.g. sarin) represent some of the most lethal CWAs. To date, there is very little published data on their use in the LAC region despite the fact that the recent attacks in Syria have sparked international interest in the use and regulation of CWAs.
To improve civilian health service preparedness in response to CWAs attacks by describing the types of agents historically used within the LAC region.
Information was extracted and analyzed from the open-source Global Terrorist Database hosted by the University of Maryland, regarding CWA-LAC from January 1, 1970, to December 31, 2017.
During the forty-seven year period reviewed, there were 29,846 terrorist attacks in the LAC region, with 63.6% occurring in the southern region. Twenty-nine CWA attacks were reported, with the most common agents being tear gas (37%) and cyanide (29.6%). The most frequent targets were religious figures/institutions (22.2%), law enforcement (18.5%), and government agencies/personnel (18.5%).
Cyanide is one of the most prevalent agents used for chemical weapons attacks in the LAC region. Preparedness should be enhanced for CWA terrorist attacks, especially those involving cyanide, given its life-threatening nature, prevalence, and the existence of reversal agents. First responders, physicians, and nurses should be aware of this potential hazard and be trained to respond appropriately. Additionally, regional stockpiles of antidotes should be considered by governmental bodies within the LAC region.
Saudi Arabia, the largest country in the Middle East, has suffered numerous terrorist attacks and is the location of Hajj, one of the world’s largest annual mass gatherings. Healthcare providers’ pre-incident knowledge and understanding of basic disaster medicine (DM) concepts are crucial for a unified and effective health-system response. Introducing healthcare providers to best practices is a stated vision of the Saudi Commission for Health Specialties. Standardizing DM curriculum taught to physicians during their residency training will assist this goal.
To produce expert consensus on the most critical DM topics for the residency curriculum in emergency medicine (EM) in the Kingdom of Saudi Arabia.
Utilizing a Delphi approach, a panel of Saudi Arabian experts in DM and EM residency directors were surveyed regarding potential DM topics for EM residency curricula. The first round comprised of open-ended questions seeking lists of suggested DM curriculum topics. In subsequent rounds, each participant received a questionnaire asking them to review the items contributed in the first round, summarized by the investigation team. The participants rated each item on a five-point Likert Scale to establish preliminary priorities and added their comments. In further rounds, participants reviewed and prioritized subjects until they reached a consensus of >=80%.
The study is ongoing and full data will be available in the new year.
This expert consensus from major stakeholders can be used to improve the foundation of the DM curriculum. The Delphi Method gives an evidence-based approach to identification and prioritization of subjects, which should be integrated within the Saudi Arabian Emergency Medicine Residency Curriculum. It also can be used as a cornerstone for implementation in other medical education programs across the Kingdom in the future.
Road traffic collisions (RTC) are the leading cause of preventable death among those aged 15–29 years worldwide. More than 1.2 million lives are lost each year on roads. Ninety percent of these deaths take place in low- and middle-income countries. The General Assembly of the United Nations (UN) proclaimed the period from 2011-2020 the “Decade of Action for Road Safety,” with the objective of stabilizing and reducing the number of deaths by 50% worldwide. In this context, the government of Colombia established the National Road Safety Plan (PNSV) for the period 2011–2021 with the objective of reducing the number of fatalities by 26%. However, the effectiveness of road safety policies in Colombia is still unknown.
To evaluate the effect of road safety laws on the incidence of RTC, deaths, and injuries in Colombia.
RTC data and fatality numbers for the time period of January 1, 2010, to December 31, 2017, were collated from official Colombian governmental publications and analyzed for reductions and trends related to the introduction of new road safety legislation.
Data analysis are expected to be completed by January 2019.
RTC remains the leading preventable cause of death in Colombia despite the PNSV. Data is being mined to determine the trends of these rates of crashes and fatalities, and their relation to the introduction of national traffic laws. Overall, while the absolute numbers of RTC and deaths have been increasing, the rate of RTC per 10,000 cars has been decreasing. This suggests that although the goals of the PNSV may not be realized, some of the laws emanating from it may be beneficial, but warrant further detailed analysis.
Healthcare facilities frequently use disaster codes as a way to communicate with employees that an emergency or incident is occurring. As increasing numbers of providers work at multiple facilities, and healthcare systems continue to build disaster response teams and protocols covering multiple facilities, standardization of disaster code terminology is critical. A lack of consistency in terminology can potentially have a devastating impact on the understanding and response of visiting or relief staff.
To evaluate the level of standardization in terminology of disaster codes in healthcare facilities.
A convenience sample was taken from a private Facebook™ group consisting of emergency department nurses from a wide range of facilities. The Facebook™ group was asked to share their hospital disaster codes. Of the 40,179 total members, 78 commented, including 55 photos of quick reference badges, and the rest were descriptions/lists of codes. One badge was excluded due to a blurry photograph. Results were collated and analyzed for trends and standardization.
The most common codes were, “Code Red” for fire (72.7%), “Code Blue” for cardiac arrest (44.9%), “Code Silver” for active shooter/weapons event (37.7%) and “Code Orange” for hazardous materials (33.8%). There were 168 instances of a code term being associated with a particular event by five or fewer facilities. Two facilities used numeric systems, with 11 using plain language descriptions.
Disaster code language is inconsistent. Few of the codes were consistently assigned to the same meaning, and none were universal. Color coding was the most common method, but there was little consistency even within color code systems. Additionally, some facilities used a combination of colors, numbers, terms, and plain language. Healthcare facilities should embrace standard terminology and create a consistent language for disaster codes to enhance response capabilities and medical security.
Opioid overdose deaths in the United States are increasing. Time to restoration of ventilation is critical. Rapid bystander administration of opioid antidote (naloxone) is an effective interim response but is historically constrained by legal restrictions.
To review and contextualize development of legislation facilitating layperson administration of naloxone across the United States.
Publicly accessible databases (1,2) were searched for legislation relevant to naloxone administration between January 2001 and July 2017.
All 51 jurisdictions implemented naloxone access laws between 2001 and 2017; 45 of these between 2012 and 2017. Nationwide mortality from opioid overdose increased from 3.3 per 100,000 population in 2001 to 13.3 in 2016, 42, and 35 jurisdictions enacted laws giving prescribers immunity from criminal prosecution, civil liability, and professional sanctions, respectively. 36, 41, and 35 jurisdictions implemented laws allowing dispensers immunity in the same domains. 38 and 46 jurisdictions gave laypeople administering naloxone immunity from criminal and civil liability. Forty-seven jurisdictions implemented laws allowing prescription of naloxone to third parties. All jurisdictions except Nebraska allowed pharmacists to dispense naloxone without a patient-specific prescription. Fifteen jurisdictions removed criminal liability for possession of non-prescribed naloxone. The 10 states with highest average rates of opioid overdose-related mortality had not legislated in a higher number of domains compared to the 10 lowest states and the average of all jurisdictions (3.4 vs 2.9 vs 2.7, respectively).
Effective involvement of bystanders in early recognition and reversal of opioid overdose requires removal of legal deterrents to prescription, dispensing, distribution, and administration of naloxone. Jurisdictions have varied in degree and speed of creating this legal environment. Understanding the integration of legislation into epidemic response may inform the response to this and future public health crises.
Human Stampedes (HS) occur at religious mass gatherings. Religious events have a higher rate of morbidity and mortality than other events that experience HS. This study is a subset analysis of religious event HS data regarding the physics principles involved in HS, and the associated event morbidity and mortality.
To analyze reports of religious HS to determine the initiating physics principles and associated morbidity and mortality.
Thirty-four reports of religious HS were analyzed to find shared variables. Thirty-three (97.1%) were written media reports with photographic, drawn, or video documentation. 29 (85.3%) cited footage/photographs and 1 (2.9%) was not associated with visual evidence. Descriptive phrases associated with physics principles contributing to the onset of HS and morbidity data were extracted and analyzed to evaluate frequency before, during, and after events.
34 (39.1%) reports of HS found in the literature review were associated with religious HS. Of these, 83% were found to take place in an open space, and 82.3% were associated with population density changes. 82.3% of events were associated with architectural nozzles (small streets, alleys, etc). 100% were found to have loss of XY-axis motion and 89% reached an average velocity of zero. 100% had loss of proxemics and 91% had associated Z-axis displacement (falls). Minimum reported attendance for a religious HS was 3000. 100% of religious HS had reported mortality at the event and 56% with further associated morbidity.
HS are deadly events at religious mass gatherings. Religious events are often recurring, planned gatherings in specific geographic locations. They are frequently associated with an increase in population density, loss of proxemics and velocity, followed by Z-axis displacements, leading to injury and death. This is frequently due to architectural nozzles, which those organizing religious mass gatherings can predict and utilize to mitigate future events.
Antimicrobial stewardship programs (ASPs) are effective in developed countries. In this study, we assessed the effectiveness of an infectious disease (ID) physician–driven post-prescription review and feedback as an ASP strategy in India, a low middle-income country (LMIC).
Design and setting:
This prospective cohort study was carried out for 18 months in 2 intensive care units of a tertiary-care hospital, consisting of 3 phases: baseline, intervention, and follow up. Each phase spanned 6 months.
Patients aged ≥15 years receiving 48 hours of study antibiotics were recruited for the study.
During the intervention phase, an ID physician reviewed the included cases and gave alternate recommendations if the antibiotic use was inappropriate. Acceptance of the recommendations was measured after 48 hours. The primary outcome of the study was days of therapy (DOT) per 1,000 study patient days (PD).
Overall, 401 patients were recruited in the baseline phase, 381 patients were recruited in the intervention phase, and 379 patients were recruited in the follow-up phase. Antimicrobial use decreased from 831.5 during the baseline phase to 717 DOT per 1,000 PD in the intervention phase (P < .0001). The effect was sustained in the follow-up phase (713.6 DOT per 1,000 PD). De-escalation according to culture susceptibility improved significantly in the intervention phase versus the baseline phase (42.7% vs 23.6%; P < .0001). Overall, 73.3% of antibiotic prescriptions were inappropriate. Recommendations by the ID team were accepted in 60.7% of the cases.
The ID physician–driven implementation of an ASP was successful in reducing antibiotic utilization in an acute-care setting in India.
Road Traffic Crashes (RTC) are one of the most preventable causes of death worldwide, yet are the number one cause of death in Nigeria. In March 2010, the United Nations General Assembly launched “The Decade of Action for Road Safety (2011-2020)” to “reduce road traffic deaths and injuries by 50% by 2020.”
To analyze trends in RTC and deaths in relation to current road safety laws in Nigeria, and possible future interventions.
Annual reports from 2013-2017 were obtained from the Federal Road Safety Corps (FRSC) of Nigeria. These reports were analyzed for trends in RTC, deaths, and reported causes to find areas of possible improvement.
The number of RTC and deaths declined yearly from 2013-2017. Crashes decreased from 23.4% in 2013-2014 to 6.2% in 2015, to 0.4% in 2016, and then increased to 3.2% in 2017. Results showed that fatalities from RTC in 2013-2014 decreased by 8.4%, then by 9.3% in 2015, and by 7.1% in 2016, but had a 1.3% increase in fatalities from 2016-2017. Analysis showed that speed violations (SPV) were the top cause of RTC. These had a decrease in the number of crashes from 5,495 (32% of RTC) in 2013, to 3,496 (29%) in 2014, to 3,195 (26.5%) in 2015. They then increased to 3,848 (33.9%) in 2016 and to 4,840 (44.1%) in 2017. There was a decline in reports of RTC caused by driving under the influence (DAD) from 1% in 2013, to 0.8% in 2014, and 0.5% in 2015 and 2016.
Current road safety laws have been effective in decreasing the total number of RTC and deaths. While certain laws such as those regarding DAD have been effective, other laws such as speed limits have been less successful and may require further changes in legal codes and/or enforcement.
The evidence of the character and purpose of settlements previously described as defended ‘small towns’ is reviewed in the light of knowledge accrued since the implementation of Planning Policy Guidance 16 in 1990, the same year as the publication of Burnham and Wacher's survey, The ‘Small Towns’ of Roman Britain. This review focuses on four of the more extensively excavated settlements: Alcester, Cambridge, Godmanchester and Worcester. In the absence of convincing urban attributes, it is suggested that this category of settlement should more appropriately be regarded as defended villages (vici). These cluster in and around the West Anglian plain and on Ermine Street, suggesting a strategic function to protect grain and other food supplies and their movement, potentially either to the northern frontier or south to London and, perhaps, export to the Continent.
The Stac Fada Member of the Stoer Group, within the Torridonian succession of NW Scotland, is a melt-rich, impact-related deposit that has not been conclusively correlated with any known impact structure. However, a gravity low approximately 50 km east of the preserved Stac Fada Member outcrops has recently been proposed as the associated impact site. We investigate the location of the impact structure through a provenance study of detrital zircon and apatite in five samples from the Stoer Group. Our zircon U–Pb data are dominated by Archaean grains (> 2.5 Ga), consistent with earlier interpretations that the detritus was largely derived from local Lewisian Gneiss Complex, whereas the apatite data (the first for the Stoer Group) display a single major peak at c. 1.7 Ga, consistent with regional Laxfordian metamorphism. The almost complete absence of Archaean-aged apatite is best explained by later heating of the > 2.5 Ga Lewisian basement (the likely source region) above the closure temperature of the apatite U–Pb system (c. 375–450°C). The U–Pb age distributions for zircon and apatite show no significant variation with stratigraphic height. This may be interpreted as evidence that there was no major change in provenance during the course of deposition of the Stoer Group or, if there was any significant change, the different source regions were characterized by similar apatite and zircon U–Pb age populations. Consequently, the new data do not provide independent constraints on the location of the structure associated with the Stac Fada Member impact event.
OBJECTIVES/SPECIFIC AIMS: Parents often make errors in comprehending and executing their child’s inpatient discharge instructions, putting their child at risk for adverse post-discharge outcomes. Suboptimal provider-caregiver communication has been linked to errors in comprehension and execution of provider instructions, especially for parents with limited health literacy. Few studies have systematically examined features of pediatric inpatient written discharge instructions that may contribute to errors. Our objective was to assess the readability, understandability, and actionability of pediatric inpatient written discharge instructions. METHODS/STUDY POPULATION: This was a cross-sectional analysis of the written discharge instructions (standardized template, content not standardized) provided to parents at an urban public hospital, enrolled as part of a prospective cohort study (n=171) focused on parent ability to comprehend their child’s discharge instructions. Inclusion criteria were: English/Spanish-speaking parents of children ≤12 years old discharged on ≥1 daily medicine. Discharge instructions were assessed for: 1) Readability (Average of 5 formulas [Flesh Reading Ease, Flesch-Kincaid, Gunning Fog, Simple Measure of Gobbledygook, Forcast]), 2) Understandability and actionability (AHRQ Patient Education Materials Assessment Tool [2 independent reviewers; κ>0.8 for both]). RESULTS/ANTICIPATED RESULTS: Mean (SD) reading grade level was 11.4 (0.7); none of the instructions were written at a recommended reading level of 6th to 8th grade or below. Mean (SD) understandability was 37.7 (6.9)%; mean actionability was 41.7 (8.4)%. All 171 sets of instructions used medical terminology without adequate plain language explanations and included information that was not relevant to the child’s diagnosis and associated care (e.g., obesity counseling, smoking cessation given to a child with appendicitis). None of the sets of instructions presented information in a logical sequence (e.g., diet instructions in more than one location) or included any pictographic information or other visual aids to support the text (e.g., diagram of medication dose within a dosing tool). DISCUSSION/SIGNIFICANCE OF IMPACT: Written discharge instructions provided in the pediatric inpatient setting were suboptimal. Use of a systematic approach to improve discharge instructions, using a health literacy perspective, has the potential to improve post-discharge outcomes in children.
Buprenorphine/samidorphan (BUP/SAM), a combination of BUP (a µ-opioid receptor partial agonist and κ-antagonist) and SAM (a sublingually bioavailable µ-opioid antagonist), is an investigational opioid system modulator for depression. BUP/SAM has shown efficacy versus placebo as an adjunctive treatment for major depressive disorder (MDD) and a consistent safety profile in previously reported, placebo-controlled clinical studies.1,2
1. To characterize the safety profile following long-term treatment with BUP/SAM
2. To explore depression symptoms and remission rates in patients with MDD following long-term treatment with BUP/SAM
FORWARD-2 (Clinicaltrials.gov ID: NCT02141399) enrolled patients who had participated in 1 of 4 controlled studies as well as de novo patients. All patients had a confirmed diagnosis of MDD, had a history of inadequate response to standard antidepressant therapies (ADTs), and had been treated with an adequate dose of an established ADT for ≥8weeks before BUP/SAM initiation. ADT dosage could be titrated, but the ADT could not be changed. During the study, patients received open-label, sublingual BUP/SAM 2mg/2mg as adjunctive treatment for up to 52weeks. Safety (primary objective) was assessed via adverse events (AEs), vital signs, laboratory analytes, and electrocardiography. Suicidal ideation or behavior (SIB) was evaluated by the Columbia Suicide Severity Rating Scale. Abuse potential, dependence, and withdrawal were assessed by AEs and the Clinical Opiate Withdrawal Scale. Exploratory efficacy endpoints included mean Montgomery–Åsberg Depression Rating Scale (MADRS) scores and remission rate (MADRS ≤10).
Of 1454 total patients, 49% completed the 52-week study, 11% discontinued due to an AE, and 40% discontinued because of other reasons as of the interim data cutoff date (April 30, 2017). Most AEs were of mild/moderate severity. Serious AEs were reported in 3.2% of patients. AEs occurring in ≥10% of patients were nausea, headache, constipation, dizziness, and somnolence. There was no evidence of increased risk of SIB with BUP/SAM. Incidence of euphoria-related events was low (1.2%). After abrupt discontinuation of BUP/SAM, there was little evidence of withdrawal. BUP/SAM was not associated with meaningful changes in laboratory or metabolic parameters or in bodyweight. The mean MADRS score decreased from 22.9 (±9.7) at baseline to 9.8 (±8.8) after 52weeks. The remission rate at 52weeks was 52.5%.
Long-term treatment with BUP/SAM did not reveal any new safety findings and confirmed that the risk of abuse and dependence with BUP/SAM was low. BUP/SAM maintained an antidepressant effect for up to 52weeks of treatment in patients with MDD.