To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Introduction: Prehospital field trauma triage (FTT) standards were reviewed and revised in 2014 based on the recommendations of the Centers for Disease Control and Prevention. The FTT standard allows a hospital bypass and direct transport, within 30 min, to a lead trauma hospital (LTH). Our objectives were to assess the impact of the newly introduced prehospital FTT standard and to describe the emergency department (ED) management and outcomes of patients that had bypassed closer hospitals. Methods: We conducted a 12-month multi-centred health record review of paramedic and ED records following the implementation of the 4 step FTT standard (step 1: vital signs and level of consciousness (physiologic), step 2: anatomical injury, step 3: mechanism and step 4: special considerations) in nine paramedic services across Eastern Ontario. We included adult trauma patients transported as urgent that met FTT standard, regardless of transport time. We developed and piloted a data collection tool and obtained consensus on all definitions. The primary outcome was the rate of appropriate triage to a LTH which was defined as: ISS ≥12, admitted to intensive care unit (ICU), non-orthopedic surgery, or death. We have reported descriptive statistics. Results: 570 patients were included: mean age 48.8, male 68.9%, falls 29.6%, motor vehicle collisions 20.2%, stab wounds 10.5%, transported to a LTH 76.5% (n = 436). 72.2% (n = 315) of patients transported to a LTH had bypassed a closer hospital and 126/306 (41.2%) of those were determined to be an appropriate triage to LTH (9 patients had missing outcomes). ED management included: CT head/cervical spine 69.9%, ultrasound 53.6%, xray 51.6%, intubation 15.0%, sedation 11.1%, tranexamic acid 9.8%, blood transfusion 8.2%, fracture reduction 6.9%, tube thoracostomy 5.9%. Outcomes included: ISS ≥ 12 32.7%, admitted to ICU 15.0%, non-orthopedic surgery 11.1%, death 8.8%. Others included: admission to hospital 57.5%, mean LOS 12.8 days, orthopedic surgery 16.3% and discharged from ED 37.3%. Conclusion: Despite a high number of admissions, the majority of trauma patients bypassed to a LTH were considered over-triaged, with a low number of ED procedures and non-orthopedic surgeries. Continued work is needed to appropriately identify patients requiring transport to a LTH.
Single nucleotide polymorphisms (SNPs) contribute small increases in risk for late-onset Alzheimer's disease (LOAD). LOAD SNPs cluster around genes with similar biological functions (pathways). Polygenic risk scores (PRS) aggregate the effect of SNPs genome-wide. However, this approach has not been widely used for SNPs within specific pathways.
We investigated whether pathway-specific PRS were significant predictors of LOAD case/control status.
We mapped SNPs to genes within 8 pathways implicated in LOAD. For our polygenic analysis, the discovery sample comprised 13,831 LOAD cases and 29,877 controls. LOAD risk alleles for SNPs in our 8 pathways were identified at a P-value threshold of 0.5. Pathway-specific PRS were calculated in a target sample of 3332 cases and 9832 controls. The genetic data were pruned with R2 > 0.2 while retaining the SNPs most significantly associated with AD. We tested whether pathway-specific PRS were associated with LOAD using logistic regression, adjusting for age, sex, country, and principal components. We report the proportion of variance in liability explained by each pathway.
The most strongly associated pathways were the immune response (NSNPs = 9304, = 5.63 × 10−19, R2 = 0.04) and hemostasis (NSNPs = 7832, P = 5.47 × 10−7, R2 = 0.015). Regulation of endocytosis, hematopoietic cell lineage, cholesterol transport, clathrin and protein folding were also significantly associated but accounted for less than 1% of the variance. With APOE excluded, all pathways remained significant except proteasome-ubiquitin activity and protein folding.
Genetic risk for LOAD can be split into contributions from different biological pathways. These offer a means to explore disease mechanisms and to stratify patients.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
The USA is the largest consumer of legally, internationally-traded wildlife. A proportion of this trade consists of species listed in the Appendices of CITES, and recorded in the CITES Trade Database. Using this resource, we quantified wildlife entering the USA for 82 of the most frequently recorded wildlife products and a range of taxonomic groups during 1979–2014. We examined trends in legal trade and seizures of illegally traded items over time, and relationships between trade and four national measures of biodiversity. We found that: (1) there is an overall positive relationship between legal imports and seizures; (2) Asia was the main region exporting CITES-listed wildlife products to the USA; (3) bears, crocodilians and other mammals (i.e. other than Ursidae, Felidae, Cetacea, Proboscidea, Primates or Rhinocerotidae) increased in both reported legal trade and seizures over time; (4) legal trade in live specimens was reported to be primarily from captive-produced, artificially-propagated or ranched sources, whereas traded meat was primarily wild sourced; (5) both seizures and legally traded items of felids and elephants decreased over time; and (6) volumes of both legally traded and seized species were correlated with four attributes of exporting countries: species endemism, species richness, number of IUCN threatened species, and country size. The goal of our analysis was to inform CITES decision-making and species conservation efforts.
Introduction: Trauma and injury play a significant role in the population's burden of disease. Limited research exists evaluating the role of trauma bypass protocols. The objective of this study was to assess the impact and effectiveness of a newly introduced prehospital field trauma triage (FTT) standard, allowing paramedics to bypass a closer hospital and directly transport to a trauma centre (TC) provided transport times were within 30 minutes. Methods: We conducted a 12-month multi-centred health record review of paramedic call reports and emergency department health records following the implementation of the 4 step FTT standard (step 1: vital signs and level of consciousness, step 2: anatomical injury, step 3: mechanism and step 4: special considerations) in nine paramedic services across Eastern Ontario. We included adult trauma patients transported as an urgent transport to hospital, that met one of the 4 steps of the FTT standard and would allow for a bypass consideration. We developed and piloted a standardized data collection tool and obtained consensus on all data definitions. The primary outcome was the rate of appropriate triage to a TC, defined as any of the following: injury severity score ≥12, admitted to an intensive care unit, underwent non-orthopedic operation, or death. We report descriptive and univariate analysis where appropriate. Results: 570 adult patients were included with the following characteristics: mean age 48.8, male 68.9%, attended by Advanced Care Paramedic 71.8%, mechanisms of injury: MVC 20.2%, falls 29.6%, stab wounds 10.5%, median initial GCS 14, mean initial BP 132, prehospital fluid administered 26.8%, prehospital intubation 3.5%, transported to a TC 74.6%. Of those transported to a TC, 308 (72.5%) had bypassed a closer hospital prior to TC arrival. Of those that bypassed a closer hospital, 136 (44.2%) were determined to be “appropriate triage to TC”. Bypassed patients more often met the step 1 or step 2 of the standard (186, 66.9%) compared to the step 3 or step 4 (122, 39.6%). An appropriate triage to TC occurred in 104 (55.9%) patients who had met step 1 or 2 and 32 (26.2%) patients meeting step 3 or 4 of the FTT standard. Conclusion: The FTT standard can identify patients who should be bypassed and transported to a TC. However, this is at a cost of potentially burdening the system with poor sensitivity. More work is needed to develop a FTT standard that will assist paramedics in appropriately identifying patients who require a trauma centre.
Recommending nitrofurantoin to treat uncomplicated cystitis was associated with increased nitrofurantoin use from 3.53 to 4.01 prescriptions per 1,000 outpatient visits, but nitrofurantoin resistance in E. coli isolates remained stable at 2%. Concomitant levofloxacin resistance was a significant risk for nitrofurantoin resistance in E. coli isolates (odds ratio [OR], 2.72; 95% confidence interval [CI], 1.04–7.17).
Breakthrough Listen is a 10-yr initiative to search for signatures of technologies created by extraterrestrial civilisations at radio and optical wavelengths. Here, we detail the digital data recording system deployed for Breakthrough Listen observations at the 64-m aperture CSIRO Parkes Telescope in New South Wales, Australia. The recording system currently implements two modes: a dual-polarisation, 1.125-GHz bandwidth mode for single-beam observations, and a 26-input, 308-MHz bandwidth mode for the 21-cm multibeam receiver. The system is also designed to support a 3-GHz single-beam mode for the forthcoming Parkes ultra-wideband feed. In this paper, we present details of the system architecture, provide an overview of hardware and software, and present initial performance results.
A 3-yr watermelon experiment was established in fall 2013 to evaluate cover crop, polyethylene mulch, tillage, and herbicide application components for weed control, yield, and profitability. Conservation tillage, either with a cereal rye cover crop alone or integrated with polyethylene mulch, was compared to the standard industry practice of conventional tillage with bedded polyethylene mulch. The study also used a non-bedded conventional tillage system without polyethylene to determine polyethylene and cover crop residue effects. Within each of the four systems, herbicide treatments comprised halosulfuron applied (1) at 26.3 g ai ha–1 PRE, (2) at 26.3 g ai ha–1 POST, or (3) sequentially at 26.3 g ai ha–1 PRE and POST. Each system also had a nontreated control. In addition, clethodim was applied in all plots twice POST at 140 g ai ha–1, except for nontreated in each system. In 2014, polyethylene or cereal rye cover crop effectively controlled tall morningglory, coffee senna, and carpetweed early season in nontreated plots, whereas the integration of the two was effective at controlling common purslane. Tall morningglory and purslane control was insufficient late season regardless of production system and herbicide application. In 2015, polyethylene effectively controlled cutleaf eveningprimrose, sicklepod, and arrowleaf sida early season in nontreated plots. Yellow nutsedge control was insufficient late season regardless of production system and herbicide application. Utilizing sequential halosulfuron applications did not increase weed control over PRE or POST alone in all years. Polyethylene use resulted in yields higher than systems without in all years. Across all 3 yr, net returns were highest for polyethylene mulch systems. The results of this experiment underscore the need for more progress in developing integrated conservation systems for watermelon production. Effective herbicides, low-disturbance cultivation, and/or hand weeding are most likely the key to success in conservation specialty crop systems.
Introduction: Early recognition of sepsis can improve patient outcomes yet recognition by paramedics is poor and research evaluating the use of prehospital screening tools is limited. Our objective was to evaluate the predictive validity of the Regional Paramedic Program for Eastern Ontario (RPPEO) prehospital sepsis notification tool to identify patients with sepsis and to describe and compare the characteristics of patients with an emergency department (ED) diagnosis of sepsis that are transported by paramedics. The RPPEO prehospital sepsis notification tool is comprised of 3 criteria: current infection, fever &/or history of fever and 2 or more signs of hypoperfusion (eg. SBP<90, HR 100, RR24, altered LOA). Methods: We performed a review of ambulance call records and in-hospital records over two 5-month periods between November 2014 February 2016. We enrolled a convenience sample of patients, assessed by primary and advanced care paramedics (ACPs), with a documented history of fever &/or documented fever of 38.3°C (101°F) that were transported to hospital. In-hospital management and outcomes were obtained and descriptive, t-tests, and chi-square analyses performed where appropriate. The RPPEO prehospital sepsis notification tool was compared to an ED diagnosis of sepsis. The predictive validity of the RPPEO tool was calculated (sensitivity, specificity, NPV, PPV). Results: 236 adult patients met the inclusion criteria with the following characteristics: mean age 65.2 yrs [range 18-101], male 48.7%, history of sepsis 2.1%, on antibiotics 23.3%, lowest mean systolic BP 125.9, treated by ACP 58.9%, prehospital temperature documented 32.6%. 34 (14.4%) had an ED diagnosis of sepsis. Patients with an ED diagnosis of sepsis, compared to those that did not, had a lower prehospital systolic BP (114.9 vs 127.8, p=0.003) and were more likely to have a prehospital shock index >1 (50.0% vs 21.4%, p=0.001). 44 (18.6%) patients met the RPPEO sepsis notification tool and of these, 27.3% (12/44) had an ED diagnosis of sepsis. We calculated the following predictive values of the RPPEO tool: sensitivity 35.3%, specificity 84.2%, NPV 88.5%, PPV 27.3%. Conclusion: The RPPEO prehospital sepsis notification tool demonstrated modest diagnostic accuracy. Further research is needed to improve accuracy and evaluate the impact on patient outcomes.
Objectives: Total intracranial volume (TICV) is an important control variable in brain–behavior research, yet its calculation has challenges. Manual TICV (Manual) is labor intensive, and automatic methods vary in reliability. To identify an accurate automatic approach we assessed the reliability of two FreeSurfer TICV metrics (eTIV and Brainmask) relative to manual TICV. We then assessed how these metrics alter associations between left entorhinal cortex (ERC) volume and story retention. Methods: Forty individuals with Parkinson’s disease (PD) and 40 non-PD peers completed a brain MRI and memory testing. Manual metrics were compared to FreeSurfer’s Brainmask (a skull strip mask with total volume of gray, white, and most cerebrospinal fluid) and eTIV (calculated using the transformation matrix into Talairach space). Volumes were compared with two-way interclass correlations and dice similarity indices. Associations between ERC volume and Wechsler Memory Scale-Third Edition Logical Memory retention were examined with and without correction using each TICV method. Results: Brainmask volumes were larger and eTIV volumes smaller than Manual. Both automated metrics correlated highly with Manual. All TICV metrics explained additional variance in the ERC-Memory relationship, although none were significant. Brainmask explained slightly more variance than other methods. Conclusions: Our findings suggest Brainmask is more reliable than eTIV for TICV correction in brain-behavioral research. (JINS, 2018, 24, 206–211)
Human-induced pluripotent stem cells (iPSCs) offer a novel, timely approach for investigating the aetiology of neuropsychiatric disorders. Although we are starting to gain more insight into the specific mechanisms that cause Alzheimer's disease and other forms of dementia, this has not resulted in therapies to slow the pathological processes. Animal models have been paramount in studying the neurobiological processes underlying psychiatric disorders. Nonetheless, these human conditions cannot be entirely recapitulated in rodents. Human cell models derived from patients’ cells now offer new hope for improving our understanding of the early molecular stages of these diseases, through to validating therapeutics. The impact of dementia is increasing, and a new model to investigate the early stages of this disease is heralded as an essential, new platform for translational research. In this paper, we review current literature using iPSCs to study Alzheimer's disease, describe drug discovery efforts using this platform, and discuss the future potential for this technology in psychiatry research.
We present preliminary results from a study exploring the origin of Milky Way substructures, and show initial evidence of a common “kicked-out” formation mechanism for two low-latitude substructures. In this scenario, stars in these substructures formed in the disk and were subsequently “kicked-out” by an external perturbation, such as the merger of an accreted satellite, which created an oscillation in the Galactic disk. To test this origin scenario, we found the fraction of different stellar populations – M giants and RR Lyrae stars – in the Monoceros Ring (also known as GASS) and A13, supplementing a study of stellar populations in the Triangulum-Andromeda cloud. This work provides: (1) the first analysis of the GASS and A13 features based upon their stellar populations; and (2) preliminary evidence of disk stars in the Milky Way that have been relocated to the disk-halo interface due to vertical oscillations of the Milky Way’s disk.
The Schistosoma mansoni cercarial elastase (SmCE) has previously been shown to be poorly immunogenic in mice. However, a minority of mice were able to produce antibodies against SmCE after multiple immunizations with crude preparations containing the enzyme. These mice were partially protected against challenge infections of S. mansoni. In the present study, we show that in contrast to the poor immunogenicity of the enzymatically active native form of SmCE derived from a crude preparation (cercarial transformation fluid), immunization of CBA/Ca mice with two enzymatically inactive forms, namely purified native SmCE or a recombinant SmCE fused to recombinant Schistosoma japonicum glutathione S-transferase (rSmCE-SjGST), after adsorption onto aluminum hydroxide adjuvant, induced specific anti-SmCE immunoglobulin G (IgG) in all mice within 2 weeks of the second immunization. The IgG antibody response to rSmCE-SjGST was mainly of the IgG1 subclass. These results suggest that inactive forms of the antigen could be used to obtain the optimum immunogenic effects as a vaccine candidate against schistosomiasis. Mice immunized with the rSmCE-SjGST on alum had smaller mean worm burdens and lower tissue egg counts when compared with adjuvant alone- and recombinant SjGST-injected controls. The native SmCE was antigenically cross-reactive with homologous enzymes of Schistosoma haematobium and Schistosoma margrebowiei.
Introduction: Safety culture is defined as the shared beliefs that an organization’s employees hold relative to workplace safety. Perceptions of workplace safety culture within paramedic services have been shown to be associated with patient and provider safety outcomes as well as safe work practices. We sought to characterize paramedics’ perceptions of the organizational safety culture across Eastern Ontario, Canada to provide important benchmarking data to evaluate future quality initiatives. Methods: This was a cross-sectional survey study conducted September 2015-January 2016 in 7 paramedic services across Eastern Ontario. We distributed an abridged version of Patterson’s previously published EMS-SAQ survey, measuring six domains of workplace safety culture, to 1,066 paramedics during continuing medical education sessions. The questions were presented for rating on a 5 point Likert scale (1=strongly agree, 5= strongly disagree) and a response of 1 or 2 was considered a ‘positive perception’ response. We present descriptive statistics and chi-square tests where appropriate. Results: We received responses from 1,041 paramedics (97.6%), with a response rate varying between 88.0% and 100% across the 8 paramedic services. One third (33.6%) were Advanced Care Paramedics (ACPs) and 39.4% of paramedics had more than 10 years’ experience. The percentage of positive responses for each domain were: Safety Climate 31.2% (95% CI 28.4-34.1), Teamwork Climate 29.3% (95% CI 26.6-32.1), Stress Recognition 56.8% (95% CI 53.8-59.8), Perceptions of Management 67.0% (95% CI 64.0-69.8), Working Conditions 42.6% (95% CI 39.6-45.7), Job Satisfaction 41.6% (95% CI 38.6-44.6). Primary care paramedics had more positive perception responses for Job Satisfaction (45% vs 35%, p=0.002), whereas ACPs had more positive perception responses for Stress Recognition (61.5% vs 54.1%, p=0.022). No association was found between gender or years of experience and a positive perception of any safety domain. Conclusion: The results provide valuable workplace safety culture data that will be used to target and evaluate needed quality improvement initiatives while also raising some awareness to paramedics of important factors related to patient and provider safety.
This study examined the response of forage crops to composted dairy waste (compost) applied at low rates and investigated effects on soil health. The evenness of spreading compost by commercial machinery was also assessed. An experiment was established on a commercial dairy farm with target rates of compost up to 5 t ha−1 applied to a field containing millet [Echinochloa esculenta (A. Braun) H. Scholz] and Pasja leafy turnip (Brassica hybrid). A pot experiment was also conducted to monitor the response of a legume forage crop (vetch; Vicia sativa L.) on three soils with equivalent rates of compost up to 20 t ha−1 with and without ‘additive blends’ comprising gypsum, lime or other soil treatments. Few significant increases in forage biomass were observed with the application of low rates of compost in either the field or pot experiment. In the field experiment, compost had little impact on crop herbage mineral composition, soil chemical attributes or soil fungal and bacterial biomass. However, small but significant increases were observed in gravimetric water content resulting in up to 22.4 mm of additional plant available water calculated in the surface 0.45 m of soil, 2 years after compost was applied in the field at 6 t ha−1 dried (7.2 t ha−1 undried), compared with the nil control. In the pot experiment, where the soil was homogenized and compost incorporated into the soil prior to sowing, there were significant differences in mineral composition in herbage and in soil. A response in biomass yield to compost was only observed on the sandier and lower fertility soil type, and yields only exceeded that of the conventional fertilizer treatment where rates equivalent to 20 t ha−1 were applied. With few yield responses observed, the justification for applying low rates of compost to forage crops and pastures seems uncertain. Our collective experience from the field and the glasshouse suggests that farmers might increase the response to compost by: (i) increasing compost application rates; (ii) applying it prior to sowing a crop; (iii) incorporating the compost into the soil; (iv) applying only to responsive soil types; (v) growing only responsive crops; and (vi) reducing weed burdens in crops following application. Commercial machinery incorporating a centrifugal twin disc mechanism was shown to deliver double the quantity of compost in the area immediately behind the spreader compared with the edges of the spreading swathe. Spatial variability in the delivery of compost could be reduced but not eliminated by increased overlapping, but this might represent a potential 20% increase in spreading costs.
Latin America Treatment and Innovation Network in Mental Health (LATIN-MH) is a research hub located in Brazil and Peru that conducts a research project to help reduce the treatment gap in mental health in Latin America (LA). Besides its research core, LATIN-MH has a Capacity Building (CB) component that aims to help young researchers receive the specific training to contribute to the growing scientific production in mental health in LA.
LATIN-MH proposal in CB includes a series of actions to prepare professionals in the research area. The main proposals are described here, which include online study groups, promotion of scientific meetings, hands-on training in different levels and sharing of information.
LATIN-MH CB activities are at its initial stages but the proposed activities were well evaluated by the participants. The first participating fellows who finished their fellowships are contributing elsewhere in the mental treatment and human resources formation area.
The repercussion of LATIN-MH actions in CB and its evaluation, particularly on the formation of human resources and dissemination of information, show that the hub is contributing to the critic formation of young researchers and the circulation of important information.
Pelvic inflammatory disease (PID) and more specifically salpingitis (visually confirmed inflammation) is the primary cause of tubal factor infertility and is an important risk factor for ectopic pregnancy. The risk of these outcomes increases following repeated episodes of PID. We developed a homogenous discrete-time Markov model for the distribution of PID history in the UK. We used a Bayesian framework to fully propagate parameter uncertainty into the model outputs. We estimated the model parameters from routine data, prospective studies, and other sources. We estimated that for women aged 35–44 years, 33·6% and 16·1% have experienced at least one episode of PID and salpingitis, respectively (diagnosed or not) and 10·7% have experienced one salpingitis and no further PID episodes, 3·7% one salpingitis and one further PID episode, and 1·7% one salpingitis and ⩾2 further PID episodes. Results are consistent with numerous external data sources, but not all. Studies of the proportion of PID that is diagnosed, and the proportion of PIDs that are salpingitis together with the severity distribution in different diagnostic settings and of overlap between routine data sources of PID would be valuable.
The Darwin region in northern Australia has experienced rapid population growth in recent years, and with it, an increased incidence of melioidosis. Previous studies in Darwin have associated the environmental presence of Burkholderia pseudomallei, the causative agent of melioidosis, with anthropogenic land usage and proximity to animals. In our study, we estimated the occurrence of B. pseudomallei and Burkholderia spp. relatives in faecal matter of wildlife, livestock and domestic animals in the Darwin region. A total of 357 faecal samples were collected and bacteria isolated through culture and direct DNA extraction after enrichment in selective media. Identification of B. pseudomallei, B. ubonensis, and other Burkholderia spp. was carried out using TTS1, Bu550, and recA BUR3–BUR4 quantitative PCR assays, respectively. B. pseudomallei was detected in seven faecal samples from wallabies and a chicken. B. cepacia complex spp. and Pandoraea spp. were cultured from wallaby faecal samples, and B. cenocepacia and B. cepacia were also isolated from livestock animals. Various bacteria isolated in this study represent opportunistic human pathogens, raising the possibility that faecal shedding contributes to the expanding geographical distribution of not just B. pseudomallei but other Burkholderiaceae that can cause human disease.