To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study originated in collaboration with Thomas Dishion because of concerns that a group format for aggressive children might dampen the effects of cognitive-behavioral intervention. Three hundred sixty aggressive preadolescent children were screened through teacher and parent ratings. Schools were randomized to receive either an individual or a group format of the child component of the same evidence-based program. The results indicate that there is variability in how group-based cognitive-behavioral intervention can affect aggressive children through a long 4-year follow-up after the end of the intervention. Aggressive children who have higher skin conductance reactivity (potentially an indicator of poorer emotion regulation) and who have a variant of the oxytocin receptor gene that may be associated with being hyperinvolved in social bonding have better outcomes in their teacher-rated externalizing behavior outcomes over time if they were seen individually rather than in groups. Analyses also indicated that higher levels of the group leaders’ clinical skills predicted reduced externalizing behavior problems. Implications for group versus individual format of cognitive-behavioral interventions for aggressive children, and for intensive training for group therapists, informed by these results, are discussed.
Field experiments were conducted in 2012 and 2013 across four locations for a total of 6 site-years in the midsouthern United States to determine the effect of growth stage at exposure on soybean sensitivity to sublethal rates of dicamba (8.8 g ae ha−1) and 2,4-D (140 g ae ha−1). Regression analysis revealed that soybean was most susceptible to injury from 2,4-D when exposed between 413 and 1,391 accumulated growing degree days (GDD) from planting, approximately between V1 and R2 growth stages. In terms of terminal plant height, soybean was most susceptible to 2,4-D between 448 and 1,719 GDD, or from V1 to R4. However, maximum susceptibility to 2,4-D was only between 624 and 1,001 GDD or from V3 to V5 for yield loss. As expected, soybean was sensitive to dicamba for longer spans of time, ranging from 0 to 1,162 GDD for visible injury or from emergence to R2. Likewise, soybean height was most affected when dicamba exposure occurred between 847 and 1,276 GDD or from V4 to R2. Regarding grain yield, soybean was most susceptible to dicamba between 820 and 1,339 GDD or from V4 to R2. Consequently, these data indicate that soybean response to 2,4-D and dicamba can be variable within vegetative or reproductive growth stages; therefore, specific growth stage at the time of exposure should be considered when evaluating injury from off-target movement. In addition, application of dicamba near susceptible soybean within the V4 to R2 growth stages should be avoided because this is the time of maximum susceptibility. Research regarding soybean sensitivity to 2,4-D and dicamba should focus on multiple exposure times and also avoid generalizing growth stages to vegetative or reproductive.
The South China Sea (SCS) is a biodiversity hotspot, however, most biodiversity surveys in the region are confined to shallow water reefs. Here, we studied the benthic habitat and fish assemblages in the upper mesophotic coral ecosystems (MCEs; 30–40 m) and SWRs (8–22 m) at three geographic locations (Luzon Strait; Palawan; and the Kalayaan Group of Islands) in the eastern SCS (also called the West Philippine Sea) using diver-based survey methods. Mean coral genera and fish species richness ranged from 17–25 (per 25 m2) and 11–17 (per 250 m2) in MCEs, respectively; although none of these were novel genera/species. Coral and fish assemblages were structured more strongly by location than by depth. Location differences were associated with the variability in benthic composition, wherein locations with higher hard coral cover had higher coral genera richness and abundance. Locations with higher algae and sand cover had higher diversity and density of fish herbivores and benthic invertivores. Fishing efforts may also have contributed to among-location differences as the highly exploited location had the lowest fish biomass. The low variation between depths may be attributed to the similar benthic composition at each location, the interconnectivity between depths due to hydrological conditions, fish motility, and the common fishing gears used in the Philippines that can likely extend beyond SWRs. Results imply that local-scale factors and anthropogenic disturbances probably dampen across-depth structuring in coral genera and fish species assemblages.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
One generation's experience of childhood maltreatment is associated with that of the next. However, whether this intergenerational transmission is specific to distinct forms of maltreatment and what factors may contribute to its continuity remains unclear. Borderline personality pathology is predicted by childhood maltreatment and characterized by features (e.g., dysregulated emotion, relationship instability, impulsivity, and inconsistent appraisals of others) that may contribute to its propagation. Among 364 older adults and 573 of their adult children (total n = 937), self-reported exposure to distinct forms of childhood maltreatment (i.e., emotional, physical, and sexual abuse, and emotional and physical neglect as assessed by the Childhood Trauma Questionnaire) showed homotypic and heterotypic associations across generations with little evidence that latent factors unique to specific forms of maltreatment show generational continuity. General nonspecific indices of childhood maltreatment showed evidence of intergenerational transmission after accounting for demographic factors and parent socioeconomic status (b = 0.126, p = 9.21 × 10−4). This continuity was partially mediated by parental borderline personality pathology (assessed longitudinally through a variety of measures and sources, indirect effect: b = 0.031, 95% confidence interval [0.003, 0.060]). The intergenerational continuity of childhood maltreatment may largely represent general risk for nonspecific maltreatment that may, in part, be propagated by borderline personality pathology and/or shared risk factors.
To describe snacking characteristics and patterns in children and examine associations with diet quality and BMI.
Children’s weight and height were measured. Participants/adult proxies completed multiple 24 h dietary recalls. Snack occasions were self-identified. Snack patterns were derived for each sample using exploratory factor analysis. Associations of snacking characteristics and patterns with Healthy Eating Index-2010 (HEI-2010) score and BMI were examined using multivariable linear regression models.
Childhood Obesity Prevention and Treatment Research (COPTR) Consortium, USA: NET-Works, GROW, GOALS and IMPACT studies.
Two snack patterns were derived for three studies: a meal-like pattern and a beverage pattern. The IMPACT study had a similar meal-like pattern and a dairy/grains pattern. A positive association was observed between meal-like pattern adherence and HEI-2010 score (P for trend < 0⋅01) and snack occasion frequency and HEI-2010 score (β coefficient (95 % CI): NET-Works, 0⋅14 (0⋅04, 0⋅23); GROW, 0⋅12 (0⋅02, 0⋅21)) among younger children. A preference for snacking while using a screen was inversely associated with HEI-2010 score in all studies except IMPACT (β coefficient (95 % CI): NET-Works, −3⋅15 (−5⋅37, −0⋅92); GROW, −2⋅44 (−4⋅27, −0⋅61); GOALS, −5⋅80 (−8⋅74, −2⋅86)). Associations with BMI were almost all null.
Meal-like and beverage patterns described most children’s snack intake, although patterns for non-Hispanic Blacks or adolescents may differ. Diets of 2–5-year-olds may benefit from frequent meal-like pattern snack consumption and diets of all children may benefit from decreasing screen use during eating occasions.
Objectives: The Wisconsin Card Sorting Test (WCST) is a complex measure of executive function that is frequently employed to investigate the schizophrenia spectrum. The successful completion of the task requires the interaction of multiple intact executive processes, including attention, inhibition, cognitive flexibility, and concept formation. Considerable cognitive heterogeneity exists among the schizophrenia spectrum population, with substantive evidence to support the existence of distinct cognitive phenotypes. The within-group performance heterogeneity of individuals with schizophrenia spectrum disorder (SSD) on the WCST has yet to be investigated. A data-driven cluster analysis was performed to characterise WCST performance heterogeneity. Methods: Hierarchical cluster analysis with k-means optimisation was employed to identify homogenous subgroups in a sample of 210 schizophrenia spectrum participants. Emergent clusters were then compared to each other and a group of 194 healthy controls (HC) on WCST performance and demographic/clinical variables. Results: Three clusters emerged and were validated via altered design iterations. Clusters were deemed to reflect a relatively intact patient subgroup, a moderately impaired patient subgroup, and a severely impaired patient subgroup. Conclusions: Considerable within-group heterogeneity exists on the WCST. Identification of subgroups of patients who exhibit homogenous performance on measures of executive functioning may assist in optimising cognitive interventions. Previous associations found using the WCST among schizophrenia spectrum participants should be reappraised. (JINS, 2019, 25, 750–760)
There is substantial evidence that many depressed individuals experience impaired executive functioning. Understanding the causes of executive dysfunction in depression is clinically important because cognitive impairment is a substantial contributor to functional impairment. This study investigated whether elevated levels of an inflammatory cytokine [interleukin-6 (IL-6)] and/or higher body mass index (BMI) concurrently and/or prospectively accounted for the relationship between depressive symptoms and impaired executive functioning in adolescents.
A diverse, community sample of adolescents (N = 288; mean age = 16.33; 51.4% female; 59.0% African-American) completed assessments of height and weight, IL-6, depressive symptoms, and self-report/behavioral measures of executive functioning (selective attention, switching attention) and future orientation annually over 3 years. Adolescents experiencing acute illness or medical conditions that affect inflammation were excluded from analyses. Path analysis within a structural equation modeling framework simultaneously examined the concurrent and prospective relationships between BMI, IL-6, depressive symptoms, and the measures of cognitive functioning across three timepoints.
Across all timepoints, higher BMI was prospectively associated with higher levels of IL-6 and depressive symptoms, while higher levels of IL-6 were associated with worse performance on three behavioral and self-report measures of cognitive functioning. Higher depressive symptoms also were prospectively associated with elevated IL-6 and both higher depressive symptoms and a higher BMI predicted worse future executive functioning via increased IL-6.
More severe depressive symptoms and increased BMI may disrupt executive functioning via elevated IL-6.
Objectives: Antisaccade error rate has been proposed to be one of the most promising endophenotypes for schizophrenia. Increased error rate in patients has been associated with working memory, attention and other executive function impairments. The relationship between antisaccade error rate and other neuropsychological processes in patients compared to healthy controls has not been explored in depth. This study aimed to replicate the finding of heightened antisaccade error rate in patients and determine which cognitive processes were most strongly associated with antisaccade error rate in both patients and controls. In addition, the study investigated whether different antisaccade task paradigms engage different cognitive processes. Methods: One hundred and ninety-one participants (54 patients with schizophrenia/schizoaffective disorder and 137 controls) completed the antisaccade task, which included both gap and step task parameters. Neuropsychological measures were obtained using the MCCB and the Stroop task. Results: The current study replicated a pronounced antisaccade error rate deficit in patients. In patients, working memory variance was most significantly associated with antisaccade errors made during the step condition, while attentional processes were most associated with errors made during the gap condition. In controls, overall global cognitive performance was most associated with antisaccade rates for both gap and step conditions. Conclusions: The current study demonstrates that in schizophrenia patients, but not controls, elevated antisaccade error rate is associated with attention and working memory, but not with global cognitive impairment or psychopathological processes. Our novel findings demonstrate that the gap and step conditions of the antisaccade task engage different cognitive processes. (JINS, 2019, 25, 174–183)
Adherence to dietary guidelines (DG) may result in higher intake of polyphenols via increased consumption of fruits, vegetables and whole grains. We compared polyphenol dietary intake and urinary excretion between two intervention groups in the Cardiovascular risk REduction Study: Supported by an Integrated Dietary Approach study: a 12-week parallel-arm, randomised controlled trial (n 161; sixty-four males, ninety-seven females; aged 40–70 years). One group adhered to UK DG, whereas the other group consumed a representative UK diet (control). We estimated polyphenol dietary intake, using a 4-d food diary (4-DFD) and FFQ, and analysed 24-h polyphenol urinary excretion by liquid chromatography-tandem MS on a subset of participants (n 46 control; n 45 DG). A polyphenol food composition database for 4-DFD analysis was generated using Phenol-Explorer and USDA databases. Total polyphenol intake by 4-DFD at endpoint (geometric means with 95 % CI, adjusted for baseline and sex) was significantly higher in the DG group (1279 mg/d per 10 MJ; 1158, 1412) compared with the control group (1084 mg/d per 10 MJ; 980, 1197). The greater total polyphenol intake in the DG group was attributed to higher intake of anthocyanins, proanthocyanidins and hydroxycinnamic acids, with the primary food sources being fruits, cereal products, nuts and seeds. FFQ estimates of flavonoid intake also detected greater intake in DG compared with the control group. 24-h urinary excretion showed consistency with 4-DFD in their ability to discriminate between dietary intervention groups for six out of ten selected, individual polyphenols. In conclusion, following UK DG increased total polyphenol intake by approximately 20 %, but not all polyphenol subclasses corresponded with this finding.
We observed pediatric S. aureus hospitalizations decreased 36% from 26.3 to 16.8 infections per 1,000 admissions from 2009 to 2016, with methicillin-resistant S. aureus (MRSA) decreasing by 52% and methicillin-susceptible S. aureus decreasing by 17%, among 39 pediatric hospitals. Similar decreases were observed for days of therapy of anti-MRSA antibiotics.
Knowledge of the effects of burial depth and burial duration on seed viability and, consequently, seedbank persistence of Palmer amaranth (Amaranthus palmeri S. Watson) and waterhemp [Amaranthus tuberculatus (Moq.) J. D. Sauer] ecotypes can be used for the development of efficient weed management programs. This is of particular interest, given the great fecundity of both species and, consequently, their high seedbank replenishment potential. Seeds of both species collected from five different locations across the United States were investigated in seven states (sites) with different soil and climatic conditions. Seeds were placed at two depths (0 and 15 cm) for 3 yr. Each year, seeds were retrieved, and seed damage (shrunken, malformed, or broken) plus losses (deteriorated and futile germination) and viability were evaluated. Greater seed damage plus loss averaged across seed origin, burial depth, and year was recorded for lots tested at Illinois (51.3% and 51.8%) followed by Tennessee (40.5% and 45.1%) and Missouri (39.2% and 42%) for A. palmeri and A. tuberculatus, respectively. The site differences for seed persistence were probably due to higher volumetric water content at these sites. Rates of seed demise were directly proportional to burial depth (α=0.001), whereas the percentage of viable seeds recovered after 36 mo on the soil surface ranged from 4.1% to 4.3% compared with 5% to 5.3% at the 15-cm depth for A. palmeri and A. tuberculatus, respectively. Seed viability loss was greater in the seeds placed on the soil surface compared with the buried seeds. The greatest influences on seed viability were burial conditions and time and site-specific soil conditions, more so than geographical location. Thus, management of these weed species should focus on reducing seed shattering, enhancing seed removal from the soil surface, or adjusting tillage systems.
OBJECTIVES/SPECIFIC AIMS: The purpose of the present secondary data analysis was to examine the effect of moderate-severe disturbed sleep before the start of radiation therapy (RT) on subsequent RT-induced pain. METHODS/STUDY POPULATION: Analyses were performed on 676 RT-naïve breast cancer patients (mean age 58, 100% female) scheduled to receive RT from a previously completed nationwide, multicenter, phase II randomized controlled trial examining the efficacy of oral curcumin on radiation dermatitis severity. The trial was conducted at 21 community oncology practices throughout the US affiliated with the University of Rochester Cancer Center NCI’s Community Oncology Research Program (URCC NCORP) Research Base. Sleep disturbance was assessed using a single item question from the modified MD Anderson Symptom Inventory (SI) on a 0–10 scale, with higher scores indicating greater sleep disturbance. Total subjective pain as well as the subdomains of pain (sensory, affective, and perceived) were assessed by the short-form McGill Pain Questionnaire. Pain at treatment site (pain-Tx) was also assessed using a single item question from the SI. These assessments were included for pre-RT (baseline) and post-RT. For the present analyses, patients were dichotomized into 2 groups: those who had moderate-severe disturbed sleep at baseline (score≥4 on the SI; n=101) Versus those who had mild or no disturbed sleep (control group; score=0–3 on the SI; n=575). RESULTS/ANTICIPATED RESULTS: Prior to the start of RT, breast cancer patients with moderate-severe disturbed sleep at baseline were younger, less likely to have had lumpectomy or partial mastectomy while more likely to have had total mastectomy and chemotherapy, more likely to be on sleep, anti-anxiety/depression, and prescription pain medications, and more likely to suffer from depression or anxiety disorder than the control group (all p’s≤0.02). Spearman rank correlations showed that changes in sleep disturbance from baseline to post-RT were significantly correlated with concurrent changes in total pain (r=0.38; p<0.001), sensory pain (r=0.35; p<0.001), affective pain (r=0.21; p<0.001), perceived pain intensity (r=0.37; p<0.001), and pain-Tx (r=0.35; p<0.001). In total, 92% of patients with moderate-severe disturbed sleep at baseline reported post-RT total pain compared with 79% of patients in the control group (p=0.006). Generalized linear estimating equations, after controlling for baseline pain and other covariates (baseline fatigue and distress, age, sleep medications, anti-anxiety/depression medications, prescription pain medications, and depression or anxiety disorder), showed that patients with moderate-severe disturbed sleep at baseline had significantly higher mean values of post-RT total pain (by 39%; p=0.033), post-RT sensory pain (by 41%; p=0.046), and post-RT affective pain (by 55%; p=0.035) than the control group. Perceived pain intensity (p=0.066) and pain-Tx (p=0.086) at post-RT were not significantly different between the 2 groups. DISCUSSION/SIGNIFICANCE OF IMPACT: These findings suggest that moderate-severe disturbed sleep prior to RT is an important predictor for worsening of pain at post-RT in breast cancer patients. There could be several plausible reasons for this. Sleep disturbance, such as sleep loss and sleep continuity disturbance, could result in impaired sleep related recovery and repair of tissue damage associated with cancer and its treatment; thus, resulting in the amplification of pain. Sleep disturbance may also reduce pain tolerance threshold through increased sensitization of the central nervous system. In addition, pain and sleep disturbance may share common neuroimmunological pathways. Sleep disturbance may modulate inflammation, which in turn may contribute to increased pain. Further research is needed to confirm these findings and whether interventions targeting sleep disturbance in early phase could be potential alternate approaches to reduce pain after RT.
Background: Despite advances in neonatal care, neonates with moderate to severe HIE are at high risk of mortality and morbidity. we report the impact of a dedicated NNCC team on short term mortality and morbidities. Methods: A retrospective cohort study on neonates with moderate to serve HIE between July 1st 2008 and December 31st 2017. primary outcome : a composite of death and/or brain injury on MRI. Secondary outcomes: rate of cooling, length of hospital stay, anti-seizure medication burden, and use of inotropes. A regression analysis was done adjusting for gestational age, birth weight, gender, out-born status, Apgar score at 10 minutes, cord blood pH, and HIE clinical staging Results: 216 neonates were included, 109 before NNCC implementation, and 107 thereafter. NNCC program resulted in reduction in the primary outcome (AOR: 0.28, CI: 0.14-0.54, p<0.001) and brain injury (AOR: 0.28, CI: 0.14-0.55, p<0.001). It decreased average length of stay/infants by 5 days (p=0.03), improved cooling rate (73% compared to 93% , p <0.001), reduced: seizure misdiagnosis (71% compared to 23%, P <0.001), anti-seizure medication burden (P = 0.001), and inotrope use (34% compared to 53%, p=0.004) Conclusions: NNCC program decreased mortality and brain injury , shortened the length of hospital stay and improved care of neonates with significant HIE.
Different diagnostic interviews are used as reference standards for major depression classification in research. Semi-structured interviews involve clinical judgement, whereas fully structured interviews are completely scripted. The Mini International Neuropsychiatric Interview (MINI), a brief fully structured interview, is also sometimes used. It is not known whether interview method is associated with probability of major depression classification.
To evaluate the association between interview method and odds of major depression classification, controlling for depressive symptom scores and participant characteristics.
Data collected for an individual participant data meta-analysis of Patient Health Questionnaire-9 (PHQ-9) diagnostic accuracy were analysed and binomial generalised linear mixed models were fit.
A total of 17 158 participants (2287 with major depression) from 57 primary studies were analysed. Among fully structured interviews, odds of major depression were higher for the MINI compared with the Composite International Diagnostic Interview (CIDI) (odds ratio (OR) = 2.10; 95% CI = 1.15–3.87). Compared with semi-structured interviews, fully structured interviews (MINI excluded) were non-significantly more likely to classify participants with low-level depressive symptoms (PHQ-9 scores ≤6) as having major depression (OR = 3.13; 95% CI = 0.98–10.00), similarly likely for moderate-level symptoms (PHQ-9 scores 7–15) (OR = 0.96; 95% CI = 0.56–1.66) and significantly less likely for high-level symptoms (PHQ-9 scores ≥16) (OR = 0.50; 95% CI = 0.26–0.97).
The MINI may identify more people as depressed than the CIDI, and semi-structured and fully structured interviews may not be interchangeable methods, but these results should be replicated.
Declaration of interest
Drs Jetté and Patten declare that they received a grant, outside the submitted work, from the Hotchkiss Brain Institute, which was jointly funded by the Institute and Pfizer. Pfizer was the original sponsor of the development of the PHQ-9, which is now in the public domain. Dr Chan is a steering committee member or consultant of Astra Zeneca, Bayer, Lilly, MSD and Pfizer. She has received sponsorships and honorarium for giving lectures and providing consultancy and her affiliated institution has received research grants from these companies. Dr Hegerl declares that within the past 3 years, he was an advisory board member for Lundbeck, Servier and Otsuka Pharma; a consultant for Bayer Pharma; and a speaker for Medice Arzneimittel, Novartis, and Roche Pharma, all outside the submitted work. Dr Inagaki declares that he has received grants from Novartis Pharma, lecture fees from Pfizer, Mochida, Shionogi, Sumitomo Dainippon Pharma, Daiichi-Sankyo, Meiji Seika and Takeda, and royalties from Nippon Hyoron Sha, Nanzando, Seiwa Shoten, Igaku-shoin and Technomics, all outside of the submitted work. Dr Yamada reports personal fees from Meiji Seika Pharma Co., Ltd., MSD K.K., Asahi Kasei Pharma Corporation, Seishin Shobo, Seiwa Shoten Co., Ltd., Igaku-shoin Ltd., Chugai Igakusha and Sentan Igakusha, all outside the submitted work. All other authors declare no competing interests. No funder had any role in the design and conduct of the study; collection, management, analysis and interpretation of the data; preparation, review or approval of the manuscript; and decision to submit the manuscript for publication.
Introduction: Early recognition of sepsis can improve patient outcomes yet recognition by paramedics is poor and research evaluating the use of prehospital screening tools is limited. Our objective was to evaluate the predictive validity of the Regional Paramedic Program for Eastern Ontario (RPPEO) prehospital sepsis notification tool to identify patients with sepsis and to describe and compare the characteristics of patients with an emergency department (ED) diagnosis of sepsis that are transported by paramedics. The RPPEO prehospital sepsis notification tool is comprised of 3 criteria: current infection, fever &/or history of fever and 2 or more signs of hypoperfusion (eg. SBP<90, HR 100, RR24, altered LOA). Methods: We performed a review of ambulance call records and in-hospital records over two 5-month periods between November 2014 February 2016. We enrolled a convenience sample of patients, assessed by primary and advanced care paramedics (ACPs), with a documented history of fever &/or documented fever of 38.3°C (101°F) that were transported to hospital. In-hospital management and outcomes were obtained and descriptive, t-tests, and chi-square analyses performed where appropriate. The RPPEO prehospital sepsis notification tool was compared to an ED diagnosis of sepsis. The predictive validity of the RPPEO tool was calculated (sensitivity, specificity, NPV, PPV). Results: 236 adult patients met the inclusion criteria with the following characteristics: mean age 65.2 yrs [range 18-101], male 48.7%, history of sepsis 2.1%, on antibiotics 23.3%, lowest mean systolic BP 125.9, treated by ACP 58.9%, prehospital temperature documented 32.6%. 34 (14.4%) had an ED diagnosis of sepsis. Patients with an ED diagnosis of sepsis, compared to those that did not, had a lower prehospital systolic BP (114.9 vs 127.8, p=0.003) and were more likely to have a prehospital shock index >1 (50.0% vs 21.4%, p=0.001). 44 (18.6%) patients met the RPPEO sepsis notification tool and of these, 27.3% (12/44) had an ED diagnosis of sepsis. We calculated the following predictive values of the RPPEO tool: sensitivity 35.3%, specificity 84.2%, NPV 88.5%, PPV 27.3%. Conclusion: The RPPEO prehospital sepsis notification tool demonstrated modest diagnostic accuracy. Further research is needed to improve accuracy and evaluate the impact on patient outcomes.