To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Site-selectivity analysis of drilling predation traces may provide useful behavioral information concerning a predator interacting with its prey. However, traditional approaches exclude some spatial information (i.e., oversimplified trace position) and are dependent on the scale of analysis (e.g., arbitrary grid system used to divide the prey skeleton into sectors). Here we introduce the spatial point pattern analysis of traces (SPPAT), an approach for visualizing and quantifying the distribution of traces on shelled invertebrate prey, which includes improved collection of spatial information inherent to drillhole location (morphometric-based estimation), improved visualization of spatial trends (kernel density and hotspot mapping), and distance-based statistics for hypothesis testing (K-, L-, and pair correlation functions). We illustrate the SPPAT approach through case studies of fossil samples, modern beach-collected samples, and laboratory feeding trials of naticid gastropod predation on bivalve prey. Overall results show that kernel density and hotspot maps enable visualization of subtle variations in regions of the shell with higher density of predation traces, which can be combined with the maximum clustering distance metric to generate hypotheses on predatory behavior and anti-predatory responses of prey across time and geographic space. Distance-based statistics also capture the major features in the distribution of traces across the prey skeleton, including aggregated and segregated clusters, likely associated with different combinations of two modes of drilling predation, edge and wall drilling. The SPPAT approach is transferable to other paleoecologic and taphonomic data such as encrustation and bioerosion, allowing for standardized investigation of a wide range of biotic interactions.
Introduction: Prehospital field trauma triage (FTT) standards were reviewed and revised in 2014 based on the recommendations of the Centers for Disease Control and Prevention. The FTT standard allows a hospital bypass and direct transport, within 30 min, to a lead trauma hospital (LTH). Our objectives were to assess the impact of the newly introduced prehospital FTT standard and to describe the emergency department (ED) management and outcomes of patients that had bypassed closer hospitals. Methods: We conducted a 12-month multi-centred health record review of paramedic and ED records following the implementation of the 4 step FTT standard (step 1: vital signs and level of consciousness (physiologic), step 2: anatomical injury, step 3: mechanism and step 4: special considerations) in nine paramedic services across Eastern Ontario. We included adult trauma patients transported as urgent that met FTT standard, regardless of transport time. We developed and piloted a data collection tool and obtained consensus on all definitions. The primary outcome was the rate of appropriate triage to a LTH which was defined as: ISS ≥12, admitted to intensive care unit (ICU), non-orthopedic surgery, or death. We have reported descriptive statistics. Results: 570 patients were included: mean age 48.8, male 68.9%, falls 29.6%, motor vehicle collisions 20.2%, stab wounds 10.5%, transported to a LTH 76.5% (n = 436). 72.2% (n = 315) of patients transported to a LTH had bypassed a closer hospital and 126/306 (41.2%) of those were determined to be an appropriate triage to LTH (9 patients had missing outcomes). ED management included: CT head/cervical spine 69.9%, ultrasound 53.6%, xray 51.6%, intubation 15.0%, sedation 11.1%, tranexamic acid 9.8%, blood transfusion 8.2%, fracture reduction 6.9%, tube thoracostomy 5.9%. Outcomes included: ISS ≥ 12 32.7%, admitted to ICU 15.0%, non-orthopedic surgery 11.1%, death 8.8%. Others included: admission to hospital 57.5%, mean LOS 12.8 days, orthopedic surgery 16.3% and discharged from ED 37.3%. Conclusion: Despite a high number of admissions, the majority of trauma patients bypassed to a LTH were considered over-triaged, with a low number of ED procedures and non-orthopedic surgeries. Continued work is needed to appropriately identify patients requiring transport to a LTH.
Acute change in mental status (ACMS), defined by the Confusion Assessment Method, is used to identify infections in nursing home residents. A medical record review revealed that none of 15,276 residents had an ACMS documented. Using the revised McGeer criteria with a possible ACMS definition, we identified 296 residents and 21 additional infections. The use of a possible ACMS definition should be considered for retrospective nursing home infection surveillance.
More than 80 % of patients experiencing their first depressive episode will have at least one new episode. Effective interventions to reduce the risk of relapse and recurrence are needed. Psychoeducation is a form of interactive education enhancing knowledge about patients’ illness, including its course, symptoms and treatment. Psychoeducational methods can also act as family intervention – already an evidence-based practice in schizophrenia and bipolar disorder. Only few studies have focused on the effect of psychoeducation, including family psychoeducation, in the prevention of new depressive episodes.
Purpose is to evaluate whether an intervention of psychoeducation for family members, compared to a control intervention, is effective in reducing the risk of new depressive episodes among patients that have achieved remission or partial remission of depressive symptoms after the acute phase of antidepressive treatment.
The project is based on a double-centre, randomized controlled trial where investigator and raters conducting psychometric assessments, will be blinded to treatment allocation. A total of 130 patients with unipolar depression in remission or partial remission will be included together with their closest relative. After baseline assessments, relatives will be randomized to either 4 sessions of a family psychoeducation program or 4 sessions in a social support group without any psychoeducational intervention. Patients will not participate in group sessions and they will continue their outpatient-treatment as usual.
Primary outcome is evaluated after 9 months and include rates of relapse as measured by HAM-D17 and rates of recurrence according to DMS-VI-R and HAM-D17.
It has been hypothesized that the first five years after first episode of psychosis constitutes a critical period in with opportunities for ameliorating the course of illness. Based on this rationale, specialized assertive early intervention services were developed. We wanted to investigate the evidence basis for such interventions.
The evidence for the effectiveness of specialized assertive early intervention services is mainly based on one large randomized clinical trial, the OPUS trial, but it is supported by the findings in smaller trials such as the Lambeth Early Onset (LEO) trial, the Croydon Outreach and Assertive Support Team COAST trial and the Norwegian site of Optimal Treatment (OTP) trial. There are positive effects on psychotic and negative symptoms, on substance abuse and user satisfaction, but the clinical effects are not sustainable when patients are transferred back to standard treatment. However the positive effects on service use and ability to live independently seem to be durable.
Implementation of specialized assertive early intervention services is recommended, but the evidence basis needs to be strengthened through replication in large high quality trials. Recommendation regarding the duration of treatment must await results of ongoing trials comparing two years of intervention with extended treatment periods.
Introduction: Trauma and injury play a significant role in the population's burden of disease. Limited research exists evaluating the role of trauma bypass protocols. The objective of this study was to assess the impact and effectiveness of a newly introduced prehospital field trauma triage (FTT) standard, allowing paramedics to bypass a closer hospital and directly transport to a trauma centre (TC) provided transport times were within 30 minutes. Methods: We conducted a 12-month multi-centred health record review of paramedic call reports and emergency department health records following the implementation of the 4 step FTT standard (step 1: vital signs and level of consciousness, step 2: anatomical injury, step 3: mechanism and step 4: special considerations) in nine paramedic services across Eastern Ontario. We included adult trauma patients transported as an urgent transport to hospital, that met one of the 4 steps of the FTT standard and would allow for a bypass consideration. We developed and piloted a standardized data collection tool and obtained consensus on all data definitions. The primary outcome was the rate of appropriate triage to a TC, defined as any of the following: injury severity score ≥12, admitted to an intensive care unit, underwent non-orthopedic operation, or death. We report descriptive and univariate analysis where appropriate. Results: 570 adult patients were included with the following characteristics: mean age 48.8, male 68.9%, attended by Advanced Care Paramedic 71.8%, mechanisms of injury: MVC 20.2%, falls 29.6%, stab wounds 10.5%, median initial GCS 14, mean initial BP 132, prehospital fluid administered 26.8%, prehospital intubation 3.5%, transported to a TC 74.6%. Of those transported to a TC, 308 (72.5%) had bypassed a closer hospital prior to TC arrival. Of those that bypassed a closer hospital, 136 (44.2%) were determined to be “appropriate triage to TC”. Bypassed patients more often met the step 1 or step 2 of the standard (186, 66.9%) compared to the step 3 or step 4 (122, 39.6%). An appropriate triage to TC occurred in 104 (55.9%) patients who had met step 1 or 2 and 32 (26.2%) patients meeting step 3 or 4 of the FTT standard. Conclusion: The FTT standard can identify patients who should be bypassed and transported to a TC. However, this is at a cost of potentially burdening the system with poor sensitivity. More work is needed to develop a FTT standard that will assist paramedics in appropriately identifying patients who require a trauma centre.
The US Navy utilizes numerous resources to encourage smoking cessation. Despite these efforts, cigarette smoking among service members remains high. Electronic cigarettes (EC) have provided an additional cessation resource. Little is known regarding the utilization efficacy of these cessation resources in the US Navy.
This study sought to explore the utilization and efficacy of ECs and other smoking cessation resources.
An anonymous cross-sectional survey was conducted at a military clinic from 2015 to 2016. Participants were active duty in the US Navy and reported demographics, smoking behaviors, and utilization of cessation resources.
Of the 977 participants in the study, 14.9% were current and 39.4% were former smokers. Most current smokers (83.6%) previously attempted cessation, smoked an average of 2–5 cigarettes per day (34.7%), and smoked every day of the month (26.4%). The number of daily cigarettes smoked and number of days cigarettes were smoked per month was not significantly different between cigarette-only smokers and EC dual users (p = 0.92, p = 0.75, respectively). Resources used by current and former smokers include: ‘cold turkey’ (44.6%, 57.1%, respectively), ECs (22.3%, 24.7%), nicotine patch (8.3%, 1.3%), medicine (6.6%, 3.9%), nicotine gum (5.8%, 10.4%), and quit programs (2.5%, 2.6).
Current and former cigarette smokers utilized similar resources to quit smoking. Electronic cigarettes are being used for cessation but do not significantly reduce the number of cigarettes smoked on a daily or monthly basis. Future studies may benefit from exploring the use of cessation resources and ECs within the military as a whole.
During water entry, a projectile can entrain an air cavity that trails behind it. Most previous studies focus on the formation and pinch-off dynamics of the air cavity, but only a few have investigated the long-term cavity dynamics after pinch-off. In this study, we examine the ripple formation following the pinch-off of an air cavity generated by a cone, with different cone angles and impact velocities. The amplitude and wavelength of these ripples are measured, and the force on the cone is experimentally determined. It was observed that the ripple amplitude and wavelength increase linearly with the cone impact velocity, which is predicted by our acoustic model of the compressible air cavity. In addition, the measured force exhibits distinct amplitudes and wavelengths. By measuring the length of the cavity, the resulting pressure variation was averaged inside the air cavity leading to a theoretical force amplitude, which matched our observations. We noted that the force wavelength also follows the same acoustic model, which agrees very well with the wavelength of the ripples.
Introduction: Early recognition of sepsis can improve patient outcomes yet recognition by paramedics is poor and research evaluating the use of prehospital screening tools is limited. Our objective was to evaluate the predictive validity of the Regional Paramedic Program for Eastern Ontario (RPPEO) prehospital sepsis notification tool to identify patients with sepsis and to describe and compare the characteristics of patients with an emergency department (ED) diagnosis of sepsis that are transported by paramedics. The RPPEO prehospital sepsis notification tool is comprised of 3 criteria: current infection, fever &/or history of fever and 2 or more signs of hypoperfusion (eg. SBP<90, HR 100, RR24, altered LOA). Methods: We performed a review of ambulance call records and in-hospital records over two 5-month periods between November 2014 February 2016. We enrolled a convenience sample of patients, assessed by primary and advanced care paramedics (ACPs), with a documented history of fever &/or documented fever of 38.3°C (101°F) that were transported to hospital. In-hospital management and outcomes were obtained and descriptive, t-tests, and chi-square analyses performed where appropriate. The RPPEO prehospital sepsis notification tool was compared to an ED diagnosis of sepsis. The predictive validity of the RPPEO tool was calculated (sensitivity, specificity, NPV, PPV). Results: 236 adult patients met the inclusion criteria with the following characteristics: mean age 65.2 yrs [range 18-101], male 48.7%, history of sepsis 2.1%, on antibiotics 23.3%, lowest mean systolic BP 125.9, treated by ACP 58.9%, prehospital temperature documented 32.6%. 34 (14.4%) had an ED diagnosis of sepsis. Patients with an ED diagnosis of sepsis, compared to those that did not, had a lower prehospital systolic BP (114.9 vs 127.8, p=0.003) and were more likely to have a prehospital shock index >1 (50.0% vs 21.4%, p=0.001). 44 (18.6%) patients met the RPPEO sepsis notification tool and of these, 27.3% (12/44) had an ED diagnosis of sepsis. We calculated the following predictive values of the RPPEO tool: sensitivity 35.3%, specificity 84.2%, NPV 88.5%, PPV 27.3%. Conclusion: The RPPEO prehospital sepsis notification tool demonstrated modest diagnostic accuracy. Further research is needed to improve accuracy and evaluate the impact on patient outcomes.
Serious mental illness (SMI) is profoundly stigmatised, such that there is even an impact on relatives of people with SMI.
To develop and validate a scale to comprehensively measure self-stigma among first-degree relatives of individuals with SMI.
We conducted group interviews focusing on self-stigma with first-degree relatives (n = 20) of people with SMI, from which 74 representative quotations were reframed as Likert-type items. Cognitive interviews with relatives (n = 11) identified 30 items for the Self-Stigma in Relatives of people with Mental Illness (SSRMI) scale. Relatives (n = 195) completed the scale twice, a month apart, together with four external correlate scales.
The 30-item SSRMI was reliable, with scores stable over time. Its single-factor structure allowed generation of a 10-item version. Construct validity of 30- and 10-item versions was supported by expected relationships with external correlates.
Both versions of the SSRMI scale are valid and reliable instruments appropriate for use in clinical and research contexts.
Building on social identity approach and intergroup helping as status relations model, the current research examined the explored effects of stability of social stratification and forms of help on higher socioeconomic status (SES) members’ attitudes towards anti-poverty programs. Two studies were conducted in a 2 (social stratification stability) × 2 (forms of help) design on willingness to support anti-poverty programs. Study 1 examined the conditions of unstable and stable social stratification that might pattern differences in support of hypothetical anti-poverty programs construed as dependency-oriented or autonomy-oriented help. Study 2 replicated and extended study 1 by examining higher SES (subjective) participants’ attitudes towards the cash transfer programs (conditional vs. unconditional, which were determined by their perceptions of the stability of social stratification). Overall, the results of the two studies confirmed that attitudes towards anti-poverty programs could be construed as specific forms of help (dependency-oriented and/or autonomy-oriented help) depending on the nature of the intergroup relations (stability of the social stratification). Finally, the theoretical contribution of the current research is discussed.
Introduction: Safety culture is defined as the shared beliefs that an organization’s employees hold relative to workplace safety. Perceptions of workplace safety culture within paramedic services have been shown to be associated with patient and provider safety outcomes as well as safe work practices. We sought to characterize paramedics’ perceptions of the organizational safety culture across Eastern Ontario, Canada to provide important benchmarking data to evaluate future quality initiatives. Methods: This was a cross-sectional survey study conducted September 2015-January 2016 in 7 paramedic services across Eastern Ontario. We distributed an abridged version of Patterson’s previously published EMS-SAQ survey, measuring six domains of workplace safety culture, to 1,066 paramedics during continuing medical education sessions. The questions were presented for rating on a 5 point Likert scale (1=strongly agree, 5= strongly disagree) and a response of 1 or 2 was considered a ‘positive perception’ response. We present descriptive statistics and chi-square tests where appropriate. Results: We received responses from 1,041 paramedics (97.6%), with a response rate varying between 88.0% and 100% across the 8 paramedic services. One third (33.6%) were Advanced Care Paramedics (ACPs) and 39.4% of paramedics had more than 10 years’ experience. The percentage of positive responses for each domain were: Safety Climate 31.2% (95% CI 28.4-34.1), Teamwork Climate 29.3% (95% CI 26.6-32.1), Stress Recognition 56.8% (95% CI 53.8-59.8), Perceptions of Management 67.0% (95% CI 64.0-69.8), Working Conditions 42.6% (95% CI 39.6-45.7), Job Satisfaction 41.6% (95% CI 38.6-44.6). Primary care paramedics had more positive perception responses for Job Satisfaction (45% vs 35%, p=0.002), whereas ACPs had more positive perception responses for Stress Recognition (61.5% vs 54.1%, p=0.022). No association was found between gender or years of experience and a positive perception of any safety domain. Conclusion: The results provide valuable workplace safety culture data that will be used to target and evaluate needed quality improvement initiatives while also raising some awareness to paramedics of important factors related to patient and provider safety.
Segregation of polydisperse granular materials occurs in many natural and industrial settings, but general theoretical modelling approaches with predictive power have been lacking. Here we describe a model capable of accurately predicting segregation for both discrete and continuous particle size distributions based on a generalized expression for the percolation velocity. The predictions of the model depend on the kinematics of the flow and other physical parameters such as the diffusion coefficient and the percolation length scale, quantities that can be determined directly from experiment, simulation or theory and that are not arbitrarily adjustable. The model is applied to heap and chute flow, and the resulting predictions are consistent with experimentally validated discrete element method (DEM) simulations. Several different continuous particle size distributions are considered to demonstrate the broad applicability of the approach.
Scanning transmission electron microscope (STEM) through-focus imaging (TFI) has been used to determine the three-dimensional atomic structure of Bi segregation-induced brittle Cu grain boundaries (GBs). With TFI, it is possible to observe single Bi atom distributions along Cu  twist GBs using an aberration-corrected STEM operating at 200 kV. The depth resolution is ~5 nm. Specimens with GBs intentionally inclined with respect to the microscope’s optic axis were used to investigate Bi segregant atom distributions along and through the Cu GB. It was found that Bi atoms exist at most once per Cu unit cell along the GB, meaning that no continuous GB film is present. Therefore, the reduced fracture toughness of this particular Bi-doped Cu boundary would not be caused by fracture of Bi–Bi bonds.
Introduction: Patients seen primarily for hypertension are common in the emergency department. The outcomes of these patients have not been described at a population level. In this study we describe the characteristics and outcomes of the patients making these visits, as well as changes over time. Methods: This retrospective cohort study used linked health databases from the province of Ontario, Canada, to assess emergency department visits made between April 1, 2002 and March 31, 2012 with a primary diagnosis of hypertension. We determined the annual number of visits as well as the age and sex standardized rates. We examined visit disposition and assessed mortality outcomes and potential hypertensive complications at 7, 30, 90, 365 days and 2 years subsequent to the ED visit. Results: There were 206,147 qualifying ED visits from 180 sites. Visits increased by 64% between 2002 and 2012, from 15793 to 25950 annual visits, respectively. The age- and sex-standardized rate increased from 170/100,000 persons to 228/100,000 persons over the same time period, a 34% increase. Eight percent of visits ended in hospitalization, but this proportion decreased from 9.9% to 7.1% over the study period. Mortality was very low, at less than 1% within 90 days, 2.5% within 1 year, and 4.1% within 2 years. Among subsequent hospitalizations for potential hypertensive complications, stroke was the most frequent admitting diagnosis, but the frequency was still <1% within 1 year. Together hospitalizations for stroke, heart failure, acute myocardial infarction, atrial fibrillation, renal failure, hypertensive encephalopathy and dissection were <1% at 30 days. Conclusion: The number of visits made primarily for hypertension has increased dramatically over the last decade. While some of the increase is due to aging of the population, other forces are contributing to the increase. Subsequent mortality and complication rates are low and have declined. With current practice patterns, the feared complications of hypertension are extremely infrequent.
Expert judgement has been used since the actuarial profession was founded. In the past, there has often been a lack of transparency regarding the use of expert judgement, even though those judgements could have a very significant impact on the outputs of calculations and the decisions made by organisations. The lack of transparency has a number of dimensions, including the nature of the underlying judgements, as well as the process used to derive those judgements. This paper aims to provide a practical framework regarding expert judgement processes, and how those processes may be validated. It includes a worked example illustrating how the process could be used for setting a particular assumption. It concludes with some suggested tools for use within expert judgement. Although primarily focussed on the insurance sector, the proposed process framework could be applied more widely without the need for significant changes.
Periods of rapid growth seen during the early stages of fetal development, including cell proliferation and differentiation, are greatly influenced by the maternal environment. We demonstrate here that over-nutrition, specifically exposure to a high-fat diet in utero, programed the extent of atherosclerosis in the offspring of ApoE*3 Leiden transgenic mice. Pregnant ApoE*3 Leiden mice were fed either a control chow diet (2.8% fat, n=12) or a high-fat, moderate-cholesterol diet (MHF, 19.4% fat, n=12). Dams were fed the chow diet during the suckling period. At 28 days postnatal age wild type and ApoE*3 Leiden offspring from chow or MHF-fed mothers were fed either a control chow diet (n=37) or a diet rich in cocoa butter (15%) and cholesterol (0.25%), for 14 weeks to induce atherosclerosis (n=36). Offspring from MHF-fed mothers had 1.9-fold larger atherosclerotic lesions (P<0.001). There was no direct effect of prenatal diet on plasma triglycerides or cholesterol; however, transgenic ApoE*3 Leiden offspring displayed raised cholesterol when on an atherogenic diet compared with wild-type controls (P=0.031). Lesion size was correlated with plasma lipid parameters after adjustment for genotype, maternal diet and postnatal diet (R2=0.563, P<0.001). ApoE*3 Leiden mothers fed a MHF diet developed hypercholesterolemia (plasma cholesterol two-fold higher than in chow-fed mothers, P=0.011). The data strongly suggest that maternal hypercholesterolemia programs later susceptibility to atherosclerosis. This is consistent with previous observations in humans and animal models.