To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Four percent of ST-elevation myocardial infarctions (STEMIs) are complicated by an out-of-hospital cardiac arrest (OHCA). Research has shown that shorter time to initial defibrillation in patients with ventricular fibrillation/tachycardia (VF/VT) arrests increases favourable neurologic survival. The purpose of this study is to determine whether routine application of defibrillation pads in patients with prehospital STEMI decreases the time to initial defibrillation in those who suffer OHCA.
This was a health records review for adult patients diagnosed with STEMI in the prehospital setting from January 2012 to July 2016. Patients were included if they had a 12 lead ECG indicative of STEMI and subsequently suffered VF/VT OHCA while in paramedic care. This study was designed to evaluate the effects of the “pads-on” protocol in a pre (Jan 2012-May 2014) /post implementation fashion (Jun 2014- Jul 2016). Records were reviewed for relevant patient and event features. T-test was used to measure the difference between mean times to defibrillation.
446 patients were diagnosed with prehospital STEMI. 11 suffered OHCA while in paramedic care. The mean (SD) age was 66.0 (9.3) and 55% were female. In the 4 patients treated with the “pads-on” protocol, the mean time to initial defibrillation was 17.7 seconds, compared to 72.7 seconds in patients who had pads applied following arrest (Δ 55.0 sec [95% CI 22.7–87.2 s]).
Routine application of defibrillation pads in STEMI patients who suffer OHCA decreases time to initial defibrillation, which has previously been demonstrated to increase favourable neurologic survival.
Extant research has established an empirical relationship between fatigue and safety-related outcomes. It is not clear if these findings are relevant to Canadian paramedicine. The purpose of this study was to determine if fatigue and shiftwork variables were related to safety outcomes in Canadian paramedics.
A survey was conducted with ten paramedic services in Ontario with a 40.5% response rate (n = 717). Respondents reported levels of fatigue, safety outcomes (injury, safety compromising behaviours, and medical errors/adverse events), work patterns (types of shifts, hours worked weekly) and demographic characteristics. Univariate and logistic regression analyses were used to assess for significant differences.
In this sample, 55% of paramedics reported being fatigued at work. Fatigued paramedics were over twice as likely to report injuries, three times as likely to report safety compromising behaviors, and 1.5 times more likely to report errors/adverse outcomes. When controlling for fatigue, shift length variables did not consistently influence safety outcomes.
These results create preliminary evidence of a relationship between fatigue and safety outcomes in Canadian paramedicine. While more research is needed, these findings point to the influence fatigue has on safety outcomes and provide an indication that fatigue mitigation efforts may be worthwhile.
Depression is a heterogeneous disorder with multiple aetiological pathways and multiple therapeutic targets. This study aims to determine whether atypical depression (AD) characterized by reversed neurovegetative symptoms is associated with a more pernicious course and a different sociodemographic, lifestyle, and comorbidity profile than nonatypical depression (nonAD).
Among 157 366 adults who completed the UK Biobank Mental Health Questionnaire (MHQ), N = 37 434 (24%) met the DSM-5 criteria for probable lifetime major depressive disorder (MDD) based on the Composite International Diagnostic Interview Short Form. Participants reporting both hypersomnia and weight gain were classified as AD cases (N = 2305), and the others as nonAD cases (N = 35 129). Logistic regression analyses were conducted to examine differences between AD and nonAD in depression features, sociodemographic and lifestyle factors, lifetime adversities, psychiatric and physical comorbidities.
Persons with AD experienced an earlier age of depression onset, longer, more severe and recurrent episodes, and higher help-seeking rates than nonAD persons. AD was associated with female gender, unhealthy behaviours (smoking, social isolation, low physical activity), more lifetime deprivation and adversity, higher rates of comorbid psychiatric disorders, obesity, cardiovascular disease (CVD), and metabolic syndrome. Sensitivity analyses comparing AD persons with those having typical neurovegetative symptoms (hyposomnia and weight loss) revealed similar results.
These findings highlight the clinical and public health significance of AD as a chronic form of depression, associated with high comorbidity and lifetime adversity. Our findings have implications for predicting depression course and comorbidities, guiding research on aetiological mechanisms, planning service use and informing therapeutic approaches.
This article explores the origins of youth engagement in school, community and democracy. Specifically, it considers the role of psychosocial or non-cognitive abilities, like grit or perseverance. Using a novel original large-scale longitudinal survey of students linked to school administrative records and a variety of modeling techniques – including sibling, twin and individual fixed effects – the study finds that psychosocial abilities are a strong predictor of youth civic engagement. Gritty students miss less class time and are more engaged in their schools, are more politically efficacious, are more likely to intend to vote when they become eligible, and volunteer more. Our work highlights the value of psychosocial attributes in the political socialization of young people.
We distributed a 16-question survey concerning noxious weed abundance, impacts, and management to livestock producers grazing on privately owned or leased grazing lands in Montana. The noxious weeds most commonly reported as being present on respondents’ grazing units were Canada thistle [Cirsium arvense (L.) Scop.] (64% of grazing units) and leafy spurge (Euphorbia esula L.) (45% of grazing units), and these species also reportedly caused the greatest reductions in livestock forage. Houndstongue (Cynoglossum officinale L.) was more prevalent than either spotted knapweed (Centaurea stoebe L.) or diffuse knapweed (Centaurea diffusa Lam.) (39% vs. 32% and 10%, respectively, of grazing units), but collectively C. stoebe and C. diffusa were reported to cause greater forage reductions than C. officinale. The top three strategies used to manage noxious weeds were chemical control, grazing, and biological control. Combining survey responses with forage-loss models derived from field data for C. stoebe and E. esula, we estimated the combined cost of noxious weed management and forage losses on privately owned rangeland to be $3.54 ha−1 yr−1, or $7,243 annually for an average size grazing unit (i.e., 2,046 ha [5,055 ac]). Our estimates of economic losses are lower than many estimates from previous studies, possibly because we focused only on direct costs related to private grazing land, while other studies often consider indirect impacts. Nonetheless, our estimates are substantial; for example, our estimated loss equates to 24% of the average per-hectare lease rate for Montana grazing land.
UK Biobank is a well-characterised cohort of over 500 000 participants that offers unique opportunities to investigate multiple diseases and risk factors.
An online mental health questionnaire completed by UK Biobank participants was expected to expand the potential for research into mental disorders.
An expert working group designed the questionnaire, using established measures where possible, and consulting with a patient group regarding acceptability. Case definitions were defined using operational criteria for lifetime depression, mania, anxiety disorder, psychotic-like experiences and self-harm, as well as current post-traumatic stress and alcohol use disorders.
157 366 completed online questionnaires were available by August 2017. Comparison of self-reported diagnosed mental disorder with a contemporary study shows a similar prevalence, despite respondents being of higher average socioeconomic status than the general population across a range of indicators. Thirty-five per cent (55 750) of participants had at least one defined syndrome, of which lifetime depression was the most common at 24% (37 434). There was extensive comorbidity among the syndromes. Mental disorders were associated with high neuroticism score, adverse life events and long-term illness; addiction and bipolar affective disorder in particular were associated with measures of deprivation.
The questionnaire represents a very large mental health survey in itself, and the results presented here show high face validity, although caution is needed owing to selection bias. Built into UK Biobank, these data intersect with other health data to offer unparalleled potential for crosscutting biomedical research involving mental health.
Declaration of interest
G.B. received grants from the National Institute for Health Research during the study; and support from Illumina Ltd. and the European Commission outside the submitted work. B.C. received grants from the Scottish Executive Chief Scientist Office and from The Dr Mortimer and Theresa Sackler Foundation during the study. C.S. received grants from the Medical Research Council and Wellcome Trust during the study, and is the Chief Scientist for UK Biobank. M.H. received grants from the Innovative Medicines Initiative via the RADAR-CNS programme and personal fees as an expert witness outside the submitted work.
Surgical site infections (SSIs) following colorectal surgery (CRS) are among the most common healthcare-associated infections (HAIs). Reduction in colorectal SSI rates is an important goal for surgical quality improvement.
To examine rates of SSI in patients with and without cancer and to identify potential predictors of SSI risk following CRS
American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) data files for 2011–2013 from a sample of 12 National Comprehensive Cancer Network (NCCN) member institutions were combined. Pooled SSI rates for colorectal procedures were calculated and risk was evaluated. The independent importance of potential risk factors was assessed using logistic regression.
Of 22 invited NCCN centers, 11 participated (50%). Colorectal procedures were selected by principal procedure current procedural technology (CPT) code. Cancer was defined by International Classification of Disease, Ninth Revision, Clinical Modification (ICD-9-CM) codes.
The primary outcome of interest was 30-day SSI rate.
A total of 652 SSIs (11.06%) were reported among 5,893 CRSs. Risk of SSI was similar for patients with and without cancer. Among CRS patients with underlying cancer, disseminated cancer (SSI rate, 17.5%; odds ratio [OR], 1.66; 95% confidence interval [CI], 1.23–2.26; P=.001), ASA score ≥3 (OR, 1.41; 95% CI, 1.09–1.83; P=.001), chronic obstructive pulmonary disease (COPD; OR, 1.6; 95% CI, 1.06–2.53; P=.02), and longer duration of procedure were associated with development of SSI.
Patients with disseminated cancer are at a higher risk for developing SSI. ASA score >3, COPD, and longer duration of surgery predict SSI risk. Disseminated cancer should be further evaluated by the Centers for Disease Control and Prevention (CDC) in generating risk-adjusted outcomes.
Exotic annual grasses such as medusahead [Taeniatherum caput-medusae (L.) Nevski] and downy brome (Bromus tectorum L.) dominate millions of hectares of grasslands in the western United States. Applying picloram, aminopyralid, and other growth regulator herbicides at late growth stages reduces seed production of most exotic annual grasses. In this study, we applied aminopyralid to T. caput-medusae to determine how reducing seed production in the current growing season influenced cover in the subsequent growing season. At eight annual grassland sites, we applied aminopyralid at 55, 123, and 245 g ae ha−1 in spring just before T. caput-medusae heading. The two higher rates were also applied pre-emergence (PRE) in fall to allow comparisons with this previously tested timing. When applied in spring during the roughly 10-d period between the flag leaf and inflorescence first becoming visible, just 55 g ae ha−1 of aminopyralid greatly limited seed production and subsequently reduced T. caput-medusae cover to nearly zero. Fall aminopyralid applications were less effective against T. caput-medusae, even at a rate of 245 g ae ha−1. The growing season of application, fall treatments, but not spring treatments, sometimes reduced cover of desirable winter annual forage grasses. The growing season after application, both spring and fall treatments tended to increase forage grasses, though spring treatments generally caused larger increases. Compared with other herbicide treatment options, preheading aminopyralid treatments are a relatively inexpensive, effective approach for controlling T. caput-medusae and increasing forage production.
The majority of market lamb produced in UK results from crossing terminal sire rams with crossbred ewes. The selection of terminal sire breeds over the past 15 years for improved carcass composition has shown positive benefits on carcass quality of their crossbred progeny (Simm et al. 2001). However, faster rates of genetic improvement would be achieved if information from these terminal sire crossbred lambs could be used in the genetic evaluation of terminal sire breeds. Incorporating crossbred information into purebred selection programmes has been modelled and the results show that it is an effective way of improving carcass quality (Jones et al. 1999). However, this is currently compromised by limitations on how individual carcass data can be collected for incorporation into breeding programmes. An innovative technology based on Video Image Analysis (VIA) of lamb carcasses is being evaluated for introduction into UK lamb abattoirs. VIA systems can provide an objective, automatic, consistent and accurate way of measuring carcass composition. However, little is known of genetic parameters for VIA measurements of lamb carcasses. Therefore the aim of the present research project was to estimate the genetic parameters of VIA carcass measurements in crossbred lamb population.
This paper explores the influence of institutions on indigenous entrepreneurship within the muttonbird economy of Ngāi Tahu (a New Zealand Māori tribe). It determines that colonisation removed the traditional Ngāi Tahu institution of executive authority which once regulated muttonbird exchange. Without this regulatory function whānau (family) birders compete against each other at their own expense and to the benefit of traders. As a consequence the birders are constrained in applying their birding knowledge and abilities to realise market opportunity. Furthermore, declining returns and harvesting pressure is in some cases reducing the financial and natural capital of whānau, whilst threats to continuing birding culture potentially undermines the socio-human capital contained within inherited traditions and the maintaining of kinship connections. It is argued that the development of a contemporary executive authority to regulate exchange and market product may reinvigorate entrepreneurial birding activities.
The Taipan galaxy survey (hereafter simply ‘Taipan’) is a multi-object spectroscopic survey starting in 2017 that will cover 2π steradians over the southern sky (δ ≲ 10°, |b| ≳ 10°), and obtain optical spectra for about two million galaxies out to z < 0.4. Taipan will use the newly refurbished 1.2-m UK Schmidt Telescope at Siding Spring Observatory with the new TAIPAN instrument, which includes an innovative ‘Starbugs’ positioning system capable of rapidly and simultaneously deploying up to 150 spectroscopic fibres (and up to 300 with a proposed upgrade) over the 6° diameter focal plane, and a purpose-built spectrograph operating in the range from 370 to 870 nm with resolving power R ≳ 2000. The main scientific goals of Taipan are (i) to measure the distance scale of the Universe (primarily governed by the local expansion rate, H0) to 1% precision, and the growth rate of structure to 5%; (ii) to make the most extensive map yet constructed of the total mass distribution and motions in the local Universe, using peculiar velocities based on improved Fundamental Plane distances, which will enable sensitive tests of gravitational physics; and (iii) to deliver a legacy sample of low-redshift galaxies as a unique laboratory for studying galaxy evolution as a function of dark matter halo and stellar mass and environment. The final survey, which will be completed within 5 yrs, will consist of a complete magnitude-limited sample (i ⩽ 17) of about 1.2 × 106 galaxies supplemented by an extension to higher redshifts and fainter magnitudes (i ⩽ 18.1) of a luminous red galaxy sample of about 0.8 × 106 galaxies. Observations and data processing will be carried out remotely and in a fully automated way, using a purpose-built automated ‘virtual observer’ software and an automated data reduction pipeline. The Taipan survey is deliberately designed to maximise its legacy value by complementing and enhancing current and planned surveys of the southern sky at wavelengths from the optical to the radio; it will become the primary redshift and optical spectroscopic reference catalogue for the local extragalactic Universe in the southern sky for the coming decade.
While the North American archaeological record signals the presence of early humans along the northeastern Pacific coast by the Late Pleistocene, we know little about the technological systems employed by these coastally oriented colonizing groups. We here report the discovery of the earliest unequivocal evidence for the use and manufacture of shell fishhooks in the western hemisphere. Four single-piece shell fishhooks dating to the terminal Pleistocene/early Holocene transition (between ~11,300 and 10,700 cal B.P.) have been excavated on Isla Cedros, Baja California, Mexico. One hook is directly dated at 9495 ± 25 B.P. with a marine reservoir–corrected age of 11,165–9185 cal B.P. Radiocarbon ages associated with three other shell fishhooks range between 8900 ± 25 B.P. and 10,415 ± 25 B.P, while median ages for the earliest contexts confirm occupation of the island by at least 12,600–12,000 cal B.P. The stratigraphic levels from which the fishhooks were recovered contained a diverse assemblage of fish remains, including deepwater species, indicative of boat use. Thus, some of the earliest known inhabitants of the Pacific coast of the Americas employed shell hook and line technology for offshore marine fishing at least by the Pleistocene-Holocene transition, if not earlier.
As a discipline, design science has traditionally focused on designing products and associated technical processes to improve usability and performance. Although significant progress has been made in these areas, little research has yet examined the role of human behaviour in the design of socio-technical systems (e.g., organizations). Here, we argue that applying organizational psychology as a design science can address this omission and enhance the capability of both disciplines. Specifically, we propose a method to predict malfunctions in socio-technical systems (PreMiSTS), thereby enabling them to be designed out or mitigated. We introduce this method, describe its nine stages, and illustrate its application with reference to two high-profile case studies of such malfunctions: (1) the severe breakdowns in patient care at the UK’s Mid-Staffordshire NHS Foundation Trust hospital in the period 2005–2009, and (2) the fatal Grayrigg rail accident in Cumbria, UK, in 2007. Having first identified the socio-technical and behavioural antecedents of these malfunctions, we then consider how the PreMiSTS method could be used to predict and prevent future malfunctions of this nature. Finally, we evaluate the method, consider its advantages and disadvantages, and suggest where it can be most usefully applied.