We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We use comparable 2005 and 2018 population data to assess threats driving the decline of lion Panthera leo populations, and review information on threats structured by problem tree and root cause analysis. We define 11 threats and rank their severity and prevalence. Two threats emerged as affecting both the number of lion populations and numbers within them: livestock depredation leading to retaliatory killing of lions, and bushmeat poaching leading to prey depletion. Our data do not allow determination of whether any specific threat drives declines faster than others. Of 20 local extirpations, most were associated with armed conflicts as a driver of proximate threats. We discuss the prevalence and severity of proximate threats and their drivers, to identify priorities for more effective conservation of lions, other carnivores and their prey.
In traumatically injured patients, excessive blood loss necessitating the transfusion of red blood cell (RBC) units is common. Indicators of early RBC transfusion in the pre-hospital setting are needed. This study aims to evaluate the association between hypothermia (<36°C) and transfusion risk within the first 24 hours after arrival to hospital for a traumatic injury.
Methods
We completed an audit of all traumatically injured patients who had emergent surgery at a single tertiary care center between 2010 and 2014. Using multivariable logistic regression analysis, we evaluated the association between pre-hospital hypothermia and transfusion of ≥1 unit of RBC within 24 hours of arrival to the trauma bay.
Results
Of the 703 patients included to evaluate the association between hypothermia and RBC transfusion, 203 patients (29%) required a transfusion within 24 hours. After controlling for important confounding variables, including age, sex, coagulopathy (platelets and INR), hemoglobin, and vital signs (blood pressure and heart rate), hypothermia was associated with a 68% increased odds of transfusion in multivariable analysis (OR: 1.68; 95% CI: 1.11-2.56).
Conclusions
Hypothermia is strongly associated with RBC transfusion in a cohort of trauma patients requiring emergent surgery. This finding highlights the importance of early measures of temperature after traumatic injury and the need for intervention trials to determine if strategies to mitigate the risk of hypothermia will decrease the risk of transfusion and other morbidities.
Schizophrenia is a disorder characterized by pervasive deficits in cognitive functioning. However, few well-powered studies have examined the degree to which cognitive performance is impaired even among individuals with schizophrenia not currently on antipsychotic medications using a wide range of cognitive and reinforcement learning measures derived from cognitive neuroscience. Such research is particularly needed in the domain of reinforcement learning, given the central role of dopamine in reinforcement learning, and the potential impact of antipsychotic medications on dopamine function.
Methods
The present study sought to fill this gap by examining healthy controls (N = 75), unmedicated (N = 48) and medicated (N = 148) individuals with schizophrenia. Participants were recruited across five sites as part of the CNTRaCS Consortium to complete tasks assessing processing speed, cognitive control, working memory, verbal learning, relational encoding and retrieval, visual integration and reinforcement learning.
Results
Individuals with schizophrenia who were not taking antipsychotic medications, as well as those taking antipsychotic medications, showed pervasive deficits across cognitive domains including reinforcement learning, processing speed, cognitive control, working memory, verbal learning and relational encoding and retrieval. Further, we found that chlorpromazine equivalency rates were significantly related to processing speed and working memory, while there were no significant relationships between anticholinergic load and performance on other tasks.
Conclusions
These findings add to a body of literature suggesting that cognitive deficits are an enduring aspect of schizophrenia, present in those off antipsychotic medications as well as those taking antipsychotic medications.
Lack of patient compliance in a general psychiatric ambulatory causes delays in diagnosing as well as in patient treatment efficacy.
Method
a 30 year old patient underwent a psychiatric treatment which lasted two yaers. He was treated with a combination of psychopharmacs and psychotheraphy.
Results
The patient contacted a psychiatrist for the first time exhibiting the following symptoms: loss of will and interests, weight gain, lowered general mood, avoidance of social contacts. A depressive episode was diagnosed and and antidepressive was introduced to his therapy (fluvoxamine). He did not comply to the therapy assigned nor he attended his scheduled examination.
His medicamentous therapy was intensified (fluoxamine, alprazolam, promazine). After six months, the patient returned in company with his family, and his non-compliance with the therapy was revealed.
After a sucessfful therapy an improvement in his mental state was noticed. During the last year, he was regular at his examinations and medication, with psychotherapy once a week.
Conclusion
Patient compliance is a prerequisite for diagnosing and a successful treatment. Combined treatment methods (psychopharmacotherapy, psychotherapy) with an adherend patient guarantee a good remission.
Evidence from previous small trials has suggested the effectiveness of early social communication interventions for autism.
Objectives
The Preschool Autism Communication Trial (PACT) investigated the efficacy of such an intervention in the largest psychosocial autism trial to date.
Aims
To provide a stringent test of a pre-school communication intervention for autism.
Methods
152 children with core autism aged 2 years - 4 years 11 months in a 3 site 2 arm single (assessor) blinded randomised controlled trial of the parent-mediated communication-focused intervention added to treatment as usual (TAU) against TAU alone. Primary outcome; severity of autism symptoms (modified social communication algorithm from Autism Diagnostic Observation Schedule-Generic, ADOS-G). Secondary outcomes; blinded measures of parent-child interaction, child language, and adaptation in school.
Results
At 13 month endpoint the treatment resulted in strong improvement in parental synchronous response to child (adjusted between-group effect size 1.22 (95% CI 0.85, 1.59) and child initiations with parent (ES 0.41 (0.08, 0.74) but small effect on autism symptomatology (ADOS-G, ES -0.24 (95% CI -0.59, 0.11) ns). Parents (not blind to allocation) reported strong treatment effects on child language and social adaptation but effects on blinded research assessed language and school adaptation were small.
Conclusions
Addition of the PACT intervention showed clear benefit in improving parent-child dyadic social communication but no substantive benefit over TAU in modifying objectively rated autism symptoms. This attenuation on generalisation from ‘proximal’ intervention effects to wider symptom change in other contexts remains a significant challenge for autism treatment and measurement methodology.
To explore if better diet quality scores as a measure of adherence to the Australian Dietary Guidelines (ADG) and the Mediterranean diet (MedDiet) are associated with a lower incidence of hypertension and non-fatal CVD.
Design:
Prospective analysis of the 1946–1951 cohort of the Australian Longitudinal Study on Women’s Health (ALSWH). The Australian Recommended Foods Score (ARFS) was calculated as an indicator of adherence to the ADG; the Mediterranean Diet Score (MDS) measured adherence to the MedDiet. Outcomes included hypertension and non-fatal CVD. Generalised estimating equations estimated OR and 95 % CI across quartiles of diet quality scores.
Setting:
Australia, 2001–2016.
Participants:
1946–1951 cohort of the ALSWH (n 5324), without CVD, hypertension and diabetes at baseline (2001), with complete FFQ data.
Results:
There were 1342 new cases of hypertension and 629 new cases of non-fatal CVD over 15 years of follow-up. Multivariate analysis indicated that women reporting better adherence to the ARFS (≥38/74) had 15 % (95 % CI 1, 28 %; P = 0·05) lower odds of hypertension and 46 % (95 % CI 6, 66 %; P = 0·1) lower odds of non-fatal CVD. Women reporting better adherence to the MDS (≥8/17) had 27 % (95 % CI 15, 47 %; P = 0·0006) lower odds of hypertension and 30 % (95 % CI 2, 50 %; P = 0·03) lower odds of non-fatal CVD.
Conclusions:
Better adherence to diet quality scores is associated with lower risk of hypertension and non-fatal CVD. These results support the need for updated evidenced based on the ADG as well as public health nutrition policies in Australia.
Introduction: The number of seniors presenting to emergency departments after a fall is increasing. Head injury concerns in this population often leads to a head CT scan. The CT rate among physicians is variable and the reasons for this are unknown. This study examined the role of patient characteristics and country of practice in the decision to order a CT. Methods: This study used a case-based survey of physicians across multiple countries. Each survey included 9 cases pertaining to an 82-year old man who falls. Each case varied in one aspect compared to a base case (aspirin, warfarin, or rivaroxaban use, occipital hematoma, amnesia, dementia, and fall with no head trauma). For each case, participants indicated how “likely” they were to order a head CT scan, measured on a 100-point scale. A response of 80 or more was defined a priori as ‘likely to order a CT scan’. The survey was piloted among emergency residents for feedback on design and comprehension, and was published in French and English. Recruitment was through the Canadian Association of Emergency Physicians, Twitter and CanadiEM. For each case we compared the proportion of physicians who were ‘likely to scan’ with relative to the base case. We also compared the proportion of participants who were ‘likely to scan’ each case in the USA, UK and Australia, relative to Canada. Results: Data was collected from 484 respondents (Canada-308, USA-64, UK-67, Australia-27, and 18 from other countries). Social media distribution limited our ability to estimate of the response rate. Physicians were most likely to scan in the anticoagulation cases (90% likely to order a scan compared to 36% for the base case (p = <0.001)). Other features associated with increased scans were occipital hematoma (48%), multiple falls (68%), and amnesia (68%) (all p < 0.005). Compared to Canada, US physicians were more likely to order CT scans for all cases (p = <0.05). Compared to Canada, UK physicians were significantly less likely to order CT for patients in every case except in the patient with amnesia. Finally, Australian physicians differed from Canada only for the occipital hematoma case where they were significantly more likely to order CT scan. Conclusion: Anticoagulation, amnesia and a history of multiple falls appear to drive the ordering a head CT scan in elderly patients who had fallen. We observed variations in practice between countries. Future clinical decision rules will likely have variable impact on head CT scan rates depending on baseline practice variation.
The consumption of nitrate-rich vegetables can acutely lower blood pressure and improve mediators shown to optimise vascular health. However, we do not yet understand the impact of long-term habitual dietary nitrate intake and its association with CVD. Therefore, the aim of this investigation was to examine the relationship between habitual dietary nitrate intakes and risk of CHD in women from the Nurses’ Health Study. We prospectively followed 62 535 women who were free from diabetes, CVD and cancer at baseline in 1986. Information on diet was updated every 4 years with validated FFQ. The main outcome was CHD defined by the occurrence of non-fatal myocardial infarction or fatal CHD. Cox proportional hazard regression models were used to estimate the relative risks (RR) and 95 % CI. During 26 years of follow-up, 2257 cases of CHD were identified. When comparing the highest quintile of nitrate intake with the lowest quintile, in aged-adjusted analysis there was a protective association for CHD (RR=0·77, 95 % CI 0·68, 0·97; P=0·0002) which dissipated after further adjustment for smoking, physical activity, BMI and race (RR=0·91; 95 % CI 0·80, 1·04; P=0·27). This magnitude of association was further attenuated once we adjusted for the Alternative Healthy Eating Index excluding vegetable and fruit consumption (RR=1·04, 95 % CI 0·91, 1·20; P=0·34). Dietary nitrate intake was not related to the risk of CHD after adjustment for other lifestyle and non-vegetable dietary factors in a large group of US women.
In Norway, incidence of sporadic domestically acquired salmonellosis is low, and most frequently due to Salmonalla Typhimurium. We investigated the risk factors for sporadic Salmonella infections in Norway to improve control and prevention measures. Surveillance data for all Salmonella infections from 2000 to 2015 were analysed for seasonality and proportion associated with domestic reservoirs, hedgehogs and wild birds. A prospective case–control study was conducted from 2010 to 2012 by recruiting cases from the Norwegian Surveillance System for Communicable Diseases and controls from the Norwegian Population Registry (389 cases and 1500 controls). Univariable analyses using logistic regression were conducted and a multivariable model was developed using regularised/penalised logistic regression. In univariable analysis, eating snow, dirt, sand or playing in a sandbox (aOR 4.14; CI 2.15–7.97) was associated with salmonellosis. This was also the only exposure significantly associated with illness in the multivariable model. Since 2004, 34.2% (n = 354) of S. Typhimuirum cases had an MLVA profile linked to a domestic reservoir. A seasonal trend with a peak in August for all Salmonella types and in February for S. Typhimurium was observed. Indirect exposure to domestic reservoirs remains a source of salmonellosis in Norway, particularly for children. Information to the public about avoiding environmental exposure should be strengthened and initiatives to combat salmonellosis in the food chain should be reinforced.
Although food from grazed animals is increasingly sought by consumers because of perceived animal welfare advantages, grazing systems provide the farmer and the animal with unique challenges. The system is dependent almost daily on the climate for feed supply, with the importation of large amounts of feed from off farm, and associated labour and mechanisation costs, sometimes reducing economic viability. Furthermore, the cow may have to walk long distances and be able to harvest feed efficiently in a highly competitive environment because of the need for high levels of pasture utilisation. She must, also, be: (1) highly fertile, with a requirement for pregnancy within ~80 days post-calving; (2) ‘easy care’, because of the need for the management of large herds with limited labour; (3) able to walk long distances; and (4) robust to changes in feed supply and quality, so that short-term nutritional insults do not unduly influence her production and reproduction cycles. These are very different and are in addition to demands placed on cows in housed systems offered pre-made mixed rations. Furthermore, additional demands in environmental sustainability and animal welfare, in conjunction with the need for greater system-level biological efficiency (i.e. ‘sustainable intensification’), will add to the ‘robustness’ requirements of cows in the future. Increasingly, there is evidence that certain genotypes of cows perform better or worse in grazing systems, indicating a genotype×environment interaction. This has led to the development of tailored breeding objectives within countries for important heritable traits to maximise the profitability and sustainability of their production system. To date, these breeding objectives have focussed on the more easily measured traits and those of highest relative economic importance. In the future, there will be greater emphasis on more difficult to measure traits that are important to the quality of life of the animal in each production system and to reduce the system’s environmental footprint.
Introduction: Understanding the spatial distribution of opioid abuse at the local level may facilitate community intervention strategies. The purpose of this analysis was to apply spatial analytical methods to determine clustering of opioid-related emergency medical services (EMS) responses in the City of Calgary. Methods: Using opioid-related EMS responses in the City of Calgary between January 1st through October 31st, 2017, we estimated the dissemination area (DA) specific spatial randomness effects by incorporating the spatial autocorrelation using intrinsic Gaussian conditional autoregressive model and generalized linear mixed models (GLMM). Global spatial autocorrelation was evaluated by Morans I index. Both Getis-Ord Gi and the LISA function in Geoda were used to estimate the local spatial autocorrelation. Two models were applied: 1) Poisson regression with DA-specific non-spatial random effects; 2) Poisson regression with DA-specific G-side spatial random effects. A pseudolikelihood approach was used for model comparison. Two types of cluster analysis were used to identify the spatial clustering. Results: There were 1488 opioid-related EMS responses available for analysis. Of the responses, 74% of the individuals were males. The median age was 33 years ( IQR: 26-42 years) with 65% of individuals between 20 and 39 years, and 27% between 40 and 64 years. In 62% of EMS responses, poisoning/overdose was the chief complaint. The global Morans Index implied the presence of global spatial autocorrelation. Comparing the two models applied suggested that the spatial model provided a better fit for the adjusted opioid-related EMS response rate. Calgary Center and East were identified as hot spots by both types of cluster analysis. Conclusion: Spatial modeling has a better predictability to assess potential high risk areas and identify locations for community intervention strategies. The clusters identified in Calgarys Center and East may have implications for future response strategies.
Introduction: Emergency Department Systems Transformation (EDST) is a bundle of Toyota Production System based interventions implemented in two Canadian tertiary care Emergency Departments (ED) between June 2014 to July 2016. The goals were to improve patient care by increasing value and reducing waste. Longer times to physician initial assessment (PIA), ED length of stays (LOS) and times to inpatient beds are associated with increased patient morbidity and potentially mortality. Some of the 17 primary interventions included computerized physician order entry optimization, staff schedule realignment, physician scorecards and a novel initial assessment process ED access block has limited full implementation of EDST. An interim analysis was conducted to assess impact of interventions implemented to date on flow metrics. Methods: Daily ED visit volumes, boarding at 7am, time to PIA and LOS for non-admitted patients were collected from April 2014 -June 2016. Volume and boarding were compared from first to last quarter using an independent samples median test. Linear regression for each variable versus time was conducted to determine unadjusted relationships. PIA, LOS for non-admitted low acuity (Canadian Triage and Acuity Scale (CTAS) 4,5) and non-admitted high acuity (CTAS 1,2,3) patients were subsequently adjusted for volume and/or boarding to control for these variables using a non-parametric correlation. Results: Overall, median ED boarding decreased at University Hospital (UH) (14.0 vs 6.0, p<0.01) and increased at Victoria Hospital (VH) (17.0 vs 21.0, p<0.01) from first to last quarter. Median ED volume increased significantly at UH from first to last quarter (129.0 vs 142.0, p<0.01) but remained essentially unchanged at VH. 90th percentile LOS for non-admitted low acuity patients significantly decreased at UH (adjusted rs=-0.24, p<0.01) but did not significantly change at VH. For high acuity patients 90th percentile LOS significantly decreased at both hospitals (UH: adjusted rs=-0.23, p<0.01; VH: adjusted rs=-0.21, p<0.01). 90th percentile time to PIA improved slightly but significantly in both EDs (UH: adjusted rs=-0.10, p<0.01; VH: adjusted rs=-0.18, p<0.01). Conclusion: Persistent ED boarding impacted the ability to fully implement the EDST model of care. Partial EDST implementation has resulted in improvement in PIA at both LHSC EDs. At UH where ED boarding decreased, LOS metrics improved significantly even after controlling for boarding.
The aim of this paper is to examine Canadian key informants’ perceptions of intrapersonal (within an individual) and interpersonal (among individuals) factors that influence successful primary care and public health collaboration.
Background
Primary health care systems can be strengthened by building stronger collaborations between primary care and public health. Although there is literature that explores interpersonal factors that can influence successful inter-organizational collaborations, a few of them have specifically explored primary care and public health collaboration. Furthermore, no papers were found that considered factors at the intrapersonal level. This paper aims to explore these gaps in a Canadian context.
Methods
This interpretative descriptive study involved key informants (service providers, managers, directors, and policy makers) who participated in one h telephone interviews to explore their perceptions of influences on successful primary care and public health collaboration. Transcripts were analyzed using NVivo 9.
Findings
A total of 74 participants [from the provinces of British Columbia (n=20); Ontario (n=19); Nova Scotia (n=21), and representatives from other provinces or national organizations (n=14)] participated. Five interpersonal factors were found that influenced public health and primary care collaborations including: (1) trusting and inclusive relationships; (2) shared values, beliefs and attitudes; (3) role clarity; (4) effective communication; and (5) decision processes. There were two influencing factors found at the intrapersonal level: (1) personal qualities, skills and knowledge; and (2) personal values, beliefs, and attitudes. A few differences were found across the three core provinces involved. There were several complex interactions identified among all inter and intra personal influencing factors: One key factor – effective communication – interacted with all of them. Results support and extend our understanding of what influences successful primary care and public health collaboration at these levels and are important considerations in building and sustaining primary care and public health collaborations.
Relations of sovereign inequality permeate international politics, and a growing body of literature grapples with the question of how states establish and sustain hierarchy amidst anarchy. I argue that existing literature on hierarchy, for all its diverse insights, misses what makes hierarchy unique in world politics. Hierarchy is not simply the presence of inequality or stratification among actors, but rather an authority relationship in which a dominant actor exercises some modicum of control over a subordinate one. This authority relationship, moreover, is dramatically different than ones found in domestic hierarchies. It is shaped less by written laws or formal procedures, than by subtle forms of manipulation and the development of informal practices. For this reason, hierarchy cannot simply be reduced the to the dynamics of anarchy, and must be viewed as a relational phenomenon. Ties between actors create positions that permit dominant actors to appropriate and orchestrate the sharing of authority with subordinate intermediaries. This article develops this relational network approach, highlighting how concepts such as access, brokerage, and yoking can illuminate the processes by which authority is enlisted and appropriated in world politics.
This sensitivity study applies the offline Canadian Land Surface Scheme (CLASS) version 3.6 to simulate snowpack evolution in idealized topography using observations at Likely, British Columbia, Canada over 1 July 2008 to 30 June 2009. A strategy for a subgrid-scale snow (SSS) parameterization is developed to incorporate two key features: ten elevation bands at 100 m intervals to capture air temperature lapse rates, and five slope angles on four aspects to resolve solar radiation impacts on the evolution of snow depth and SWE. Simulations reveal strong elevational dependencies of snow depth and SWE when adjusting temperatures using a moist adiabatic lapse rate with elevation, with 26% peak SWE differences between that at the average elevation versus the mean of the remainder of the elevation bands. Differences in peak SWE on north- and south-facing slopes increase from 3.0 mm at 10° slope to 17.9 mm at 50° slope. When applied to elevation, slope and aspect combinations derived from a high-resolution digital elevation model, elevation dominates the control of peak SWE values. Inclusion of the range of SSS effects into a regional climate model will improve snowpack and hydrological simulations of western North America's snow-dominated, mountainous watersheds.
Introduction: We examined persons transported to hospital after police use of force to determine whether Emergency Department (ED) assessment and/or mode of transport could be predicted. Methods: A multi-site prospective consecutive cohort study of police use of force with data on ED assessment for individuals ≥18 yrs was conducted over 36 months (Jan 2010-Dec 2012) in 4 cities in Canada. Police, EMS and hospital data were linked by study ID. Stepwise logistic regression examined the relationship between the police call for service and subject characteristics on subsequent ED assessment and mode of transport. Results: In 3310 use of force events, 86.7% of subjects were male, median age 29 yrs. ED transport occurred in 26% (n=726). Odds of ED assessment increased by 1.2 (CI 1.1, 1.3) for each force modality >1. Other predictors of ED use: if the nature of police call was for Mental Health Act (MHA) (Odds 14.3, CI 10.6, 19.2), features of excited delirium (ExD) (Odds 2.7, CI 1.9, 3.7), police-assessed emotional distress (EDP) not an MHA (Odds 2.1, CI 1.5, 3.0) and combined drugs, alcohol and EDP (Odds 1.7, CI 1.9, 3.7). Those with alcohol impairment alone were less likely to go to ED from the scene: OR 0.6 (CI 0.5, 0.7). EMS transported 55% of all patients (n=401), although police transported ~100 people who EMS attended at the scene but did not subsequently transport. For patients brought to the ED, 70% had a retrievable chart (512/726) with a discernible primary diagnosis: 25% for physical injury, 32% for psychiatric and 43% for drug and/or alcohol intoxication. For use of force events that began as MHA calls, patient transport was more often by police car than ambulance OR 1.8 (CI 1.2, 2.5), while those with drug intoxication or ≥3 ExD features were more often brought by ambulance: odds of police transport 0.5 (CI 0.3, 0.9) and 0.4 (CI 0.3, 0.7). Violence or aggression did not predict mode of transport in our study. Conclusion: About one quarter of police use of force events lead to ED assessment; 1 in 4 patients transported had a physical injury of some description. Calls including the Mental Health Act or individuals with drug intoxication or excited delirium features are most predictive of ED use following police use of force. In MHA calls with use of force, persons are nearly twice as likely to go to ED by police car than by ambulance.