To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Introduction: The number of seniors presenting to emergency departments after a fall is increasing. Head injury concerns in this population often leads to a head CT scan. The CT rate among physicians is variable and the reasons for this are unknown. This study examined the role of patient characteristics and country of practice in the decision to order a CT. Methods: This study used a case-based survey of physicians across multiple countries. Each survey included 9 cases pertaining to an 82-year old man who falls. Each case varied in one aspect compared to a base case (aspirin, warfarin, or rivaroxaban use, occipital hematoma, amnesia, dementia, and fall with no head trauma). For each case, participants indicated how “likely” they were to order a head CT scan, measured on a 100-point scale. A response of 80 or more was defined a priori as ‘likely to order a CT scan’. The survey was piloted among emergency residents for feedback on design and comprehension, and was published in French and English. Recruitment was through the Canadian Association of Emergency Physicians, Twitter and CanadiEM. For each case we compared the proportion of physicians who were ‘likely to scan’ with relative to the base case. We also compared the proportion of participants who were ‘likely to scan’ each case in the USA, UK and Australia, relative to Canada. Results: Data was collected from 484 respondents (Canada-308, USA-64, UK-67, Australia-27, and 18 from other countries). Social media distribution limited our ability to estimate of the response rate. Physicians were most likely to scan in the anticoagulation cases (90% likely to order a scan compared to 36% for the base case (p = <0.001)). Other features associated with increased scans were occipital hematoma (48%), multiple falls (68%), and amnesia (68%) (all p < 0.005). Compared to Canada, US physicians were more likely to order CT scans for all cases (p = <0.05). Compared to Canada, UK physicians were significantly less likely to order CT for patients in every case except in the patient with amnesia. Finally, Australian physicians differed from Canada only for the occipital hematoma case where they were significantly more likely to order CT scan. Conclusion: Anticoagulation, amnesia and a history of multiple falls appear to drive the ordering a head CT scan in elderly patients who had fallen. We observed variations in practice between countries. Future clinical decision rules will likely have variable impact on head CT scan rates depending on baseline practice variation.
The consumption of nitrate-rich vegetables can acutely lower blood pressure and improve mediators shown to optimise vascular health. However, we do not yet understand the impact of long-term habitual dietary nitrate intake and its association with CVD. Therefore, the aim of this investigation was to examine the relationship between habitual dietary nitrate intakes and risk of CHD in women from the Nurses’ Health Study. We prospectively followed 62 535 women who were free from diabetes, CVD and cancer at baseline in 1986. Information on diet was updated every 4 years with validated FFQ. The main outcome was CHD defined by the occurrence of non-fatal myocardial infarction or fatal CHD. Cox proportional hazard regression models were used to estimate the relative risks (RR) and 95 % CI. During 26 years of follow-up, 2257 cases of CHD were identified. When comparing the highest quintile of nitrate intake with the lowest quintile, in aged-adjusted analysis there was a protective association for CHD (RR=0·77, 95 % CI 0·68, 0·97; P=0·0002) which dissipated after further adjustment for smoking, physical activity, BMI and race (RR=0·91; 95 % CI 0·80, 1·04; P=0·27). This magnitude of association was further attenuated once we adjusted for the Alternative Healthy Eating Index excluding vegetable and fruit consumption (RR=1·04, 95 % CI 0·91, 1·20; P=0·34). Dietary nitrate intake was not related to the risk of CHD after adjustment for other lifestyle and non-vegetable dietary factors in a large group of US women.
In Norway, incidence of sporadic domestically acquired salmonellosis is low, and most frequently due to Salmonalla Typhimurium. We investigated the risk factors for sporadic Salmonella infections in Norway to improve control and prevention measures. Surveillance data for all Salmonella infections from 2000 to 2015 were analysed for seasonality and proportion associated with domestic reservoirs, hedgehogs and wild birds. A prospective case–control study was conducted from 2010 to 2012 by recruiting cases from the Norwegian Surveillance System for Communicable Diseases and controls from the Norwegian Population Registry (389 cases and 1500 controls). Univariable analyses using logistic regression were conducted and a multivariable model was developed using regularised/penalised logistic regression. In univariable analysis, eating snow, dirt, sand or playing in a sandbox (aOR 4.14; CI 2.15–7.97) was associated with salmonellosis. This was also the only exposure significantly associated with illness in the multivariable model. Since 2004, 34.2% (n = 354) of S. Typhimuirum cases had an MLVA profile linked to a domestic reservoir. A seasonal trend with a peak in August for all Salmonella types and in February for S. Typhimurium was observed. Indirect exposure to domestic reservoirs remains a source of salmonellosis in Norway, particularly for children. Information to the public about avoiding environmental exposure should be strengthened and initiatives to combat salmonellosis in the food chain should be reinforced.
Although food from grazed animals is increasingly sought by consumers because of perceived animal welfare advantages, grazing systems provide the farmer and the animal with unique challenges. The system is dependent almost daily on the climate for feed supply, with the importation of large amounts of feed from off farm, and associated labour and mechanisation costs, sometimes reducing economic viability. Furthermore, the cow may have to walk long distances and be able to harvest feed efficiently in a highly competitive environment because of the need for high levels of pasture utilisation. She must, also, be: (1) highly fertile, with a requirement for pregnancy within ~80 days post-calving; (2) ‘easy care’, because of the need for the management of large herds with limited labour; (3) able to walk long distances; and (4) robust to changes in feed supply and quality, so that short-term nutritional insults do not unduly influence her production and reproduction cycles. These are very different and are in addition to demands placed on cows in housed systems offered pre-made mixed rations. Furthermore, additional demands in environmental sustainability and animal welfare, in conjunction with the need for greater system-level biological efficiency (i.e. ‘sustainable intensification’), will add to the ‘robustness’ requirements of cows in the future. Increasingly, there is evidence that certain genotypes of cows perform better or worse in grazing systems, indicating a genotype×environment interaction. This has led to the development of tailored breeding objectives within countries for important heritable traits to maximise the profitability and sustainability of their production system. To date, these breeding objectives have focussed on the more easily measured traits and those of highest relative economic importance. In the future, there will be greater emphasis on more difficult to measure traits that are important to the quality of life of the animal in each production system and to reduce the system’s environmental footprint.
Introduction: Understanding the spatial distribution of opioid abuse at the local level may facilitate community intervention strategies. The purpose of this analysis was to apply spatial analytical methods to determine clustering of opioid-related emergency medical services (EMS) responses in the City of Calgary. Methods: Using opioid-related EMS responses in the City of Calgary between January 1st through October 31st, 2017, we estimated the dissemination area (DA) specific spatial randomness effects by incorporating the spatial autocorrelation using intrinsic Gaussian conditional autoregressive model and generalized linear mixed models (GLMM). Global spatial autocorrelation was evaluated by Morans I index. Both Getis-Ord Gi and the LISA function in Geoda were used to estimate the local spatial autocorrelation. Two models were applied: 1) Poisson regression with DA-specific non-spatial random effects; 2) Poisson regression with DA-specific G-side spatial random effects. A pseudolikelihood approach was used for model comparison. Two types of cluster analysis were used to identify the spatial clustering. Results: There were 1488 opioid-related EMS responses available for analysis. Of the responses, 74% of the individuals were males. The median age was 33 years ( IQR: 26-42 years) with 65% of individuals between 20 and 39 years, and 27% between 40 and 64 years. In 62% of EMS responses, poisoning/overdose was the chief complaint. The global Morans Index implied the presence of global spatial autocorrelation. Comparing the two models applied suggested that the spatial model provided a better fit for the adjusted opioid-related EMS response rate. Calgary Center and East were identified as hot spots by both types of cluster analysis. Conclusion: Spatial modeling has a better predictability to assess potential high risk areas and identify locations for community intervention strategies. The clusters identified in Calgarys Center and East may have implications for future response strategies.
Introduction: Emergency Department Systems Transformation (EDST) is a bundle of Toyota Production System based interventions implemented in two Canadian tertiary care Emergency Departments (ED) between June 2014 to July 2016. The goals were to improve patient care by increasing value and reducing waste. Longer times to physician initial assessment (PIA), ED length of stays (LOS) and times to inpatient beds are associated with increased patient morbidity and potentially mortality. Some of the 17 primary interventions included computerized physician order entry optimization, staff schedule realignment, physician scorecards and a novel initial assessment process ED access block has limited full implementation of EDST. An interim analysis was conducted to assess impact of interventions implemented to date on flow metrics. Methods: Daily ED visit volumes, boarding at 7am, time to PIA and LOS for non-admitted patients were collected from April 2014 -June 2016. Volume and boarding were compared from first to last quarter using an independent samples median test. Linear regression for each variable versus time was conducted to determine unadjusted relationships. PIA, LOS for non-admitted low acuity (Canadian Triage and Acuity Scale (CTAS) 4,5) and non-admitted high acuity (CTAS 1,2,3) patients were subsequently adjusted for volume and/or boarding to control for these variables using a non-parametric correlation. Results: Overall, median ED boarding decreased at University Hospital (UH) (14.0 vs 6.0, p<0.01) and increased at Victoria Hospital (VH) (17.0 vs 21.0, p<0.01) from first to last quarter. Median ED volume increased significantly at UH from first to last quarter (129.0 vs 142.0, p<0.01) but remained essentially unchanged at VH. 90th percentile LOS for non-admitted low acuity patients significantly decreased at UH (adjusted rs=-0.24, p<0.01) but did not significantly change at VH. For high acuity patients 90th percentile LOS significantly decreased at both hospitals (UH: adjusted rs=-0.23, p<0.01; VH: adjusted rs=-0.21, p<0.01). 90th percentile time to PIA improved slightly but significantly in both EDs (UH: adjusted rs=-0.10, p<0.01; VH: adjusted rs=-0.18, p<0.01). Conclusion: Persistent ED boarding impacted the ability to fully implement the EDST model of care. Partial EDST implementation has resulted in improvement in PIA at both LHSC EDs. At UH where ED boarding decreased, LOS metrics improved significantly even after controlling for boarding.
The aim of this paper is to examine Canadian key informants’ perceptions of intrapersonal (within an individual) and interpersonal (among individuals) factors that influence successful primary care and public health collaboration.
Primary health care systems can be strengthened by building stronger collaborations between primary care and public health. Although there is literature that explores interpersonal factors that can influence successful inter-organizational collaborations, a few of them have specifically explored primary care and public health collaboration. Furthermore, no papers were found that considered factors at the intrapersonal level. This paper aims to explore these gaps in a Canadian context.
This interpretative descriptive study involved key informants (service providers, managers, directors, and policy makers) who participated in one h telephone interviews to explore their perceptions of influences on successful primary care and public health collaboration. Transcripts were analyzed using NVivo 9.
A total of 74 participants [from the provinces of British Columbia (n=20); Ontario (n=19); Nova Scotia (n=21), and representatives from other provinces or national organizations (n=14)] participated. Five interpersonal factors were found that influenced public health and primary care collaborations including: (1) trusting and inclusive relationships; (2) shared values, beliefs and attitudes; (3) role clarity; (4) effective communication; and (5) decision processes. There were two influencing factors found at the intrapersonal level: (1) personal qualities, skills and knowledge; and (2) personal values, beliefs, and attitudes. A few differences were found across the three core provinces involved. There were several complex interactions identified among all inter and intra personal influencing factors: One key factor – effective communication – interacted with all of them. Results support and extend our understanding of what influences successful primary care and public health collaboration at these levels and are important considerations in building and sustaining primary care and public health collaborations.
Relations of sovereign inequality permeate international politics, and a growing body of literature grapples with the question of how states establish and sustain hierarchy amidst anarchy. I argue that existing literature on hierarchy, for all its diverse insights, misses what makes hierarchy unique in world politics. Hierarchy is not simply the presence of inequality or stratification among actors, but rather an authority relationship in which a dominant actor exercises some modicum of control over a subordinate one. This authority relationship, moreover, is dramatically different than ones found in domestic hierarchies. It is shaped less by written laws or formal procedures, than by subtle forms of manipulation and the development of informal practices. For this reason, hierarchy cannot simply be reduced the to the dynamics of anarchy, and must be viewed as a relational phenomenon. Ties between actors create positions that permit dominant actors to appropriate and orchestrate the sharing of authority with subordinate intermediaries. This article develops this relational network approach, highlighting how concepts such as access, brokerage, and yoking can illuminate the processes by which authority is enlisted and appropriated in world politics.
This sensitivity study applies the offline Canadian Land Surface Scheme (CLASS) version 3.6 to simulate snowpack evolution in idealized topography using observations at Likely, British Columbia, Canada over 1 July 2008 to 30 June 2009. A strategy for a subgrid-scale snow (SSS) parameterization is developed to incorporate two key features: ten elevation bands at 100 m intervals to capture air temperature lapse rates, and five slope angles on four aspects to resolve solar radiation impacts on the evolution of snow depth and SWE. Simulations reveal strong elevational dependencies of snow depth and SWE when adjusting temperatures using a moist adiabatic lapse rate with elevation, with 26% peak SWE differences between that at the average elevation versus the mean of the remainder of the elevation bands. Differences in peak SWE on north- and south-facing slopes increase from 3.0 mm at 10° slope to 17.9 mm at 50° slope. When applied to elevation, slope and aspect combinations derived from a high-resolution digital elevation model, elevation dominates the control of peak SWE values. Inclusion of the range of SSS effects into a regional climate model will improve snowpack and hydrological simulations of western North America's snow-dominated, mountainous watersheds.
Introduction: We examined persons transported to hospital after police use of force to determine whether Emergency Department (ED) assessment and/or mode of transport could be predicted. Methods: A multi-site prospective consecutive cohort study of police use of force with data on ED assessment for individuals ≥18 yrs was conducted over 36 months (Jan 2010-Dec 2012) in 4 cities in Canada. Police, EMS and hospital data were linked by study ID. Stepwise logistic regression examined the relationship between the police call for service and subject characteristics on subsequent ED assessment and mode of transport. Results: In 3310 use of force events, 86.7% of subjects were male, median age 29 yrs. ED transport occurred in 26% (n=726). Odds of ED assessment increased by 1.2 (CI 1.1, 1.3) for each force modality >1. Other predictors of ED use: if the nature of police call was for Mental Health Act (MHA) (Odds 14.3, CI 10.6, 19.2), features of excited delirium (ExD) (Odds 2.7, CI 1.9, 3.7), police-assessed emotional distress (EDP) not an MHA (Odds 2.1, CI 1.5, 3.0) and combined drugs, alcohol and EDP (Odds 1.7, CI 1.9, 3.7). Those with alcohol impairment alone were less likely to go to ED from the scene: OR 0.6 (CI 0.5, 0.7). EMS transported 55% of all patients (n=401), although police transported ~100 people who EMS attended at the scene but did not subsequently transport. For patients brought to the ED, 70% had a retrievable chart (512/726) with a discernible primary diagnosis: 25% for physical injury, 32% for psychiatric and 43% for drug and/or alcohol intoxication. For use of force events that began as MHA calls, patient transport was more often by police car than ambulance OR 1.8 (CI 1.2, 2.5), while those with drug intoxication or ≥3 ExD features were more often brought by ambulance: odds of police transport 0.5 (CI 0.3, 0.9) and 0.4 (CI 0.3, 0.7). Violence or aggression did not predict mode of transport in our study. Conclusion: About one quarter of police use of force events lead to ED assessment; 1 in 4 patients transported had a physical injury of some description. Calls including the Mental Health Act or individuals with drug intoxication or excited delirium features are most predictive of ED use following police use of force. In MHA calls with use of force, persons are nearly twice as likely to go to ED by police car than by ambulance.
Water is the foundation of all ecosystems, whether terrestrial or aquatic. In terrestrial ecosystems freshwater not only provides critical water supply for transpiration during plant photosynthesis and drinking water for animals, but also transports, redistributes and stores energy, nutrients and contaminants. In aquatic and snow ecosystems, water is the medium in which the ecosystem functions and so its state mediates all transactions in these systems. Ecosystems are not passive responders to water but through their structure and function can manage water and associated microclimate – forests, grasslands, organic terrain wetlands, and beaver ponds being just a few examples.
This chapter will examine the surface water budget in terms of the water continuity equation as a manifestation of the hydrological cycle. To solve the continuity equation for water, the chapter will review hydrological processes and how they interact with vegetation, animals, soils, geomorphology and climate in the context of the catchment. The coupling of the mass and energy continuity equations in controlling hydrological processes will be discussed. How hydrological processes and their ecosystem interactions are managed by humans will be introduced. Then the chapter will review calculation schemes for the surface water budget via one-dimensional land surface schemes and catchment-based hydrological models, noting the data requirements, uncertainty and limitations of these models and the balance required between model complexity and physical representation of hydrology. This will give the conceptual ideas and basic mathematics of conservation laws and transport processes that form the basis of many models in the forthcoming chapters.
Hydrological Processes as a Fundamental Component of Aquatic and Terrestrial Ecosystems
The hydrological cycle is the flow and storage of water, as liquid, solid or vapor, on and near the Earth's surface. This cycling is a fundamental function of the Earth system and, through its associated latent energy transformations and other influences on land surface characteristics, ensures the habitability of the planet. A representation of the global hydrological cycle is found in Figure 4.1 where it can be seen that there are substantial flows between ocean and land – evaporation and river discharge from land transfer water directly to the oceans or through precipitation and ocean water is evaporated and then forms precipitation over land.
The aims of the study were to describe the patterning and persistence of anxiety and depressive symptoms from adolescence to young adulthood and to examine long-term developmental relationships with earlier patterns of internalizing behaviours in childhood.
We used parallel processes latent growth curve modelling to build trajectories of internalizing from adolescence to adulthood, using seven waves of follow-ups (ages 11–27 years) from 1406 participants of the Australian Temperament Project. We then used latent factors to capture the stability of maternal reported child internalizing symptoms across three waves of early childhood follow-ups (ages 5, 7 and 9 years), and examined relationships among these patterns of symptoms across the three developmental periods, adjusting for gender and socio-economic status.
We observed strong continuity in depressive symptoms from adolescence to young adulthood. In contrast, adolescent anxiety was not persistent across the same period, nor was it related to later depressive symptoms. Anxiety was, however, related to non-specific stress in young adulthood, but only moderately so. Although childhood internalizing was related to adolescent and adult profiles, the associations were weak and indirect by adulthood, suggesting that other factors are important in the development of internalizing symptoms.
Once established, adolescent depressive symptoms are not only strongly persistent, but also have the potential to differentiate into anxiety in young adulthood. Relationships with childhood internalizing symptoms are weak, suggesting that early adolescence may be an important period for targeted intervention, but also that further research into the childhood origins of internalizing behaviours is needed.
Introduction: This study provides an estimate of the number of EMS calls related to police use of force events that involve struggling, intoxicated and/or emotionally distressed patients. We hypothesized there would be under-reporting of EMS risk by paramedic agencies due to lack of standardized reporting of police events by EMS services and lack of a common linked case number between prehospital agencies in Canada. Methods: Data were collected during a multi-site, prospective, consecutive cohort study of police use of force in 4 Canadian cities using standardized data forms. Use of force was defined a priori and the application of handcuffs was not considered a force modality. Inclusion criteria: all subjects ≥ 18 years of age involved in a use of force police-public encounter. We defined risk to EMS as the presence of police- and/or paramedic- assessments of violent or struggling subjects on the scene. Three separate data forms (police-report of use of force, EMS encounter, and Emergency Department (ED) visit) were linked in the study by unique ID. When police-reported EMS was activated, investigators hand searched the EMS service reports at the relevant agencies for matching call sheets. Results: From Jan 2010 to Dec 2012, we studied 3310 consecutive public-police interactions involving use of force above simple joint lock application. Subjects were male (86%) with a mean age of 33 yrs; 85% were assessed by police as emotionally disturbed, intoxicated with drugs and/or alcohol or a combination of those. 45% were violent at the scene. Police-reported EMS attendance in 24% (809/3310) of use of force events, of which only 43% (349/809) of EMS run sheets were available. In events with violent subjects, EMS transported 51% to ED compared to 35% by police transport (chi=79.7, p=0.00). Conclusion: We identified periods of professional and physical risk to paramedics attending police use of force events and found that risk significantly underrepresented in EMS data. Paramedical training would benefit from policy and procedures for response to police calls and the violent patient, the majority of whom are struggling. A common linked case number in prehospital care would enable more specific quantification of the risk for EMS providers involved in police events.
Residual feed intake (RFI) is the difference between actual and predicted dry matter intake (DMI) of individual animals. Recent studies with Holstein-Friesian calves have identified an ~20% difference in RFI during growth (calf RFI) and these groups remained divergent in RFI during lactation. The objective of the experiment described here was to determine if cows selected for divergent RFI as calves differed in milk production, reproduction or in the profiles of BW and body condition score (BCS) change during lactation, when grazing pasture. The cows used in the experiment (n=126) had an RFI of −0.88 and +0.75 kg DM intake/day for growth as calves (efficient and inefficient calf RFI groups, respectively) and were intensively grazed at four stocking rates (SR) of 2.2, 2.6, 3.1 and 3.6 cows/ha on self-contained farmlets, over 3 years. Each SR treatment had equal number of cows identified as low and high calf RFI, with 24, 28, 34 and 40/11 ha farmlet. The cows divergent for calf RFI were randomly allocated to each SR. Although SR affected production, calf RFI group (low or high) did not affect milk production, reproduction, BW, BCS or changes in these parameters throughout lactation. The most efficient animals (low calf RFI) lost similar BW and BCS as the least efficient (high calf RFI) immediately post-calving, and regained similar BW and BCS before their next calving. These results indicate that selection for RFI as calves to increase efficiency of feed utilisation did not negatively affect farm productivity variables (milk production, BCS, BW and reproduction) as adults when managed under an intensive pastoral grazing system.
Expert judgement has been used since the actuarial profession was founded. In the past, there has often been a lack of transparency regarding the use of expert judgement, even though those judgements could have a very significant impact on the outputs of calculations and the decisions made by organisations. The lack of transparency has a number of dimensions, including the nature of the underlying judgements, as well as the process used to derive those judgements. This paper aims to provide a practical framework regarding expert judgement processes, and how those processes may be validated. It includes a worked example illustrating how the process could be used for setting a particular assumption. It concludes with some suggested tools for use within expert judgement. Although primarily focussed on the insurance sector, the proposed process framework could be applied more widely without the need for significant changes.
The statistics on the angular structures of quasars have been more than doubled. Quasars are discussed from both morphological and statistical viewpoints and the angular diameter-redshift relation has been confirmed.