To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This volume brings together the full range of modalities of social influence - from crowding, leadership, and norm formation to resistance and mass mediation - to set out a challenge-and-response 'cyclone' model. The authors use real-world examples to ground this model and review each modality of social influence in depth. A 'periodic table of social influence' is constructed that characterises and compares exercises of influence in practical terms. The wider implications of social influence are considered, such as how each exercise of a single modality stimulates responses from other modalities and how any everyday process is likely to arise from a mix of influences. The book demonstrates that different modalities of social influence are tactics that defend, question, and develop 'common sense' over time and offers advice to those studying in political and social movements, social change, and management.
Mindfulness meditation has been practiced in the Eastern world for more than 25 centuries but only recently it has become popular in the West. Today, therapeutic interventions such as ‘Mindfulness Based Stress Reduction’ are used within health services throughout Europe as a means of improving patient wellbeing. Whilst these interventions have proved successful in reducing stress and depression a limitation is that they tend to apply the practices of mindfulness in an ‘out of context’ manner. Meditation Based Awareness Training (MBAT), on the other hand, includes a composite array of ‘spiritual-based’ trainings, which are traditionally assumed to enhance the cultivation of a more sustainable quality of wellbeing within the meditator.
The purpose of this program is to design, implement, and evaluate MBAT as an approach to meditation and mindfulness that can be adapted to meet the needs of various populations. In the current phase, MBAT was developed in a general format for individuals from the general population who want to increase their levels of wellbeing. A controlled comparison trial has been run to evaluate this version of MBAT: Participants of the study undertook an 8-week MBAT program and comparisons were made with a control group on perceived psychological wellbeing (depression, anxiety, and anger management) and stress. In a second phase (not included in this presentation) MBAT will be adapted to populations with special needs, e.g., elderly people, trauma victims, and forensic inmates.
Findings from the trial will be reported and implications for further development of MBAT will be discussed.
This introductory chapter of volume II of the Cambridge World History of Violence, which focuses on the thousand years between 500 and 1500, or what is also known as the Middle Millennium, examines .institutions and forms of violence in the geographical area including Japan and China, Central Asia, North Africa, and Europe, with two additional chapters extending coverage into Aztec and Mayan culture. The topics of this introduction are set in four contexts in which violence occurred across this broad chronology and vast territory. They are: the formation of centralized polities through war and conquest; institution building and ideological expression by these same polities; control of extensive trade networks; and the emergence and dominance of religious ecumenes. Attention is also given to the idea of how theories of violence are relevant to the specific historical circumstances discussed in the volume’s chapters. A final section on the depiction of violence, both visual and literary, demonstrates the ubiquity of societal efforts to confront meanings of violence during this longue durée.
Biodiversity offsetting aims to achieve at least no net loss of biodiversity by fully compensating for residual development-induced biodiversity losses after the mitigation hierarchy (avoid, minimize, remediate) has been applied. Actions used to generate offsets can include securing site protection, or maintaining or enhancing the condition of targeted biodiversity at an offset site. Protection and maintenance actions aim to prevent future biodiversity loss, so such offsets are referred to as averted loss offsets. However, the benefits of such approaches can be highly uncertain and opaque, because assumptions about the change in likelihood of loss as a result of the offset action are often implicit. As a result, the gain generated by averting losses can be intentionally or inadvertently overestimated, leading to offset outcomes that are insufficient for achieving no net loss of biodiversity. We present a method and decision tree to guide consistent and credible estimation of the likelihood of biodiversity loss for a proposed offset site with and without protection, for use when calculating the amount of benefit associated with the protection component of averted loss offsets. In circumstances such as when a jurisdictional offset policy applies to most impacts, plausible estimates of averted loss can be very low. Averting further loss of biodiversity is desirable, and averted loss offsets can be a valid approach for generating tangible gains. However, overestimation of averted loss benefits poses a major risk to biodiversity.
Fully slatted concrete floors are labour-efficient, cost-effective and thus common in beef cattle housing. However, the welfare of cattle accommodated on them has been questioned. The objective of this study was to evaluate the effect of floor and diet on hoof health and lying behaviours of housed dairy-origin bulls, from a mean age of 8 months to slaughter at 15.5 months old. Forty-eight bulls, which had a mean initial live weight of 212 (SD = 23.7) kg, were allocated to one of four treatments, which consisted of two floors and two diets arranged in a 2 × 2 factorial design. The floors evaluated were a fully slatted concrete floor and a fully slatted concrete floor overlaid with rubber, while the diets offered were either a high concentrate diet or a grass-silage-based diet supplemented with concentrates. The mean total duration of the study was 216 days. Floor had no significant effect on claw measurements measured on day 62 or 139. However, bulls accommodated on slats overlaid with rubber had a tendency to have a higher front toe length measured pre-slaughter than those accommodated on concrete slats (P = 0.063). Floor had no significant effect on the net growth of toes or heels during the duration of the study. The number of bruises (P < 0.01) and the bruising score (P < 0.05) were significantly higher on day 62 in bulls accommodated on fully slatted concrete floors than on concrete slats overlaid with rubber, but there was no significant effect of floor on these parameters on day 139 or at the measurement taken pre-slaughter. There was a tendency for bulls accommodated on concrete slats to have a higher probability of having sole bruising at the end of the experiment than those accommodated on slats overlaid with rubber (P = 0.052). Diet had no significant effect on toe length or heel height, number of bruises, or overall bruising score at any time point of the study. There was little evidence in the current study to suggest that bulls lying on fully slatted concrete floors could not express lying postures similar to those on concrete slats overlaid with rubber.
There is evidence indicating that using the current UK energy feeding system to ration the present sheep flocks may underestimate their nutrient requirements. The objective of the present study was to address this issue by developing updated maintenance energy requirements for the current sheep flocks and evaluating if these requirements were influenced by a range of dietary and animal factors. Data (n = 131) used were collated from five experiments with sheep (5 to 18 months old and 29.0 to 69.8 kg BW) undertaken at the Agri-Food and Biosciences Institute of the UK from 2013 to 2017. The trials were designed to evaluate the effects of dietary type, genotype, physiological stage and sex on nutrient utilization and energetic efficiencies. Energy intake and output data were measured in individual calorimeter chambers. Energy balance (Eg) was calculated as the difference between gross energy intake and a sum of fecal energy, urine energy, methane energy and heat production. Data were analysed using the restricted maximum likelihood analysis to develop the linear relationship between Eg or heat production and metabolizable energy (ME) intake, with the effects of a range of dietary and animal factors removed. The net energy (NEm) and ME (MEm) requirements for maintenance derived from the linear relationship between Eg and ME intake were 0.358 and 0.486 MJ/kg BW0.75, respectively, which are 40% to 53% higher than those recommended in energy feeding systems currently used to ration sheep in the USA and the UK. Further analysis of the current dataset revealed that concentrate supplement, sire type or physiological stage had no significant effect on the derived NEm values. However, female lambs had a significantly higher NEm (0.352 v. 0.306 or 0.288 MJ/kg BW0.75) or MEm (0.507 v. 0.441 or 0.415 MJ/kg BW0.75) than those for male or castrated lambs. The present results indicate that using present energy feeding systems in the UK developed over 40 years ago to ration the current sheep flocks could underestimate maintenance energy requirements. There is an urgent need to update these systems to reflect the higher metabolic rates of the current sheep flocks.
Lithostratigraphical studies coupled with the development of new dating methods has led to significant progress in understanding the Late Pleistocene terrestrial record in Scotland. Systematic analysis and re-evaluation of key localities have provided new insights into the complexity of the event stratigraphy in some regions and the timing of Late Pleistocene environmental changes, but few additional critical sites have been described in the past 25 years. The terrestrial stratigraphic record remains important for understanding the timing, sequence and patterns of glaciation and deglaciation during the last glacial/interglacial cycle. Former interpretations of ice-free areas in peripheral areas during the Last Glacial Maximum (LGM) are inconsistent with current stratigraphic and dating evidence. Significant challenges remain to determine events and patterns of glaciation during the Early and Middle Devensian, particularly in the context of offshore evidence and ice sheet modelling that indicate significant build-up of ice throughout much of the period. The terrestrial evidence broadly supports recent reconstructions of a highly dynamic and climate-sensitive British–Irish Ice Sheet (BIIS), which apparently reached its greatest thickness in Scotland between 30 and 27ka, before the global LGM. A thick (relative to topography) integrated ice sheet reaching the shelf edge with a simple ice-divide structure was replaced after the LGM by a much thinner one comprising multiple dispersion centres and a more complex flow structure.
Fully slatted concrete floors are prevalent in beef cattle housing. However, concerns have been raised about welfare of cattle accommodated on slats. The objective of this study was to evaluate the effect of diet and floor type on the intake, performance and cleanliness of dairy-origin bulls from a mean age of 8 months to slaughter at 15.5 months old. Forty-eight bulls, which had a mean initial live weight of 212 kg (SD = 23.7), were allocated one of four treatments which consisted of two floors and two diets, arranged in a 2×2 factorial design. The floors evaluated were a fully slatted concrete floor and a fully slatted concrete floor covered with rubber; while the diets offered were either a high concentrate diet or a grass silage-based diet supplemented with concentrates. Over the entire experimental period, floor type had no significant effect on intake. Interestingly, however, when bulls were offered concentrates ad libitum, those accommodated on rubber covered slats consumed more concentrates than those accommodated on concrete slats. No effect of floor type on intake was noted when bulls were offered the grass silage supplemented with concentrate diet. There were no significant interactions between floor and diet on animal performance. Animals accommodated on rubber covered slats had a significantly better performance than those accommodated on concrete slats, as assessed by live weight at slaughter and live weight gain/day (P < 0.01) and estimated carcass gain/day (P < 0.05). The diet offered had no significant effect on animal performance. Bulls accommodated on rubber covered slats were significantly cleaner than those accommodated on concrete slats on day 97 (P < 0.001), but there was no significant effect of floor type when measured at other time points in the experiment. It is concluded from this study that diet has an important role to play in assessing bulls’ responses in performance to the effect of covering concrete slatted floors with rubber. Bulls offered a high concentrate diet had a higher concentrate intake, higher performance but a similar feed conversion ratio (FCR) when accommodated on rubber covered slats compared to those accommodated on fully concrete slatted floors. Animals offered this intensive diet were less efficient (as measured by a higher FCR) than those offered a supplemented grass silage-based diet.
To determine the responsiveness of primary care chaplaincy (PCC) to the current variety of presenting symptoms seen in primary care. This was done with a focus on complex and undifferentiated illness.
Current presentations to primary care are often complex, undifferentiated and display risk factors for social isolation and loneliness. These are frequently associated with loss of well-being and spiritual issues. PCC provides holistic care for such patients but its efficacy is unknown in presentations representative of such issues. There is therefore a need to assess the characteristics of those attending PCC. The effectiveness of PCC relative to the type and number of presenting symptoms should also be analysed whilst evaluating impact on GP workload.
This was a retrospective observational study based on routinely collected data. In total, 164 patients attended PCC; 75 were co-prescribed antidepressants (AD) and 89 were not (No-AD). Pre- and post-PCC well-being was assessed by the Warwick–Edinburgh mental well-being score. Presenting issue(s) data were collected on a separate questionnaire. GP appointment utilisation was measured for three months pre- and post-PCC.
Those displaying undifferentiated illness and risk factors for social isolation and loneliness accessed PCC. PCC (No-AD) was associated with a clinically meaningful and statistically significant improvement in well-being in all presenting issues. This effect was maintained in those with multiple presenting issues. PCC was associated with a reduction in GP appointment utilisation in those not co-prescribed AD.
Concentrate inclusion levels in dairy cow diets are often adjusted so that the milk yield responses remain economic. While changes in concentrate level on performance is well known, their impact on other biological parameters, including immune function, is less well understood. The objective of this study was to evaluate the effect of concentrate inclusion level in a grass silage-based mixed ration on immune function. Following calving 63 (45 multiparous and 18 primiparous) Holstein Friesian dairy cows were allocated to one of three isonitrogenous diets for the first 70 days of lactation. Diets comprised of a mixture of concentrates and grass silage, with concentrates comprising either a low (30%, LC), medium (50%, MC) or high (70%, HC) proportion of the diet on a dry matter (DM) basis. Daily DM intakes, milk yields and BW were recorded, along with weekly body condition score, milk composition and vaginal mucus scores. Blood biochemistry was measured using a chemistry analyzer, neutrophil phagocytic and oxidative burst assessed using commercial kits and flow cytometry, and interferon-γ production evaluated by ELISA after whole blood stimulation. Over the study period cows on HC had a higher total DM intake, milk yield, fat yield, protein yield, fat+protein yield, protein content, mean BW and mean daily energy balance, and a lower BW loss than cows on MC, whose respective values were higher than cows on LC. Cows on HC and MC had a lower serum non-esterified fatty acid concentration than cows on LC (0.37, 0.37 and 0.50 mmol/l, respectively, P=0.005, SED=0.032), while cows on HC had a lower serum β-hydroxybutyrate concentration than cows on MC and LC (0.42, 0.55 and 0.55 mmol/l, respectively, P=0.002, SED=0.03). Concentrate inclusion level had no effect on vaginal mucus scores. At week 3 postpartum, cows on HC tended to have a higher percentage of oxidative burst positive neutrophils than cows on LC (43.2% and 35.3%, respectively, P=0.078, SED=3.11), although at all other times concentrate inclusion level in the total mixed ration had no effect on neutrophil phagocytic or oxidative burst characteristics, or on interferon-γ production by pokeweed mitogen stimulated whole blood culture. This study demonstrates that for high yielding Holstein Friesian cows managed on a grass silage-based diet, concentrate inclusion levels in early lactation affects performance but has no effect on neutrophil or lymphocyte immune parameters.
Background: Central neuropathic pain syndromes are a result of central nervous system injury, most commonly related to stroke, traumatic spinal cord injury, or multiple sclerosis. These syndromes are distinctly less common than peripheral neuropathic pain, and less is known regarding the underlying pathophysiology, appropriate pharmacotherapy, and long-term outcomes. The objective of this study was to determine the long-term clinical effectiveness of the management of central neuropathic pain relative to peripheral neuropathic pain at tertiary pain centers. Methods: Patients diagnosed with central (n=79) and peripheral (n=710) neuropathic pain were identified for analysis from a prospective observational cohort study of patients with chronic neuropathic pain recruited from seven Canadian tertiary pain centers. Data regarding patient characteristics, analgesic use, and patient-reported outcomes were collected at baseline and 12-month follow-up. The primary outcome measure was the composite of a reduction in average pain intensity and pain interference. Secondary outcome measures included assessments of function, mood, quality of life, catastrophizing, and patient satisfaction. Results: At 12-month follow-up, 13.5% (95% confidence interval [CI], 5.6-25.8) of patients with central neuropathic pain and complete data sets (n=52) achieved a ≥30% reduction in pain, whereas 38.5% (95% CI, 25.3-53.0) achieved a reduction of at least 1 point on the Pain Interference Scale. The proportion of patients with central neuropathic pain achieving both these measures, and thus the primary outcome, was 9.6% (95% CI, 3.2-21.0). Patients with peripheral neuropathic pain and complete data sets (n=463) were more likely to achieve this primary outcome at 12 months (25.3% of patients; 95% CI, 21.4-29.5) (p=0.012). Conclusion: Patients with central neuropathic pain syndromes managed in tertiary care centers were less likely to achieve a meaningful improvement in pain and function compared with patients with peripheral neuropathic pain at 12-month follow-up.
This study aimed to evaluate the effect of using different floor types to accommodate growing and finishing beef cattle on lameness. In all, 80 dairy origin bulls were blocked according to live weight and breed into 20 groups, and randomly allocated within groups to one of four treatments. The floor types studied were fully slatted flooring throughout the entire experimental period (CS); fully slatted flooring covered with rubber strips throughout the entire experimental period (RS); fully slatted flooring during the growing period and then moved to a solid floor covered with straw bedding during the finishing period (CS-S) and fully slatted flooring during the growing period and then moved to fully slatted flooring covered with rubber strips during the finishing period (CS-RS). The total duration of the study was 204 days. The first 101 days was defined as the growing period, with the remainder of the study defined as the finishing period. During the growing period, there was a tendency for bulls accommodated on CS to have a higher locomotion score compared with those accommodated on RS (P=0.059). However, floor type had no significant effect on locomotion score during the finishing period. There was also no significant effect of floor type on digital dermatitis during both the growing or finishing period. Floor type had no significant effect on swelling at the leg joints at the end of the finishing period. Bulls accommodated on RS had the least probability of bruised soles during both the growing and finishing period (P<0.01). Growing bulls accommodated on CS had significantly greater front heel height net growth compared with those accommodated on RS (P<0.05). However, bulls accommodated on RS had a tendency to have greater front toe net growth compared with those accommodated on CS (P=0.087). Finishing bulls accommodated on CS-RS had the greatest front toe net growth (P<0.001). Heel height net growth was greatest in bulls accommodated on CS-S (P<0.001). Floor type had no significant effect on mean maximum hoof temperature during the growing period. Finishing bulls accommodated on CS-S had a significantly lower mean maximum hoof temperature compared with those accommodated on any other floor type (P<0.001). The study concluded that rubber flooring is a suitable alternative to fully slatted flooring, reducing the prevalence of bruised soles. Despite greater toe net growth in bulls accommodated on rubber flooring, there was no effect of floor type on locomotion score, suggesting that increased toe net growth does not adversely affect walking ability. In addition, although mean maximum hoof temperature was lowest in bulls accommodated on straw bedding, there was no evidence to suggest this is indicative of improved hoof health.
Determination of the proportion of bovine tuberculosis (bTB) breakdowns attributed to a herd purchasing infected animals has not been previously quantified using data from the Animal and Public Health Information System (APHIS) database in Northern Ireland. We used a case–control study design to account for the infection process occurring in the disclosing bTB breakdown herds. Cases (N = 6926) were cattle moving to a future confirmed bTB breakdown where they would disclose as a confirmed bTB reactor or a Lesion at Routine Slaughter (LRS). Controls (N = 303 499) were cattle moving to a future confirmed bTB breakdown where they did not become a bTB reactor or LRS. Our study showed that the cattle leaving herds which disclosed bTB within 450 days had an increased odds of becoming a confirmed bTB reactor or LRS compared with the cattle which left herds that remained free for 450 days (odds ratio (OR) = 2·09: 95% CI 1·96–2·22). Of the 12 060 confirmed bTB breakdowns included in our study (2007–2015 inclusive), 31% (95% CI 29·8–31·5) contained a confirmed bTB reactor(s) or LRS(s) at the disclosing test which entered the herd within the previous 450 days. After controlling for the infection process occurring in the disclosing bTB breakdown herd, our study showed that 6·4% (95% CI 5·9–6·8) of bTB breakdowns in Northern Ireland were directly attributable to the movement of infected animals.
The aim of this study was to evaluate the effect of using different floor types to accommodate growing and finishing beef cattle on their performance, cleanliness, carcass characteristics and meat quality. In total, 80 dairy origin young bulls (mean initial live weight 224 kg (SD=28.4 kg)) were divided into 20 blocks with four animals each according to live weight. The total duration of the experimental period was 204 days. The first 101 days was defined as the growing period, with the remainder of the study defined as the finishing period. Cattle were randomly assigned within blocks to one of four floor type treatments, which included fully slatted flooring throughout the entire experimental period (CS); fully slatted flooring covered with rubber strips throughout the entire experimental period (RS); fully slatted flooring during the growing period and moved to a solid floor covered with straw bedding during the finishing period (CS-S) and fully slatted flooring during the growing period and moved to fully slatted flooring covered with rubber strips during the finishing period (CS-RS). Bulls were offered ad libitum grass silage supplemented with concentrates during the growing period. During the finishing period, bulls were offered concentrates supplemented with chopped barley straw. There was no significant effect of floor type on total dry matter intake (DMI), feed conversion ratio, daily live weight gain or back fat depth during the growing and finishing periods. Compared with bulls accommodated on CS, RS and CS-RS, bulls accommodated on CS-S had a significantly lower straw DMI (P<0.01). Although bulls accommodated on CS and CS-S were significantly dirtier compared with those accommodated on RS and CS-RS on days 50 (P<0.05) and 151 (P<0.01), there was no effect of floor type on the cleanliness of bulls at the end of the growing and finishing periods. There was also no significant effect of floor type on carcass characteristics or meat quality. However, bulls accommodated on CS-S had a tendency for less channel, cod and kidney fat (P=0.084) compared with those accommodated on CS, RS and CS-RS. Overall, floor type had no effect on the performance, cleanliness, carcass characteristics or meat quality of growing or finishing beef cattle.
The most important factors known to influence the eating quality of beef are well established and include both pre- and post-slaughter events with many of the determinants interacting with each other. A substantial programme of work has been conducted by the Agri-Food and Biosciences Institute in Northern Ireland aimed at quantifying those factors of most importance to the local beef industry. Post-slaughter effects such as carcase chilling and electrical stimulation, ageing, carcase hanging and cooking method have been shown to have a significant impact on eating quality when compared with pre-slaughter activities such as animal handling and lairage time in the Northern Ireland studies. However, the effect of animal breed, particularly the use of dairy breed animals, was shown to significantly improve eating quality. Many of these factors were found to interact with each other.