We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The surrealist imagination is an imagination at war. Born out of the horrors of the European trenches and catapulted into the nightmares of fascism, the Spanish Civil War, World War II, and the Holocaust, surrealism has always responded to the historical violence that has shaped and energized it. At the same time, however, surrealist responses to war are all too aware of their struggle to articulate their political nature. How can surrealism write war? What is the political import of surrealism’s indirect aesthetics? How might surrealist writing advance our understanding of the complexities of wartime subjectivity? This chapter explores these questions by turning its attention to two dark allegorical novels: Ruthven Todd’s Over the Mountain (1939) and Rex Warner’s The Aerodrome (1941). To date, discussions of British surrealist writing have confined themselves to the aesthetic and political contexts of interwar and wartime poetry. But there is a need to complicate this literary history if we are to better understand the diversity of British surrealist writing before, during, and after the Second World War. Whilst the novel was very much a marginal practice in 1930s and 1940s surrealist circles, it nevertheless emerged in the wartime period as a dark form of literary political enquiry; one that, coming through from the counter-Enlightenment impulses of the Gothic, poses disquieting questions about wartime human appetites for violence, corruption, and absolute power.
Many captive giraffes perform oral stereotypies, in particular tongue-playing, licking of objects (including conspecifics) and vacuum chewing. Typically, the diet of these large ruminants in captivity consists mostly of food concentrates, which are consumed rapidly and do not provide stimulation for their long, prehensile tongues. In the wild, browsing requires extensive use of this organ but in captivity material upon which to browse is limited. Consequently, vacuum activities, such as mock leaf-feeding behaviour, and stereotypies may develop. Rumination is also a major component of a giraffe's behavioural repertoire. It is essential for proper digestion, but may also be connected with non-REM sleep. Inadequate opportunities for rumination may also contribute to the development of oral stereotypies. In this study of captive giraffes, we examined the effect of increasing dietary fibre on the time spent ruminating and feeding and the extent to which oral stereotypies were performed. Two giraffes of different age, sex and sub-species were studied at Paignton Zoo Environmental Park. Dietary fibre was increased by the addition of coarse meadow hay to their existing diet. Following the addition of hay, time spent feeding did not change significantly but there was a significant increase in the time spent ruminating and a significant reduction in time spent performing oral stereotypies by both giraffes, suggesting that oral stereotypies may be connected with rumination rather than feeding. Stereotypic behaviour is generally accepted to be an indicator of sub-optimal welfare. Thus, the reduction in this behaviour by the simple addition of coarse fibre to the diet can be interpreted as enhancing the welfare of these animals.
Artificial rearing involves removing piglets from their mother at seven days of age and feeding them milk replacer until weaning. Early-life rearing conditions can influence piglets’ mental development, as reflected by their emotional state and reactivity. This study compared the post-weaning emotional state and reactivity of pigs which were either sow-reared or artificially reared pre-weaning. Behavioural tests (startle test, novel object test, human-animal relationship test and open door test) were conducted one week post-weaning (weaner 1, 34 [± 0.6] days old), one week after movement to weaner 2 (69 [±1.2] days old) and to finisher (100 [± 1.3] days old) stages. Qualitative Behavioural Assessments (QBA) were conducted on the same days in weaner 2 and finisher stages. QBA descriptors were computed by PCA and all other data were analysed using linear models. Artificially reared pigs were less fearful of human contact in weaner 1 (45.1 [± 8.43] vs 81.3 [± 7.89]%) and finisher (25.8 [± 5.19] vs 45.7 [± 6.00]%) stages; but there was no difference in the other tests. Artificially reared pigs had a higher QBA score (more positive) than sow-reared pigs in weaner 2 (54.49 [± 10.102] vs 17.88 [± 9.94]) but not in finisher (70.71 [± 8.860] vs 52.76 [± 9.735]) stage. In conclusion, artificially reared pigs appeared to have a more positive emotional state transiently post-weaning and a lower fearfulness towards humans, which are likely mediated by their pre-weaning conditions. These data emphasise the need to consider the entire life of the animals to fully evaluate the long-term impacts of a rearing system.
Pregnant (dry) sows (Sus scrofa) are fed a rationed amount of feed to maintain healthy weight and production but this does not satisfy their hunger. This study measured the extent of feed restriction compared to sows’ desired intake. Forty-seven Large White × Landrace sows were housed in small groups with straw bedding and individual feeding stalls. Following three days on a standard ration of 2.5 kg, they were offered 10 kg a day of commercial dry sow feed for three days, split into four 2.5-kg meals a day which enabled individual intakes to be measured. This quantity was effectively ad libitum (maximum daily intake 9.4 kg). Mean (± SEM) intake per day over the three ad libitum days was 5.67 (± 0.24) kg, compared to the 2.5-kg standard ration. The ration thus provides less than half (44.1%) of sows’ desired intake. Behaviour on their third rationed day was compared with behaviour on the third day of ad libitum. Eating rate and the display of hunger-related behaviours, particularly following the morning feed, was greater under ration feeding; sows spent more time in the food stall and less in the straw bed, and more time active rather than resting. During ration-feeding sows also chewed and nosed more at straw bedding and pen equipment and used the drinker more after their morning meal than when they were fed ad libitum. Eating rate on the last rationed day was positively correlated with feed intake on each of the ad libitum days. Despite an EU requirement for fibre to be added to diets to ameliorate this problem, and the provision of straw bedding, hunger resulting from food restriction remains a welfare concern for dry sows.
New livestock housing systems designed to improve animal welfare will only see large-scale commercial adoption if they improve profitability, or are at least cost neutral to the farm business. Economic evaluation of new system developments is therefore essential to determine their effect on cost of production and hence the extent of any market premium necessary to stimulate adoption. This paper describes such an evaluation in relation to high welfare farrowing systems for sows where any potential system needs to reconcile the behavioural needs of the sow with piglet survivability, acceptable capital and running costs, farm practicality and ease of management. In the Defra-sponsored PigSAFE project, a new farrowing system has been developed which comprises a loose, straw-bedded pen with embedded design features which promote piglet survival. Data on this and four other farrowing systems (new systems: 360° Farrower and a Danish pen; existing systems: crate and outdoor paddock) were used to populate a model of production cost taking account of both capital and running costs (feed, labour, bedding etc). Assuming equitable pig performance across all indoor farrowing systems, the model estimated a higher production cost for non-crate systems by 1.6, 1.7 and 3.5%, respectively, for 360° Farrower, Danish and PigSAFE systems on a per-sow basis. The outdoor production system had the lowest production cost. An online survey of pig producers confirmed that, whilst some producers would consider installing a non-crate system, the majority of producers remain cautious about considering alternatives to the farrowing crate. If pig performance in alternative indoor systems could be improved from the crate baseline (eg through reduced piglet mortality, improved weaning weight or sow re-breeding), then the differential cost of production could be reduced. Indeed, with further innovation by pig producers, management of alternative farrowing systems may evolve to a point where there can be improvements in both welfare and pig production. However, larger data sets of alternative systems on commercial farms will be needed to explore fully the welfare/production interface before such a relationship can be confirmed for those pig producers who will be replacing their units in the next ten years.
Increasing litter size has long been a goal of pig breeders and producers, and may have implications for pig (Sus scrofa domesticus) welfare. This paper reviews the scientific evidence on biological factors affecting sow and piglet welfare in relation to large litter size. It is concluded that, in a number of ways, large litter size is a risk factor for decreased animal welfare in pig production. Increased litter size is associated with increased piglet mortality, which is likely to be associated with significant negative animal welfare impacts. In surviving piglets, many of the causes of mortality can also occur in non-lethal forms that cause suffering. Intense teat competition may increase the likelihood that some piglets do not gain adequate access to milk, causing starvation in the short term and possibly long-term detriments to health. Also, increased litter size leads to more piglets with low birth weight which is associated with a variety of negative long-term effects. Finally, increased production pressure placed on sows bearing large litters may produce health and welfare concerns for the sow. However, possible biological approaches to mitigating health and welfare issues associated with large litters are being implemented. An important mitigation strategy is genetic selection encompassing traits that promote piglet survival, vitality and growth. Sow nutrition and the minimisation of stress during gestation could also contribute to improving outcomes in terms of piglet welfare. Awareness of the possible negative welfare consequences of large litter size in pigs should lead to further active measures being taken to mitigate the mentioned effects.
Increasing litter size has long been a goal of pig (Sus scrofa domesticus) breeders and producers in many countries. Whilst this has economic and environmental benefits for the pig industry, there are also implications for pig welfare. Certain management interventions are used when litter size routinely exceeds the ability of individual sows to successfully rear all the piglets (ie viable piglets outnumber functional teats). Such interventions include: tooth reduction; split suckling; cross-fostering; use of nurse sow systems and early weaning, including split weaning; and use of artificial rearing systems. These practices raise welfare questions for both the piglets and sow and are described and discussed in this review. In addition, possible management approaches which might mitigate health and welfare issues associated with large litters are identified. These include early intervention to provide increased care for vulnerable neonates and improvements to farrowing accommodation to mitigate negative effects, particularly for nurse sows. An important concept is that management at all stages of the reproductive cycle, not simply in the farrowing accommodation, can impact on piglet outcomes. For example, poor stockhandling at earlier stages of the reproductive cycle can create fearful animals with increased likelihood of showing poor maternal behaviour. Benefits of good sow and litter management, including positive human-animal relationships, are discussed. Such practices apply to all production situations, not just those involving large litters. However, given that interventions for large litters involve increased handling of piglets and increased interaction with sows, there are likely to be even greater benefits for management of hyper-prolific herds.
In many countries, including the UK, the majority of domestic sows are housed in farrowing crates during the farrowing and lactation periods. Such systems raise welfare problems due to the close confinement of the sow. Despite the fact that many alternative housing systems have been developed, no commercially viable/feasible option has emerged for large scale units. Current scientific and practical knowledge of farrowing systems were reviewed in this study to identify alternative systems, their welfare and production potential. The aim was to establish acceptable trade-offs between profit and welfare within alternative farrowing systems. Linear programming (LP) was used to examine possible trade-offs and to support the design of welfare-friendly yet commercially viable alternatives. The objective of the LP was to optimise the economic performance of conventional crates, simple pens and designed pens subject to both managerial and animal welfare constraints. Quantitative values for constraints were derived from the literature. The potential effects of each welfare component on productivity were assessed by a group of animal welfare scientists and used in the model. The modelled welfare components (inputs) were extra space, substrate and temperature. Results showed that, when using piglet survival rate in the LP based on data drawn from the literature and incorporating costs of extra inputs in the model, the crates obtained the highest annual net margin and the designed pens and the pens were in second and third place, respectively. The designed pens and the pens were able to improve their annual net margin once alternative reference points, following expert-derived production functions, were used to adjust piglet survival rates in response to extra space, extra substrate and modified pen heating. The non-crate systems then provided higher welfare and higher net margin for sows and piglets than crates, implying the possibility of a win-win situation.
Observational studies suggest that 25-hydroxy vitamin D (25(OH)D) concentration is inversely associated with pain. However, findings from intervention trials are inconsistent. We assessed the effect of vitamin D supplementation on pain using data from a large, double-blind, population-based, placebo-controlled trial (the D-Health Trial). 21 315 participants (aged 60–84 years) were randomly assigned to a monthly dose of 60 000 IU vitamin D3 or matching placebo. Pain was measured using the six-item Pain Impact Questionnaire (PIQ-6), administered 1, 2 and 5 years after enrolment. We used regression models (linear for continuous PIQ-6 score and log-binomial for binary categorisations of the score, namely ‘some or more pain impact’ and ‘presence of any bodily pain’) to estimate the effect of vitamin D on pain. We included 20 423 participants who completed ≥1 PIQ-6. In blood samples collected from 3943 randomly selected participants (∼800 per year), the mean (sd) 25(OH)D concentrations were 77 (sd 25) and 115 (sd 30) nmol/l in the placebo and vitamin D groups, respectively. Most (76 %) participants were predicted to have 25(OH)D concentration >50 nmol/l at baseline. The mean PIQ-6 was similar in all surveys (∼50·4). The adjusted mean difference in PIQ-6 score (vitamin D cf placebo) was 0·02 (95 % CI (−0·20, 0·25)). The proportion of participants with some or more pain impact and with the presence of bodily pain was also similar between groups (both prevalence ratios 1·01, 95 % CI (0·99, 1·03)). In conclusion, supplementation with 60 000 IU of vitamin D3/month had negligible effect on bodily pain.
A neural network framework is used to design a new Ni-based superalloy that surpasses the performance of IN718 for laser-blown-powder directed-energy-deposition repair applications. The framework utilized a large database comprising physical and thermodynamic properties for different alloy compositions to learn both composition to property and also property to property relationships. The alloy composition space was based on IN718, although, W was additionally included and the limiting Al and Co content were allowed to increase compared standard IN718, thereby allowing the alloy to approach the composition of ATI 718Plus® (718Plus). The composition with the highest probability of satisfying target properties including phase stability, solidification strain, and tensile strength was identified. The alloy was fabricated, and the properties were experimentally investigated. The testing confirms that this alloy offers advantages for additive repair applications over standard IN718.
Smutgrass is an invasive weed species that can quickly outcompete bahiagrass because of its aggressive growth, prolific seed production, and rhizomatous nature. Total renovation of bahiagrass pastures or hayfields is generally not a feasible or economically viable option for most producers. Therefore, controlling the continual spread of smutgrass will require an integrated weed management (IWM) plan that incorporates multiple strategies. The objective of this study was to test the interactions of herbicides and fertilizers on smutgrass control in bahiagrass and determine the most efficacious and economical IWM plan for low-input bahiagrass systems. This research was conducted on a mixture of ‘Tifton 9’ and ‘Pensacola’ bahiagrass at the Alapaha Beef Station in Alapaha, GA. The study design was a randomized complete block with a three-by-four factorial treatment arrangement with six replications. Fertility treatments included 56 kg N ha–1 (ammonium nitrate, 34% N) + 56 kg K2O ha–1, 56 kg N ha–1, and an unfertilized control. Smutgrass was reduced to <15% ground coverage when a postemergent herbicide was applied. The addition of a preemergent herbicide and/or fertilizer further reduced the coverage of smutgrass (P < 0.01). As smutgrass declined, the bahiagrass ground coverage increased; other vegetation and dead material did not differ by treatment. Generally, herbage accumulation and crude protein were only affected following the second N application (P < 0.01). Treatments that included preemergent (indaziflam) and postemergent (hexazinone) herbicides in addition to N and K2O resulted in an improved bahiagrass stand as timely weed suppression removed competition, while fertilizer provided essential nutrients for optimum growth to fill in the gaps. Combining herbicide and fertilizer is a more economical option for producers when compared to a complete bahiagrass renovation.
Positive deviance is an asset-based improvement approach. At its core is the belief that solutions to problems already exist within communities, and that identifying, understanding, and sharing these solutions enables improvements at scale. Originating in the field of international public health in the 1960s, positive deviance is now, with some adaptations, seeing growing application in healthcare. We present examples of how positive deviance has been used to support healthcare improvement. We draw on an emerging view of safety, known as Safety II, to explain why positive deviance has drawn the interest of researchers and improvers alike. In doing so, we identify a set of fundamental values associated with the positive deviance approach and consider how far they align with current use. Throughout, we consider the untapped potential of the approach, reflect on its limitations, and offer insights into the possible challenges of using it in practice. This title is also available as Open Access on Cambridge Core.
Social care to assist with the activities of daily living is a necessity for many older people; while informal care provided by family members can be a first step to meeting care needs, formal care provided by professionals is often needed or preferred by older people and their families. In England, the number of older people paying for formal care is set to rise, driven by an ageing population and the limited resources of local authorities. Little is known about how older people and their families experience the potentially disruptive processes of deciding upon, searching for and implementing such care, including the financial implications. This paper explores accounts of seeking self-funded social care in England, told by older people and their families in 39 qualitative interviews. These accounts, which we call ‘care chronicles’, include stories about the emergence of care needs and informal care-giving, the search for formal care, including interacting with new systems and agencies, and getting formal (paid) care, either as the recipient or an involved family member. Stories are analysed through the lens of biographical disruption, and analysis demonstrates that such disruptions can occur for older people and their families across the entirety of the care chronicle. Needing, seeking and getting care all have the potential to cause practical and symbolic disruptions; moreover, these disruptions can be cumulative and cyclical, as attempts to resolve or minimise one disruption can lead to new ones. While the concept of biographical disruption is a mainstay in medical sociology, it is less frequently applied to issues relating to social care, and most often takes embodiment as a key focus. This study is novel in its application of the concept to experiences of seeking self-funded care, and in its introduction of the concept of ‘care chronicles’, which invite a longer and broader view of biographical disruptions in the lives of older people with care needs and their families.
In many systems consisting of interacting subsystems, the complex interactions between elements can be represented using multilayer networks. However percolation, key to understanding connectivity and robustness, is not trivially generalised to multiple layers. This Element describes a generalisation of percolation to multilayer networks: weak multiplex percolation. A node belongs to a connected component if at least one of its neighbours in each layer is in this component. The authors fully describe the critical phenomena of this process. In two layers with finite second moments of the degree distributions the authors observe an unusual continuous transition with quadratic growth above the threshold. When the second moments diverge, the singularity is determined by the asymptotics of the degree distributions, creating a rich set of critical behaviours. In three or more layers the authors find a discontinuous hybrid transition which persists even in highly heterogeneous degree distributions, becoming continuous only when the powerlaw exponent reaches $1+1/(M-1)$ for $M$ layers.
Feeding practices used by educators in Early Childhood Education and Care (ECEC) settings can influence the diet quality of young children. However, Australian data is scarce and limited to describing barriers to responsive feeding. This study describes the use of feeding practices amongst a group of Australian educators.
Design:
Direct observation of feeding practices and assessment of centre policy were conducted using the ‘Environment and Policy Assessment and Observation’ tool. Self-reported feeding practices and demographic data were collected via online survey using the Childcare Food and Activity Practices Questionnaire.
Setting:
Ten centre-based ECEC services in South East Queensland, Australia.
Participants:
Educators working in ECEC.
Results:
A total of 120 meals were observed and 88 educators provided self-report data (n 84 female). Centre policy supported the use of responsive feeding practices, and this was reflected in the high frequency with which children could decide what and how much to eat, across both observed and self-report data as well as low levels of pressure to eat and use of food as a reward (observed at 19·9 % and 0 % of meals). The only apparent discrepancy was regarding modelling. Median score for self-reported role-modelling was 5·0 (4·3–5·0) and educators were observed to sit with children at 75 % of meals, however observed occasions of enthusiastic role modelling was only 22 % (0–33·3) of meals.
Conclusions:
Research addressing how educators conceptualise feeding practices, as well under what circumstances they are used, particularly in centres with different models of food provision, may shed light on why modelling is rarely implemented in practice.
In this chapter, working with scientific evidence, I build up a picture of the psychopathic personality which can be applied to my preferred account of moral responsibility. After sketching an introduction to the history of psychopathy as a clinical construct, I consider some disputes and controversies surrounding its diagnosis. I distinguish psychopathy from Antisocial Personality Disorder (APD), a rival construct commonly used in clinical settings. I also sketch the implications of evidence for a distinction between ‘successful’ and ‘unsuccessful’ psychopaths for the overall construct. I conclude that the Hare Psychopathy Checklist is the most robust measure available, a measure which describes psychopathy as a condition characterised primarily by emotional deficiencies. I then review neuroscientific evidence for structural and functional correlates and causes of psychopathy. I also review evidence for the treatability of the condition, concluding based on the current psychological and psychiatric evidence that psychopathy appears to be highly recalcitrant to the treatment methods that have been tried so far, and that some of these methods may even be counter-productive.