To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
While rarely at the center of debates around censorship in the United States, horror narratives have been profoundly shaped by pressures to constrain their provocative and shocking nature. This chapter explores the history of censorship efforts by government agencies, media companies, and public organizations and the impact they have had on horror across all forms of media. Tracing these efforts across various media, including literature, comic books, motion pictures, radio, and television, this chapter details the various entities that have tried to constrain the horror genre and the ways horror has adapted to these changing conditions. Throughout this historical period, regulatory efforts have consistently sought to limit shocking imagery and as well as restrict the evocation of feelings of shock and horror. Examining this regulatory history gives insights into the dynamic and evolving public dialogue about the limits of social acceptance and how much transgression society can accept.
Barrett’s oesophagus (BE) is the precursor of oesophageal adenocarcinoma, which has become the most common type of oesophageal cancer in many Western populations. Existing evidence on diet and risk of BE predominantly comes from case–control studies, which are subject to recall bias in measurement of diet. We aimed to investigate the potential effect of diet, including macronutrients, carotenoids, food groups, specific food items, beverages and dietary scores, on risk of BE in over 20 000 participants of the Melbourne Collaborative Cohort Study. Diet at baseline (1990–1994) was measured using a food frequency questionnaire. The outcome was BE diagnosed between baseline and follow-up (2007–2010). Logistic regression models were used to estimate OR and 95 % CI for diet in relation to risk of BE. Intakes of leafy vegetables and fruit were inversely associated with risk of BE (highest v. lowest quartile: OR = 0·59; CI: 0·38, 0·94; P-trend = 0·02 and OR = 0·58; CI: 0·37, 0·93; P-trend = 0·02 respectively), as were dietary fibre and carotenoids. Stronger associations were observed for food than the nutrients found in them. Positive associations were observed for discretionary food (OR = 1·54; CI: 0·97, 2·44; P-trend = 0·04) and total fat intake (OR per 10 g/d = 1·11; CI: 1·00, 1·23), the association for fat was less robust in sensitivity analyses. No association was observed for meat, protein, dairy products or diet scores. Diet is a potential modifiable risk factor for BE. Public health and clinical guidelines that incorporate dietary recommendations could contribute to reduction in risk of BE and, thereby, oesophageal adenocarcinoma.
To examine associations between diet and risk of developing gastro-oesophageal reflux disease (GERD).
Prospective cohort with a median follow-up of 15·8 years. Baseline diet was measured using a FFQ. GERD was defined as self-reported current or history of daily heartburn or acid regurgitation beginning at least 2 years after baseline. Sex-specific logistic regressions were performed to estimate OR for GERD associated with diet quality scores and intakes of nutrients, food groups and individual foods and beverages. The effect of substituting saturated fat for monounsaturated or polyunsaturated fat on GERD risk was examined.
A cohort of 20 926 participants (62 % women) aged 40–59 years at recruitment between 1990 and 1994.
For men, total fat intake was associated with increased risk of GERD (OR 1·05 per 5 g/d; 95 % CI 1·01, 1·09; P = 0·016), whereas total carbohydrate (OR 0·89 per 30 g/d; 95 % CI 0·82, 0·98; P = 0·010) and starch intakes (OR 0·84 per 30 g/d; 95 % CI 0·75, 0·94; P = 0·005) were associated with reduced risk. Nutrients were not associated with risk for women. For both sexes, substituting saturated fat for polyunsaturated or monounsaturated fat did not change risk. For both sexes, fish, chicken, cruciferous vegetables and carbonated beverages were associated with increased risk, whereas total fruit and citrus were associated with reduced risk. No association was observed with diet quality scores.
Diet is a possible risk factor for GERD, but food considered as triggers of GERD symptoms might not necessarily contribute to disease development. Potential differential associations for men and women warrant further investigation.
Individuals with schizophrenia are at higher risk of physical illnesses, which are a major contributor to their 20-year reduced life expectancy. It is currently unknown what causes the increased risk of physical illness in schizophrenia.
To link genetic data from a clinically ascertained sample of individuals with schizophrenia to anonymised National Health Service (NHS) records. To assess (a) rates of physical illness in those with schizophrenia, and (b) whether physical illness in schizophrenia is associated with genetic liability.
We linked genetic data from a clinically ascertained sample of individuals with schizophrenia (Cardiff Cognition in Schizophrenia participants, n = 896) to anonymised NHS records held in the Secure Anonymised Information Linkage (SAIL) databank. Physical illnesses were defined from the General Practice Database and Patient Episode Database for Wales. Genetic liability for schizophrenia was indexed by (a) rare copy number variants (CNVs), and (b) polygenic risk scores.
Individuals with schizophrenia in SAIL had increased rates of epilepsy (standardised rate ratio (SRR) = 5.34), intellectual disability (SRR = 3.11), type 2 diabetes (SRR = 2.45), congenital disorders (SRR = 1.77), ischaemic heart disease (SRR = 1.57) and smoking (SRR = 1.44) in comparison with the general SAIL population. In those with schizophrenia, carrier status for schizophrenia-associated CNVs and neurodevelopmental disorder-associated CNVs was associated with height (P = 0.015–0.017), with carriers being 7.5–7.7 cm shorter than non-carriers. We did not find evidence that the increased rates of poor physical health outcomes in schizophrenia were associated with genetic liability for the disorder.
This study demonstrates the value of and potential for linking genetic data from clinically ascertained research studies to anonymised health records. The increased risk for physical illness in schizophrenia is not caused by genetic liability for the disorder.
Ruminants are recognised to suffer from Cu-responsive disorders. Present understanding of Cu transport and metabolism is limited and inconsistent across vets and veterinary professionals. There has been much progress from the studies of the 1980s and early 1990s in cellular Cu transport and liver metabolism which has not been translated into agricultural practice. Cu metabolism operates in regulated pathways of Cu trafficking rather than in pools of Cu lability. Cu in the cell is chaperoned to enzyme production, retention within metallothionein or excretion via the Golgi into the blood. The hepatocyte differs in that Cu-containing caeruloplasmin can be synthesised to provide systemic Cu supply and excess Cu is excreted via bile. The aim of the present review is to improve understanding and highlight the relevant progress in relation to ruminants through the translation of newer findings from medicine and non-ruminant animal models into ruminants.
Children are at increased risk for experiencing negative physical and mental health outcomes as a result of disasters. Millions of children spend their days in childcare centers or in residential family childcare settings. The purpose of this study was to describe childcare providers’ perceived levels of preparedness capabilities and to assess differences in levels of perceived preparedness between different types of childcare providers.
A national convenience sample of childcare center administrators and residential family childcare administrators completed a brief online survey about their preparedness efforts.
Overall, there were few differences in preparedness between childcare centers and residential family childcare providers. However, childcare centers were more likely to report that they had written plans (94.47%) than residential family childcare providers (83.73%) were (χ12=15.62; P<.001). Both types of providers were more likely to report being very prepared/prepared for fires (91.31%) than they were for any other type of emergency (flooding, active shooter, etc.; 45.08% to 79.34%).
Future work should assess how childcare providers respond to and recover from emergencies, as well as explore the types of resources childcare providers need in order to feel comfortable caring for children during such emergency situations. (Disaster Med Public Health Preparedness. 2019;13:704–708)
Forms of non-random copying error provide sources of inherited variation yet their effects on cultural evolutionary dynamics are poorly understood. Focusing on variation in granny and reef knot forms, we present a mathematical model that specifies how these variant frequencies are affected by non-linear interactions between copying fidelity, mirroring, handedness and repetition biases. Experiments on adult humans allowed these effects to be estimated using approximate Bayesian computation and the model is iterated to explain the prevalence of granny over reef knots in the wild. Our study system also serves to show conditions under which copying fidelity drives heterogeneity in cultural variants at equilibrium, and that interaction between unbiased forms of copying error can skew cultural variation.
The goal of the present study was to use a methodology that accurately and reliably describes the availability, price and quality of healthy foods at both the store and community levels using the Nutrition Environment Measures Survey in Stores (NEMS-S), to propose a spatial methodology for integrating these store and community data into measures for defining objective food access.
Two hundred and sixty-five retail food stores in and within 2 miles (3·2 km) of Flint, Michigan, USA, were mapped using ArcGIS mapping software.
A survey based on the validated NEMS-S was conducted at each retail food store. Scores were assigned to each store based on a modified version of the NEMS-S scoring system and linked to the mapped locations of stores. Neighbourhood characteristics (race and socio-economic distress) were appended to each store. Finally, spatial and kernel density analyses were run on the mapped store scores to obtain healthy food density metrics.
Regression analyses revealed that neighbourhoods with higher socio-economic distress had significantly lower dairy sub-scores compared with their lower-distress counterparts (β coefficient=−1·3; P=0·04). Additionally, supermarkets were present only in neighbourhoods with <60 % African-American population and low socio-economic distress. Two areas in Flint had an overall NEMS-S score of 0.
By identifying areas with poor access to healthy foods via a validated metric, this research can be used help local government and organizations target interventions to high-need areas. Furthermore, the methodology used for the survey and the mapping exercise can be replicated in other cities to provide comparable results.
Northern bobwhite quail (Colinus virginianus), a popular gamebird among hunters, have been declining over recent decades in the Rolling Plains ecoregion. Investigations in the past few years have revealed a high prevalence of eyeworms (Oxyspirura petrowi) and caecal worms (Aulonocephalus pennula) in this ecoregion, prompting a need to better understand their host–parasite interaction and other factors that influence infection. In this study, the efficiency of a mobile laboratory was tested by deploying it to three field sites in the Rolling Plains between July and August of 2017 and collecting cloacal swabs from bobwhites. The DNA was extracted from swabs for quantitative PCR and was run in the mobile and reference laboratory to specifically detect A. pennula and O. petrowi infection. When compared with the Wildlife Toxicology's reference laboratory, the mobile laboratory had a 97 and 99% agreement for A. pennula and O. petrowi, respectively. There were no significant differences in infection levels between field sites. Due to its efficiency, it is proposed that the mobile laboratory would be an effective way to monitor infection levels, in addition to factors that may affect infection such as climate, diapause, and intermediate host populations.
Oxyspirura petrowi is a heteroxenous nematode found in northern bobwhite (Colinus virginianus) of the Rolling Plains ecoregion of Texas. Despite its impact on this popular gamebird, genetic level studies on O. petrowi remain relatively unexplored. To accomplish this, we chose the previously studied nuclear rDNA 18S region as well as the mitochondrial COX1 gene region of O. petrowi to investigate phylogenetic relations between O. petrowi and other nematode species. In this study, we generate primers using multiple alignment and universal nematode primers to obtain a near-complete 18S and partial COX1 sequence of O. petrowi, respectively. Phylogenetic trees for O. petrowi’s 18S and COX1 gene regions were constructed using the Maximum Likelihood and Maximum Parsimony method. A comparative analysis was done based on the nuclear and mitochondrial region similarities between O. petrowi and other nematode species that infect both humans and animals. Results revealed a close relation to the zoonotic eyeworm Thelazia callipaeda as well as a close relation with filarial super family (Filarioidea) such as the human eyeworm Loa loa and Dirofilaria repens eyeworm of dog and carnivores.
This study aimed to explore, describe and enhance understanding of women’s experiences, beliefs and knowledge of urinary symptoms in the postpartum period and also sought to understand the perceptions of health professionals of these issues.
Women often take no action with regard to urinary symptoms particularly in the postnatal period, which can lead to the adoption of coping mechanisms or normalisation of symptoms. The true prevalence is difficult to assess due to differing age groups and time spans in studies. There is only a small body of work available to try to understand the lack of action on the part of the women, and even less around the attitudes of health professionals.
Grounded theory was selected for a qualitative inductive approach, to attempt to understand the social processes involved and generate new knowledge by examining the different interactions. Recruitment was by theoretical sampling. In total, 15 women were interviewed and two focus groups of health professionals were undertaken. In addition, an antenatal clinic and a postnatal mothers group were observed. All information was analysed manually using constant comparison.
The findings revealed that at times poor communication, lack of clear education and the power of relative’s stories of the past were barriers to help seeking, and were disempowering women, creating a climate for normalisation. Women were willing to talk but preferred the health professional to initiate discussion. In addition, health professionals were concerned about a lack of time and knowledge and were uncertain of the effect of pelvic floor muscle exercises due to some research indicating improvement may not be maintained over time. The core category was; ‘overcoming barriers to facilitate empowerment’, indicating that improving communication and education could reduce barriers and enable them to seek help.
A range of precision farming technologies are used commercially for variable rate applications of nitrogen (N) for cereals, yet these usually adjust N rates from a pre-set value, rather than predicting economically optimal N requirements on an absolute basis. This paper reports chessboard experiments set up to examine variation in N requirements, and to develop and test systems for its prediction, and to assess its predictability. Results showed very substantial variability in fertiliser N requirements within fields, typically >150 kg ha−1, and large variation in optimal yields, typically >2 t ha−1. Despite this, calculated increases in yield and gross margin with N requirements perfectly matched across fields were surprisingly modest (compared to the uniform average rate). Implications are discussed, including the causes of the large remaining variation in grain yield, after N limitations were removed.
Oxyspirura petrowi is a heteroxenous parasitic nematode that has been reported in high prevalences from birds in the Order Galliformes experiencing population declines in the USA. There is a paucity of information regarding the natural history O. petrowi, including the life cycle and effects of infection on wild bird populations. In order to study the life cycle of this parasite, we collected plains lubber grasshoppers (Brachystola magna) from a field location in Mitchell County, Texas. We found third-stage larvae (L3) in 37.9% (66/174) B. magna. We determined that they were O. petrowi through morphological comparison of L3 from experimentally infected Acheta domesticus and by sequence analysis. Then, we showed that B. magna are a potential intermediate hosts for O. petrowi infections in northern bobwhites (Colinus virginianus) in a laboratory setting by experimental infection. We first detected shedding of eggs in feces using a fecal float technique 52 days post infection. In addition, we recovered 87 O. petrowi from experimentally infected northern bobwhites. Although we detected shedding in feces, recovery of eggs was low (>5 eggs/g). Future work is needed to understand shedding routes and shedding patterns of northern bobwhites infected with O. petrowi.
We examine the game theoretic properties of a model of crime first introduced by Short et al. (2010 Phys. Rev. E82, 066114) as the SBD Adversarial Game. We identify the rationalizable strategies and one-shot equilibria under multiple equilibrium refinements. We further show that SBD's main result about the effectiveness of defecting-punishers (“Informants”) in driving the system to evolve to the cooperative equilibrium under an imitation dynamic generalizes to a best response dynamic, though only under certain parameter regimes. The nature of this strategy's role, however, differs significantly between the two dynamics: in the SBD imitation dynamic, Informants are sufficient but not necessary to achieve the cooperative equilibrium, while under the best response dynamic, Informants are necessary but not sufficient for convergence to cooperation. Since a policy of simply converting citizens to Informants will not guarantee success under best response dynamics, we identify alternative strategies that may help the system reach cooperation in this case, e.g., the use of moderate but not too severe punishments on criminals.