To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Black Death first reduced England’s population by nearly one half then prevented demographic recovery. Volatility characterised the 1350s and 1360s, due to extreme weather conditions, poor harvests, contracting output, disrupted markets, labour shortages and a high turnover of people. Towns struggled to assimilate the influx of migrants. The availability of land on favourable terms, and of well-paid employment, greatly benefited the lower orders of society, but caused consternation to the ruling elite. The government responded with a wave of legislation to regulate labour mobility, prices and wages, so as to impose upon workers the discipline of manual labour deemed essential to the common profit. By the 1380s equilibrium had replaced the volatility. The economy had contracted, and shifted from arable production to pastoral and manufactured products. Towns were smaller, but their residents tended to be wealthier. The attitude of the authorities to labour had become more realistic and less idealistic, emphasising its noble qualities rather than denouncing its vices.
Although the conceptual design is a fundamental process through which design decisions are made, its focus is on finding the right solution. Is finding the right solution enough for a good design? Defining the problem or applying a solution-focused process may not be enough to create the differences that must be present in today's variable conditions. This can be overcome through seeking meaning instead of seeking a solution. The purpose of this article is to develop an approach that focuses on seeking meaning for products by starting with a design-thinking approach to the conceptual design process in engineering design. Focusing on a search for meaning in engineering design will provide advantages, such as creating unique values and sustainable competition.
Ferrierite is the name for a series of zeolite-group of minerals which includes three species with the same ferrierite framework (FER) crystal structure but different extra-framework cations. Recent studies have shown that ferrierite can exhibit a fibrous-asbestiform crystal habit and may possess the same properties as carcinogenic fibrous erionite. Characterisation of the ferrierite in and around a mine location will be helpful in assessing the potential for toxic outcomes of exposure in the mine and any local population.
The zeolite-rich tuff deposit of Lovelock, Nevada, USA is the largest occurrence of diagenetic ferrierite-Mg. A previous survey reported that ferrierite hosted in these rocks displays a fibrous morphology. However, these observations concerned a limited number of samples and until now there has been little evidence of widespread occurrence of fibrous ferrierite in the Lovelock deposit.
The main goal of this study was to perform a mineralogical and morphometric characterisation of the tuff deposit at Lovelock and evaluate the distribution of fibrous ferrierite in the outcrop. For this purpose, a multi-analytical approach including powder X-ray diffraction, scanning and transmission microscopies, micro-Raman spectroscopy, thermal analyses, and surface-area determination was applied.
The results prove fibrous ferrierite is widespread and intermixed with mordenite and orthoclase, although there are variations in the spatial distribution in the bedrock. The crystal habit of the ferrierite ranges from prismatic to asbestiform (elongated, thin and slightly flexible) and fibres are aggregated in bundles. According to the WHO counting criteria, most of the ferrierite fibres can be classified as breathable. While waiting for confirmatory in vitro and in vivo tests to assess the actual toxicity/pathogenicity potential of this mineral fibre, it is recommended to adopt a precautionary approach for mining operations in this area to reduce the risk of exposure.
New approaches are needed to safely reduce emergency admissions to hospital by targeting interventions effectively in primary care. A predictive risk stratification tool (PRISM) identifies each registered patient's risk of an emergency admission in the following year, allowing practitioners to identify and manage those at higher risk. We evaluated the introduction of PRISM in primary care in one area of the United Kingdom, assessing its impact on emergency admissions and other service use.
We conducted a randomized stepped wedge trial with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. PRISM was implemented in eleven primary care practice clusters (total thirty-two practices) over a year from March 2013. We analyzed routine linked data outcomes for 18 months.
We included outcomes for 230,099 registered patients, assigned to ranked risk groups.
Overall, the rate of emergency admissions was higher in the intervention phase than in the control phase: adjusted difference in number of emergency admissions per participant per year at risk, delta = .011 (95 percent Confidence Interval, CI .010, .013). Patients in the intervention phase spent more days in hospital per year: adjusted delta = .029 (95 percent CI .026, .031). Both effects were consistent across risk groups.
Primary care activity increased in the intervention phase overall delta = .011 (95 percent CI .007, .014), except for the two highest risk groups which showed a decrease in the number of days with recorded activity.
Introduction of a predictive risk model in primary care was associated with increased emergency episodes across the general practice population and at each risk level, in contrast to the intended purpose of the model. Future evaluation work could assess the impact of targeting of different services to patients across different levels of risk, rather than the current policy focus on those at highest risk.
Emergency admissions to hospital are a major financial burden on health services. In one area of the United Kingdom (UK), we evaluated a predictive risk stratification tool (PRISM) designed to support primary care practitioners to identify and manage patients at high risk of admission. We assessed the costs of implementing PRISM and its impact on health services costs. At the same time as the study, but independent of it, an incentive payment (‘QOF’) was introduced to encourage primary care practitioners to identify high risk patients and manage their care.
We conducted a randomized stepped wedge trial in thirty-two practices, with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. We analysed routine linked data on patient outcomes for 18 months (February 2013 – September 2014). We assigned standard unit costs in pound sterling to the resources utilized by each patient. Cost differences between the two study phases were used in conjunction with differences in the primary outcome (emergency admissions) to undertake a cost-effectiveness analysis.
We included outcomes for 230,099 registered patients. We estimated a PRISM implementation cost of GBP0.12 per patient per year.
Costs of emergency department attendances, outpatient visits, emergency and elective admissions to hospital, and general practice activity were higher per patient per year in the intervention phase than control phase (adjusted δ = GBP76, 95 percent Confidence Interval, CI GBP46, GBP106), an effect that was consistent and generally increased with risk level.
Despite low reported use of PRISM, it was associated with increased healthcare expenditure. This effect was unexpected and in the opposite direction to that intended. We cannot disentangle the effects of introducing the PRISM tool from those of imposing the QOF targets; however, since across the UK predictive risk stratification tools for emergency admissions have been introduced alongside incentives to focus on patients at risk, we believe that our findings are generalizable.
A predictive risk stratification tool (PRISM) to estimate a patient's risk of an emergency hospital admission in the following year was trialled in general practice in an area of the United Kingdom. PRISM's introduction coincided with a new incentive payment (‘QOF’) in the regional contract for family doctors to identify and manage the care of people at high risk of emergency hospital admission.
Alongside the trial, we carried out a complementary qualitative study of processes of change associated with PRISM's implementation. We aimed to describe how PRISM was understood, communicated, adopted, and used by practitioners, managers, local commissioners and policy makers. We gathered data through focus groups, interviews and questionnaires at three time points (baseline, mid-trial and end-trial). We analyzed data thematically, informed by Normalisation Process Theory (1).
All groups showed high awareness of PRISM, but raised concerns about whether it could identify patients not yet known, and about whether there were sufficient community-based services to respond to care needs identified. All practices reported using PRISM to fulfil their QOF targets, but after the QOF reporting period ended, only two practices continued to use it. Family doctors said PRISM changed their awareness of patients and focused them on targeting the highest-risk patients, though they were uncertain about the potential for positive impact on this group.
Though external factors supported its uptake in the short term, with a focus on the highest risk patients, PRISM did not become a sustained part of normal practice for primary care practitioners.
A gravity survey was conducted on the Windmill Islands, East Antarctica, during the 2004–05 summer season. The aim of the study was to investigate the subsurface geology of the Windmill Islands area. Ninety-seven gravity stations were established. Additionally, 49 observations from a survey in 1993–94 were re-reduced and merged with the 2004–05 data. A three-dimensional subsurface model was constructed from the merged gravity dataset to determine the subsurface geology of the Windmill Islands. The main country rock in the Windmill Islands is a Garnet-bearing Granite Gneiss. A relatively dense intrusive charnockite unit, the Ardery Charnockite, generates the dominant gravity high of the study area and has been modelled to extend to depths of 7–13 km. It has moderate to steep contacts against the surrounding Garnet-bearing Granite Gneiss. The Ardery Charnockite surrounds a less dense granite pluton, the Ford Granite, which is modelled to a depth of 6–12 km and creates a localized gravity low. This granitic pluton extends at depth towards the east. The modelling process has also shown that Mitchell Peninsula is linked to the adjacent Law Dome ice cap by an ‘ice ramp’ of approximately 100 m thickness.
This paper asks whether where someone lives bears any association with their attitudes to inequality and income redistribution, focusing on the relative contribution of neighbourhood income, density and ethnic composition. People on higher incomes showed higher support for redistribution when living in more deprived neighbourhoods. People with lower levels of altruism had higher levels of support for redistribution in neighbourhoods of higher density. People living in more ethnically mixed neighbourhoods had higher levels of support for redistribution on average, but this support declined for Whites with low levels of altruism as the deprivation of the neighbourhood increased. Current trends which sustain or extend income and wealth inequalities, reflected in patterns of residence, may undermine social cohesion in the medium- to long-term. This may be offset to some extent by trends of rising residential ethnic diversity.
The vast majority of people in medieval England were rural dwellers who eked a living as peasant smallholders, landless labourers and petty traders. In c.1300 around one half of these people were ‘servile’, ‘villeins’, or ‘unfree’. These terms were used to describe peasants who were ‘bonded in some fashion to a lord or a particular piece of land. This tie was hereditary rather than merely contractual or temporary, and it placed the serf under the jurisdiction (and sometimes the arbitrary power) of the lords.’ This general condition, known as serfdom, was found throughout medieval Europe. Serfdom constrained the freedom of time and action of individuals by imposing legal restrictions upon their movement, and by placing inescapable economic burdens and rents of various sorts. These restrictions and burdens were widely regarded as degrading, and, according to leading historians such as Hilton, they enabled lords to extract from peasant families all the labour and landed produce which was surplus to their subsistence and to their basic operating needs. Serfdom is therefore regarded as an inherently exploitative relationship, skewed heavily to the benefit of landlords.
Serfdom was not present in every part of medieval Europe and, where it existed, its specific characteristics varied so much that ‘serfdom’ in one place could be very different from that in another. ‘It is such an elastic concept as to lead some historians to doubt whether the term has any analytical value … its precise form must be established empirically for each particular time and place.’
Merton College, Oxford, was established in 1262 and its estates were mainly scattered across the southern Midlands. Two of the three manors in this case study were conventional demesne manors, but they were situated a considerable distance from one another. Cuxham (Oxfordshire) lies six miles north east of Wallingford, and twelve miles from Merton College itself, and the manor was small but dominated by its demesne and unfree holdings: there were few free tenures. Kibworth Harcourt (Leicestershire) was a large manor, but it was detached from the main estate and situated 12 miles south east of Leicester. Merton acquired both manors soon after its foundation. The third manor, Holywell, had been acquired in 1266, and was a much smaller manor, located just outside Oxford. Cuxham and Holywell were chosen for study because of their excellent surviving archives, their relative geographical proximity, and their contrasting manorial structure. Kibworth is added to the case study because much is already known about it from Cecily Howell's detailed research, and because its detachment from the main estate enhances the scope for comparison with two manors close to Merton itself.
Customary land tenures
Before 1348–9 villein tenure dominated peasant landholding in Cuxham. A custumal of 1298 listed 21 villein tenants (13 half virgates, each of 12 acres, and eight cottars) and five free tenants.
All sections of medieval English society were liable to render various payments and services to a superior lord, which either marked key moments in the life cycle of an individual or were associated with the terms of their landholding. The nobility and gentry paid aid upon the marriage of their lord's eldest daughter, and were liable for military service; freemen paid relief upon entry to a landholding; and all but the poorest paid mortuary (a death duty) to the church. Villeins and villein tenures were also liable for a range of incidents, dues and services, but these differed from those owed by the rest of society in two ways: they were generally more onerous and demeaning, and the lord could – in strict legal theory – impose them arbitrarily. In reality, by c.1300 the arbitrary imposition of servile dues was exceptionally rare, and instead their frequency and duration were determined and largely fixed by local custom. However, there still remained some elements of uncertainty and unpredictability, in the sense that the precise package of dues, and the way in which they were levied, varied from manor to manor; the lord could determine how much was charged for some dues; and the lord might challenge or ignore custom.
This chapter surveys each of the main dues and incidents associated with villeinage in England in c.1300; briefly explains their character, origin and prevalence; and then constructs as tight a chronology of their decline as the current historical literature permits.