Book chapters will be unavailable on Saturday 24th August between 8am-12pm BST. This is for essential maintenance which will provide improved performance going forwards. Please accept our apologies for any inconvenience caused.
To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In 1817–21, the Indian subcontinent was ravaged by a series of epidemics which marked the beginning of what has since become known as the First Cholera Pandemic. Despite their far-reaching consequences, these epidemics have received remarkably little attention and have never been considered as historical subjects in their own right. This article examines the epidemics of 1817–21 in greater detail and assesses their significance for the social and political history of the Indian subcontinent. Additionally, it examines the meanings that were attached to the epidemics in the years running up to the first appearance of cholera in the West. In so doing, the article makes comparisons between responses to cholera in India and in other contexts, and tests the applicability of concepts used in the study of epidemics in the West. It is argued that the official reaction to cholera in India was initially ameliorative, in keeping with the East India Company's response to famines and other supposedly natural disasters. However, this view was gradually supplemented and replaced by a view of cholera as a social disease, requiring preventive action. These views were initially rejected in Britain, but found favour after cholera epidemics in 1831–32. Secondly, in contrast to later epidemics, it is argued that those of 1817–21 did little to exacerbate tensions between rulers and the ruled. On the rare occasions when cholera did elicit a violent reaction, it tended to be intra-communal rather than anti-colonial in nature.
Rapeseed is a popular cover crop choice due to its deep-growing taproot, which creates soil macropores and increases water infiltration. Brassicaceae spp. that are mature or at later growth stages can be troublesome to control. Experiments were conducted in Delaware and Virginia to evaluate herbicides for terminating rapeseed cover crops. Two separate experiments, adjacent to each other, were established to evaluate rapeseed termination by 14 herbicide treatments at two timings. Termination timings included an early and late termination to simulate rapeseed termination prior to planting corn and soybean, respectively, for the region. At three locations where rapeseed height averaged 12 cm at early termination and 52 cm at late termination, glyphosate + 2,4-D was most effective, controlling rapeseed 96% 28 d after early termination (DAET). Paraquat + atrazine + mesotrione (92%), glyphosate + saflufenacil (91%), glyphosate + dicamba (91%), and glyphosate (86%) all provided at least 80% control 28 DAET. Rapeseed biomass followed a similar trend. Paraquat + 2,4-D (85%), glyphosate + 2,4-D (82%), and paraquat + atrazine + mesotrione (81%) were the only treatments that provided at least 80% control 28 d after late termination (DALT). Herbicide efficacy was less at Painter in 2017, where rapeseed height was 41 cm at early termination, and 107 cm at late termination. No herbicide treatments controlled rapeseed >80% 28 DAET or 28 DALT at this location. Herbicide termination of rapeseed is best when the plant is small; termination of large rapeseed plants may require mechanical of other methods beyond herbicides.
Geochemical and related studies have been made of near-surface sediments from the River Clyde estuary and adjoining areas, extending from Glasgow to the N, and W as far as the Holy Loch on the W coast of Scotland, UK. Multibeam echosounder, sidescan sonar and shallow seismic data, taken with core information, indicate that a shallow layer of modern sediment, often less than a metre thick, rests on earlier glacial and post-glacial sediments. The offshore Quaternary history can be aligned with onshore sequences, with the recognition of buried drumlins, settlement of muds from quieter water, probably behind an ice dam, and later tidal delta deposits. The geochemistry of contaminants within the cores also indicates shallow contaminated sediments, often resting on pristine pre-industrial deposits at depths less than 1m. The distribution of different contaminants with depth in the sediment, such as Pb (and Pb isotopes), organics and radionuclides, allow chronologies of contamination from different sources to be suggested. Dating was also attempted using microfossils, radiocarbon and 210Pb, but with limited success. Some of the spatial distribution of contaminants in the surface sediments can be related to grain-size variations. Contaminants are highest, both in absolute terms and in enrichment relative to the natural background, in the urban and inner estuary and in the Holy Loch, reflecting the concentration of industrial activity.
Child welfare policy making is a highly contested area in public policy. Child abuse scandals prompt critical appraisals of parents, professionals and the child protection system creating a tipping point for reform. One hundred and six transcripts of debates in the West Australian Parliament from August until December 2006 relating to child welfare and child deaths were analysed using qualitative content analysis. The analysis found that statistics about child deaths were conflated with other levels of childhood vulnerability promoting blame, fear, risk and an individual responsibility theme. The key rhetorical strategy was the use of numbers to generate emotion, credibility and authority to frame child maltreatment narrowly as a moral crime. Rhetoric and emotions is about telling causal stories and will remain ubiquitous in social policy making. So, in order to guide policy debate and creation, ground their claims and manage ambiguity and uncertainty, policy makers, researchers and practitioners working with complex social issues will do well to step into this public and political discourse and be strategic in shaping more nuanced alternative frames.
A predictive risk stratification tool (PRISM) to estimate a patient's risk of an emergency hospital admission in the following year was trialled in general practice in an area of the United Kingdom. PRISM's introduction coincided with a new incentive payment (‘QOF’) in the regional contract for family doctors to identify and manage the care of people at high risk of emergency hospital admission.
Alongside the trial, we carried out a complementary qualitative study of processes of change associated with PRISM's implementation. We aimed to describe how PRISM was understood, communicated, adopted, and used by practitioners, managers, local commissioners and policy makers. We gathered data through focus groups, interviews and questionnaires at three time points (baseline, mid-trial and end-trial). We analyzed data thematically, informed by Normalisation Process Theory (1).
All groups showed high awareness of PRISM, but raised concerns about whether it could identify patients not yet known, and about whether there were sufficient community-based services to respond to care needs identified. All practices reported using PRISM to fulfil their QOF targets, but after the QOF reporting period ended, only two practices continued to use it. Family doctors said PRISM changed their awareness of patients and focused them on targeting the highest-risk patients, though they were uncertain about the potential for positive impact on this group.
Though external factors supported its uptake in the short term, with a focus on the highest risk patients, PRISM did not become a sustained part of normal practice for primary care practitioners.
Emergency admissions to hospital are a major financial burden on health services. In one area of the United Kingdom (UK), we evaluated a predictive risk stratification tool (PRISM) designed to support primary care practitioners to identify and manage patients at high risk of admission. We assessed the costs of implementing PRISM and its impact on health services costs. At the same time as the study, but independent of it, an incentive payment (‘QOF’) was introduced to encourage primary care practitioners to identify high risk patients and manage their care.
We conducted a randomized stepped wedge trial in thirty-two practices, with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. We analysed routine linked data on patient outcomes for 18 months (February 2013 – September 2014). We assigned standard unit costs in pound sterling to the resources utilized by each patient. Cost differences between the two study phases were used in conjunction with differences in the primary outcome (emergency admissions) to undertake a cost-effectiveness analysis.
We included outcomes for 230,099 registered patients. We estimated a PRISM implementation cost of GBP0.12 per patient per year.
Costs of emergency department attendances, outpatient visits, emergency and elective admissions to hospital, and general practice activity were higher per patient per year in the intervention phase than control phase (adjusted δ = GBP76, 95 percent Confidence Interval, CI GBP46, GBP106), an effect that was consistent and generally increased with risk level.
Despite low reported use of PRISM, it was associated with increased healthcare expenditure. This effect was unexpected and in the opposite direction to that intended. We cannot disentangle the effects of introducing the PRISM tool from those of imposing the QOF targets; however, since across the UK predictive risk stratification tools for emergency admissions have been introduced alongside incentives to focus on patients at risk, we believe that our findings are generalizable.
As part of a project to investigate the flow of ice at low effective stress, two independent strain-gauge systems were used to measure vertical strain rate as a function of depth and time at Siple Dome, Antarctica. The measurements were made from January 1998 until January 2002 at the ice divide and a site 7km to the northeast on the flank. The strain-rate profiles place constraints on the rheology of ice at low stress, show the expected differences between divide and flank flow (with some structure due to firn compaction and probably ice stratigraphy), and suggest that the flow of the ice sheet has not changed much in the last 8.6 kyr. The strain rates show an unexpected time dependence on scales ranging from several months to hours, including discrete summer events at the divide. Time dependence in strain rate, water pressure, seismicity, velocity and possibly basal motion has been seen previously on the Siple Coast ice streams, but it is especially surprising on Siple Dome because the bed is cold.
We used observations and modeling of Siple Dome, West Antarctica, a ridge ice divide, to infer the importance of linear deformation mechanisms in ice-sheet flow. We determined the crossover stress (a threshold value of the effective deviatoric stress below which linear flow mechanisms dominate over nonlinear flow mechanisms) by combining measurements of ice properties with in situ deformation rate measurements and a finite-element ice flow model that accounts for the effects of viscous anisotropy induced by preferred crystal-orientation fabric. We found that a crossover stress of 0.18 bar produces the best match between predicted and observed deformation rates. For Siple Dome, this means that including a linear term in the flow law is necessary, but generally the flow is still dominated by the nonlinear (Glen; n = 3) term. The pattern of flow near the divide at Siple Dome is also strongly affected by crystal fabric. Measurements of sonic velocity, which is a proxy for vertically oriented crystal fabric, suggest that a bed-parallel shear band exists several hundred meters above the bed within the Ice Age ice.
As part of a larger program to measure and model vertical strain around Siple Dome on the West Antarctic ice sheet, we developed a new sensor to accurately and stably record displacements. The sensors consist of optical fibers, encased in thin-wall stainless-steel tubes, frozen into holes drilled with hot water, and stretched from the surface to various depths (up to 985 m) in the ice sheet. An optical system, connected annually to the fibers, reads out their absolute lengths with a precision of about 2 mm. Two sets of five sensors were installed in the 1997/98 field season: one set is near the Siple Dome core hole (an ice divide), and a second set is on the flank 7 km to the north (the ice thickness at both sites is approximately 1000 m). The optical-fiber length observations taken in four field seasons spanning a 3 year interval reveal vertical strain rates ranging from −229 ± 4 ppm a−1 to −7 ± 9 ppm a−1. In addition to confirming a non-linear constitutive relationship for deep ice, our analysis of the strain rates indicates the ice sheet is thinning at the flank and is in steady state at the divide.
The Vietnam War has long been regarded as pivotal in the history of the Republic of Korea, although its involvement in this conflict remains controversial. While most scholarship has focused on the political and economic ramifications of the war – and allegations of brutality by Korean troops – few scholars have considered the impact of the conflict upon medicine and public health. This article argues that the war had a transformative impact on medical careers and public health in Korea, and that this can be most clearly seen in efforts to control parasitic diseases. These diseases were a major drain on military manpower and a matter of growing concern domestically. The deployment to Vietnam boosted research into parasitic diseases of all kinds and accelerated the domestic campaign to control malaria and intestinal parasites. It also had a formative impact upon the development of overseas aid.
This article describes a formal proof of the Kepler conjecture on dense sphere packings in a combination of the HOL Light and Isabelle proof assistants. This paper constitutes the official published account of the now completed Flyspeck project.
ABSTRACT. Sanitary issues are at the heart of the history of mankind in its relationship with the sea. The most serious pandemics were spread via the maritime routes. Increased trip duration from the 16th century and the development of steam power in the 19th century also contributed to the introduction of new illnesses. Experimentation to find ‘solutions’ to these problems and the discovery of different local therapeutics led on-board practitioners to be pioneers in the art of curing illnesses and initiators of new disciplines in the medical domain. Even today, the oceans continue to provide new challenges and opportunities in the field of medicine.
RÉSUMÉ. Les problèmes sanitaires sont au coeur de l'histoire de l'humanité dans ses rapports avec la mer. Les épidémies les plus violentes ont été propagées par les voies maritimes. L'augmentation de la durée du voyage au XVIe siècle et le développement de la vapeur au XIXe siècle ont également contribué à l'émergence de nouvelles maladies. L'expérimentation pour trouver des « solutions » à ces problèmes et la découverte de différentes thérapies locales ont conduit les praticiens-navigateurs à être des pionniers dans l'art de guérir les maladies et des initiateurs de nouvelles disciplines dans le domaine médical. Même encore aujourd'hui, les océans continuent de fournir de nouveaux défis et de nouvelles opportunités dans le domaine de la médecine.
Throughout history, the ocean has presented enormous challenges to human health. During the modern period, as long-distance voyages became increasingly common, familiar problems such as motion sickness and exposure to the elements were joined by new ones, especially those caused by dietary deficiency and infectious disease. With these challenges came opportunities. Travel to foreign locations provided medical practitioners with access to new drugs and ideas, as well as freedom from the constraints imposed by law and custom at home. As a result, maritime practitioners began to make important and distinct contributions to many areas of health and medicine.
Before we examine some of these innovations, we need to consider how the epidemiological landscape was changed by maritime navigation. By the 18th century, long-distance voyages were bringing many parts of the world into regular contact with one another; not simply the Atlantic and Indian Oceans but also, increasingly, the Pacific, including the remotest southern seas.
Giant ragweed has been increasing as a major weed of row crops in the last 30 yr, but quantitative data regarding its pattern and mechanisms of spread in crop fields are lacking. To address this gap, we conducted a Web-based survey of certified crop advisors in the U.S. Corn Belt and Ontario, Canada. Participants were asked questions regarding giant ragweed and crop production practices for the county of their choice. Responses were mapped and correlation analyses were conducted among the responses to determine factors associated with giant ragweed populations. Respondents rated giant ragweed as the most or one of the most difficult weeds to manage in 45% of 421 U.S. counties responding, and 57% of responding counties reported giant ragweed populations with herbicide resistance to acetolactate synthase inhibitors, glyphosate, or both herbicides. Results suggest that giant ragweed is increasing in crop fields outward from the east-central U.S. Corn Belt in most directions. Crop production practices associated with giant ragweed populations included minimum tillage, continuous soybean, and multiple-application herbicide programs; ecological factors included giant ragweed presence in noncrop edge habitats, early and prolonged emergence, and presence of the seed-burying common earthworm in crop fields. Managing giant ragweed in noncrop areas could reduce giant ragweed migration from noncrop habitats into crop fields and slow its spread. Where giant ragweed is already established in crop fields, including a more diverse combination of crop species, tillage practices, and herbicide sites of action will be critical to reduce populations, disrupt emergence patterns, and select against herbicide-resistant giant ragweed genotypes. Incorporation of a cereal grain into the crop rotation may help suppress early giant ragweed emergence and provide chemical or mechanical control options for late-emerging giant ragweed.