To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Infections are a frequent cause of hospital (re)admissions for older adults receiving home health care (HHC) in the United States. However, previous investigators have likely underestimated the prevalence of infections leading to hospitalization due to limitations of identifying infections using Outcome and Assessment Information Set (OASIS), the standardized assessment tool mandated for all Medicare-certified HHC agencies. By linking OASIS data with inpatient data from the Medicare Provider Analysis and Review (MedPAR) file, we were able to better quantify infection hospitalization trends and subsequent mortality among HHC patients. Method: After stratification (by census region, ownership, and urban or rural location) and random sampling, our data set consisted of 2,258,113 Medicare beneficiaries who received HHC services between January 1, 2013, and December 31, 2018, from 1,481 Medicare-certified HHC agencies. The 60-day HHC episodes were identified in OASIS. Hospital transfers reported in OASIS were linked with corresponding MedPAR records. Our outcomes of interest were (1) hospitalization with infection present on admission (POA); (2) hospitalization with infection as the primary cause; and (3) 30-day mortality following hospitalization with infection as the primary cause. We identified bacterial (including suspected) infections based on International Classification of Disease, Ninth Revision (ICD-9) and ICD-10 codes in MedPAR. We classified infections by site: respiratory, urinary tract, skin/soft tissue, intravenous catheter-related, and all (including other or unspecified infection site). We also identified sepsis diagnoses. Result: From 2013 through 2018, the percentage of 60-day HHC episodes with 1 or more hospital transfers ranged from 15% to 16%. Approximately half of all HHC patients hospitalized had an infection POA. Over the 6 years studied, infection (any type) was the primary cause of hospitalization in more than a quarter of all transfers (25.86%–27.57%). The percentage of hospitalizations due to sepsis increased from 7.51% in 2013 to 11.49% in 2018, whereas the percentage of hospitalizations due to respiratory, urinary tract, or skin/soft-tissue infections decreased (p <0.001). Thirty-day mortality following a transfer due to infection ranged from 14.14% in 2013 to 14.98% in 2018; mortality rates were highest following transfers caused by sepsis (23.14%-26.51%) and respiratory infections (13.07%-14.27%). Conclusion: HHC is an important source of post-acute care for those aging in place. Our findings demonstrate that infections are a persistent problem in HHC and are associated with substantial 30-day mortality, particularly following hospitalizations caused by sepsis, emphasizing the importance of infection prevention in HHC. Effective policies to promote best practices for infection prevention and control in the home environment are needed to mitigate infection risk.
The Patient Health Questionnaire-9 (PHQ-9) is a widely used measure of depression in primary care. It was, however, originally designed as a diagnostic screening tool, and not for measuring change in response to antidepressant treatment. Although the Quick Inventory of Depressive Symptomology (QIDS-SR-16) has been extensively validated for outcome measurement, it is poorly adopted in UK primary care, and, although free for clinicians, has licensing restrictions for healthcare organisation use.
We aimed to develop a modified version of the PHQ-9, the Maudsley Modified PHQ-9 (MM-PHQ-9), for tracking symptom changes in primary care. We tested the measure's validity, reliability and factor structure.
A sample of 121 participants was recruited across three studies, and comprised 78 participants with major depressive disorder and 43 controls. MM-PHQ-9 scores were compared with the QIDS-SR-16 and Clinical Global Impressions improvement scale, for concurrent validity. Internal consistency of the scale was assessed, and principal component analysis was conducted to determine the items’ factor structure.
The MM-PHQ-9 demonstrated good concurrent validity with the QIDS-SR-16, and excellent internal consistency. Sensitivity to change over a 14-week period was d = 0.41 compared with d = 0.61 on the QIDS-SR-16. Concurrent validity between the paper and mobile app versions of the MM-PHQ-9 was r = 0.67.
These results indicate that the MM-PHQ-9 is a valid and reliable measure of depressive symptoms in paper and mobile app format, although further validation is required. The measure was sensitive to change, demonstrating suitability for use in routine outcome assessment.
This article explores different approaches to assessing the effectiveness of non-state-based non-judicial grievance mechanisms (NSBGMs) in achieving access to remedy for rightsholders. It queries the approach that has been widely adopted as a result of the United Nations Guiding Principles on Business and Human Rights (UNGPs), which focuses on the procedural aspects of grievance mechanisms. Rather, it stresses the importance of analysing the outcomes of cases for rightsholders. This article tests this hypothesis by undertaking comprehensive empirical research into the complaint mechanism of the Roundtable on Sustainable Palm Oil (RSPO). RSPO is found to perform well when judged according to the UNGPs’ effectiveness criteria. However, it performs poorly when individual cases are assessed to ascertain the outcomes that are achieved for rightsholders. The article therefore argues for the importance of equivalent scrutiny of outcomes in relation to other NSBGMs and provides an approach and accompanying methodology that can be utilized for that purpose.
Most techniques for pollen-based quantitative climate reconstruction use modern assemblages as a reference data set. We examine the implication of methodological choices in the selection and treatment of the reference data set for climate reconstructions using Weighted Averaging Partial Least Squares (WA-PLS) regression and records of the last glacial period from Europe. We show that the training data set used is important because it determines the climate space sampled. The range and continuity of sampling along the climate gradient is more important than sampling density. Reconstruction uncertainties are generally reduced when more taxa are included, but combining related taxa that are poorly sampled in the data set to a higher taxonomic level provides more stable reconstructions. Excluding taxa that are climatically insensitive, or systematically overrepresented in fossil pollen assemblages because of known biases in pollen production or transport, makes no significant difference to the reconstructions. However, the exclusion of taxa overrepresented because of preservation issues does produce an improvement. These findings are relevant not only for WA-PLS reconstructions but also for similar approaches using modern assemblage reference data. There is no universal solution to these issues, but we propose a number of checks to evaluate the robustness of pollen-based reconstructions.
Residual strain in electrodeposited Li films may affect safety and performance in Li metal battery anodes, so it is important to understand how to detect residual strain in electrodeposited Li and the conditions under which it arises. To explore this Li films, electrodeposited onto Cu metal substrates, were prepared under an applied pressure of either 10 or 1000 kPa and subsequently tested for the presence or absence of residual strain via sin2(ψ) analysis. X-ray diffraction (XRD) analysis of Li films required preparation and examination within an inert environment; hence, a Be-dome sample holder was employed during XRD characterization. Results show that the Li film grown under 1000 kPa displayed a detectable presence of in-plane compressive strain (−0.066%), whereas the Li film grown under 10 kPa displayed no detectable in-plane strain. The underlying Cu substrate revealed an in-plane residual strain near zero. Texture analysis via pole figure determination was also performed for both Li and Cu and revealed a mild fiber texture for Li metal and a strong bi-axial texture of the Cu substrate. Experimental details concerning sample preparation, alignment, and analysis of the particularly air-sensitive Li films have also been detailed. This work shows that Li metal exhibits residual strain when electrodeposited under compressive stress and that XRD can be used to quantify that strain.
In 1817–21, the Indian subcontinent was ravaged by a series of epidemics which marked the beginning of what has since become known as the First Cholera Pandemic. Despite their far-reaching consequences, these epidemics have received remarkably little attention and have never been considered as historical subjects in their own right. This article examines the epidemics of 1817–21 in greater detail and assesses their significance for the social and political history of the Indian subcontinent. Additionally, it examines the meanings that were attached to the epidemics in the years running up to the first appearance of cholera in the West. In so doing, the article makes comparisons between responses to cholera in India and in other contexts, and tests the applicability of concepts used in the study of epidemics in the West. It is argued that the official reaction to cholera in India was initially ameliorative, in keeping with the East India Company's response to famines and other supposedly natural disasters. However, this view was gradually supplemented and replaced by a view of cholera as a social disease, requiring preventive action. These views were initially rejected in Britain, but found favour after cholera epidemics in 1831–32. Secondly, in contrast to later epidemics, it is argued that those of 1817–21 did little to exacerbate tensions between rulers and the ruled. On the rare occasions when cholera did elicit a violent reaction, it tended to be intra-communal rather than anti-colonial in nature.
Rapeseed is a popular cover crop choice due to its deep-growing taproot, which creates soil macropores and increases water infiltration. Brassicaceae spp. that are mature or at later growth stages can be troublesome to control. Experiments were conducted in Delaware and Virginia to evaluate herbicides for terminating rapeseed cover crops. Two separate experiments, adjacent to each other, were established to evaluate rapeseed termination by 14 herbicide treatments at two timings. Termination timings included an early and late termination to simulate rapeseed termination prior to planting corn and soybean, respectively, for the region. At three locations where rapeseed height averaged 12 cm at early termination and 52 cm at late termination, glyphosate + 2,4-D was most effective, controlling rapeseed 96% 28 d after early termination (DAET). Paraquat + atrazine + mesotrione (92%), glyphosate + saflufenacil (91%), glyphosate + dicamba (91%), and glyphosate (86%) all provided at least 80% control 28 DAET. Rapeseed biomass followed a similar trend. Paraquat + 2,4-D (85%), glyphosate + 2,4-D (82%), and paraquat + atrazine + mesotrione (81%) were the only treatments that provided at least 80% control 28 d after late termination (DALT). Herbicide efficacy was less at Painter in 2017, where rapeseed height was 41 cm at early termination, and 107 cm at late termination. No herbicide treatments controlled rapeseed >80% 28 DAET or 28 DALT at this location. Herbicide termination of rapeseed is best when the plant is small; termination of large rapeseed plants may require mechanical of other methods beyond herbicides.
Child welfare policy making is a highly contested area in public policy. Child abuse scandals prompt critical appraisals of parents, professionals and the child protection system creating a tipping point for reform. One hundred and six transcripts of debates in the West Australian Parliament from August until December 2006 relating to child welfare and child deaths were analysed using qualitative content analysis. The analysis found that statistics about child deaths were conflated with other levels of childhood vulnerability promoting blame, fear, risk and an individual responsibility theme. The key rhetorical strategy was the use of numbers to generate emotion, credibility and authority to frame child maltreatment narrowly as a moral crime. Rhetoric and emotions is about telling causal stories and will remain ubiquitous in social policy making. So, in order to guide policy debate and creation, ground their claims and manage ambiguity and uncertainty, policy makers, researchers and practitioners working with complex social issues will do well to step into this public and political discourse and be strategic in shaping more nuanced alternative frames.
The Vietnam War has long been regarded as pivotal in the history of the Republic of Korea, although its involvement in this conflict remains controversial. While most scholarship has focused on the political and economic ramifications of the war – and allegations of brutality by Korean troops – few scholars have considered the impact of the conflict upon medicine and public health. This article argues that the war had a transformative impact on medical careers and public health in Korea, and that this can be most clearly seen in efforts to control parasitic diseases. These diseases were a major drain on military manpower and a matter of growing concern domestically. The deployment to Vietnam boosted research into parasitic diseases of all kinds and accelerated the domestic campaign to control malaria and intestinal parasites. It also had a formative impact upon the development of overseas aid.
Geochemical and related studies have been made of near-surface sediments from the River Clyde estuary and adjoining areas, extending from Glasgow to the N, and W as far as the Holy Loch on the W coast of Scotland, UK. Multibeam echosounder, sidescan sonar and shallow seismic data, taken with core information, indicate that a shallow layer of modern sediment, often less than a metre thick, rests on earlier glacial and post-glacial sediments. The offshore Quaternary history can be aligned with onshore sequences, with the recognition of buried drumlins, settlement of muds from quieter water, probably behind an ice dam, and later tidal delta deposits. The geochemistry of contaminants within the cores also indicates shallow contaminated sediments, often resting on pristine pre-industrial deposits at depths less than 1m. The distribution of different contaminants with depth in the sediment, such as Pb (and Pb isotopes), organics and radionuclides, allow chronologies of contamination from different sources to be suggested. Dating was also attempted using microfossils, radiocarbon and 210Pb, but with limited success. Some of the spatial distribution of contaminants in the surface sediments can be related to grain-size variations. Contaminants are highest, both in absolute terms and in enrichment relative to the natural background, in the urban and inner estuary and in the Holy Loch, reflecting the concentration of industrial activity.
ABSTRACT. Sanitary issues are at the heart of the history of mankind in its relationship with the sea. The most serious pandemics were spread via the maritime routes. Increased trip duration from the 16th century and the development of steam power in the 19th century also contributed to the introduction of new illnesses. Experimentation to find ‘solutions’ to these problems and the discovery of different local therapeutics led on-board practitioners to be pioneers in the art of curing illnesses and initiators of new disciplines in the medical domain. Even today, the oceans continue to provide new challenges and opportunities in the field of medicine.
RÉSUMÉ. Les problèmes sanitaires sont au coeur de l'histoire de l'humanité dans ses rapports avec la mer. Les épidémies les plus violentes ont été propagées par les voies maritimes. L'augmentation de la durée du voyage au XVIe siècle et le développement de la vapeur au XIXe siècle ont également contribué à l'émergence de nouvelles maladies. L'expérimentation pour trouver des « solutions » à ces problèmes et la découverte de différentes thérapies locales ont conduit les praticiens-navigateurs à être des pionniers dans l'art de guérir les maladies et des initiateurs de nouvelles disciplines dans le domaine médical. Même encore aujourd'hui, les océans continuent de fournir de nouveaux défis et de nouvelles opportunités dans le domaine de la médecine.
Throughout history, the ocean has presented enormous challenges to human health. During the modern period, as long-distance voyages became increasingly common, familiar problems such as motion sickness and exposure to the elements were joined by new ones, especially those caused by dietary deficiency and infectious disease. With these challenges came opportunities. Travel to foreign locations provided medical practitioners with access to new drugs and ideas, as well as freedom from the constraints imposed by law and custom at home. As a result, maritime practitioners began to make important and distinct contributions to many areas of health and medicine.
Before we examine some of these innovations, we need to consider how the epidemiological landscape was changed by maritime navigation. By the 18th century, long-distance voyages were bringing many parts of the world into regular contact with one another; not simply the Atlantic and Indian Oceans but also, increasingly, the Pacific, including the remotest southern seas.
Emergency admissions to hospital are a major financial burden on health services. In one area of the United Kingdom (UK), we evaluated a predictive risk stratification tool (PRISM) designed to support primary care practitioners to identify and manage patients at high risk of admission. We assessed the costs of implementing PRISM and its impact on health services costs. At the same time as the study, but independent of it, an incentive payment (‘QOF’) was introduced to encourage primary care practitioners to identify high risk patients and manage their care.
We conducted a randomized stepped wedge trial in thirty-two practices, with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. We analysed routine linked data on patient outcomes for 18 months (February 2013 – September 2014). We assigned standard unit costs in pound sterling to the resources utilized by each patient. Cost differences between the two study phases were used in conjunction with differences in the primary outcome (emergency admissions) to undertake a cost-effectiveness analysis.
We included outcomes for 230,099 registered patients. We estimated a PRISM implementation cost of GBP0.12 per patient per year.
Costs of emergency department attendances, outpatient visits, emergency and elective admissions to hospital, and general practice activity were higher per patient per year in the intervention phase than control phase (adjusted δ = GBP76, 95 percent Confidence Interval, CI GBP46, GBP106), an effect that was consistent and generally increased with risk level.
Despite low reported use of PRISM, it was associated with increased healthcare expenditure. This effect was unexpected and in the opposite direction to that intended. We cannot disentangle the effects of introducing the PRISM tool from those of imposing the QOF targets; however, since across the UK predictive risk stratification tools for emergency admissions have been introduced alongside incentives to focus on patients at risk, we believe that our findings are generalizable.
This article describes a formal proof of the Kepler conjecture on dense sphere packings in a combination of the HOL Light and Isabelle proof assistants. This paper constitutes the official published account of the now completed Flyspeck project.
A predictive risk stratification tool (PRISM) to estimate a patient's risk of an emergency hospital admission in the following year was trialled in general practice in an area of the United Kingdom. PRISM's introduction coincided with a new incentive payment (‘QOF’) in the regional contract for family doctors to identify and manage the care of people at high risk of emergency hospital admission.
Alongside the trial, we carried out a complementary qualitative study of processes of change associated with PRISM's implementation. We aimed to describe how PRISM was understood, communicated, adopted, and used by practitioners, managers, local commissioners and policy makers. We gathered data through focus groups, interviews and questionnaires at three time points (baseline, mid-trial and end-trial). We analyzed data thematically, informed by Normalisation Process Theory (1).
All groups showed high awareness of PRISM, but raised concerns about whether it could identify patients not yet known, and about whether there were sufficient community-based services to respond to care needs identified. All practices reported using PRISM to fulfil their QOF targets, but after the QOF reporting period ended, only two practices continued to use it. Family doctors said PRISM changed their awareness of patients and focused them on targeting the highest-risk patients, though they were uncertain about the potential for positive impact on this group.
Though external factors supported its uptake in the short term, with a focus on the highest risk patients, PRISM did not become a sustained part of normal practice for primary care practitioners.
Giant ragweed has been increasing as a major weed of row crops in the last
30 yr, but quantitative data regarding its pattern and mechanisms of spread
in crop fields are lacking. To address this gap, we conducted a Web-based
survey of certified crop advisors in the U.S. Corn Belt and Ontario, Canada.
Participants were asked questions regarding giant ragweed and crop
production practices for the county of their choice. Responses were mapped
and correlation analyses were conducted among the responses to determine
factors associated with giant ragweed populations. Respondents rated giant
ragweed as the most or one of the most difficult weeds to manage in 45% of
421 U.S. counties responding, and 57% of responding counties reported giant
ragweed populations with herbicide resistance to acetolactate synthase
inhibitors, glyphosate, or both herbicides. Results suggest that giant
ragweed is increasing in crop fields outward from the east-central U.S. Corn
Belt in most directions. Crop production practices associated with giant
ragweed populations included minimum tillage, continuous soybean, and
multiple-application herbicide programs; ecological factors included giant
ragweed presence in noncrop edge habitats, early and prolonged emergence,
and presence of the seed-burying common earthworm in crop fields. Managing
giant ragweed in noncrop areas could reduce giant ragweed migration from
noncrop habitats into crop fields and slow its spread. Where giant ragweed
is already established in crop fields, including a more diverse combination
of crop species, tillage practices, and herbicide sites of action will be
critical to reduce populations, disrupt emergence patterns, and select
against herbicide-resistant giant ragweed genotypes. Incorporation of a
cereal grain into the crop rotation may help suppress early giant ragweed
emergence and provide chemical or mechanical control options for
late-emerging giant ragweed.
This paper reports on the findings from a qualitative study exploring the experiences of teenage mothers using a nurse-led, home-based contraceptive service designed to prevent repeat unplanned pregnancies. The aim was to understand if, and how the service was effective in equipping teenage mothers to make informed choices about contraception, thus preventing a second pregnancy.
Unplanned teenage pregnancy remains a significant focus of health and social policy in the United Kingdom (UK). Despite the long-term pattern of declining conception rates, the UK continues to report higher rates than comparable countries elsewhere in Europe. Current estimates suggest that approximately one fifth of births amongst under 18’s are repeat pregnancies (Teenage Pregnancy Independent Advisory Group, 2009). Services that are designed to reduce second unplanned pregnancies are an important element in promoting teenage sexual health. However, there has been no UK research that explores this kind of service and the experiences of service users.
We conducted a qualitative interview study. From 2013–2014 we interviewed 40 teenage mothers who had engaged with the nurse-led, home-based contraceptive service.
The data demonstrates that the service was effective in preventing repeat pregnancies in a number of cases. Among the aspects of the service which were found to contribute to its effectiveness were privacy, convenience, flexibility, appropriately timed access, the non-judgemental attitude of staff and ongoing support.
Disease has followed trade, exploration, and conflict, and has magnified their consequences. The middle of the eighteenth century saw few great shifts in patterns of disease but the advent of what would become a near global conflict between the European powers, the Seven Years' War, brought heavy mortality to the affected regions. By 1801, the disease had crossed the Atlantic, where it intermittently ravaged the Mediterranean coast of Spain for two decades, severely affecting cities such as Cadiz and Barcelona. As cholera disappeared from the developed world, a new and more terrifying threat emerged from the Orient. Epidemic diseases such as cholera remained a problem in the most deprived parts of Asia and Africa, particularly at times of famine and unrest. Civilian populations suffered as a result of infection and destruction of sanitary infrastructure. The influenza of 1918-1919 marked the end of a century of pandemic disease, but the great upheavals of previous decades affected many species other than humans.