To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
When we think about law and empire, it is most accurate, if inelegant, to pluralize everything: empires, colonies, peoples, cultures, sources of law. The transnational turn has dramatic implications for the history of law in the Americas. Most obviously, especially for the period from 1500 to 1812, scholars have become increasingly sensitive to the role of European empires – including the British, French, and Spanish – in shaping America’s legal cultures. Groups of colonists from across Europe brought a multiplicity of understandings of law and social order with them, encountering Indigenous nations with their own rich legal traditions. Colonists used law to justify their wars, govern their settlements, and express their politics. They also used law to forge relationships with the polities around them, including Indigenous nations and rival colonies. During this period of legal superabundance, the questions of whose law, and what law, applied in colonial jurisdictions were always live, and often impossible to answer with certainty. Law changed as it traveled across oceans, plains, and mountains, incorporating local traditions here and obliterating them there, sometimes deliberately, sometimes by accident.
Cooperation among militant organizations contributes to capability but also presents security risks. This is particularly the case when organizations face substantial repression from the state. As a consequence, for cooperation to emerge and persist when it is most valuable, militant groups must have means of committing to cooperation even when the incentives to defect are high. We posit that shared ideology plays this role by providing community monitoring, authority structures, trust, and transnational networks. We test this theory using new, expansive, time-series data on relationships between militant organizations from 1950 to 2016, which we introduce here. We find that when groups share an ideology, and especially a religion, they are more likely to sustain material cooperation in the face of state repression. These findings contextualize and expand upon research demonstrating that connections between violent nonstate actors strongly shape their tactical and strategic behavior.
We present new results on the Galactic bar/bulge transverse velocity structure using Gaia and the VISTA Variables in Via Lactea (VVV) survey. Gaia is complemented in high extinction regions by the multi-epoch infrared VVV observations for which derived relative proper motions can be tied to Gaia’s absolute frame. We extract kinematic maps (both 2D and 3D) of the Galactic bar/bulge, from which we measure the pattern speed of the bar using a novel technique. We focus on the evidence of an X-shaped bulge from the kinematic maps.
Thalénite-(Y), ideally Y3Si3O10F, is a heavy-rare-earth-rich silicate phase occurring in granite pegmatites that may help to illustrate rare-earth element (REE) chemistry and behaviour in natural systems. The crystal structure and mineral chemistry of thalénite-(Y) were analysed by electron microprobe analysis, X-ray diffraction and micro-Raman spectroscopy from a new locality in the peralkaline granite of the Golden Horn batholith, Okanogan County, Washington State, USA, in comparison with new analyses from the White Cloud pegmatite in the Pikes Peak batholith, Colorado, USA. The Golden Horn thalénite-(Y) occurs as late-stage sub-millimetre euhedral bladed transparent crystals in small miarolitic cavities in an arfvedsonite-bearing biotite granite. It exhibits growth zoning with distinct heavy-rare-earth element (HREE) vs. light-rare-earth element (LREE) enriched zones. The White Cloud thalénite-(Y) occurs in two distinct anhedral and botryoidal crystal habits of mostly homogenous composition. In addition, minor secondary thalénite-(Y) is recognized by its distinct Yb-rich composition (up to 0.8 atoms per formula unit (apfu) Yb). Single-crystal X-ray diffraction analysis and structure refinement reveals Y-site ordering with preferential HREE occupation of Y2 vs. Y1 and Y3 REE sites. Chondrite normalization shows continuous enrichment of HREE in White Cloud thalénite-(Y), in contrast to Golden Horn thalénite-(Y) with a slight depletion of the heaviest REE (Tm, Yb and Lu). The results suggest a hydrothermal origin of the Golden Horn miarolitic thalénite-(Y), compared to a combination of both primary magmatic followed by hydrothermal processes responsible for the multiple generations over a range of spatial scales in White Cloud thalénite-(Y).
Militant groups, like all organizations, carefully consider the tactics and strategies that they employ. We assess why some militant organizations diversify into multiple tactics while others limit themselves to just one or a few. This is an important puzzle because militant organizations that employ multiple approaches to violence are more likely to stretch state defenses, achieve tactical success, and threaten state security. We theorize that militant organizations respond to external pressure by diversifying their tactics to ensure their survival and continued relevance, and that the primary sources of such pressure are government repression and interorganizational competition. We find consistent support for these propositions in tests of both the Global Terrorism Database (GTD) and Minorities at Risk Organizational Behavior (MAROB) data sets. We bolster these findings with an additional specification that employs ethnic fractionalization in the first stage of a multi-process recursive model. These findings are relevant not only for academic research but for policy as well. While it is difficult for countries to anticipate the character of future tactical choices, they may be able to anticipate which groups will most readily diversify and thereby complicate counterterrorism efforts.
New approaches are needed to safely reduce emergency admissions to hospital by targeting interventions effectively in primary care. A predictive risk stratification tool (PRISM) identifies each registered patient's risk of an emergency admission in the following year, allowing practitioners to identify and manage those at higher risk. We evaluated the introduction of PRISM in primary care in one area of the United Kingdom, assessing its impact on emergency admissions and other service use.
We conducted a randomized stepped wedge trial with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. PRISM was implemented in eleven primary care practice clusters (total thirty-two practices) over a year from March 2013. We analyzed routine linked data outcomes for 18 months.
We included outcomes for 230,099 registered patients, assigned to ranked risk groups.
Overall, the rate of emergency admissions was higher in the intervention phase than in the control phase: adjusted difference in number of emergency admissions per participant per year at risk, delta = .011 (95 percent Confidence Interval, CI .010, .013). Patients in the intervention phase spent more days in hospital per year: adjusted delta = .029 (95 percent CI .026, .031). Both effects were consistent across risk groups.
Primary care activity increased in the intervention phase overall delta = .011 (95 percent CI .007, .014), except for the two highest risk groups which showed a decrease in the number of days with recorded activity.
Introduction of a predictive risk model in primary care was associated with increased emergency episodes across the general practice population and at each risk level, in contrast to the intended purpose of the model. Future evaluation work could assess the impact of targeting of different services to patients across different levels of risk, rather than the current policy focus on those at highest risk.
Emergency admissions to hospital are a major financial burden on health services. In one area of the United Kingdom (UK), we evaluated a predictive risk stratification tool (PRISM) designed to support primary care practitioners to identify and manage patients at high risk of admission. We assessed the costs of implementing PRISM and its impact on health services costs. At the same time as the study, but independent of it, an incentive payment (‘QOF’) was introduced to encourage primary care practitioners to identify high risk patients and manage their care.
We conducted a randomized stepped wedge trial in thirty-two practices, with cluster-defined control and intervention phases, and participant-level anonymized linked outcomes. We analysed routine linked data on patient outcomes for 18 months (February 2013 – September 2014). We assigned standard unit costs in pound sterling to the resources utilized by each patient. Cost differences between the two study phases were used in conjunction with differences in the primary outcome (emergency admissions) to undertake a cost-effectiveness analysis.
We included outcomes for 230,099 registered patients. We estimated a PRISM implementation cost of GBP0.12 per patient per year.
Costs of emergency department attendances, outpatient visits, emergency and elective admissions to hospital, and general practice activity were higher per patient per year in the intervention phase than control phase (adjusted δ = GBP76, 95 percent Confidence Interval, CI GBP46, GBP106), an effect that was consistent and generally increased with risk level.
Despite low reported use of PRISM, it was associated with increased healthcare expenditure. This effect was unexpected and in the opposite direction to that intended. We cannot disentangle the effects of introducing the PRISM tool from those of imposing the QOF targets; however, since across the UK predictive risk stratification tools for emergency admissions have been introduced alongside incentives to focus on patients at risk, we believe that our findings are generalizable.
A predictive risk stratification tool (PRISM) to estimate a patient's risk of an emergency hospital admission in the following year was trialled in general practice in an area of the United Kingdom. PRISM's introduction coincided with a new incentive payment (‘QOF’) in the regional contract for family doctors to identify and manage the care of people at high risk of emergency hospital admission.
Alongside the trial, we carried out a complementary qualitative study of processes of change associated with PRISM's implementation. We aimed to describe how PRISM was understood, communicated, adopted, and used by practitioners, managers, local commissioners and policy makers. We gathered data through focus groups, interviews and questionnaires at three time points (baseline, mid-trial and end-trial). We analyzed data thematically, informed by Normalisation Process Theory (1).
All groups showed high awareness of PRISM, but raised concerns about whether it could identify patients not yet known, and about whether there were sufficient community-based services to respond to care needs identified. All practices reported using PRISM to fulfil their QOF targets, but after the QOF reporting period ended, only two practices continued to use it. Family doctors said PRISM changed their awareness of patients and focused them on targeting the highest-risk patients, though they were uncertain about the potential for positive impact on this group.
Though external factors supported its uptake in the short term, with a focus on the highest risk patients, PRISM did not become a sustained part of normal practice for primary care practitioners.
Mass losses originating from supraglacial ice cliffs at the lower tongues of debris-covered glaciers are a potentially large component of the mass balance, but have rarely been quantified. In this study, we develop a method to estimate ice cliff volume losses based on high-resolution topographic data derived from terrestrial and aerial photogrammetry. We apply our method to six cliffs monitored in May and October 2013 and 2014 using four different topographic datasets collected over the debris-covered Lirung Glacier of the Nepalese Himalayas. During the monsoon, the cliff mean backwasting rate was relatively consistent in 2013 (3.8 ± 0.3 cm w.e. d−1) and more heterogeneous among cliffs in 2014 (3.1 ± 0.7 cm w.e. d−1), and the geometric variations between cliffs are larger. Their mean backwasting rate is significantly lower in winter (October 2013–May 2014), at 1.0 ± 0.3 cm w.e. d−1. These results are consistent with estimates of cliff ablation from an energy-balance model developed in a previous study. The ice cliffs lose mass at rates six times higher than estimates of glacier-wide melt under debris, which seems to confirm that ice cliffs provide a large contribution to total glacier melt.
Environmental offsetting involves compensating for the residual adverse impacts of an action on the environment by generating an equivalent benefit elsewhere. As the prevalence of environmental offsetting grows, so does the challenge of translating no-net-loss goals to workable policy. From 2011–2012, the Australian Government developed an Environmental Offsets Policy and an accompanying metric (the Offsets Assessment Guide) to support decision making about offset requirements under the Environment Protection and Biodiversity Conservation Act 1999. Through extensive stakeholder consultation and in collaboration with academic researchers, the Guide was developed with the aim of accounting appropriately for ecological equivalence in a transparent and flexible manner. This paper outlines the Australian Government's environmental offset policy development process, and describes the approach adopted for evaluating the suitability of proposed offsets in meeting the policy goals. The Guide explicitly estimates the extent to which an offset will improve the target biota and/or avert future losses, the degree of confidence that the offset will be implemented successfully, and the time it will take to deliver a conservation benefit. Since implementation of the Environmental Offsets Policy and the Guide, there has been a shift in focus from estimating offset requirements based on simplistic area ratios, toward directly evaluating the components of an offset action that determine its environmental performance. Achieving a balance between scientific robustness and policy workability is an ongoing challenge. The Environmental Offsets Policy and Guide represent an important step towards consistency and transparency in environmental offset decision-making.
The full extent of the height and scale of the Sentinel Range, Antarctica, was not known until reconnaissance flights and scientific traverses in the International Geophysical Year (IGY), 1957–1958. These explorations revealed the range to be twenty miles in length, with a large number of high peaks culminating in Mt. Vinson, the highest on the Antarctic continent at nearly 4900 meters. The discoveries captured the interest of the U.S. and world mountaineering communities setting off a competition to achieve the first climb of Vinson. The challenge was tempered only by the range's remoteness from the coast of Antarctica and the formidable logistics of mounting a mountaineering expedition. The US which had the most advanced ski-equipped cargo aircraft, had an established post-IGY policy that prohibited adventure expeditions that could divert logistic resources from the scientific programme. This paper discusses Mt. Vinson competition within the US and international climbing communities, mounting national pressures to achieve the first climb, and a reversal in policy by the US Antarctic Policy Group that resulted in the 1966–1967 American Antarctic Mountaineering Expedition's first ascents of Vinson and five other high peaks. Today, between 100 and 200 persons climb Mt. Vinson each austral summer.
A series of editorials in this Journal have argued that psychiatry is in the midst of a crisis. The various solutions proposed would all involve a strengthening of psychiatry's identity as essentially ‘applied neuroscience’. Although not discounting the importance of the brain sciences and psychopharmacology, we argue that psychiatry needs to move beyond the dominance of the current, technological paradigm. This would be more in keeping with the evidence about how positive outcomes are achieved and could also serve to foster more meaningful collaboration with the growing service user movement.