To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure email@example.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Buffelgrass (Pennisetum ciliare (L.) Link) is a drought-tolerant invasive grass that is a threat to native biodiversity in the drylands of the Americas and Australia. Despite efforts from land managers to control P. ciliaris, management approaches tend to have mixed success, treatment results can be poorly communicated among entities, and there are few long-term controlled studies. In this literature review we synthesize data from both peer-reviewed and “grey” literature on the efficacy of management techniques to control P. ciliaris and the secondary impacts to native plant communities. Our search resulted in 42 unique sources, containing a total of 229 studies that we categorized into 10 treatment types, which included herbicide, seeding, manual removal, fire, grazing, biocontrol, fire + additional treatments, manual removal + additional treatments, herbicide + additional treatments, and herbicide + manual removal. We found treatments that utilized multiple techniques in tandem as well as follow-up treatments were the most effective at controlling P. ciliaris. Fewer than 1/3 of the studies reported impacts of management on native species and the most commonly studied treatment (herbicide, N = 130) showed detrimental impacts on native plant communities. However, the average time between treatment and outcome measurement was only 15 months; we suggest the need for more long-term studies of treatment efficacy as well as secondary impacts on the ecosystem. Lastly, we conducted a second literature review on P. ciliaris biology and traits for mechanisms that allows it to alter the invaded environment to facilitate a competitive advantage over native species. We found evidence of self-reinforcing feedbacks of invasion being generated by P. ciliaris through its interactions with water availability, nutrient cycling, and disturbance regimes. We developed a conceptual model of P. ciliaris based on these feedback loops and offer management considerations based on its invasion dynamics and biology.
The aim of this research was to look at the emergence of wearable technology and the internet of things (IoT) and their current and potential use in the health and care area. There is a wide and ever-expanding range of wearables, devices, apps, data aggregators and platforms allowing the measurement, tracking and aggregation of a multitude of health and lifestyle measures, information and behaviours. The use and application of such technology and the corresponding richness of data that it can provide bring the health and care insurance market both potential opportunities and challenges. Insurers across a range of fields are already engaging with this type of technology in their proposition designs in areas such as customer engagement, marketing and underwriting. However, it seems like we are just at the start of the journey, on a learning curve to find the optimal practical applications of such technology with many aspects as yet untried, tested or indeed backed up with quantifiable evidence. It is clear though that technology is only part of the solution, on its own it will not engage or change behaviours and insurers will need to consider this in terms of implementation and goals. In the first weeks of forming this working party, it became evident that the potential scope of this technology, the information already out there and the pace of development of it, is almost overwhelming. With many yet-unanswered questions the paper focuses on pulling together in one place relevant information for the consideration of the health and care actuary, and also to open the reader’s eyes to potential future innovations by drawing on use of the technology in other markets and spheres, and the “science fiction–like” new technology that is just around the corner. The paper explores:
an overview of wearables and IoT and available measures,
examples of how this technology is currently being used,
risks and challenges,
future technology developments and
what this may mean for the future of insurance.
Insurers who engage now are likely to be on an evolving business case model and product development journey, over which they can build up their understanding and interpretation of the data that this technology can provide. An exciting area full of potential – when and how will you get involved?
The Canadian Stroke Best Practice Recommendations suggests that patients suspected of transient ischemic attack (TIA)/minor stroke receive urgent brain imaging, preferably computed tomography angiography (CTA). Yet, high requisition rates for non-cerebrovascular patients overburden limited radiological resources, putting patients at risk. We hypothesize that our clinical decision support tool (CDST) developed for risk stratification of TIA in the emergency department (ED), and which incorporates Canadian guidelines, could improve CTA utilization.
Retrospective study design with clinical information gathered from ED patient referrals to an outpatient TIA unit in Victoria, BC, from 2015-2016. Actual CTA orders by ED and TIA unit staff were compared to hypothetical CTA ordering if our CDST had been used in the ED upon patient arrival.
For 1,679 referrals, clinicians ordered 954 CTAs. Our CDST would have ordered a total of 977 CTAs for these patients. Overall, this would have increased the number of imaged-TIA patients by 89 (10.1%) while imaging 98 (16.1%) fewer non-cerebrovascular patients over the 2-year period. Our CDST would have ordered CTA for 18 (78.3%) of the recurrent stroke patients in the sample.
Our CDST could enhance CTA utilization in the ED for suspected TIA patients, and facilitate guideline-based stroke care. Use of our CDST would increase the number of TIA patients receiving CTA before ED discharge (rather than later at TIA units) and reduce the burden of imaging stroke mimics in radiological departments.
Globalization blurs the traditional distinction between high and low politics, creating connections between previously discrete issue areas. An important existing literature focuses on how states may intentionally tie policy areas together to enhance cooperation. Building on recent scholarship in historical institutionalism, the authors emphasize how the extent of political discretion enjoyed by heads of state to negotiate and implement international agreements varies across issue areas. When policy domains are linked, so too are different domestic political configurations, each with its own opportunity structures or points of leverage. Opening up the possibility for such variation, the article demonstrates how actors other than states, such as nonstate and substate actors, use the heterogeneity of opportunity structures to influence negotiations and their institutional consequences. The authors examine the theory's purchase on international cooperation over intelligence, privacy, and data exchange in the transatlantic space in the wake of the terrorist attacks of September 11, 2001, and the revelations made public by Edward Snowden in 2013. The findings speak to critical international relations debates, including the role of nonstate actors in diplomacy, the interaction between domestic and international politics, and the consequences of globalization and digital technologies for the relationship between international political economy and security.
This paper reviews recent research into predicting the eating qualities of beef. A range of instrumental and grading approaches have been discussed, highlighting implications for the European beef industry. Studies incorporating a number of instrumental and spectroscopic techniques illustrate the potential for online systems to non-destructively measure muscle pH, colour, fat and moisture content of beef with R2 (coefficient of determination) values >0.90. Direct predictions of eating quality (tenderness, flavour, juiciness) and fatty acid content using these methods are also discussed though success is greatly variable. R2 values for instrumental measures of tenderness have been quoted as high as 0.85 though R2 values for sensory tenderness values can be as low as 0.01. Discriminant analysis models can improve prediction of variables such as pH and shear force, correctly classifying beef samples into categorical groups with >90% accuracy. Prediction of beef flavour continues to challenge researchers and the industry alike, with R2 values rarely quoted above 0.50, regardless of instrumental or statistical analysis used. Beef grading systems such as EUROP and United States Department of Agriculture systems provide carcase classification and some indication of yield. Other systems attempt to classify the whole carcase according to expected eating quality. These are being supplemented by schemes such as Meat Standards Australia (MSA), based on consumer satisfaction for individual cuts. In Australia, MSA has grown steadily since its inception generating a 10% premium for the beef industry in 2015-16 of $187 million. There is evidence that European consumers would respond to an eating quality guarantee provided it is simple and independently controlled. A European beef quality assurance system might encompass environmental and nutritional measures as well as eating quality and would need to be profitable, simple, effective and sufficiently flexible to allow companies to develop their own brands.
Pressure ridges impact the mass, energy and momentum budgets of the sea-ice cover and present an obstacle to transportation through ice-infested waters. Quantifying ridge characteristics is important for understanding total sea-ice mass and for improving the representation of sea-ice dynamics in high-resolution models. Multi-sensor measurements collected during annual Operation IceBridge (OIB) airborne surveys of the Arctic provide new opportunities to assess the sea ice at the end of winter. We present a new methodology to derive ridge sail height from high-resolution OIB Digital Mapping System (DMS) visible imagery. We assess the efficacy of the methodology by mapping the full sail height distribution along 12 pressure ridges in the western and central Arctic. Comparisons against coincident Airborne Topographic Mapper (ATM) elevation anomalies are used to demonstrate the methodology and evaluate DMS-derived sail heights. Sail heights and elevation anomalies were correlated at 0.81 or above. On average mean and maximum sail height agreed with ATM elevation to within 0.11 and 0.49 m, respectively. Of the ridges mapped, mean sail height ranged from 0.99 to 2.16 m, while maximum sail height ranged from 2.1 to 4.8 m. DMS also delivered higher sampling along ridge crests than coincident ATM data.
The discovery of the first electromagnetic counterpart to a gravitational wave signal has generated follow-up observations by over 50 facilities world-wide, ushering in the new era of multi-messenger astronomy. In this paper, we present follow-up observations of the gravitational wave event GW170817 and its electromagnetic counterpart SSS17a/DLT17ck (IAU label AT2017gfo) by 14 Australian telescopes and partner observatories as part of Australian-based and Australian-led research programs. We report early- to late-time multi-wavelength observations, including optical imaging and spectroscopy, mid-infrared imaging, radio imaging, and searches for fast radio bursts. Our optical spectra reveal that the transient source emission cooled from approximately 6 400 K to 2 100 K over a 7-d period and produced no significant optical emission lines. The spectral profiles, cooling rate, and photometric light curves are consistent with the expected outburst and subsequent processes of a binary neutron star merger. Star formation in the host galaxy probably ceased at least a Gyr ago, although there is evidence for a galaxy merger. Binary pulsars with short (100 Myr) decay times are therefore unlikely progenitors, but pulsars like PSR B1534+12 with its 2.7 Gyr coalescence time could produce such a merger. The displacement (~2.2 kpc) of the binary star system from the centre of the main galaxy is not unusual for stars in the host galaxy or stars originating in the merging galaxy, and therefore any constraints on the kick velocity imparted to the progenitor are poor.
With European Laser Facilities such as the Extreme Light Infrastructure (ELI) and the Helmholtz International Beamline for Extreme Fields (HIBEF) scheduled to come online within the next couple of years, General Atomics, as a major supplier of targets and target components for the High Energy Density Physics community in the United States, is gearing up to meet their demand for large numbers of low cost targets. Using the production of a subassembly for the National Ignition Facility’s fusion targets as an example, we demonstrate that through automation of assembly tasks, the design of targets and their experimental setup can be fairly complex while keeping the assembly time and cost as a minimum. A six-axis Mitsubishi robot is used in combination with vision feedback and a force–torque sensor to assemble target subassemblies of different scales and designs with minimal change of tooling, allowing for design flexibility and short assembly setup times. Implementing automated measurement routines on a Nikon NEXIV microscope further reduces the effort required for target metrology, while electronic data collection and transfer complete a streamlined target production operation that can be adapted to a large variety of target designs.
With the conclusion of the science phase of the Ice, Cloud and land Elevation Satellite (ICESat) mission in late 2009, and the planned launch of ICESat-2 in late 2015, NASA has recently established the IceBridge program to provide continuity between missions. A major goal of IceBridge is to obtain a sea-ice thickness time series via airborne surveys over the Arctic and Southern Oceans. Typically two laser altimeters, the Airborne Topographic Mapper (ATM) and the Land, Vegetation and Ice Sensor (LVIS), are utilized during IceBridge flights. Using laser altimetry simulations of conventional analogue systems such as ICESat, LVIS and ATM, with the multi-beam system proposed for ICESat-2, we investigate differences in measurements gathered at varying spatial resolutions and the impact on sea-ice freeboard. We assess the ability of each system to reproduce the elevation distributions of two sea-ice models and discuss potential biases in lead detection and sea-surface elevation, arising from variable footprint size and spacing. The conventional systems accurately reproduce mean freeboard over 25 km length scales, while ICESat-2 offers considerable improvements over its predecessor ICESat. In particular, its dense along-track sampling of the surface will allow flexibility in the algorithmic approaches taken to optimize the signal-to-noise ratio for accurate and precise freeboard retrieval.
Airborne and spaceborne altimeters provide measurements of sea-ice elevation, from which sea-ice freeboard and thickness may be derived. Observations of the Arctic ice pack by satellite altimeters indicate a significant decline in ice thickness, and volume, over the last decade. NASA’s Ice, Cloud and land Elevation Satellite-2 (ICESat-2) is a next-generation laser altimeter designed to continue key sea-ice observations through the end of this decade. An airborne simulator for ICESat-2, the Multiple Altimeter Beam Experimental Lidar (MABEL), has been deployed to gather pre-launch data for mission development. We present an analysis of MABEL data gathered over sea ice in the Greenland Sea and assess the capabilities of photon-counting techniques for sea-ice freeboard retrieval. We compare freeboard estimates in the marginal ice zone derived from MABEL photon-counting data with coincident data collected by a conventional airborne laser altimeter. We find that freeboard estimates agree to within 0.03 m in the areas where sea-ice floes were interspersed with wide leads, and to within 0.07 m elsewhere. MABEL data may also be used to infer sea-ice thickness, and when compared with coincident but independent ice thickness estimates, MABEL ice thicknesses agreed to within 0.65 m or better.
We demonstrate high vertical yield InAs1-xSbx (0 < x ≤ 0.18) nanowire arrays grown on InP (111)B substrates by calalyst-free selective-area metal-organic chemical vapor deposition. High antimony composition is achieved by pulsing the arsenic flow to reduce the effective arsenic partial pressure while keeping the antimony partial pressure fixed. This increases the antimony vapor phase composition while allowing the antimony partial pressure to be kept low enough to avoid antimony condensation on the growth mask. InAsSb nanowire arrays show strong emission by photoluminescence at 77 K, covering a wavelength range of 3.77 – 5.08 μm. These results pave the way to engineering optical properties and enabling hybrid integration for nanoscale mid-wavelength infrared optical devices.
We describe a versatile infrared camera/spectrograph, IRIS, designed and constructed at the Anglo-Australian Observatory for use on the Anglo-Australian Telescope. A variety of optical configurations can be selected under remote control to provide several direct image scales and a few low-resolution spectroscopic formats. Two cross-dispersed transmission echelles are of novel design, as is the use of a modified Bowen-Burch system to provide a fast f/ratio in the widest-field option. The drive electronics includes a choice of readout schemes for versatility, and continuous display when the array is not taking data, to facilitate field acquisition and focusing.
The linearity of the detector has been studied in detail. Although outwardly good, slight nonlinearities prevent removal of fixed-pattern noise from the data without application of a cubic linearising function.
Specific control and data-reduction software has been written. We describe also a scanning mode developed for spectroscopic imaging.
What is the relationship between globalization and the political power of business? Much of the existing literature focuses on the ability of mobile capital to threaten exit in order to press for more business friendly rules. In this article, we refine arguments about exit options in global markets by arguing that the relative exit options available to business and other actors are neither fixed, nor exogenous consequences of some generically conceived process of globalization. Instead, they themselves are the result of struggles between actors with different interests and political opportunities. Since exit options play a crucial role in determining the relative structural power of business vis-à-vis other actors, we dub the power to shape exit options structuring power, distinguishing it from structural power, and argue that it is crucial to explaining it. We identify two channels through which actors can shape exit options – extending jurisdictional reach and reshaping the rules of other jurisdictions – and the factors that will make regulators and business more or less capable of exercising structuring power. We then use two exploratory case studies – one involving privacy regulation, the other accountancy standards – to illustrate how structuring power can work to shape exit options, and thus structural power. We conclude by considering the relationship between structuring power, structural power, and the existing literature in comparative and international political economy.
Landers and Behrend (2015) call for editors and reviewers to resist using heuristics when evaluating samples in research as well as for researchers to cautiously consider choosing the samples appropriate for their research questions. Whereas we fully agree with the former conclusion, we believe the latter can be extended even further to encourage researchers to embrace the strengths of their samples for understanding their research rather than simply defending their samples. We believe that samples are not inherently better or worse but rather better suited for different research objectives. In this commentary, we identify three continua on which research goals can differ to demonstrate that all samples can inform science. Depending on the position of one's research on these continua, different samples exhibit different strengths; the continua described below can be used to anchor one's sample to demonstrate how it can benefit, rather than limit, research conclusions. As discussed in the focal article, researchers will often apologize for their convenience samples as one of a litany of limitations; we hope that researchers will move sampling issues out of the limitations section and into the main discussion.
The objective of the present study was to explore the influence of participation in community-supported agriculture (CSA) on vegetable exposure, vegetable intake during and after the CSA season, and preference related to locally produced vegetables acquired directly from CSA growers.
Quantitative surveys were administered at three time points in two harvest seasons to four groups of CSA participants: new full-paying, returning full-paying, new subsidized and returning subsidized members. Questionnaires included a vegetable frequency measure and measures of new and changed vegetable preference. Comparisons were made between new and returning CSA members and between those receiving subsidies and full-paying members.
The research was conducted in a rural county in New York, USA.
CSA members who agreed to participate in the study.
Analysis was based on 151 usable questionnaires. CSA participants reported higher intake of eleven different vegetables during the CSA season, with a sustained increase in some winter vegetables. Over half of the respondents reported trying at least one, and up to eleven, new vegetables. Sustained preferences for CSA items were reported.
While those who choose to join a CSA may be more likely to acquire new and expanded vegetable preferences than those who do not, the CSA experience has the potential to enhance vegetable exposure, augment vegetable preference and increase overall vegetable consumption. Dietary patterns encouraged through CSA participation can promote preferences and consumer demand that support local production and seasonal availability. Emphasis on fresh and fresh stored locally produced vegetables is consistent with sustainable community-based food systems.
Owing to the inability of cartilage to heal even minor defects, as well as the prevalence of osteoarthritis, the biological repair of this tissue has been the primary focus of decades of basic science and pre-clinical research. This research focussed on cartilage repair has witnessed marked advances via developments in biomaterials science as well as in tissue engineering methodologies. In this chapter, we review select topics in cartilage tissue engineering, describe current clinical cartilage repair procedures, and discuss ongoing considerations relating to the realization of these advances through pre-clinical animal models.
Cartilage is a collagenous, proteoglycan-rich, and water-saturated flexible soft connective tissue. A single cell type, the chondrocyte, is responsible for cartilage tissue maintenance and homeostasis. The tissue is aneural and avascular in the adult and relies on diffusion for nutrient and waste exchange (Brodin, 1955; Strangeways, 1920). The structure and function of cartilage categorizes these soft connective tissues into three broad groupings: elastic cartilage, fibrocartilage, and hyaline cartilage (Gray and Goss, 1973).
We present a comparative study of the anomalous Nernst effect (ANE), measured at room temperature for magnetite thin films deposited on different substrates in order to study the effects induced by the substrate, compressive or tensile strain and structural defects as anti-phase boundaries (APB), on the observed ANE. From our preliminary results we have observed an increase of the measured ANE in the case of compressive strain compared with the tensile one. Moreover our results also suggest that the density of APBs also play an important role in the ANE values.
What is the relationship between domestic and international politics in a world of economic interdependence? This article discusses and organizes an emerging body of scholarship, which the authors label the new interdependence approach, addressing how transnational interactions shape domestic institutions and global politics in a world of economic interdependence. This literature makes three important contributions. First, it examines how domestic institutions affect the ability of political actors to construct the rules and norms governing interdependent relations and thus present a source of asymmetric power. Second, it explores how interdependence alters domestic political institutions through processes of diffusion, transgovernmental coordination, and extraterritorial application and in turn how it changes the national institutions mediating internal debates on globalization. Third, it studies the shifting boundaries of political contestation through which substate actors affect decision making in foreign jurisdictions. Given the importance of institutional change to the new interdependence agenda, the authors suggest several instances where historical institutionalist tools might be exploited to address these transnational dynamics, in particular, mechanisms of cross-national sequencing and change strategies of substate actors. As globalization continues, it will be ever more difficult to examine national trajectories of institutional change in isolation from each other. Equally, it will be difficult to understand international institutions without paying attention to the ways in which they both transform and are transformed by domestic institutional politics. While the new interdependence approach does not yet cohere as a single voice, the authors believe that it offers an innovative agenda that holds tremendous promise for both comparative and international relations research as it calls on scholars to reconsider the dynamic nature of globalization for global politics.