We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure coreplatform@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Triage is a tool used to determine patients’ severity of illness or injury within minutes of arrival. This study aims to assess the reliability and validity of a new computer-based triage decision support tool, ANKUTRIAGE, prospectively.
Methods:
ANKUTRIAGE, a 5-level triage tool was established considering 2 major factors, patient’s vital signs and characteristics of the admission complaint. Adult patients admitted to the ED between July and October, 2019 were consecutively and independently double triaged by 2 assessors using ANKUTRIAGE system. To measure inter-rater reliability, quadratic-weighted kappa coefficients (Kw) were calculated. For the validity, associations among urgency levels, resource use, and clinical outcomes were evaluated.
Results:
The inter-rater reliability between users of ANKUTRIAGE was excellent with an agreement coefficient (Kw) greater than 0.8 in all compared groups. In the validity phase, hospitalization rate, intensive care unit admission and mortality rate decreased from level 1 to 5. Likewise, according to the urgency levels, resource use decreased significantly as the triage level decreased (P < 0.05).
Conclusions:
ANKUTRIAGE proved to be a valid and reliable tool in the emergency department. The results showed that displaying the key discriminator for each complaint to assist decision leads to a high inter-rater agreement with good correlation between urgency levels and clinical outcomes, as well as between urgency levels and resource consumptions.
With the rise of digital technologies the number and diversity of related tools (such as phones, computers, 3-D printers, etc.) have markedly increased. This chapter examines how digital objects and other new technologies alter human experiences with the material world.
I argue that Open Science as currently conceptualised and implemented does not take sufficient account of epistemic diversity within research. I use three case studies to exemplify how Open Science threatens to privilege some forms of inquiry over others, thus exasperating divides within and across systems of practice, and overlooking important sources and forms of epistemic diversity. Building on insights from pluralist philosophy, I then identify four aspects of diverse research practices that should serve as reference points for debates around Open Science: (1) specificity to local conditions, (2) entrenchment within repertoires, (3) permeability to newcomers and (4) demarcation strategies.
NeXL is a collection of Julia language packages (libraries) for X-ray microanalysis data processing. NeXLCore provides basic atomic and X-ray physics data and models including support for microanalysis-related data types for materials and k-ratios. NeXLMatrixCorrection provides algorithms for matrix correction and iteration. NeXLSpectrum provides utilities and tools for energy-dispersive X-ray spectrum and hyperspectrum analysis including display, manipulation, and fitting. NeXL is integrated with the Julia language infrastructure. NeXL builds on the Gadfly plotting library and the DataFrames tabular data library. When combined with the DrWatson package, NeXL can provide a highly reproducible environment in which to process microanalysis data. Data availability and reproducible data analysis are two keys to scientific reproducibility. Not only should readers of journal articles have access to the data, they should also be able to reproduce the analysis steps that take the data to final results. This paper will both discuss the NeXL framework and provide examples of how it can used for reproducible data analysis.
The IoT raises several questions germane to traditional products liability law and the UCC’s warranty provisions. These include how best to evaluate and remedy consumer harms related to insecure devices, malfunctioning devices, and the termination of services and software integral to a device’s operations. Consider that the modern IoT vehicle with an infotainment system generates massive quantities of data about drivers, and that mobile applications can be used to impact the operations of these vehicles.
Providing high-quality electron images and hyperspectral X-ray maps is a focus of many modern electron microscopy laboratories. Nevertheless, further image processing and annotations are often needed to prepare them for publications and reports. For multi-user facilities, accessibility to processing software can be a limitation either through license costs or availability of processing stations. Open-source software running on multiple platforms allows for post-acquisition data processing in-lab or on user-owned devices. We developed Probelab ReImager to supersede our vendor-supplied acquisition software's exportation by being efficient and highly customizable. This article describes its main features and capabilities.
Species inventories are essential to the implementation of conservation policies to mitigate biodiversity loss and maintain ecosystem services and their value to the society. This is particularly topical with respect to climate change and direct anthropogenic effects on Antarctic biodiversity, with the identification of the most at-risk taxa and geographical areas becoming a priority. Identification tools are often neglected and considered helpful only for taxonomists. However, the development of new online information technologies and computer-aided identification tools provides an opportunity to promote them to a wider audience, especially considering the emerging generation of scientists who apply an integrative approach to taxonomy. This paper aims to clarify essential concepts and provide convenient and accessible tools, tips and suggested systems to use and develop knowledge bases (KBs). The software Xper3 was selected as an example of a user-friendly KB management system to give a general overview of existing tools and functionalities through two applications: the ‘Antarctic Echinoids’ and ‘Odontasteridae Southern Ocean (Asteroids)’ KBs. We highlight the advantages provided by KBs over more classical tools, and future potential uses are highlighted, including the production of field guides to aid in the compilation of species inventories for biodiversity conservation purposes.
This chapter explores core arguments surrounding the political use of bots. It details the brief history of their use online. It accesses the academic literature to highlight key themes on the subject of what some researchers call computational propaganda and others simultaneously call “information operations,” “information warfare,” “influence operations,” “online astroturfing,” “cybertufing,” and many other terms. Computational propaganda, and each of these other concepts to one degree or another, focuses on the ways in which the use of algorithms, automation (most often in the form of political bots), and human curation are used over social media to purposefully distribute misleading information over social media networks.
This chapter explains the meaning of finite element analysis (FEA) and why it is a popular computational method for engineering analysis. By using a structural analysis of a thin plate as an example, the procedure of a typical FEA is illustrated. The history of the development of the method, its applications, and software commercialization are introduced briefly, followed by a description of some of the trends in the current development of the method. The chapter also provides a list of commercial and open-source FEA codes that are popular today.
This chapter consists of two parts. The first part covers basic aspects of machine autonomy as a technical concept, explaining how it constitutes a form of control over a machine and how degrees of autonomous capability manifest in complex machines. This part is not about weapon systems in particular, but about electromechanical systems in general. It briefly outlines adaptive and intelligent control methodologies and explains that autonomy does not sever the connection between a machine and its operator but only alters the relationship between them. The second part discusses some aspects of how autonomous systems are used in military applications. Specifically, autonomous behaviour will extend to systems ancillary to combat operations and autonomous systems will be employed in roles wherein they effectively ‘collaborate’ with human soldiers and with each other. Assessing the legal consequences of using autonomous systems in military operations is therefore not simply a matter of studying the properties of a new type of weapon; it is about understanding a new relationship between soldiers and weapons.
For policymakers, this book explains the ramifications under international humanitarian law of a major new field of weapon development with a focus on questions currently being debated by governments, the United Nations and other bodies. Based on a clear explanation of the principles of autonomous systems and a survey of technologies under active development as well as some that are in use today, it provides a thorough legal analysis grounded on a clear understanding of the technological realities of autonomous weapon systems. For legal practitioners and scholars, it describes the legal constraints that will apply to use of autonomous systems in armed conflict and the measures that will be needed to ensure that the efficacy of the law is maintained. More generally, it serves as a case study in identifying the legal consequences of use of autonomous systems in partnership with, or in place of, human beings.
The Murchison Widefield Array (MWA) is an electronically steered low-frequency (<300 MHz) radio interferometer, with a ‘slew’ time less than 8 s. Low-frequency (∼100 MHz) radio telescopes are ideally suited for rapid response follow-up of transients due to their large field of view, the inverted spectrum of coherent emission, and the fact that the dispersion delay between a 1 GHz and 100 MHz pulse is on the order of 1–10 min for dispersion measures of 100–2000 pc/cm3. The MWA has previously been used to provide fast follow-up for transient events including gamma-ray bursts (GRBs), fast radio bursts (FRBs), and gravitational waves, using systems that respond to gamma-ray coordinates network packet-based notifications. We describe a system for automatically triggering MWA observations of such events, based on Virtual Observatory Event standard triggers, which is more flexible, capable, and accurate than previous systems. The system can respond to external multi-messenger triggers, which makes it well-suited to searching for prompt coherent radio emission from GRBs, the study of FRBs and gravitational waves, single pulse studies of pulsars, and rapid follow-up of high-energy superflares from flare stars. The new triggering system has the capability to trigger observations in both the regular correlator mode (limited to ≥0.5 s integrations) and using the Voltage Capture System (VCS, 0.1 ms integration) of the MWA and represents a new mode of operation for the MWA. The upgraded standard correlator triggering capability has been in use since MWA observing semester 2018B (July–Dec 2018), and the VCS and buffered mode triggers will become available for observing in a future semester.
A new era in radio astronomy will begin with the upcoming large-scale surveys planned at the Australian Square Kilometre Array Pathfinder (ASKAP). ASKAP started its Early Science programme in October 2017 and several target fields were observed during the array commissioning phase. The Scorpio field was the first observed in the Galactic Plane in Band 1 (792–1 032 MHz) using 15 commissioned antennas. The achieved sensitivity and large field of view already allow to discover new sources and survey thousands of existing ones with improved precision with respect to previous surveys. Data analysis is currently ongoing to deliver the first source catalogue. Given the increased scale of the data, source extraction and characterisation, even in this Early Science phase, have to be carried out in a mostly automated way. This process presents significant challenges due to the presence of extended objects and diffuse emission close to the Galactic Plane.
In this context, we have extended and optimised a novel source finding tool, named Caesar, to allow extraction of both compact and extended sources from radio maps. A number of developments have been done driven by the analysis of the Scorpio map and in view of the future ASKAP Galactic Plane survey. The main goals are the improvement of algorithm performances and scalability as well as of software maintainability and usability within the radio community. In this paper, we present the current status of Caesar and report a first systematic characterisation of its performance for both compact and extended sources using simulated maps. Future prospects are discussed in the light of the obtained results.
Chapter 4 analyzes the doctrine of patentable subject matter. Delving into American, European, and Japanese patent jurisprudence, it first describes how these legal systems handle software-related inventions in general. Next, it applies that jurisprudence to 3D printable files to demonstrate why only one of the three 3D printing file formats is likely to constitute patentable subject matter. More intriguingly, it turns out that this file format is of least interest to would-be patent holders. In other words, a patent protection gap exists. Chapter 4 also analyzes jurisdictions’ differential treatment of patent claims directed to electronic signals. The Japanese and European patent systems consider these claims to be patentable subject matter, whereas the U.S. system does not. The upshot is that patent protection for software and 3D printable files is weaker in the United States because most 3D printable files are sold as internet signal transmissions. I argue that the United States should provide protection for signal claims.
Chapter 8 focuses on a specific issue created by 3D printing technology: whether DMFs of purely (or primarily) utilitarian objects should receive copyright protection. Tangible objects dominated by utilitarian concerns do not receive copyright protection. Neither should the corresponding DMF, I argue. This novel argument has attracted criticism, but I defend it as a matter of doctrine and policy. Doctrinally, I argue that U.S. law excludes copyright protection not only for useful articles, but also for designs of (i.e., the shape of) useful articles, even if depicted in a two-dimensional drawing. Moreover, most jurisdictions around the world extend copyright protection only to works containing creativity, and I argue that DMFs of utilitarian objects contain no copyrightable creativity – they are exact, uncreative representations of the unprotected tangible objects. As a matter of policy, allowing copyright protection for DMFs of useful articles would cause copyright law, which is geared toward aesthetic works, to trespass on patent law, which is geared toward utilitarian works. In short, granting these DMFs copyright protection would inhibit the progress of utilitarian innovation.
This article introduces an intuitive understanding of electron Ronchigrams and how they are affected by aberrations. This is accomplished through a portable web application, http://Ronchigram.com. The history of the Ronchigram, the physics which define it, and its visual features are reviewed in the context of aberration-corrected scanning transmission electron microscopy.
Analysis of numerous filamentous structures in an image is often limited by the ability of algorithms to accurately segment complex structures or structures within a dense population. It is even more problematic if these structures continuously grow when recording a time-series of images. To overcome these issues we present DSeg; an image analysis program designed to process time-series image data, as well as single images, to segment filamentous structures. The program includes a robust binary level-set algorithm modified to use size constraints, edge intensity, and past information. We verify our algorithms using synthetic data, differential interference contrast images of filamentous prokaryotes, and transmission electron microscopy images of bacterial adhesion fimbriae. DSeg includes automatic segmentation, tools for analysis, and drift correction, and outputs statistical data such as persistence length, growth rate, and growth direction. The program is available at Sourceforge.
The internet has considerable potential to improve health-related food choice at low-cost. Online solutions in this field can be deployed quickly and at very low cost, especially if they are not dependent on bespoke devices or offline processes such as the provision and analysis of biological samples. One key challenge is the automated delivery of personalised dietary advice in a replicable, scalable and inexpensive way, using valid nutrition assessment methods and effective recommendations. We have developed a web-based personalised nutrition system (eNutri) which assesses dietary intake using a validated graphical FFQ and provides personalised food-based dietary advice automatically. Its effectiveness was evaluated during an online randomised controlled trial dietary intervention (EatWellUK study) in which personalised dietary advice was compared with general population recommendations (control) delivered online. The present paper presents a review of literature relevant to this work, and describes the strategies used during the development of the eNutri app. Its design and source code have been made publicly available under a permissive open source license, so that other researchers and organisations can benefit from this work. In a context where personalised diet advice has great potential for health promotion and disease prevention at-scale and yet is not currently being offered in the most popular mobile apps, the strategies and approaches described in the present paper can help to inform and advance the design and development of technologies for personalised nutrition.