To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Instrument delivery is critical part in vascular intervention surgery. Due to the soft-body structure of instruments, the relationship between manipulation commands and instrument motion is non-linear, making instrument delivery challenging and time-consuming. Reinforcement learning has the potential to learn manipulation skills and automate instrument delivery with enhanced success rates and reduced workload of physicians. However, due to the sample inefficiency when using high-dimensional images, existing reinforcement learning algorithms are limited on realistic vascular robotic systems. To alleviate this problem, this paper proposes discrete soft actor-critic with auto-encoder (DSAC-AE) that augments SAC-discrete with an auxiliary reconstruction task. The algorithm is applied with distributed sample collection and parameter update in a robot-assisted preclinical environment. Experimental results indicate that guidewire delivery can be automatically implemented after 50k sampling steps in less than 15 h, demonstrating the proposed algorithm has the great potential to learn manipulation skill for vascular robotic systems.
This chapter describes five “action areas” in which politically achievable changes over the coming two decades could render humankind a lot safer than it is today. For climate change, these include urgent measures for rapid decarbonization, coupled with ramped-up research on technologies for carbon removal and for solar radiation management; new international pacts among small groups of nations for emissions reductions with mutual accountability and incentives; and pre-adaptation measures for dealing effectively with unavoidable harms caused by global warming. For nuclear weapons, these include preparing contingency plans for major or limited nuclear wars, as well as risk-reduction measures than can be implemented today. For pandemics, experts point to four sensible and affordable measures that would greatly reduce the harms of future pandemics. For AI, an immediate challenge will be to prepare for chronic mass unemployment due to rising levels of automation. Finally, the chapter proposes the creation of a new federal agency, the Office for Emerging Biotechnology, to oversee and regulate cutting-edge developments in this field.
This chapter explores narratives that informed two influential attempts to automate and consolidate mathematics in large computing systems during the second half of the twentieth century – the QED system and the MACSYMA system. These narratives were both political (aligning the automation of mathematics with certain cultural values) and epistemic (each laid out a vision of what mathematics entailed such that it could and should be automated). These narratives united political and epistemic considerations especially with regards to representation: how will mathematical objects and procedures be translated into computer languages and operations and encoded in memory? How much freedom or conformity will be required of those who use and build these systems? MACSYMA and QED represented opposite approaches to these questions: preserving pluralism with a heterogeneous modular design vs requiring that all mathematics be translated into one shared root logic. The narratives explored here shaped, explained and justified the representational choices made in each system and aligned them with specific political and epistemic projects.
Workplace automation fueled by technological innovations has been generating social policy implications. Defying the prevalent argument that automation risk triggers employment insecurity and prompts individuals to favour redistribution, this study doesn’t find empirical evidence in the Chinese context. Analysing national survey data, this study reveals a very strong association between automation risk and popular preference for government responsibility in old-age support. Further analysis suggests that more generous local welfare systems generate a reinforcing effect between automation risk and individuals’ support for government involvement in old-age support. In a welfare system in which major redistributive policies are not employment-dependent, automation risk may not necessarily trigger stronger preferences for short-term immediate protection through redistributive programmes, but may stimulate individuals to project their need for social protection towards middle- or longer-term and employment-related policies. The generosity of subnational welfare systems moderates the formation of individuals’ social policy preferences through policy feedback.
A genre that glorifies brutish masculinity and late Victorian imperialism, boys' 'lost world' adventure fiction has traditionally been studied for its politically problematic content. While attuned to these concerns, this Element approaches the genre from a different angle, viewing adventure fiction as not just a catalogue of texts but a corpus of books. Examining early editions of Treasure Island, King Solomon's Mines, and The Lost World, the Element argues that fin-de-siècle adventure fiction sought to resist the nineteenth-century industrialisation of book production from within. As the Element points out, the genre is filled with nostalgic simulations of material anachronisms – 'facsimiles' of fictional pre-modern paper, printing, and handwriting that re-humanise the otherwise alienating landscape of the modern book and modern literary production. The Element ends by exploring a subversive revival of lost world adventure fiction that emerged in response to ebooks at the beginning of the twenty-first century.
Technological change has squeezed the demand for middle-skill jobs, which typically involve routine-intense tasks. This squeeze has coincided with an increase in the number of part-time working individuals who wish to work more hours. We argue that these two trends are linked. Due to the decline of middle-skill employment, medium-educated workers shift into low-skill employment, increasing the supply of labour for jobs in this segment of the labour market. This pushes those dependent on these jobs to accept part-time jobs, even if these involve fewer hours than they prefer. To empirically assess this claim, we analyse involuntary part-time employment across 16 European countries between 1999 and 2010. Our analysis confirms that a decline in middle-skill employment is associated with an increase in involuntary part-time employment at the bottom end of the labour market. This finding implies that the automation of routine-intense labour worsens employment possibilities in this segment of the labour market. However, we show that training and job creation schemes mitigate this effect. These programmes cushion competition either by providing medium-educated workers with the necessary skills to shift into high-skill jobs or by increasing employment possibilities. Thus, governments have the tools to support workers facing challenges in the knowledge economy.
We propose the techniques for automatic processing of measurement results in the context of golden (typical) device selection and noise figure measurement. These techniques are for golden (typical) device selection and noise figure measurement processing. Automation of measurement result processing and microwave element modeling speeds up a modeling routine and decreases the risk of possible errors. The techniques are validated through modeling of 0.15 μm GaAs pHEMTs with 4 × 40 μm and 4 × 75 μm total gate widths. Two test amplifiers were designed using the developed models. The amplifier modeling results agree well with measurements which confirms the validity of the proposed techniques. The proposed algorithm is potentially applicable to other circuit types (switches, digital, power amplifiers, mixers, oscillators, etc.) but may require different settings in those cases. However, in the presented work, we validated the algorithm for the linear and low-noise amplifiers only.
Artificial intelligence (AI) promises to reshape scientific inquiry and enable breakthrough discoveries in areas such as energy storage, quantum computing, and biomedicine. Scanning transmission electron microscopy (STEM), a cornerstone of the study of chemical and materials systems, stands to benefit greatly from AI-driven automation. However, present barriers to low-level instrument control, as well as generalizable and interpretable feature detection, make truly automated microscopy impractical. Here, we discuss the design of a closed-loop instrument control platform guided by emerging sparse data analytics. We hypothesize that a centralized controller, informed by machine learning combining limited a priori knowledge and task-based discrimination, could drive on-the-fly experimental decision-making. This platform may unlock practical, automated analysis of a variety of material features, enabling new high-throughput and statistical studies.
In this chapter, we summarize the content of our book and we discuss current limitations of PLC technology. Buidling on these limitations, we highlight new research areas for residential and enterprise PLC networks.
This study introduces automation into a Schumpeterian growth model to explore the effects of R&D and automation subsidies. R&D subsidy increases innovation and growth but decreases the share of automated industries and the degree of capital intensity in the aggregate production function. Automation subsidy has the opposite effects on these macroeconomic variables. Calibrating the model to US data, we find that raising R&D subsidy increases the welfare of high-skill workers but decreases the welfare of low-skill workers and capital owners, whereas increasing automation subsidy increases the welfare of high-skill workers and capital owners but decreases the welfare of low-skill workers. Therefore, whether the government should subsidize innovation or automation depends on how it evaluates the welfare gains and losses of different agents in the economy.
This chapter takes up the ways Marxist cultural theory has explored the intensified mechanization or automation of labor. It suggests that the relationship between labor-saving industrial technology and cultural transformation has been central to twentieth-century Marxist thought, from Frankfurt School theorists Walter Benjamin and Theodor Adorno; to mid-century forebearers of cultural studies like Antonio Gramsci and Herbert Marcuse; to activist thinkers C. L. R. James, Raya Dunayevskaya, and James Boggs; to the Italian autonomists. Despite and often alongside their persistent interest in consumption and the commodity, these thinkers have also explored the ways transformations in the “instruments of labor” affect productive workers themselves. The chapter concludes by drawing attention to the genre of the workers' inquiry, which yokes structural analysis of capital accumulation to a careful rendering of workers’ own experiences, and by calling for future workers inquiries exploring technology and the exploitation of university labor.
The opening drops the reader into the story of Captain "Sully" Sullenberger and Co-pilot Jeffrey Skiles as they take off on their fateful flight in February of 2009. What is likely a familiar story is given new life via quotes from the flight recordings interspersed with a vivid retelling of the scene and their lightning-fast decisions. The first section ends with a cliffhanger: Captain Sully’s famous words: “Brace for impact.” The remainder of the chapter is a journey through vivid examples of technology paired with human stories: from the effects on aviation safety after terrain detectors were mandated to the unexpected tragedies when the detectors "cried wolf" too often.
The rate of innovation in Information Technology (IT) has slowed down over time. The slowdown is evident both in the data on quality-adjusted prices of computers, and performance of microprocessors used in computers. The model in this paper shows that an IT–labor elasticity of substitution that is greater than 1 can explain the slowdown. With an elasticity of substitution greater than 1, however, slowing innovation can result in sustained labor productivity and output growth. Sustained growth is possible because an IT–labor elasticity of substitution greater than 1 results in a continuously increasing share of IT in production costs, which counteracts the effect of slowing innovation on labor productivity and output growth. In this environment of slowing innovation, increasing IT share and sustained growth, employment can increase or decrease, depending on the values of the IT–labor elasticity of substitution and the price elasticity of demand for IT-enabled consumption goods.
Intended to simplify the benefit system and ’make work pay’, Universal Credit (UC) is the UK’s first ‘digital by design’ benefit. Proponents of UC highlight the greater efficiency and effectiveness of digitalisation, while critics point to costly IT write-offs and the ‘digital divide’ between people with the skills and resources to access digital technologies, and those without. Less attention has been paid to automation in UC and its effects on the people subject to these rapidly developing technologies. Findings from research exploring couples’ experiences of claiming UC suggest that automated processes for assessing entitlement and calculating payment may be creating additional administrative burdens for some claimants. Rigid design parameters built into UC’s digital architecture may also restrict options for policy reform. The article calls for a broadening of thinking and research about digitalisation in welfare systems to include questions of administrative burden and the wider effects and impacts on claimants.
The justice system is infamously slow in adopting technology.1 Although recent years saw an exponential increase in the role played by technology within the justice system,2 the legal industry has not kept pace with technical advancements to the same extent as other sectors. As put by former Australian High Court Justice, Michael Kirby, a Dickensian lawyer would still feel at home in the court halls of the 1990s courts, while a Dickensian doctor would not comprehend a contemporaneous hospital due to immense modernisation that had taken place at the same time.3 However, in the COVID-19 era, the courts and tribunals are forced to conduct remote hearings, which imposes a degree of technological awareness and proficiency on the justice system.
This chapter introduces automation techniques for intracytoplasmic sperm injection (ICSI). These techniques are used for sperm motility and morphology quantification, sperm immobilization, sperm aspiration, oocyte orientation control, and sperm injection. Emerging techniques are also described such as computer assisted sperm analysis, laser immobilization, deep learning based polar body detection, and piezo drilling for reducing cell deformation in penetration. Finally, outlooks for future development of automation techniques in ICSI are provided.
As the practice of law is heavily impacted by recent and ongoing advances in technology, it is difficult to provide an accurate picture of either the status quo or the future of the profession whilst focusing on details. Therefore, this chapter focuses on general principles as well as underlying developments and mechanics, rather than detailed accounts of which legal department, agency or law firm is using which techniques or technologies.
Legal tech (LT) companies operating in litigation predominantly address B2C relationships, which is odd against the overall LT backdrop where B2B solutions prevail.1 A ‘no win no fee’ policy, whereby consumers are only charged for success, is popular among LT companies that manage claims.2 Even though their contingency fees tend to be significant,3 they attract consumers who would otherwise have abandoned a claim as a result of rational apathy due to its small value. The automated management of claims has impacted consumer access to justice as the activity of LT companies has led to an increase in redress for small value claims.4
We introduce a novel amortised resource analysis couched in a type-and-effect system. Our analysis is formulated in terms of the physicist’s method of amortised analysis and is potentialbased. The type system makes use of logarithmic potential functions and is the first such system to exhibit logarithmic amortised complexity. With our approach, we target the automated analysis of self-adjusting data structures, like splay trees, which so far have only manually been analysed in the literature. In particular, we have implemented a semi-automated prototype, which successfully analyses the zig-zig case of splaying, once the type annotations are fixed.