Book contents
- Frontmatter
- Contents
- List of contributors
- Acknowledgements
- PART I Introduction
- PART II Meanings of autonomy and human cognition under automation
- PART III Autonomous weapons systems and human dignity
- PART IV Risk, transparency and legal compliance in the regulation of autonomous weapons systems
- PART V New frameworks for collective responsibility
- 11 The obligation to exercise discretion in warfare: why autonomous weapons systems are unlawful
- 12 Autonomy and uncertainty: increasingly autonomous weapons systems and the international legal regulation of risk
- PART VI New frameworks for individual responsibility
- PART VII Conclusion
- Index
12 - Autonomy and uncertainty: increasingly autonomous weapons systems and the international legal regulation of risk
from PART V - New frameworks for collective responsibility
Published online by Cambridge University Press: 05 August 2016
- Frontmatter
- Contents
- List of contributors
- Acknowledgements
- PART I Introduction
- PART II Meanings of autonomy and human cognition under automation
- PART III Autonomous weapons systems and human dignity
- PART IV Risk, transparency and legal compliance in the regulation of autonomous weapons systems
- PART V New frameworks for collective responsibility
- 11 The obligation to exercise discretion in warfare: why autonomous weapons systems are unlawful
- 12 Autonomy and uncertainty: increasingly autonomous weapons systems and the international legal regulation of risk
- PART VI New frameworks for individual responsibility
- PART VII Conclusion
- Index
Summary
Uncertainty and its problems
The debate concerning the law, ethics and policy of autonomous weapons systems (AWS) remains at an early stage, but one of the consistent emergent themes is that of uncertainty. Uncertainty presents itself as a problem in several different registers: first, there is the conceptual uncertainty surrounding how to define and debate the nature of autonomy in AWS. Contributions to this volume from roboticists, sociologists of science and philosophers of science demonstrate that within and without the field of computer science, no stable consensus exists concerning the meaning of autonomy or of autonomy in weapons systems. Indeed, a review of definitions invoked during a recent expert meeting convened by states parties to the Convention on Certain Conventional Weapons shows substantially different definitions in use among military experts, computer scientists and international humanitarian lawyers.
At stake in the debate over definitions are regulatory preoccupations and negotiating postures over a potential pre-emptive ban. A weapons system capable of identifying, tracking and firing on a target without human intervention, and in a manner consistent with the humanitarian law obligations of precaution, proportionality and distinction, is a fantastic ideal type. Defining AWS in such a way truncates the regulatory issues to a simple question of whether such a system is somehow inconsistent with human dignity – a question about which states, ethicists and lawyers can be expected to reasonably disagree. However, the definition formulated in this manner also begs important legal questions in respect of the design, development and deployment of AWS. Defining autonomous weapons in terms of this pure type reduces almost all questions of legality to questions of technological capacity, to which a humanitarian lawyer's response can only be: ‘If what the programmers and engineers claim is true, then … ’ The temporally prior question of whether international law generally, and international humanitarian law (IHL) in particular, prescribes any standards or processes that should be applied to the design, testing, verification and authorization of the use of AWS is not addressed. Yet these ex ante considerations are urgently in need of legal analysis and may prove to generate deeper legal problems.
- Type
- Chapter
- Information
- Autonomous Weapons SystemsLaw, Ethics, Policy, pp. 284 - 300Publisher: Cambridge University PressPrint publication year: 2016
- 1
- Cited by