Skip to main content Accessibility help
×
Hostname: page-component-848d4c4894-2xdlg Total loading time: 0 Render date: 2024-06-21T14:16:32.060Z Has data issue: false hasContentIssue false

Chapter Nine - Dealing With Daemons: Trust in Autonomous Systems

Published online by Cambridge University Press:  28 February 2024

Philippe Sormani
Affiliation:
Université de Lausanne, Switzerland
Dirk vom Lehn
Affiliation:
King's College London
Get access

Summary

Introduction: Trust and artificial intelligence (AI)

Daemons, or demons, may signify diverse entities. In multitasking computer systems, daemons are computer programs that run as background processes without the supervision of a user. In the entirely different context of thought experiments, philosophers or scientists occasionally imagine demons as agents which act in ways that pose intellectual challenges or highlight apparent paradoxes. In this chapter, both uses become applicable concerning the operations of a set of technologies with automated features. By extension, this argumentation could be relevant for the contemporary discussion on AI and algorithmic systems.

For the past two decades, the field of AI has undergone significant developments. What was once a marginal strand of research in computer science has now been implemented in a wide range of practices, from highly technical expert systems to common and mundane applications. The neurosurgeon segmenting her scan of a brain tumor and the teenager applying a beauty filter to his Snapchat Story can both draw on the power of convolutional neural networks. As another example, although the data differs, uniquely tailored recommendations for medical treatments and suggestions for music may build on similar clustering techniques. The content-agnostic nature of machine learning methods allows for their application across the board.

The profusion of algorithmic systems is also accompanied by some concerns regarding their trustworthiness. Systems that sort, score, recommend or in other ways inform or make decisions that affect human experience have been understood as having many risks (European Commission 2020). Trustworthy AI has become a research field of its own with dedicated venues. For example, the ACM FAccT conference brings together academics and practitioners interested in fairness, accountability and transparency in socio-technical systems. One of the goals of this research is to make AI trustworthy and explainable (i.e., understandable and predictable by humans). As topics for computer science, they may have a certain novelty to them. However, the notions of trust and accountability have a long history in ethnomethodology.

Garfinkel developed his ideas about trust most notably in “A Conception of, and Experiments with, ‘Trust’ as a Condition of Stable Concerted Actions” (Garfinkel 1963), in which he argues that trust is a necessary condition for understanding the events of daily life. According to Watson, though, this study of trust belongs to Garfinkel's early work, and not everyone regards it as “fully fledged EM analysis” (2009, 489).

Type
Chapter
Information
Publisher: Anthem Press
Print publication year: 2023

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×