Hostname: page-component-8448b6f56d-wq2xx Total loading time: 0 Render date: 2024-04-18T23:25:48.983Z Has data issue: false hasContentIssue false

The Ethics of Algorithms in Healthcare

Published online by Cambridge University Press:  20 January 2022

Christina Oxholm
Affiliation:
Department for the Study of Culture, Faculty of Humanities, University of Southern Denmark, 5230Odense, Denmark
Anne-Marie S. Christensen*
Affiliation:
Department for the Study of Culture, Faculty of Humanities, University of Southern Denmark, 5230Odense, Denmark
Anette S. Nielsen
Affiliation:
Clinical Alcohol Research Unit, Department of Clinical Research, Faculty of Health, University of Southern Denmark, 5230Odense, Denmark
*
*Corresponding author. Email: amsc@sdu.dk

Abstract

The amount of data available to healthcare practitioners is growing, and the rapid increase in available patient data is becoming a problem for healthcare practitioners, as they are often unable to fully survey and process the data relevant for the treatment or care of a patient. Consequently, there are currently several efforts to develop systems that can aid healthcare practitioners with reading and processing patient data and, in this way, provide them with a better foundation for decision-making about the treatment and care of patients. There are also efforts to develop algorithms that provide suggestions for such decisions. However, the development of these systems and algorithms raises several concerns related to the privacy of patients, the patient–practitioner relationship, and the autonomy of healthcare practitioners. The aim of this article is to provide a foundation for understanding the ethical challenges related to the development of a specific form of data-processing systems, namely clinical algorithms.

Type
Departments and Columns
Copyright
© The Author(s), 2022. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Notes

1. Mittelstadt, BD, Allo, P, Taddeo, M, Wachter, S, Floridi, L. The ethics of algorithms: Mapping the debate. Big Data & Society 2016;3:121.CrossRefGoogle Scholar

2. Green, G, Defoe, EC. What is a clinical algorithm. Clinical Pediatrics 1978;17(5):457–63.CrossRefGoogle ScholarPubMed

3. Kimmel, SE, French, B, Kasner, SE, Johnson, JA, Anderson, JL, Gage, BF, et al. A pharmacogenetic versus a clinical algorithm for warfarin dosing. The New England Journal of Medicine 2013;369:2283–93.CrossRefGoogle ScholarPubMed

4. Jacobson, TA. Toward ‘pain-free’ statin prescribing: Clinical algorithm for diagnosis and management of Myalgia. Mayo Clinic Proceedings 2008;83(6):687700.CrossRefGoogle ScholarPubMed

5. De Jager, PL, Chibnik, LB, Cui, J, Reischl, J, Lehr, S, Simon, KC, et al. Integration of genetic risk factors into a clinical algorithm for multiple sclerosis susceptibility: A weighted genetic risk score. The Lancet. Neurology 2009;8(12):1111–19.CrossRefGoogle ScholarPubMed

6. Berner ES. Clinical decision support systems: State of the art. Agency for Healthcare Research and Quality. AHRQ Publication No. 09–0069; 2009 June:4.

7. See note 6, Berner 2009, at 4.

8. Silber, MH, Ehrenberg, BL, Allen, RP, Buchfuhrer, MJ, Earley, CJ, Hening, WA, et al. An algorithm for the management of restless leg syndrome. Mayo Clinic Proceedings 2004;79(7):916–22.CrossRefGoogle Scholar

9. Bousquet, J, Schünemann, HJ, Hellings, PW, Arnavielhe, S, Bachert, C, Bedbrook, A, et al. MACVIA clinical decision algorithm in adolescents and adults with allergic rhinitis. Journal of Allergy and Clinical Immunology 2016;138(2):367–74.CrossRefGoogle ScholarPubMed

10. Hughes, J. An algorithm for choosing among smoking cessation treatments. Journal of Substance Abuse Treatment 2008;34(4):426–32.CrossRefGoogle ScholarPubMed

11. Schurink, C, Lucas, PJF, Hoepelman, IM, Bonten, MJM. Computer-assisted decision support for diagnosis and treatment of infectious diseases in intensive care units. The Lancet Infectious Diseases 2005;5(5):305–12.CrossRefGoogle ScholarPubMed

12. Council on Children with Disabilities, Section on Developmental Behavioral Pediatrics, Bright Futures Steering Committee, Medical Home Initiatives for Children with Special Needs Project Advisory Committee. Identifying infants and young children with developmental disorders in the medical home: An algorithm for developmental surveillance and screening. Pediatrics 2006;118:405–20.

13. Berger, JS, Jordan, CO, Lloyd, D, Blumenthal, RS. Screening for cardiovascular risk in asymptomatic patients. Journal of the American College of Cardiology 2010;55(12):11691177.CrossRefGoogle ScholarPubMed

14. Nelson, SJ, Blois, MS, Tuttle, MS, Erlbaum, M, Harrison, P, Kim, H, et al. Evaluating RECONSIDER – A computer program for diagnostic prompting. Journal of Medical Systems 1985;9(5–6):379–88.CrossRefGoogle ScholarPubMed

15. Uzoka FME, Osuji J, Obot O. Clinical decision support systems (DSS) in the diagnosis of malaria: A case comparison of two soft computing methodologies. Expert Systems with Applications 2011;38(3):1537–53.

16. Graber ML, Mathew A. Performance of a web-based clinical diagnosis support system for internists. Journal of General Internal Medicine 2008;23(1):37–40.

17. Goldhahn J, Rampton V, Spinas GA. Could artificial intelligence make doctors obsolete? BMJ 2018;363:k4563.

18. From here on and for the sake of brevity, we use the term “clinical algorithm” to refer to both CDS and CDM algorithms. We will address the two types of algorithms separately when their different implications are relevant to the discussion.

19. Nuffield Council on Bioethics. The Collection, Linking and Use of Data in Biomedical Research and Healthcare: Ethical Issues. London: Nuffield Council on Bioethics; 2015, at Chap. 1.

20. See note 19, Nuffield Council on Bioethics 2015, at Chap. 1.

21. Regulation (EU) 2016/679 of the European Parliament, General Data Protection Regulation (GDPR); available at https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32016R0679&from=EN#d1e1797-1-1 (last accessed 15 Nov 2019).

22. Beauchamp, T, Childress, J. Principles of Biomedical Ethics. New York: Oxford University Press; 2013.Google Scholar

23. The distinction between CDS and CDM algorithms is not relevant in the context of the challenge of patient privacy because the implications of CDS and CDM algorithms both concern data flow.

24. Fairweather NB, Rogerson S. A moral approach to electronic patient records. Medical Informatics and the Internet in Medicine 2001;26(3):219–34.

25. Llandres N. Ethical problems caused by the use of informatics in medicine. In: Collste G, ed. Ethics and Information Technology. New Delhi, India: New Academic Publishers; 1998:76.

26. Cato, KD, Bockting, W, , Larson E. Did I tell you that? Ethical issues related to using computational methods to discover non-disclosed patient characteristics. Journal of Empirical Research in Human Research Ethics 2016;11(3):214–19.CrossRefGoogle ScholarPubMed

27. DeCew J. Privacy. The Stanford Encyclopedia of Philosophy (spring 2018 edition). Zalta EN, ed.; available at https://plato.stanford.edu/entries/privacy/ (last accessed 4 Nov 2019).

28. See also Westin A. Privacy and Freedom. New York: Altheneum; 1967.

29. This is also the fundamental understanding of privacy motivating the discussion of Mittelstadt BD, Allo P, Taddeo M, Wachter S, Floridi L. The ethics of algorithms: Mapping the debate. Big Data & Society 2016:1–21.

30. See also Winkelstein, PS. Ethical and social challenges of electronic health information. In: Hsinchun, C, Fuller, SS, Friedman, C, Hersh, W, eds. Medical Informatics. Integrated Series in Information Systems. Boston, MA: Springer Science; 2005:144–5.Google Scholar

31. Beyond the HIPAA Privacy Rule: Enhancing Privacy, Improving Health Through Research; available at http://www.ncbi.nlm.nih.gov/books/NBK9579/.

32. See note 26, Cato et al. 2016, at 215.

33. See note 24, Fairweather, Rogerson 2001. See also Kluge EHW. Health information, the fair information principles and ethics. Methods of Information in Medicine 1994;33:336–45.

34. See note 24, Fairweather, Rogerson 2001, at 224.

35. See note 27, DeCew 2018

36. See note 1, Mittelstadt 2006, at 10.

37. See note 19, Nuffield Council on Bioethics 2015, at 1–198.

38. See note 21, GDPR, chap. II, article 9(2) g.

39. See note 19, Nuffield Council on Bioethics 2015, at 46–56.

40. For a contextually sensitive understanding of norms of privacy, see Nissenbaum H. Privacy as contextual integrity. Washington Law Review 2004;79(1):119–58. Nissenbaum suggests two types of informational norms: norms of appropriateness and norms of distribution.

41. See note 26, Cato et al. 2016.

42. Christensen, AMS. The institutional framework of professional virtue. In: Carr, D, ed. Cultivating Moral Character and Virtue in Professional Practice. London: Routledge; 2018:124–34.CrossRefGoogle Scholar

43. Zhiping W, Lopez MC. Physician acceptance of information technologies: Role of perceived threat to professional autonomy. Decision Support Systems 2008;46:206–15.

44. Maddox, TM, Rumsfeld, JS, Payne, PRO. Questions for artificial intelligence in health care. JAMA 2019;321(1):31–2.CrossRefGoogle ScholarPubMed

45. High-Level Expert Group on Artificial Intelligence (AI HLEG). Ethics guidelines for trustworthy AI. Brussels: European Commission 2019. Compare Shahriari K, Shahriari M. IEEE Standard Review—Ethically Aligned Design: A Vision for Prioritizing Human Well-being with Artificial Intelligence and Autonomous Systems. 2017 IEEE Canada International Humanitarian Technology Conference (IHTC); 2017: 197–201.

46. See note 1, Mittelstadt 2006, at 6.

47. Turek M. Explainable Artificial Intelligence (XAI). US Department of Defense Advanced Research Projects Agency; available at http://www.darpa.mil/program/explainable-artificial-intelligence (last accessed 6 Aug 2019).

48. Watson D, Krutzinna J, Bruce IN, Griffiths CE, McInnes IB, Barnes MR, et al. Clinical applications of machine learning algorithms: Beyond the black box. BMJ 2019;364:1886–90, at 1886.

49. We want to thank an anonymous reviewer for this important point.

50. Kim JT. Application of machine and deep learning algorithms in intelligent clinical decision support systems in healthcare. Health & Medical Informatics 2018;9(5):321–6.

51. See note 48, Watson et al. 2019. For a discussion of validation and regulation of black box algorithms in healthcare, see Nicholson PW. Big data and black-box medical algorithms. Science Translational Medicine 2018;10(471):eaao5333.

52. Kaba R, Sooriakumaran P. The evolution of the doctor–patient relationship. International Journal of Surgery 2007;5(1):57–65.

53. For another way of investigating the influence of algorithms and AI on the patient–practitioner relationship, see LaRosa E, Danks D. Impact on trust of healthcare AI. AIES’18, Proceedings of the 2018 AAAI/ACM Conference on AI, Ethics, and Society; 2018:210–15.

54. Mead, N, Bower, P. Patient-centeredness: A conceptual framework and review of the empirical literature. Social Science & Medicine 2000;51:1087–110CrossRefGoogle Scholar.

55. See note 54, Mead, Bower 2000.

56. Hershey, PT. A definition for paternalism. The Journal of Medicine and Philosophy: A Forum for Bioethics and Philosophy of Medicine 1985;10(2):171–82.CrossRefGoogle ScholarPubMed

57. See note 54, Mead, Bower 2000.

58. See note 52, Kaba, Sooriakumaran 2007, at 61.

59. See note 54, Mead, Bower 2000, at 1090.

60. See note 54, Mead, Bower 2000, at 1091.