Hostname: page-component-84b7d79bbc-fnpn6 Total loading time: 0 Render date: 2024-07-30T04:28:39.704Z Has data issue: false hasContentIssue false

Some Reflections on Dignity as an Alternative Legal Concept in Data Protection Regulation

Published online by Cambridge University Press:  06 March 2019

Anne de Hingh*
Affiliation:
Internet Law, Department of Transnational Legal Studies, Faculty of Law, VU University Amsterdam. E-mail: a.e.de.hingh@vu.nl

Abstract

Core share and HTML view are not available for this content. However, as you have access to this content, a full PDF is available via the ‘Save PDF’ action button.

As the use of the Internet and online platforms grows, the scale of collecting and processing personal data and turnovers have increased correspondingly.1 At the same time, public awareness about the Internet turning into a genuine profiling and advertisement machine, as well as a powerful surveillance instrument, grows. More people today are concerned about the ways in which public and private actors store and use private information. Many individuals note that they lose sight of the consequences once they give consent to the collection of their sometimes most intimate personal data. The Snowden revelations and the recent Facebook and Cambridge Analytica scandal have only reinforced this public awareness.

Objections against these data processing practices cannot be explained as breaches of data protection or privacy regulation alone. In this Article, it is argued that recently passed regulations fail to solve the unease of data subjects as other, more fundamental values are at stake here. A different or complementary ethical and legal framework is needed to interpret this generally felt unease vis-à-vis current data practices and secondly to confront future developments on the data market. The concept of human dignity may be a helpful perspective in this respect. In the context of data processing, human dignity is generally interpreted in a quite specific manner, such as contributing to the empowerment and self-determination of autonomous individuals. It can be argued, however, that human dignity—in the context of the commodification and commoditization of online personal data—should be seen in a different, quite opposite, light. In sum, future regulation of privacy and data protection attention should shift towards more constraining dimensions of human dignity.

Type
Articles
Copyright
Copyright © 2018 by German Law Journal GbR 

References

1 The growth of the Dutch internet use is reflected in the results of a recent survey: of the 17 million Dutch citizens, 11.5 million use Whatsapp, 10.8 million are on Facebook, 8 million use YouTube, 4.4 million are members of LinkedIn, and 4.1 million people in the Netherlands use Instagram. See Newcom, Nationale Social Media Onderzoek (Jan. 29, 2018), https://www.newcom.nl/socialmedia2018.Google Scholar

2 “Personal data” means any information relating to an identified or identifiable natural person (“data subject”); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person. See Art. 4(1) of the Regulation (EU) 2016/679 of the European Parliament and of the Council of April 27, 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing the General Data Protection Regulation, Directive 95/46/EC, 2016 O.J. (L119) [hereinafter GDPR].Google Scholar

3 See Schneier, Bruce, Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World (2015).Google Scholar

4 See Mayer-Schönberger, Viktor & Ramge, Thomas, Reinventing Capitalism in the Age of Big Data (2018).Google Scholar

5 Harari, Yuval Noah, Homo Deus: A Brief History of Tomorrow (2015); Lohr, Steve, Data-ism: The Revolution Transforming Decision Making, Consumer Behavior, and Almost Everything Else (2015).Google Scholar

6 See Surveillance Studies Centre at Queen's University, The Big Data Surveillance Project, Surveillance Stud. Centre, http://www.sscqueens.org/projects/big-data-surveillance.Google Scholar

7 Morozov, Evgeny, Digital Technologies and the Future of Datacapitalism, Soc. Eur., (Jun. 23, 2015), https://www.socialeurope.eu/digital-technologies-and-the-future-of-data-capitalism.Google Scholar

8 See Zuboff, Shoshana, The Secrets of Surveillance Capitalism, Frankfurter Allgemeine Zeitung (Mar. 5, 2016), www.shoshanazuboff.com.Google Scholar

9 Cambridge Analytica approached Facebook users through the Amazon Mechanical Turk platform (mturk.com) and paid them one to two dollars to download and use a personality quiz app (thisismydigitallife). The quiz “scraped” the information from the profiles of 320,000 Facebook users as well as detailed information from the profiles of their friends. See Tufekci, Zeynep, Facebook's Surveillance Machine, N.Y. Times (Mar. 19, 2018), https://www.nytimes.com/2018/03/19/opinion/facebook-cambridge-analytica.html.Google Scholar

10 The Dutch oversight committee (CTIVD) concluded that one of the data sets was obtained unlawfully, such as without permission of the Minister of Interior Affairs. See CTIVD, Toezichtsrapport nr 55, Over Het Verwerven Van Door Derden Op Internet Aangeboden Bulkdatasets Door de AIVD en de MIVD (2017), https://www.ctivd.nl/documenten/rapporten/2018/02/13/index.Google Scholar

11 See, e.g., Coudert, Fanny, The Europol Regulation and Purpose Limitation: from the ‘silo-based approach’ to … what exactly?, 3 EDPL 313–24 (2017); Purtova, N., Between the GDPR and the Police Directive: navigating through the maze of information sharing in public-private partnerships, 8 IDPL 1, 13 (2018).Google Scholar

12 Roessler, Beate, Should Personal Data be a Tradable Good? On the Moral Limits of Markets in Privacy, in Social dimensions of privacy: Interdisciplinary perspectives 141–61 (Beate Roessler & Dorota Mokrosinska eds., 2015); Sandel, Michael J., What Money Can't Buy: The Moral Limits of Markets (2012).Google Scholar

13 Gunnarson, Martin & Svenaeus, Fredrik, The Body as Gift, Resource, and Commodity: Exchanging Organs, Tissues, and Cells in the 21st Century, 9–30 (Martin Gunnarson & Fredrik Svenaeus eds., 2012).Google Scholar

14 Coudert, supra note 11, at 313.Google Scholar

15 See Zwitter, Andrej, Big Data Ethics, Big Data & Soc., 1, 12 (2014) (“[T]he very nature of Big Data has an underestimated impact on the individual's ability to understand its potential and make informed decisions.”).CrossRefGoogle Scholar

16 Berners-Lee, Tim, The Web Can Be Weaponised – and We Can't Count on Big Tech to Stop it, Guardian (Mar. 12, 2018), https://www.theguardian.com/commentisfree/2018/mar/12/tim-berners-lee-web-weapon-regulation-open-letter.Google Scholar

17 Zuiderveen Borgesius, Frederik J., Improving privacy protection in the area of behavioural targeting 187 (2015).Google Scholar

18 See Hart, Kim & Fried, Ina, Exclusive Poll: Facebook Favourability Plunges, Axios (Mar. 26, 2018), https://www.axios.com/exclusive-poll-facebook-favorability-plunges-1522057235-b1fa31db-e646–4413-a273-95d3387da4f2.html.Google Scholar

19 See Newcom, supra note 1.Google Scholar

20 See KPMG, Crossing the Line: Staying on the Right Side of Consumer Privacy (2017), https://assets.kpmg.com/content/dam/kpmg/xx/pdf/2016/11/crossing-the-line.pdf.Google Scholar

21 Regulation 2016/679, of the European Parliament and of the Council of April 27, 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data and Repealing Council Directive 95/46/EC, 2016 O.J. (L119).Google Scholar

22 See Carr, Nicholas, The Glass Cage: Automation and Us (2014); Schneier, supra note 3; Schnitzler, Hans, Het digitale proletariaat (2015); Harari supra note 5; Franklin Foer, World Without Mind (2017).Google Scholar

23 European Data Protection Supervisor, Opinion 4/2015 Towards A New Digital Ethics: Data, Dignity and Technology (2015), https://edps.europa.eu/sites/edp/files/publication/15–09–11_data_ethics_en.pdf.Google Scholar

24 Id. at 6.Google Scholar

25 Id. at 12.Google Scholar

27 Charter of Fundamental Rights of the European Union, art. 1 (recognizing human dignity as an inviolable right that must be respected and protected).Google Scholar

28 See Explanations Relating to the Charter of Fundamental Rights (2007/C 303/02). In its judgement in Case C-377/98, Netherlands v. Parliament, 2001 E.C.R. I-7079 para. 70–77, the Court of Justice confirmed that a fundamental right to human dignity is part of Union law. It follows that none of the rights laid down in this Charter may be used to harm the dignity of another person, and that the dignity of the human person is part of the substance of the rights laid down in this Charter. It must therefore be respected, even where a right is restricted.Google Scholar

29 Floridi, Luciano, On Human Dignity as a Foundation for the Right of Privacy, 29 Philos. Tech. 308 (2016).Google Scholar

30 GDPR, supra note 2, art. 88.Google Scholar

31 Floridi, Luciano, On Human Dignity as a Foundation for the Right of Privacy, 29 Philos. Tech. (2016).Google Scholar

32 In the Opinion, it was proposed to set up an advisory group to investigate the relationships between human rights, data technology, markets and business models in the 21st century and “to assess the ethical dimension beyond data protection rules.” The EDPS Ethics Advisory Group is composed of six experts: Burgess, J. Peter, Floridi, Luciano, Lanier, Jaron Zepel, Pols, Aurélie, Rouvroy, Antoinette, and van den Hoven, Jeroen. See EDPS Ethics Advisory Group, Towards a Digital Ethics (2018).Google Scholar

33 European Data Protection Supervisor, supra note 23.Google Scholar

34 See, e.g., Brownsword, Roger, Human Dignity, Ethical Pluralism, and the Regulation of Modern Biotechnologies, in New Technologies and Human Rights (T. Murphy ed., 2009).Google Scholar

35 European Commission Press Release 09/156, The Roundtable on Online Data Collection, Targeting and Profiling (March 31, 2009) (“Personal data is the new oil of the Internet and the new currency of the digital world.”).Google Scholar

36 Lodder, Arno R. & de Hingh, Anne E., An Analogy Between Data Processing and The Organ Trade Prohibition (forthcoming).Google Scholar

37 OECD, Exploring the Economics of Personal Data: A Survey of Methodologies for Measuring Monetary Value, 220 OECD Digital Economy Papers (2013), https://www.oecd-ilibrary.org/docserver/5k486qtxldmq-en.pdf?expires=1522591418&id=id&accname=guest&checksum=154F0735253121EAC53377F7E3269D23.Google Scholar

38 See van der Hoeven, Marco, Data Will Be Central to Any Earnings, Executive-People (Apr. 5, 2018), https://executive-people.nl/597065/lsquo-data-komt-centraal-te-staan-in-elk-verdienmodel-rsquo.html.Google Scholar

39 GDPR, supra note 2, art. 4, § 11.Google Scholar

40 GDPR, supra note 2, recital 70. “Where personal data are processed for the purposes of direct marketing, the data subject should have the right to object to such processing, including profiling to the extent that it is related to such direct marketing, whether with regard to initial or further processing, at any time and free of charge. That right should be explicitly brought to the attention of the data subject and presented clearly and separately from any other information.” See also GDPR, supra note 2, art. 4 § 4, where profiling means “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability or behaviour, location or movements.”Google Scholar

41 Sandel distinguishes two objections to the extending of the reach of market valuation and exchange: corruption—i.e. the degrading effect of market valuation and exchange on certain goods—and coercion—i.e. the creation of coercive bargaining conditions, or tainted consent. See Sandel, supra note 12; see also Sandel, supra note 12.Google Scholar

42 Zuiderveen Borgesius, Frederik J., Improving privacy protection in the area of behavioural targeting 223 (2015).Google Scholar

43 GDPR, supra note 2, art. 4.Google Scholar

44 Schermer, Bart W., Custers, Bart & van der Hof, Simone, The Crisis of Consent: How Stronger Legal Protection may lead to Weaker Consent in Data Protection, 16 Ethics & Info. Tech. 171, 171–82 (2014).Google Scholar

45 In practical terms, this would imply that these companies would be forced to offer an opt-out possibility which would enable customers to declare that they do not want to be profiled and receive targeted information. Baarslag, T. et al., Negotiating Mobile App Permissions (2015), https://eprints.soton.ac.uk/377378/1/NegotiatingMobileAppPermissions.pdf.Google Scholar

46 Berners-Lee, Tim, I Invented the Web: Here Are Three Things We Need To Change To Save It, Guardian (Mar. 12, 2017), https://www.theguardian.com/technology/2017/mar/11/tim-berners-lee-web-inventor-save-internet.Google Scholar

47 Sandel, supra note 12.Google Scholar

48 See id. Google Scholar

49 See id. Google Scholar

50 See id. Google Scholar

51 See De Busser, Els, EU-US Digital Data Exchange to Combat Financial Crime: Fast is the New Slow, 19 German L.J. (2018).Google Scholar

52 Lodder, Arno R. & Loui, Ronald, Data Algorithms and Privacy in Surveillance: On Stages, Number and the Human Factor, in Research Handbook of Law and Artificial Intelligence (W. Barfield & U. Pagallo eds., forthcoming).Google Scholar

53 Quirine Eijkman, van Eijk, Nico & van Schaik, Robert, Dutch National Security Reform Under Review: Sufficient Checks and Balances in the Intelligence and Security Services Act, Institute for Information Law (2018).Google Scholar

54 More specific information on the period concerned is not available due to the secret nature of the operation.Google Scholar

55 CTIVD, Toezichtsrapport nr 55, Over het verwerven van door derden op internet aangeboden bulkdatasets door de AIVD en de MIVD (2017).Google Scholar

56 Schneier, Bruce, Data Is a Toxic Asset, So Why Not Throw It Out?, CNN (Mar. 1, 2016), https://edition.cnn.com/2016/03/01/opinions/data-is-a-toxic-asset-opinion-schneier/index.html.Google Scholar

57 Schneier, supra note 3; Schneier, Bruce, 'Stalker Economy’ Here to Stay, CNN (Nov. 26, 2013), https://edition.cnn.com/2013/11/20/opinion/schneier-stalker-economy/index.html.Google Scholar

58 See Peacock, Sylvia E., How web tracking changes user agency in the age of Big Data: The Used User, Big Data & Soc., 1, 8 (2014); see also Schneier, supra note 57, at 94: (“The NSA didn't build a massive eavesdropping system from scratch. It noticed that the corporate world was already building one, and tapped into it … . This leads to a situation in which governments do not really want to limit their own access to data by crippling the corporate hand that feeds them.”); Austin, Lisa M., Enough About Me: Why Privacy Is About Power, Not Consent (Or Harm), in A World Without Privacy: What Law Can and Should Do 3 (2014).Google Scholar

59 CTIVD, Toezichtsrapport nr 55, Over het verwerven van door derden op internet aangeboden bulkdatasets door de AIVD en de MIVD (2017).Google Scholar

60 GDPR, supra note 2, art. 2. This Regulation does not apply to the processing of personal data: (d) by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, including the safeguarding against and the prevention of threats to public security. See also Dutch Data protection Act art. 2(2)(b).Google Scholar

61 Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data of the Council of Europe art. 3, Jan. 28, 1981, E.T.S. 108.Google Scholar

62 See Eijkman, supra note 53.Google Scholar

63 Moerel, Lokke, Big Data Protection. How to Make the Draft EU Regulation on Data Protection Future Proof 58 (2014) (delivered lecture during the public acceptance of the appointment of professor of Global ICT Law, Tilburg University). See also Moerel, Lokke & Prins, Corien, On the Death of Purpose Limitation IAAP (Jun. 2, 2015), https://iapp.org/news/a/on-the-death-of-purpose-limitation/.Google Scholar

64 For this metaphor and other works of Marc Schuilenburg (VU Amsterdam), see http://marcschuilenburg.nl/.Google Scholar

65 Lodder, Arno R. & de Hingh, Anne E., An Analogy Between Data Processing and the Organ Trade Prohibition (forthcoming).Google Scholar

66 Peacock, Sylvia E., How Web Tracking Changes User Agency in The Age Of Big Data: The Used User, Big Data & Soc., 1, 12 (2014).Google Scholar

67 See Austin, supra note 53, at 3.Google Scholar

68 Sandel, supra note 12 (suggesting we could “begin with moral intuitions we have about certain practices and to see whether or not the practices in question are relevantly similar.”).Google Scholar

69 Beyleveld, Beryck & Brownsword, Roger, Human Dignity in Bioethics and Biolaw (1993).Google Scholar

70 van Beers, Britta, Persoon en Lichaam in het Recht. Menselijke waardigheid en zelfbeschikking in het tijdperk van de medische biotechnologie (dissertation Vrije Universiteit Amsterdam) (2009); Gunnarson, Martin & Svenaeus, Fredrik, The Body as Gift, Resource, and Commodity: Exchanging Organs, Tissues, and Cells in the 21st Century (Martin Gunnarson & Fredrik Svenaeus eds., 2012); see also Manuel Wackenheim v. France, Communication No. 854/1999, U.N. Doc. CCPR/C/75/D/854/1999 (2002).Google Scholar

71 See Convention on Human Rights and Biomedicine of the Council of Europe art. 1, Apr. 4, 1997, E.T.S. 164; Universal Declaration on the Human Genome and Human Rights, art. 1, 2(a); International Declaration on Human Genetic Data, art. 1; Universal Declaration on Bioethics and Human Rights, art. 2(c), 3(1). See also Lodder, Arno R. & de Hingh, Anne E., An analogy between data processing and the organ trade prohibition (forthcoming) for an elaboration of the analogy between (parts of) the human body and data related to the human individual.Google Scholar

72 See, e.g., Jacobson, Nora, Dignity and health 186–88 (2012) (noting the objections against the use the concept of dignity in the field of bioethics).Google Scholar

73 Floridi, Luciano, On Human Dignity as a Foundation for the Right of Privacy, 29 Philos. Tech. (2016).Google Scholar

75 EDPS Ethics Advisory Group, supra note 32.Google Scholar

76 See Ethics Advisory Group, Ethics, European Data Prot. Supervisor (2015), https://edps.europa.eu/data-protection/our-work/ethics_en.Google Scholar

77 EDPS Ethics Advisory Group, supra note 32, at 16.Google Scholar

78 See id., at 7.Google Scholar

79 See id. Google Scholar

80 See id., at 16.Google Scholar

81 See id., at 9.Google Scholar

82 See id., at 17.Google Scholar

83 Berners-Lee, supra note 16.Google Scholar

84 Moerel, Lokke, Big Data Protection: How to Make the Draft EU Regulation on Data Protection Future Proof 58 (2014).Google Scholar

85 Roessler, Beate, Should Personal Data Be a Tradable Good? On the Moral Limits of Markets In Privacy, in Social Dimensions Of Privacy: Interdisciplinary Perspectives (Beate Roessler & Dorota Mokrosinska eds., 2015).Google Scholar

86 See Austin, supra note 53; see also Richards, Neil M. & King, Jonathan H., Big Data Ethics, 49 Wake Forest L. Rev. 411 (2014) (claiming “if we were designing things from scratch we would almost certainly want to use a word other than ‘privacy'”).Google Scholar

87 Schneier, supra note 56; see also Conly, Sarah, Against Autonomy: Justifying Coercive Paternalism (2013).Google Scholar