To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Inside the information industry, management leverages the tools of coercive bureaucracies to routinize work that serves data-extractive ends. Corporate bureaucracies work subtly to set privacy discourses among company employees, inculcating corporate-friendly understandings of privacy as frontline workers approach their work. Organizations hobble privacy offices and amplify voices that interpret privacy law in ways that serve corporate interests. Management also constrains designers in the design process, feeding software engineers’ ambivalence toward privacy and using organizational structures to make it difficult for anyone to build better privacy protections into the designs of new technologies.
Privacy’s performances begin with discourse. This is an important step. By influencing how we think about privacy, by inculcating definitions of privacy that are so narrow, outdated, and corporate-friendly, tech companies can ensure that even those employees who consider themselves privacy advocates will nevertheless end up serving data-extractive business models in the end.
The information industry’s discursive performances have influenced everyone, including policy makers, privacy professionals, and ordinary users of technology. The campaign has seen its greatest success in the United States, where tech companies and their allies are actively undermining the push for a comprehensive national information privacy law, where studies show many people have given up on the hope that they can adequately protect their privacy online, and where many privacy professionals see corporate-friendly privacy discourses as ordinary, routine, and common sense.1
This book has been about the tools the information industry uses to routinize an antiprivacy ethos and practice through its organizations. Of course, not all tech companies use all of these tactics: some use a few, some use more, some use none. But these strategies are in use, and they have the effect of marginalizing privacy throughout the everyday work of law and design. More than just a collection of strategies, they are features of informational capitalism. They help explain how data-extractive capitalism persists.
Through a long campaign to inculcate corporate-friendly discourses about privacy, the information industry tilted our legal consciousness away from privacy and enlisted even those employees who see themselves as privacy advocates in their data-extractive missions. This softened the discursive ground on which we think and talk about privacy and weakened the privacy laws we manage to pass. Technology companies then took advantage of public-private partnerships explicitly built into those privacy laws to undermine their effectiveness. They used coercive bureaucracies and took advantage of power asymmetries to develop compliance programs that reoriented and recast privacy laws in ways that served their surveillant interests. As a result, the information industry undermined the institutions that are supposed to protect our privacy.
How can privacy law and corporate commitments to privacy be on the rise without it having a significant effect on the designs of new technologies? Tech companies use the tools of coercive bureaucracies to routinize antiprivacy norms and practices in privacy discourse, compliance, and design. Those bureaucracies constrain workers directly by focusing their work on corporate-friendly approaches to privacy. As information industry workers perform these antiprivacy routines and practices, those practices become habituated, inuring employees to corporate surveillance even as they earnestly profess to be privacy advocates. The result is a system in which the rank and file have been conscripted into serving the information industry’s surveillant interests, and in which the meaning of privacy has been subtly changed, often without them even realizing what’s going on.
We are told that accepting widespread corporate surveillance is a natural progression for human civilization. Peter Schwartz, a senior vice president at Salesforce: “Gradually, we will accept much, much greater surveillance. And in the end we won’t be too bothered by it.” Thomas Friedman in 2014: “Privacy is over.” Mark Zuckerberg in 2010: “The age of privacy is over.” Sun Microsystem’s former CEO Scott McNealy in 1999: We “have zero privacy anyway. Get over it.”1
In Industry Unbound, Ari Ezra Waldman exposes precisely how the tech industry conducts its ongoing crusade to undermine our privacy. With research based on interviews with scores of tech employees and internal documents outlining corporate strategies, Waldman reveals that companies don't just lobby against privacy law; they also manipulate how we think about privacy, how their employees approach their work, and how they weaken the law to make data-extractive products the norm. In contrast to those who claim that privacy law is getting stronger, Waldman shows why recent shifts in privacy law are precisely the kinds of changes that corporations want and how even those who think of themselves as privacy advocates often unwittingly facilitate corporate malfeasance. This powerful account should be ready by anyone who wants to understand why privacy laws are not working and how corporations trap us into giving up our personal information.
Chapter 7 unpacks the conditions of place on Facebook. It highlights contingencies in convening in Facebook-mediated spaces tied to physical infrastructure, group administration, algorithmic and corporate controls, and state oversight. This chapter makes the case that on Facebook, experiences have not matched the actual – and potential – conditions of ownership over digital infrastructure and space. Public discussion on social media obscured, rather than removed, unequal ownership structures and physical dependencies. Participants experienced discussion on Facebook as open and unpredictable, but this was premised upon a limited view of underlying structures and controls. This raises a critical question about the nature of publics online: if spatial configurations of publics mask their constraints, resulting in experiences that seem unpredictable but are not necessarily so, what does this mean for the actual scope of unpredictability of the interactions?
The mythology of the Market is strongly evident, indicated by the corporate camouflage of existential desire by the wide range of constructed desires. This mythology has materialised in the personalisation of the idea of the corporation. Its functioning is revealed by the commodification of individuals within models of regulatory capitalism and by the structural embedding of debt as credit. These trends have been promoted by the digitisation of corporate function, by algorithmic profiling of individuals as consumers and by the exploitation of Big Data. This has morphed into surveillance capitalism. The non-mythological way forward would start with focusing on all shareholders, including all citizens on whom the corporation impacts. This is the reimagining of corporations on purpose-based, fiduciary principles. This in turn would require the redrafting of competition and consumer protection law, as well as shifting the control of personal data to the individual. It would also require changes to employee relations strategies.
Starr describes how we have became so vulnerable to disinformation in this digital era. Heargues, that, like analyses of democratization, which have turned in recent years to thereverse processes of democratic backsliding and breakdown, analyses of contemporarycommunication need to attend to the related processes of backsliding and breakdown inthe media – or what he refers to as “media degradation.” After defining that term inrelation to democratic theory, Starr focuses on three developments that have contributedto the increased vulnerability to disinformation: 1) the attrition of journalistic capacities; 2)the degradation of standards in both the viral and broadcast streams of the new mediaecology; and 3) the rising power of digital platforms with incentives to prioritize growthand profits and no legal accountability for user-generated content. Neoliberal policies oflimited government and reduced regulation of business and partisan politics contributed tothese developments, but while demands are growing for regulation, it remains uncertainwhether government can act effectively.
With the rise of surveillance capitalism digital platforms hunt and capture ever more dimensions of once private experience as raw material for datafication, production, and sales. Under these unprecedented conditions, elemental epistemic rights can no longer be taken for granted. Unlike twentieth century totalitarianism, however, surveillance capitalism does not employ soldiers and henchmen to threaten terror and murder. It is a new instrumentarian power that works through ubiquitous digital instrumentation to manipulate subliminal cues, psychologically target communications, impose choice architectures, trigger social comparison dynamics, and levy rewards and punishments – all aimed at remotely tuning, herding, and modifying human behavior in the direction of profitable outcomes and always engineered to preserve users’ ignorance. The result is best understood as the unauthorized privatization of the division of learning in society. Just as Emile Durkheim warned of the subversion of the division of labor by the powerful forces of industrial capital a century ago, today’s surveillance capital exerts private power over the definitive principle of social order in our time. We must promote individual epistemic sovereignty, law, and democracy as the only possible grounds for human freedom, a functional democratic society, and an information civilization founded on equality and justice.
Email your librarian or administrator to recommend adding this to your organisation's collection.