Skip to main content Accessibility help
×
Hostname: page-component-8448b6f56d-dnltx Total loading time: 0 Render date: 2024-04-20T01:02:53.449Z Has data issue: false hasContentIssue false

Index

Published online by Cambridge University Press:  24 August 2020

Nathaniel Persily
Affiliation:
Stanford University, California
Joshua A. Tucker
Affiliation:
New York University

Summary

Type
Chapter
Information
Social Media and Democracy
The State of the Field, Prospects for Reform
, pp. 332 - 346
Publisher: Cambridge University Press
Print publication year: 2020
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC-ND 4.0 https://creativecommons.org/cclicenses/

Index

accountability. see transparency, platform society
accuracy vs. directional goals, and adjustment to misinformation correction, 172173
accurate information vs. misinformation, spread and impact, 22
ACLU (American Civil Liberties Union), 225
ad exchanges, 125
Adamic, Lada A., 40, 43
administrative data, user data as, 319
advertising. see also political advertising
CDA 230 liability immunity issue, 264265, 270
as motivation to use misinformation, 18
online platform advantage in, 144
advisory opinions, FEC, 117
affective polarization, 4647
Africa, misinformation effects, 26
African Americans, effects of online discrimination, 68
age factor
in fake news sharing, 21
in responses to misinformation and its correction, 182
agenda-setting power of misinformation, 2324
aggregate-level political polarization from social media, 46
Aiello, Luca Maria, 38
algorithmic bias, social media platforms’ priorities and, 21
algorithmic systems. see also ranking algorithms
content takedown and, 273274
for corrections to misinformation, 184185
curation of feed content and liability platforms, 265
lack of transparency, 293, 302303
Allcott, Hunt, 18, 45
Allport, Gordon Willard, 44
alt-right communities, code-word stand-ins for racial slurs, 57
Alvarez-Benjumea, Amalia, 75
American Civil Liberties Union (ACLU), 225
The American Voter (Campbell), 13
analytical thinking, in responses to misinformation and its correction, 182
Ananny, Mike, 302303, 305
Andrews, C., 21
Ang, L. C., 172, 173
Angwin, Julia, 238
anti-refugee hate crimes, 70
antitrust law, as media regulation tool, 211212, 215217
Application Programming Interfaces (APIs), defined, 316
astroturf content, see political bots
asymmetric polarization, 4748
attention cascades, misinformation effects in Brazil, 26
audit reports, disclosure by platforms during content takedown, 235236
authoritarian regimes, social media influence campaigns in, 25
automated hate speech detection, 5960, 72
automated serendipity, 152153
automatic vs. deliberative belief echoes, 175
backfire effects, 187, see also worldview backfire effects
banning of hate speech producers, 73
from correction of misinformation, 45
defined, 163
familiarity effect, 175178
from psychological reactance, 183
from transparency attempts, 302
backlash against worldview backfire effects, 172173
Badaan, Vivianne, 74
bag-of-communities technique, hate speech detection, 60
bag-of-words method, hate speech detection, 59
Bail, Christopher A., 45, 48
Bakshy, Eytan, 43
Balkin, Jack, 323
Ballard, Andrew O., 132133
banning of content by social media platforms, 7173, see also content takedown
Barberá, Pablo, 38, 40
Barlow, John Perry, 200
Barnes v. Yahoo!, 260
Barnidge, Matthew, 40
Batzel v. Smith, 260
BCRA (Bipartisan Campaign Reform Act) of 2002, 113115
belief echoes, 175
Bello-Pardo, Emily, 45
Benkler, Y., 17
Berinsky, A. J., 179
Berry, Jeffrey M., 37
Bessi, Alessandro, 99
biased information processing, 16, 46
Bipartisan Campaign Reform Act (BCRA) of 2002, 113115
blockchain-based registry, 292
Blunt, Christopher, 134135
BMJV (Ministry of Justice and Consumer Protection), 205
boomerang effects, from correction of misinformation, 45
Bork, Robert, 212, 216
bot detection systems, 97
bots. see also political bots, social bots
defined, 9193
in counter-speech against hate speech, 74
influence potential, 89
limiting prevalence of, 278
types of, 9495
Bowling Alone (Putnam), 36
Boxell, Levi, 44
Boyd, R. L., 15
Bramble, Nicholas, 274275
Brandeis, Louis, 289, 290
Brazil, misinformation dissemination in, 26
“breaking news” sites, 21
broadcast media, regulation of, 210213
Broockman, D. E., 24
Brundidge, Jennifer, 39
Brynjolfsson, Erik, 36
Cambridge Analytica, 316, 317
campaign committees, 112
campaign contact and advertising, lack of impact from, 24
Campbell, A., 13
Capitalism, Socialism, and Democracy (Schumpeter), 140
censorship
automatic hate speech detection errors, 72
combating bots and, 94
as risk of content moderation, 73
as social control over hate speech, 75
vs. indirect support for preferred media content, 203
Chadwick, A., 98
challenge avoidance vs. reinforcement seeking, 3841
Chan, Jason, 70
Chandrasekharan, Eshwar, 72
chat bots. see social bots
Cheung, C. S. C., 166
Chicago School, 215, 217
Christchurch terrorist incident, 231
CIE (continued influence effect), correction of misinformation, 163, 165, 187
Citron, Danielle Keats, 278
Civettini, A. J. W., 172
Clapper, James R. Jr., 89
Clinton, Hillary, 17
Cloudflare, 231
Code of Practice on Disinformation, European Commission, 300
cognitive bias, diffusion of misinformation and, 21
Collier, J., 25
Communications Decency Act (CDA) of 1996
costs and benefits of modifying, 272275
court-driven regulation using, 263266
disinformation challenge sources, 253258
failure of “wisdom of the crowds” concept, 279280
history of, 259261
introduction, 252280
legislative actions beyond amendation of, 266268
modification or removal of to address political disinformation, 269278
platform liability issue for, 208209, 214, 224, 261262
practical modifications, 275278
regulatory role of, 258259
“communications placed for a fee on another person’s Web site,” 119
communities of like-minded individuals, 34, 3638
Community Guidelines, 221, 225226, 238
Community Standards, Facebook’s, 296298, 304
community-driven filtration of disinformation, 274
competition, increasing internet platform, 215
complex realism, in media impact on democracy, 157
computational propaganda, 25, 90, 98102, see also political bots
Computational Propaganda (Woolley and Howard), 90
The Computerization Society (Nora and Minc), 207
confirmation bias, 170, 179
conflict detection, in processing corrections to misinformation, 177
connective use of bots, for democratic good or control, 96
Consent of the Networked (McKinnon), 238
conservatism, predictors of, 180
conservatives. see liberals vs. conservatives
conspiracy theories, 10, 166, 182
consumer welfare standard, leaving the price model of, 216
consumption of misinformation, 1720
contact hypothesis, 44
content and advertisements, transparency of, 296299
content moderation
calls for legal mandates, 220
Facebook’s opaqueness about, 294
hate speech reduction, 58, 7174
intermediary liability laws, 223227
regulation of internet platforms and, 214215
content takedown
consequences of, 240242
empirical research questions, 242243
internet platform practices, 226227
legally mandated vs. voluntary, 220222
sources of information, 227240
US court support for platforms’ right, 225
voluntary transparency during, 295296
content-based factors, as moderators of misinformation receptivity, 183184
contextual factors, as moderators of misinformation receptivity, 183186
continued influence effect (CIE), correction of misinformation, 163, 165, 187
Cook, Timothy, 142
Coppock, Alexander, 172
Copyright Directive, EU, 224
Corporate Social Responsibility (CSR) movement, 291
corporate social responsibility, and transparency, 290293
correction of misinformation. see also backfire effects
consequences of, 163164
content-based factors in receptivity to, 183184
contextual factors in receptivity to, 183
continued influence effect (CIE), 163, 165, 187
disadvantage of arguments in favor of original, 170
environmental factors in receptivity to, 184186
exposure to fact checking, 186
moderators of influence of, 178186
personal and psychological factors in receptivity to, 181183
review of literature, 164165
Costello, Matthew, 64
counter-arguing, and worldview backfire effects of misinformation correction, 170, 183
counter-attitudinal messages, acceptance of, 3841, 172
counter-notices to content takedown, 227
counter-speech approach to reducing hate speech, 7375
Counter-Terrorist Information Referral Unit (CTIRU), UK, 235
Crawford, Kate, 302303, 305
creative destruction
impact on democracy, 140, 155158
individual-level changes in news engagement, 148149
internet platforms’ role in, 139141, 142143, 144148
cross-cutting content, effects of exposure to, 35, 43, 44
CSA (Higher Audiovisual Council), 205
CSR (Corporate Social Responsibility) movement, 291
CTIRU (Counter-Terrorist Information Referral Unit), UK, 235
culture of connectivity, 147
cyberbalkanization, 49
cynicism and apathy, misinformation’s effects on, 25
Dara, Rishabh, 239
data portability, 216
data stewards, internet platforms as, 323
data tax on internet platforms, 323
“Declaration of the Independence of Cyberspace” (Barlow), 200
defamation tort as basis for platform liability, 275
Del Vicario, M., 22, 38
deliberative vs. automatic belief echoes, 175
democracy. see also transparency, platform society
ambiguous nature of disinformation impact, 257258
bots for the good of, 9697
creative destruction’s impact on, 140, 155158
definitional issues, 155156
disinformation targeting in democracies, 25
institutional shifts in, 156
losses from news media changes, 148149
media regulation vs. free speech rights in, 199201
Democratic Corporatist model, political system and media, 202
“democratic creative destruction,” 139141, 155158
Diakopoulos, N., 9697
dictionary-based methods, hate speech detection, 59
difference-in-differences strategy, belief analysis for new rumors, 24
diffusion of misinformation. see dissemination and spread of misinformation
digital advertising, decline of traditional news institutions and, 144, see also political advertising
digital media, democratic implications of, 151153, see also news media
Digital Millennium Copyright Act (DMCA), US, 224, 227, 274
digital trace data, 8
direct vs. distributed discovery of news, 150152
directional vs. accuracy goals, and adjustment to misinformation correction, 172173
directionally motivated reasoning, 169170, 171
disclaimers on advertisements, 115120
disconfirmation bias, 170, 179
disinformation. see also propaganda
ambiguity of impact on democracy, 257258
challenge of measuring societal impact, 271
community-driven filtration of, 274
computational propaganda, 90
defined, 11
global reach of, 25
intentionality in, 168
judicial action to combat, 263266, 269270
legislative interventions to combat, 270, 271272
misinformation vs., 166
in online news participation, 155
production of, 1316
Section 230, Communications Decency Act (CDA 230), 252280
as unfair competition in marketplace of ideas, 277278
Disinformation Code of Practice, EU, 225
dispute flags for contested stories, 266
dissemination and spread of misinformation, 2023
distributed vs. direct discovery of news, 150152
diverse deliberation, importance of, 44
diverse groups, problem-solving abilities, 37
diversity of political ideas
individual-level exposure to, 3941
political polarization effects, 35
DMCA (Digital Millennium Copyright Act), US, 224, 227, 274
Doe v. Backpage, 264
Doe v. MySpace, 260
dual-process theory, continued influence effect, 174175
echo chambers, 241
avoidance of opinion challenges as path to, 38
conservatives’ vulnerability to, 180
diffusion patterns of information and, 22
political polarization and, 3638
social bias in diffusion of misinformation and, 21
social media’s fostering of, 34, 4446
Ecker, U. K. H., 166, 172, 173
eCommerce Directive, EU, 224
EDRi (European Digital Rights), 225
Elkin-Koren, Niva, 239
Ellinas, Antonis, 206
email lists, buying or selling, 114
Emmerson, K. M., 172
enclave deliberation, 37
encryption
consequences for research access, 327328
role in digital transparency, 292
Enforcement Report, Facebook Community Standards, 296298
engagement metrics, mismatch with traffic statistics and consumption data, 26
Engstrom, Evan, 239
environmental factors, as moderators of misinformation receptivity, 184186
EU elections of May 2019, internet platform content takedown reporting, 234
Europe
free speech rights vs. media regulation in, 199
media regulation in, 199, 201207, 208, 211, 213, 215
misinformation consumption patterns, 20
misinformation impact in, 25
selective exposure to news, 151
European Digital Rights (EDRi), 225
express advocacy, 115
Facebook
algorithmic bias vs. individual choice, 43
challenges of making competitive, 216
changes in compensation options, 126
Community Standards, 296298, 304
content takedown processes, 231
cross-cutting friendships and viewpoint diversity, 40
culture of openness, 294
data access challenges, 2223, 125126, 315317
education of users on information skepicism, 185
FEC rules for ads, 118119
individual-level fake-news sharing behavior, 21
lack of ranking algorithm effect on ideological balance of news consumption, 35
leaked information during content takedowns, 236
negotiation of ad policies, 124
opaqueness to outside world, 294
Oversight Body proposal for content policy, 304
polarization effect of deactivation of account, 45
political advertising and, 123, 134, 209, 298
privacy audits of, 299
reduction in misinformation sharing, 23
resistance to third-party investigations, 300301
rules on content to remove hate speech, 71
social bias in diffusion of information, 21
social bots and fake accounts, 91
targeted advertising information, 127128
transparency issue for, 229, 286, 292293, 295, 298, 301
worldwide application of GDPR by, 317
fact checks, exposure to, 186
Fair Housing Council of San Fernando Valley v. Roommates.com, 263
Fairness Doctrine, FCC, 212213, 215
fake news. see also misinformation
age factor in being more willing to share, 21
consumption research on exposure to, 18
defined, 11
density of ecosystem, 17
format criterion for misinformation, 167
political bias in 2016 presidential election, 17
profit motive for producers of, 14
proliferation of, 163
Republicans vs. Democrats in reading and sharing, 180
as synonym for misinformation, 253
familiarity backfire effects, correction of misinformation, 175178
FCC (Federal Communications Commission), 210211
Feamster, Nick, 239
Federal Communications Commission (FCC), 210211
Federal Election Campaign Act (FECA), 277
Federal Election Commission (FEC), 112
advertising regulation, 112120
digital political ad spending information, 128129
Federal Trade Commission (FTC), 299
Ferrara, Emilio, 98, 99
filter bubbles, 4144, 152
The Filter Bubble (Pariser), 4243
financial incentives for disinformation, 255256
First Amendment
challenge of combating disinformation in environment of, 276
challenging CDA 230 and, 268
private vs. public speech and, 209
Flaxman, Seth, 39
Fletcher, Richard, 20, 40
Flynn, D. J., 164
Flyverbom, Mikkel, 294
focus criterion for misinformation, 167
FOI or FOIA (Freedom of Information Access), 289
foreign interests, investing in elections, 115
format crierion for misinformation, 167
for-profit entities, news media as, 142143
4chan’s /pol/ board, 62, 63
4chans /pol/ board, 64
Fowler, Erika Franklin, 134
France
content takedown transparency rules, 230
media regulation in, 201207
newspaper markets, 204
Franz, Michael M., 134
Freedom of Information Access (FOI or FOIA), 289
freedom of speech and press. see media regulation
Frenemies: How Social Media Polarizes America (Settle), 4647
FTC (Federal Trade Commission), 299
FTC v. Accusearch, 263
funding of social media research, 325326
Fung, Archon, 290291
Gab, 64
Gayo-Avello, D., 95
General Data Protection Regulation (GDPR), EU, 199, 208, 317318
Gentzkow, Matthew, 18, 39, 44
Germany
content takedown transparency rules, 230
media regulation in, 201207
newspaper markets, 204
Gillespie, Tarleton, 238
Gladwell, Malcolm, 41
Glaser, J., 180
global considerations
censorship danger of content moderation, 73
hate speech detection in languages other than English, 61
hate speech legal definitions, 58
political bot usage, 9394
political polarization, 49
scope of misinformation and, 2526
social media as transnational communication mechanism, 99
Global Network Initiative (GNI), 229230, 295296, 299
global South, misinformation effects in, 2526
Goel, Sharad, 39
Google
advertising data access challenge, 125126, 302
content takedown processes, 226227, 231
content takedown resources, 233
culture of openness, 294
lack of ranking algorithm effect on ideological balance of news consumption, 35
negotiation of ad policies, 124
political advertising, 123, 299
privacy issue, 296, 299
search engine contribution to filter bubble, 42
targeted advertising information, 127
transparency issue for, 295, 299, 301
YouTube bans by, 209
Gorwa, R., 94
government. see state, the
Grinberg, N., 19, 22
group-level considerations, hate speech effects, 6970
Guess, A., 17, 18, 19, 21, 172
Guilbeault, D., 94
Gulati, J. “Jeff,” 128129
Hallin, Daniel C., 201202, 206
harassment, online, 154, see also trolling
hate crimes, online hate speech and, 7071
hate groups, use of online hate speech, 6164
hate speech. see also Network Enforcement Law (NetzDG)
combating, 7175, 77
defining, 5659, 75
detecting, 5961, 76
introduction, 56
offline consequences of, 6871, 77
prevalence of, 6668
producers of, 6164, 76
targets of, 6466, 6869, 76
Hate Speech Code of Conduct, European Commission, 225, 234
Hawdon, James, 64
Higher Audiovisual Council (CSA), 205
Hillygus, D. Sunshine, 132133
Holz, T., 94
homogeneous discussion groups, polarization in, 37
homophily, 3738, 41
Honest Ads Act, 122
honey-pot bots, 95
Howard, Philip N., 90
A Human Rights Audit of the Internet Watch Foundation (“IWF Audit”), 236
Hwang, Tim, 96
hyperpartisan publishers, 17
ideological polarization, 4647, 48
ideological segregation
individual- vs. audience-level, 3940
ranking algorithms’ impact on, 43
ideology
ranking algorithms’ impact on, 35
responses to misinformation correction and, 180181
illegal content, variations in knowledge definition and platform liability, 224
illusory truth effect, 176, 177
inadvertency thesis, 39
incidental exposure to news, 152
independent research, on platforms’ takedown practices, 237, 239240
in-depth public statements, disclosure by platforms during content takedown, 230232
India, misinformation effects in, 26
individual-level exposure to social media, lack of echo chamber effect, 44
individual-level factors
cross-cutting discussions and polarization effects, 45
diversity of opinion in news exposure, 3941
hate speech impacts, 6869
as moderators of misinformation receptivity, 178183
news media changes, 148155
influence, online
authoritarian regimes’ influence campaigns, 25
distortion of democratic processes by, 240
of political bots, 96
information falsehood and quality, challenge of regulating, 275276
information fiduciaries, internet platforms as, 323
information monopolies, dangers of, 320321
ingroup superiority, identifying hate speech, 60
intentionality criterion for misinformation, 167168
Intermediary Liability laws, 220, 223227
Internet
early utopianism, 1
facilitation of like-minded group communication, 3637
hate groups’ use of, 6162
impracticalities of regulating content on, 200
internet platforms. see also transparency, platform society, social media platforms
content moderation by, 7174, 295296
distributed discovery of news, 151152
hate speech definitions, 58
impact on democracy, 157, 209210
institutionalization of, 147148
lack of regulation and restriction in United States, 208210
regulatory issues for, 214217
size factor in content takedown practices, 226
speech rule enforcement problems, 221
transparency issue for, 293295, 301303
Internet Referral Unit (IRU), Europol, 234235
Internet Research Agency (IRA). see Russian (IRA) disinformation
Internet Watch Foundation (IWF), UK, 236
IRU (Internet Referral Unit), Europol, 234235
Jost, John T., 38, 180
journalism bots, contribution to democratic discourse, 9697
journalism, as major contributor to research, 1415, see also news media
journalistic professionalism, 201
judicial action
to address internet platform content liability, 263266
to combat disinformation campaigns, 269271
Kaakinen, Markus, 67
Kahan, D. M., 181
Kalla, J. L., 24
Karpf, D., 98
Kim, E., 24
Kim, J. W., 24
Kim, Young Mie, 127
Klonick, Kate, 231, 238
Kollanyi, Bence, 99
Kolo, Castulus, 204
Konitzer, Tobias, 132133
Kosack, Stephen, 290
Kreiss, Daniel, 124, 238
Kruglanski, A. W., 180
Kuhn, Raymond, 203
Kuklinski, J. H., 12
Lardeau, Matthieu, 204, 206
Le Floch, Patrick, 204
leaked information, disclosure by platforms during content takedown, 236
legacy broadcast media, regulation of, 213214
legacy media channels
countering hate speech, 74
regulation of, 210213
legal definitions of hate speech, ambiguities in, 58
legislative interventions to combat disinformation online, 270, 271272
legitimate news providers, vulnerability to agenda setting by misinformation sources, 23
Leonard, A., 91
Lewandowsky, S., 164
Lewis, Rebecca, 16, 256
liability of internet platforms. see Communications Decency Act (CDA) of 1996, Intermediary Liability laws
liberal bias accusations against California-based internet platforms, 240241
Liberal model, political system and media, 202
liberals vs. conservatives
asymmetry in ideological valence, 17
biased information processing, 48
misinformation consumption levels, 19
misinformation from, 180
responses to misinformation correction, 180181
“liberation technology,” 1
like-minded individuals, polarizing views through communities of, 36
Lipset, Seymour Martin, 208
listener bots, 95
Lodge, Milton, 47
Lokot, T., 9697
Luceri, L., 90
Lumen Database, 229, 237
Macedonia, disinformation source from, 13, 14
MacGregor, Sharon, 238
machine learning, 92
Magdy, Walid, 64
Magrini, 206
Mancini, Paolo, 201202
mandated transparency regimes, 299300
manufacturing consensus, computational propaganda for, 100
Marchal, N., 20
Maréchal, N., 94
marketplace of ideas, disinformation as unfair competition in, 277278
Martin, Gregory J., 134
Marwick, Alice, 16, 255
Mathew, Binny, 75
Maurer, Brianna, 45
McGregor, Shannon C., 124
McKinnon, Rebecca, 237238
media ecology, studying the media as ecosystem, 1617
media environment. see news media
media pluralism, 201203
media regulation, 199217, see also Communications Decency Act (CDA) of 1996
Europe, 201207
free speech rights vs., 199201
legacy broadcast media, 210213
United States, 207210
media tracking firms, as source for digital ad spending data, 129130
mental model theory, and continued influence effect, 174
Merkel, Angela, 89
message presentation, and backfire effects from misinformation correction, 170
Messing, Solomon, 39, 40, 43
Metaxas, Panagiotis T., 95, 98
microtargeting, establishing limitations on, 277278
Minc, Alain, 207
Ministry of Justice and Consumer Protection (BMJV), 205
Minitel, 206207
minority groups, online speech as marginalizer of, 241
misinformation. see also disinformation, fake news
ambiguity of impact on democracy, 257258
consumption of, 1720
defining, 1011, 165168
disinformation vs., 166
effects on democratic process, 2325
effects on political activity, 27
European media regulation response to, 205
global scope of, 2526
lack of research on effects of political, 2425
misperceptions and, 1220, 166
moderators of influence of, 178186
from online news participation, 154155
proliferation of, 163164
research progress on social media, 2627
Section 230, Communications Decency Act (CDA 230), 252280
spread and dissemination of, 1923
misperceptions and misinformation, 1220, 166
Monaco, Nicholas, 100
motivated reasoning framework, 13, 169170, 177, 181
Muller, Karsten, 70
Munger, Kevin, 74
Murphy, S. T., 184
Muslims, fears generated by hate speech, 68
Mustafaraj, E., 95, 98
Mutz, Diana C., 50
Nagler, Jonathan, 21
Narayanan, V., 26
National Security Agency (NSA), US, 296
need for closure, in responses to misinformation and its correction, 182183
negativity of digital advertising, collecting data on, 133
Nelson, J. L., 19
net neutrality, 210, 267
Network Enforcement Law (NetzDG), 199, 205, 230, 232234, 299300
neutrality of internet platforms in relationship to users’ speech, 223224
The New Governors (Klonick), 238
New York Times Co. v. Sullivan, 262
Newell, Edward, 72
news bots, 9697
news media
attention shift away from news, 144
consequences of changes in, 157
expansion of news sources to individuals and organizations, 146147
impact on democracy, 139158
individual-level changes in, 148155
institutional changes in, 142148
loss of trust in, 153
online harassment, 154
operational changes, 146
structural changes and impact on democracy, 139141
newspapers, 143, 204
n-grams method, hate speech detection, 59
Nielsen, Rasmus Kleis, 40
Nimmo, B., 99
Nora, Simon, 207
notice and takedown systems, 222, 226227, 230, see also content takedown
novelty, as main driver of misinformation, 22
NSA (National Security Agency), US, 296
Nyhan, B., 17, 18, 19, 20, 164, 169, 172, 180
Nyss, C., 100
Obama, Barack, 35
offline and online social ties, 39
offline consequences of online speech, 6771, 241
offline vs. online information exposure, 40
Olteanu, Alexandra, 67
online panels of individuals, advertising data from, 129130
Onlinecensorship.org, 238
online platforms. see internet platforms
opinion challenges, avoiding, or not, 3841
The Outrage Industry (Berry and Sobieraj), 37
overcorrection of misinformation, and psychological reactance, 183
oversight issue, democratic transparency, 303305
PACs (Political Action Committees), 112, 113
paid vs. unpaid communications to voters, content of, 133
Panagopoulos, C., 180
Pariser, Eli, 4243
participatory media, manipulation of mainstream media by, 16
partisan bias, 13, 24
partisan identification
as moderator of misinformation receptivity, 180181
responses to misinformation and its correction, 180181
partisan motivated reasoning, 46
Perel, Mayaan, 239
personal and psychological factors, as moderators of misinformation receptivity, 181183
personalization, news, 153
platformed sociality, 147
platform-independent data from media tracking firms, as source for digital ad spending data, 129130
polarization, variations in definition, 35, see also political polarization
Polarized Pluralist model, political system and media, 202
policies
anti-political bot, 94
social media platforms’ self-generated ad, 124
social media research impact on, 9
Political Action Committees (PACs), 112, 113
political advertising
campaign contact and impact from, 24
campaign finance rules, 112123
classification of ads, 114115
congressional proposals, 122123
content of, 132134, 209
decentralized purchasing structure, 124126
disclaimer requirements, 115120
effectiveness question for, 134135
introduction, 111135
negotiation of ad policies by platforms, 124
reporting requirements, 112117
social media platforms as media consultants, 126
spending on, 128132
state laws, 121122
targeting of, 127128
transparency practices of internet platforms, 298299
political bots
as computational propaganda tools, 98102
defeating, 99
for democratic good, 9697
fears around, 89
influence debate, 96
role of, 9396
separation from traditional propaganda, 99
political campaigns, as new media, 147
political communication, 320, see social media research
political factors in moderators of misinformation receptivity, 179180
political ideology, as moderator of misinformation receptivity, 180181
political interest factor in asymmetric polarization, 4748
political parallelism, 201
political polarization
asymmetric, 4748
avoiding opinion challenges, 3841
communities of like-minded individuals, 3638
echo chamber effects, 34, 3638, 4446
filter bubble, 4144
ideological, 4647
interventions to reduce, 50
political rumors, defined, 166
political sophistication, as moderator of misinformation and its correction, 179180
Porter, E., 172
power-law pattern, fake news sharing on Twitter, 22
Preuss, Mike, 74
“prevent harm” goal of Intermediary Liability laws, 223
privacy issue
Europe vs. United States, 208
FTC regulation of internet platforms’ practices, 299
future prospects for balance with research access, 327329
impact on research access to data, 317320
user vs. owner approach to data rights, 323
private broadcasting, development of, 211213
processing fluency, correction of misinformation, 176, 178
programmatic purchasing of advertising, 125
“promote innovation” goal of Intermediary Liability laws, 223
propaganda. see also political bots
computational, 25, 90, 98102
creation of misperceptions through, 13
defined, 11
Russian, 1416, 25, 89, 120, 254255
social bots and, 9293
ProPublica, 300301
“protect lawful online speech and information” goal of Intermediary Liability laws, 223
psychological polarization. see affective polarization
psychological reactance, in responses to misinformation and its correction, 183
public broadcasting, 203204, 211
public communication, defined, 114
public filings, disclosure by platforms during content takedown, 232235
public interest standard, US broadcasting, 210211
public’s right to data, rethinking, 322323
Putnam, Robert, 36
radicalization and recruitment to terrorist groups, 241
raiding, in online hate speech, 66
ranking algorithms
cross-cutting content reduction by, 43
filter bubble generation by, 34, 42
ideological impact of, 35
ideological segregation and, 43
political polarization role of, 4144
as shapers of discourse, 239240
Rao, Justin M., 39
Ratkiewicz, J., 98
Red Lion Broadcasting v. FCC, 212
Reddit, 72, 256
Redlawsk, D. P., 172, 173
refugees, hate speech and hate crimes against, 70
regulation of media. see media regulation
regulatory oversight, as necessary backing for transparency, 291
Reifler, Jason, 17, 18, 19, 164, 169, 172, 180
reinforcement seeking vs. challenge avoidance, 3839
repeating misinformation, strengthening in memory, 176
retweet networks, prevalence among hate speech users, 63
Ribeiro, Manoel Horta, 63
Ridout, Travis N., 129, 133
Rivero, Gonzalo, 38
Roberts, Chris, 133
robonotices, 226
robot journalists, limitations of, 97
Roche, C., 170
Roommates.comcase, 259, 261, 263266, 269, 270, 271
Russian (IRA) disinformation, 1416, 89, 120, 254255
Russian model for propaganda, wearing down of political participation interest from, 25
Ruths, Derek, 72
Saleem, Haji Mohammad, 72
Scandinavian government transparency initiatives, 289
Schaffner, B. F., 170
Schieb, Carla, 74
Schumpeter, Joseph, 140, 156, 158
Schwarz, N., 70
Seaborn, Brent, 134135
search engines, ranking algorithms and the filter bubble, 42, 4344
Section 230, Communications Decency Act. see Communications Decency Act (CDA) of 1996
Seifert, C. M., 164
self-reported survey measures, unreliability of, 18
self-selection of individuals in social media networks, 4950
Settle, Jaime E., 46
Shao, C., 20
Shapiro, Jesse M., 39, 44
Shaw, Daron R., 134135
Shin, J., 23
Siegel, Alexandra A., 67, 74
Silva, Leandro, 64
sleeper bots, 95
Snowden, Edward, 296
Sobieraj, Sarah, 37
social bias, diffusion of misinformation and, 21
social bots, 20, 9193
social correction by users on social media, 185
social media and democracy
as instrument for political control, 98
introduction, 19
social media platforms. see also internet platforms, content moderation, media regulation
algorithmic bias and, 42
hate speech forms and patterns on, 6264
inadvertency thesis and exposure to diverse views, 3941
long-term changes in interaction on, 49
as major resources for media consumption, 20
as media consultants, 126
negotiation of ad policies, 124
political polarization effect from using, 4446
response to proposed new advertising rules, 123
risks of influence over research, 324325
role in misinformation correction, 184186
as transnational communication mechanisms, 99
as US based and inflexible in other countries, 156
social media research
challenges and opportunities, 237, 313320
data access challenge, 130, 314
data sharing paradigm, 320326
difficulty of studying persuasive effects of misinformation, 24
funding of, 325326
future prospects, 326330
general challenges of, 11
hate speech detection limitations, 61
importance of, 89, 323324
misinformation analysis challenge, 27
need for multiple-platform study, 23
overview, 12
privacy practices vs. data access, 317320, 327329
Social Science One project, Facebook, 240, 315316, 323
social unacceptability, as deterrent to hate speech, 75
source credibility, 171
source cues, backfire effects of misinformation correction, 170
spambots, 95
spending on digital political advertising, 128132, 134
state actors, disinformation from, 254255
state laws, US
campaign finance, 121
online political advertising, 121122
state, the
content takedown reporting by, 234235
disinformation from, 25
role in shaping media system, 201210
state-sponsored trolling campaigns, 93, 100
steganography as hate speech symbol, 65
Stephens-Davidowitz, Seth, 70
stereotype subtyping theory, 172
Stratton Oakmont v. Prodigy Service, 259260
structuration, 139
Suhay, Elizabeth, 45
Sunstein, Cass, 37, 275
supervised text classification tasks, hate speech detection, 5960
Taber, C. S., 47
takedown requests, disclosure by platforms during content takedown, 230
Taneja, H., 19
targeting of advertising, 127128, 133, 144
technology
media regulation and telecommunications development, 206207, 208210
role in political communication, 12, 36
temporal distance issue for corrected misinformation, 175, 176177
terrorism, radicalization, and recruitment by online influencers, 241
text message ads, FEC disclaimer requirements, 117118
third-party investigations to ensure transparency, 300301
Thorson, E., 175
Three Years of the Right to Be Forgotten (Google), 231
topic modeling approach, hate speech detection, 60
transnational communication mechanism, social media as, 99
transparency reports, disclosure by platforms during content takedown, 228229
transparency, platform society
as academic study topic, 287
as accountability mechanism, 286287
content and advertisements, 296299
corporate social responsibility, 290293
future of, 273, 301303
historical evolution of, 288290
mandated transparency regimes, 266267, 299300
mixed results of, 291292
oversight issue, 303305
purpose of, 305
social media platforms’ policies to increase ad, 123
theory and reality of, 292, 304305
third-party investigation to ensure, 300301
transparency in practice, 293295
user disclosure requirements by platforms, 267
voluntary internet platform, 222, 295296
troll bots, 95
trolling
digital news media and, 154
as disinformation source, 256257
state-sponsored, 100
Trump, Donald, 17
trust in source, 21, 153, 171
truth value criterion, misinformation, 166, 167
Tucker, J. A., 11, 21, 165
Twitter
advertising research challenge, 126
asymmetric polarization on, 48
“breaking news” sites, 21
cross-cutting friendships and viewpoint diversity, 40
effects of banning extremist social networks, 72
hate speech on, 66, 71
partisan sharing of ideological views, 38
political advertising, 123, 298299
political moderation impact of cross-cutting content exposure, 45
power-law pattern in fake news sharing, 22
public availability of data, 19, 2223, 314
retweet analysis for spread data, 20
Russian IRA operations, 1516
small numbers of activists as responsible for majority of partisan content, 38
social bots on, 91
spread and dissemination of misinformation, 22
transparency issue for, 229, 295
Ukraine, disinformation campaigns in, 25
United Kingdom, media regulation in, 211
United States
media regulation in, 199, 207210, 213216
misinformation consumption patterns, 1920
political polarization effects vs. other countries, 49
selective exposure to news, 151
transparency developments for government in, 289290
United States Congress, political advertising rules, 122123
United States presidential election of 2012, 23, 38
United States presidential election of 2016
bots in, 89
computational propaganda in, 99100
disinformation impact, 257
disinformation suppliers, 14, 1617
internet platform content takedown reporting, 233234
lack of regulation of online ads during, 115, 120, 209
rate of hate speech during and after, 6768
trolling during, 256
United States v. Alvarez, 276
Urban study of content notice and takedown, 226227
user characteristics, in hate speech detection, 60, 63
user level liability, pros and cons to applying to platforms, 272273
“us vs. them,” to identify othering in hate speech, 60
vaccine debate, bot manipulation of, 100
Van Alstyne, Marshall, 36
van der Linden, S., 180
van Dijck, José, 147
VanDuyn, E., 25
Vargo, C. J., 23
Viacom v. YouTube, 232
violence
hate speech and, 56, 57, 58, 67, 69, 7071
online speech connection to, 241
polarization on social media and, 49
virtual bystander effect, corrections of misinformation on social media, 185
Volokh, Eugene, 237
Vosoughi, S., 22
Walter, N., 184
Warner, Mark, 199
weak ties
sharing of news by, 35
as sources of counter-attitudinal information, 41
web crawlers, limitations in gathering advertising data, 130, see also bots
Webster, James, 139
Weichart, Stephan, 204
Westwood, Sean J., 39
WhatsApp, 2526, 328
WhoTargetsMe, 300
wikiedits bots, 95
Williams, Christine B., 128129
Williams, Ev, 279
Winter, Fabian, 75
“wisdom of the crowds,” failure of, 279280
Wittes, Benjamin, 278
Wood, T. J., 172
Woolley, Samuel C., 90
worldview backfire effects, 169173
avoiding, 171
backlash against, 172173
identifying, 173
Republicans vs. Democrats, 180
worldviews, tailoring correction of misinformation to antagonists’, 171172
Wu, Tim, 268
Yates, Sally, 89
Yin, L., 15
York, Jillian, 298
YouTube, content takedown resources for, 233
Zaller, John R., 47
Zeran v. American Online, 260, 263
Zuckerberg, Mark, 97, 209, 293, 298, 304

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×