Hostname: page-component-78c5997874-t5tsf Total loading time: 0 Render date: 2024-11-18T21:59:08.130Z Has data issue: false hasContentIssue false

How Digital Platforms Organize Immaturity: A Sociosymbolic Framework of Platform Power

Published online by Cambridge University Press:  16 March 2023

Martín Harracá
Affiliation:
University of Surrey, UK
Itziar Castelló
Affiliation:
City University of London, UK
Annabelle Gawer
Affiliation:
University of Surrey, UK
Rights & Permissions [Opens in a new window]

Abstract

The power of the digital platforms and the increasing scope of their control over individuals and institutions have begun to generate societal concern. However, the ways in which digital platforms exercise power and organize immaturity—defined as the erosion of the individual’s capacity for public use of reason—have not yet been theorized sufficiently. Drawing on Bourdieu’s concepts of field, capitals, and habitus, we take a sociosymbolic perspective on platforms’ power dynamics, characterizing the digital habitus and identifying specific forms of platform power and counterpower accumulation. We make two main contributions. First, we expand the concept of organized immaturity by adopting a sociological perspective, from which we develop a novel sociosymbolic view of platforms’ power dynamics. Our framework explains fundamental aspects of immaturity, such as self-infliction and emergence. Second, we contribute to the platform literature by developing a three-phase model of platform power dynamics over time.

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of the Society for Business Ethics

Organized immaturity, defined as the erosion of the individual’s capacity for the public use of reason (Scherer and Neesham Reference Scherer and Neesham2020), differs from other forms of control in that it is a self-inflicted and emergent (as opposed to orchestrated) collective phenomenon in which autonomy-eroding mechanisms mutually reinforce each other (Scherer and Neesham Reference Scherer and Neesham2020, 9).

The phenomenon of autonomy erosion and increasing user control has been discussed in the context of the dark side of digitalization (Flyverbom, Deibert, and Matten Reference Flyverbom, Deibert and Matten2019; Trittin-Ulbrich et al. Reference Trittin-Ulbrich, Scherer, Munro and Whelan2021). Scholars have looked at how the automation of interactions through algorithms can lead to an emergent manipulation of choice and autonomy erosion (Alaimo and Kallinikos Reference Alaimo and Kallinikos2017; Beer Reference Beer2017; Just and Latzer Reference Just and Latzer2017; Orlikowski and Scott Reference Orlikowski and Scott2015), but there is still little exploration of the organizing role of platforms in this process.

Digital platforms have been described as organizational forms that orchestrate activities between independent users through the use of digital interfaces (Gawer Reference Gawer2014, Reference Gawer2021; Constantinides et al. Reference Constantinides, Henfridsson and Parker2018; Cusumano et al. Reference Cusumano, Gawer and Yoffie2019; McIntyre et al. Reference McIntyre, Srinivasan, Afuah, Gawer and Kretschmer2021). Increasingly, scholars denounce the negative effects of power accumulation by digital platforms and platform owners. For example, studies of the structural constitution of markets criticize gatekeeping positions that impose discriminatory clauses or limit content access and creation, with consequences for users’ choices (Crémer et al. Reference Crémer, de Montjoye and Schweitzer2019; Jacobides Reference Jacobides2021; Khan Reference Khan2018). Other researchers, such as Kelkar (Reference Kelkar2018), Stark and Pais (Reference Stark and Pais2020), and Flyverbom et al. (Reference Flyverbom, Deibert and Matten2019), discuss sociomaterial perspectives on platforms and show how platform owners design the interfaces, prescribing what is accessible to users and what choices they may enjoy in the digital platform; this, again, restricts choice and creates negative psychological effects on users (Seymour Reference Seymour2019; Wu et al. Reference Wu, Morstatter, Carley and Liu2019). Lanier (Reference Lanier2018) and Zuboff (Reference Zuboff2019) present systems of surveillance promoted by the power of digital platforms that explain how the datafication of human experience leads to increasing forms of domination.

These studies provide valuable explanations of how the increasing power of platforms hinders freedom of choice and individual autonomy. However, their explanations are partial, focusing either on the market mechanisms that limit consumer choice or on the specific role of digital objects, such as algorithms, that constrain the platform users’ autonomy. The fundamental aspects of the organizing of immaturity, such as the tension between organizing and emergence, and the relationship between self-infliction and the power accumulation strategies of key agents, such as platform owners, remain unexplored though. These tensions are essential to explaining how organized immaturity is created and reproduced. We claim that there is a need to explain the power accumulation of the different agents of the platforms and its relation to the mechanisms that lead to the delegation of autonomous decision-making. Therefore, in this article, we ask, How do digital platforms organize immaturity?

To tackle this issue, we build a sociosymbolic perspective of power accumulation in digital platforms inspired by Bourdieu’s writings (Bourdieu and Wacquant Reference Bourdieu and Wacquant2007; Bourdieu Reference Bourdieu1977, Reference Bourdieu1979, Reference Bourdieu1984, Reference Bourdieu1987, Reference Bourdieu1989, Reference Bourdieu1990, Reference Bourdieu1991, Reference Bourdieu2005, Reference Bourdieu, Granovetter and Swedberg2011, Reference Bourdieu and Champagne2014). A sociosymbolic perspective supports building a dynamic conceptualization of power accumulation based on agents’ practices, positions, and strategies. The concepts of field evolution and habitus allow further explanation of the emergence of immaturity and the mechanisms of self-infliction. By situating the concepts of fields, capitals, and habitus in the context of digital platforms, we describe digital platforms as organizations mediated by a digital infrastructure and a digital habitus in which agents accumulate capitals by operating in a field. We explain the role of the digital habitus in organizing immaturity, complementing prior literature on materiality and affordances. We propose a framework of power accumulation in which the dynamics of platform owner power accumulation and counterpower accumulation coexist. The platform owner accumulates power in five forms: constitutional, juridical, discursive, distinction, and crowd. There are two forms of counterpower: crowd and hacking. We also explain the evolution over time of the power dynamics and propose a three-phase model in which the forms of power operate. These phases are platform formation, platform domination within the original field, and platform cross-field expansion.

This framework makes two significant contributions. First, we build a theoretical apparatus that explains the organizing dynamics of immaturity by explaining the relations between the structure, the digital objects, and the platform owner’s power accumulation strategies. From these, we can explain the tension of emergence and self-infliction. With this framework, we draw on sociological perspectives to expand the understanding of organized immaturity in digital spaces by focusing on describing the practices that constitute the webs of relations that configure the digital habitus and the processes of power accumulation. Second, we contribute to the platform literature by developing a three-phase model of platform power dynamics over time. This model expands current views on platform power, providing a more holistic scheme in which power is both accumulated and contested and highlighting how agents other than the platform owner play a role in producing and exercising forms of power. This article concludes by providing policy recommendations on how to understand and tackle organized immaturity and highlighting potential avenues for further research.

ORGANIZED IMMATURITY

Organized immaturity has been defined as a collective, albeit not necessarily orchestrated, phenomenon where independent reasoning is delegated to another’s guidance (Scherer and Neesham Reference Scherer and Neesham2020). It is inspired by the Kantian principle that humans should have intellectual maturity involving autonomy of judgment, choice, and decision-making without the guidance of an external authority. It also relates to the ability to use experience to reason and reflect critically and ethically on complex or problematic situations and to challenge norms and institutions (Scherer and Neesham Reference Scherer and Neesham2020). The concept of organized immaturity differs from other forms of control in two ways. First, it is a “self-inflicted” (Kant, as cited in Scherer and Neesham Reference Scherer and Neesham2020, 8) process, referring to harm done by humans to themselves, often in a nonconscious manner. From this perspective, “immaturity” is therefore a condition of the human being that arises when an individual defers or delegates their own autonomous reasoning to external authorities (Dewey Reference Dewey1939). The second way in which organized immaturity differs from other forms of control is that it is an emergent (as opposed to orchestrated) collective phenomenon in which autonomy-eroding mechanisms mutually reinforce each other (Scherer and Neesham Reference Scherer and Neesham2020, 9).

According to Scherer and Neesham (Reference Scherer and Neesham2020), the study of immaturity relates also to its organizing elements. The perpetuation of modern forms of immaturity has been associated to organizations and institutions that create the conditions for the self-inflicted immaturity. Organized forms of immaturity have been addressed in the critical analysis of bureaucratic organizations, where the individual is subject to various forms of domination and control (Clegg Reference Clegg1989; Hilferding Reference Hilferding2005).

The Fourth Industrial Revolution (Schwab Reference Schwab2017; Philbeck and Davis Reference Philbeck and Davis2018) has ushered in a consolidation of the globalized information and communication technologies that are driving the organization of economic life. However, the infrastructures and mechanisms behind these sociotechnological systems curb individual liberties and impact people’s autonomy (O’Connor and Weatherall Reference O’Connor and Weatherall2019; McCoy, Rahman, and Somer Reference McCoy, Rahman and Somer2018).

The term organized immaturity is not explicitly used in most of the literature studying forms of control related to digitalization (with the exception of Scherer and Neesham [Reference Scherer and Neesham2020] and Scherer et al. [Reference Scherer, Neesham, Schoeneborn and Scholz2020]), but scholars are increasingly analyzing the “dark side of digitalization” (Flyverbom et al. Reference Flyverbom, Deibert and Matten2019; Trittin-Ulbrich et al. Reference Trittin-Ulbrich, Scherer, Munro and Whelan2021). In particular, attention has been directed to the use of big data and systems based on artificial intelligence and to how the automation of interactions through algorithms can lead to an emergent manipulation of choice. Even the basic algorithmic function of search and match creates power asymmetries, since the inspection or control of its guiding principles presents technical challenges for both users and regulators (Beer Reference Beer2017; Just and Latzer Reference Just and Latzer2017). Biases might be found in the criteria for how results are limited, displayed, and sorted (Faraj, Pachidi, and Sayegh Reference Faraj, Pachidi and Sayegh2018) and may even amplify properties of the data used as input, as has been observed in the context of racial biases (Noble Reference Noble2018). Researchers are increasingly pointing at the importance of unpacking the consequences of algorithms in conjunction with a socially structured analysis of the device (e.g., Beer Reference Beer2017; Introna Reference Introna2016; Orlikowski and Scott Reference Orlikowski and Scott2015). Through this, they show how the “black box of algorithmic culture” (Orlikowski and Scott Reference Orlikowski and Scott2015; Pasquale Reference Pasquale2015; Striphas Reference Striphas2010) creates a world of secrecy that eschews questioning and abrogates responsibility (Introna Reference Introna2016), eroding autonomous decision-making.

However, this emphasis on the artificial intelligence tools, algorithms, and coding processes that hinder autonomy in decision-making must be complemented by research into the organizing structures of immaturity, that is, the key organizing agents. Studying digital platforms can improve understanding about how organized immaturity happens, as these platforms organize social interactions and transform the power relations of the different agents who participate in the digital exchanges.

PLATFORMS AND THE ACCUMULATION OF POWER

Platforms as Organizing Agents

In the platform literature, digital platforms have been described as new organizational forms that orchestrate activities between independent users through the use of digital interfaces (Gawer Reference Gawer2014; Kretschmer et al. Reference Kretschmer, Leiponen, Schilling and Vasudeva2022; McIntyre et al. Reference McIntyre, Srinivasan, Afuah, Gawer and Kretschmer2021). Platforms can be considered a “particular kind of technology of organizing” (Gulati, Puranam, and Tushman Reference Gulati, Puranam and Tushman2012, 573) or “hybrid structures between organizations and markets” (Kretschmer et al. Reference Kretschmer, Leiponen, Schilling and Vasudeva2022, 4), as they use a mixture of market and hierarchical incentives to coordinate autonomous agents. Platform organizations are distinct from hierarchies, markets, and networks (Gawer Reference Gawer2014) because, as Kornberger et al. (Reference Kornberger, Pflueger and Mouritsen2017, 81) argued, “platform organizations question not only extant organization designs but also, quite fundamentally, the [Coasian] idea of the firm … and … of value creation processes.”

Two fundamental characteristics define the digital platform as an organizing agent: how its digital architecture is structured and how it coordinates interactions. From an organizational perspective, platforms can be described by the common set of design rules that define their technological architecture. This system is characterized by a “core” or center component with low variety and a complementary set of “peripheral” components with high variety (Tiwana, Konsynski, and Bush Reference Tiwana, Konsynski and Bush2010). The rules governing interactions among the parts are the interfaces (Baldwin and Woodard Reference Baldwin, Woodard and Gawer2009). Interfaces contribute to reduce a system’s complexity by greatly simplifying the scope of information required to develop each component (Gawer Reference Gawer2014). Together, the center, the periphery, and the interfaces define a platform’s architecture (Baldwin and Woodard Reference Baldwin, Woodard and Gawer2009). The center–periphery structure therefore defines an asymmetric framework in which the participants collaborate and compete (Adner and Kapoor Reference Adner and Kapoor2010), under conditions set by the platform owners on two elements: openness and governance rules (Gawer and Henderson Reference Gawer and Henderson2007; Boudreau Reference Boudreau2010).

Platforms coordinate transactions by creating “multisided markets,” in which their owners act as intermediaries to bring together (match) and facilitate exchanges between different groups of users by aligning market incentives (Rochet and Tirole Reference Rochet and Tirole2003). Interactions occur in a networked structure, implying that the value derived from platform usage increases exponentially with each additional user (Katz and Shapiro Reference Katz and Shapiro1985). As the value for participants grows with the size of the platform, it is optimal for them to converge on the same platform, leading to the prediction that platforms will tend to create concentrated markets organized by increasingly powerful owners (Caillaud and Jullien Reference Caillaud and Jullien2003; Evans Reference Evans2003).

Platforms’ Accumulation of Power and the Consequences for Individuals’ Autonomy Erosion

The characteristics of platforms described in the preceding section have facilitated the accumulation of power by platform owners, leading to “new forms of domination and competition” (Fuchs Reference Fuchs2007, 7) that are increasingly eroding people’s capacity to make independent decisions. The consequences of the platforms’ power accumulation for manipulation of choice and autonomy delegation have been analyzed from two perspectives: first, in relation to the structural constitution of markets and how this structure can lead to manipulation of users’ choices, and second, from a sociomaterial perspective that looks at the interaction of digital objects (e.g., algorithms) and the platform users.

From the perspective of the structural constitution of markets, the accumulation of power and manipulation of choice is associated to the growing centrality of large platforms in the economy. Consumers and business partners can have their choices manipulated because of the specific intermediary role that platforms play. Once the market has been tipped, this role provides the platform owner with a position from which they can charge supramonopoly prices and define the rules of the market, including who can access it and how the transactions occur (Busch et al. Reference Busch, Graef, Hofmann and Gawer2021; Khan Reference Khan2018; Jacobides Reference Jacobides2021). In this way, platforms are increasingly operating as gatekeepers, imposing discriminatory clauses or limiting content access and creation (Stigler Committee on Digital Platforms [SCDP] 2019; Furman Reference Furman2019). Choice making can also be limited due to market concentration driven by platforms, in that a platform enhances its owner’s opportunities to leverage its assets (Khan Reference Khan2018). Thus the owner can entrench their (platform’s) position in a market and enter an adjacent one by creating economies of scale and scope (Khan Reference Khan2018; Jacobides Reference Jacobides2021). This brings the possibility of creating a dominant position in apparently unrelated markets through practices like vertical integrations, killer buys, predatory pricing, and self-preferencing (Crémer et al. Reference Crémer, de Montjoye and Schweitzer2019; Furman Reference Furman2019). In addition, the capture and control of transactional data may be used to improve platform services, while also enabling the creation of entry barriers that fend off competition (Khan Reference Khan2018).

Market-based analyses provide a view of power accumulation based on asset control and market position. However, they have been criticized for overlooking the impact of other noneconomic dimensions and for portraying power as relatively unidirectional (Margetts et al. Reference Margetts, Lehdonvirta, González-Bailón, Hutchinson, Bright, Nash and Sutcliffe2021; Lynskey Reference Lynskey2017, Reference Lynskey2019). Such critiques recognize that the deep social impact of platform power cannot be tackled from a market perspective alone (Margetts et al. Reference Margetts, Lehdonvirta, González-Bailón, Hutchinson, Bright, Nash and Sutcliffe2021; Lianos and Carballa-Smichowski Reference Lianos and Carballa-Smichowski2022).

Sociomaterial perspectives place affordances and materiality of the digital objects at the center of the platform interactions (Fayard and Weeks Reference Fayard and Weeks2014; Kornberger Reference Kornberger2017; Curchod et al. Reference Curchod, Patriotta, Cohen and Neysen2019). In this perspective, digital objects, such as code, interfaces, and algorithms, are described as central objects that can hinder autonomy. For example, when platform owners design the interfaces, they define the category of user, prescribing what is accessible to users and what choices they enjoy in the digital platform (Kelkar Reference Kelkar2018). Encoding, which comprises the rules for how offline objects and actions are translated into a digital language (Alaimo and Kallinikos Reference Alaimo and Kallinikos2017), is also defined by platform owners. Once codified, actions must be performed in accordance with the rules established by the platform. Thus the affordances of technology shape and mold the interactions of the users with the platforms (Alaimo and Kallinikos Reference Alaimo and Kallinikos2017). Furthermore, algorithms and codes have been denounced for their opacity (Etter and Albu Reference Etter and Albu2021). The inspection and control of a platform’s guiding principles present technical challenges for both users and regulators (Beer Reference Beer2017), which enables manipulation. For example, Seymour (Reference Seymour2019) and Wu et al. (Reference Wu, Morstatter, Carley and Liu2019) describe how the manipulation design techniques employed by platform firms like Facebook and Twitter are worrying not only because they affect an individual’s freedom of choice but also because they can cause users to experience harmful psychological effects, such as addiction.

Yet the aforementioned studies of affordances and materiality offer a limited understanding of how emergence and self-infliction of organized maturity are patterned by the strategic choices of platform owners and other agents. To further understand the organized immaturity of digital platforms, it is important to look at how practices are shaped and organized by the relations between the technological objects, the different users’ strategies, and structural elements that conform the power accumulation of the platform.

Some scholars have begun to offer holistic models that explain the accumulation of power by platform firms and its consequences for the authority erosion of different agents. Lanier (Reference Lanier2018) and Zuboff (Reference Zuboff2019) describe digital platforms’ datafication of human experience, which leads to increasing forms of domination in what they term “surveillance capitalism.” Surveillance is enabled by the asymmetric positions of platform owners and users, defined by technological architecture, and executed through monetization strategies based on user data. Zuboff (Reference Zuboff2019) argues that despite the explicit narrative of platforms as both positive and objectively inevitable, their strategies and business models—based on voluntary data sharing—are fundamentally connected to the extraction of economic rents. Surveillance reduces human experience to free raw material for translation into behavioral data and prediction products (Zuboff Reference Zuboff2019), eroding individual autonomy and disrupting intellectual privacy (Richards Reference Richards2012). Surveillance has become a naturalized practice that we all—willingly or not—perform (Lyon Reference Lyon2018). Surveillance theories therefore contribute to this debate by offering an understanding of the instrumental connection between the business model and technological objects that constitute the platform and the self-infliction aspects of immaturity processes.

Yet, we argue that further work is needed to understand not only the expansion of immaturity through a system of economic surveillance but also how the everyday practices of leading and participating in the platform relate to immaturity emergence. Moreover, we argue that these views should be enriched with a theory of how agency is constituted and transformed by platform power dynamics, how these dynamics have an organizing role in producing and reproducing the delegation of autonomous decision-making, and how the emergence of immaturity and the strategic power accumulation by platform owners are connected.

A SOCIOSYMBOLIC PERSPECTIVE OF DIGITAL PLATFORMS

To further explain how platforms organize immaturity, we draw on Bourdieu’s sociosymbolic theory and the concepts of field, capitals, and habitus. A sociosymbolic perspective situates the agents in a field and explores the power accumulation dynamics of each agent. It takes materiality into consideration, but, through the concept of habitus, it is able to explain how interactions are also mediated by previous history and the networks of relations in a way that complements the notion of affordances and its connotations for the perception of physical artifacts and technology (Fayard and Weeks Reference Fayard and Weeks2014). Furthermore, a sociosymbolic approach allows us to build an integrative conceptualization of power accumulation and its dynamics based on agents’ practices, positions, and strategies. It shows how multiple types of powers can coexist and accounts for how the relative positions of agents shape their motivations and actions, explaining the practices of immaturity and its relation to self-infliction. We explain this further, first by providing an overview of how a sociosymbolic perspective generally explains power and its dynamics through the concepts of field, capital, and habitus; we thus show how digital platforms can be understood through these lenses. Second, we describe the dynamics that lead to specific forms of power accumulation and explain how they can evolve over time.

Fields, Capitals, and Habitus in Digital Platforms

Bourdieu’s sociosymbolic theory was developed to explain social stratification and dynamics in (offline) societies by focusing on how agents (people, groups, or institutions) produce, reproduce, and transform social structures through practice (i.e., what they do in everyday life). Through practice, agents produce particular social spaces with specific boundaries demarcated by shared interests and power relations; these social spaces are termed fields of practice (Bourdieu and Wacquant Reference Bourdieu and Wacquant2007).

Fields

A field (champ) is a key spatial metaphor in Bourdieu’s work. It represents “a network, or a configuration, of objective relations between positions” (Bourdieu and Wacquant Reference Bourdieu and Wacquant2007, 97). These positions are objectively defined “to field occupants, agents or institutions … by their present and potential position (situs) in the structure of the distribution of species of power (or capital)” (Bourdieu and Wacquant Reference Bourdieu and Wacquant2007, 97). Individuals, groups, or organizations can be agents in a given field, and one individual may have different agencies (or “roles”), depending on their situation in the field.

The concept of “field” can be related to digital platforms in the sense that the organization and production of practices situates the platform in relation to an existing field. This may be the field of cultural production (e.g., Facebook) or the field of goods exchange (e.g., Amazon). The fields have specific logics and structures that define them. Different agents can have multiple roles; for example, an Instagram user may be both a contributor and a consumer of content. The relational aspects of the fields are also very compatible with network-based perspectives (Portes Reference Portes1998) because the field in which the platform is embedded functions on the basis of relations created during the practice of exchanges that constitute the field. The technological infrastructure creates a center–periphery structure, which provides the foundation on which the practices occur, both enabling and regulating them. This approach to platforms highlights the practice of the agent and its position but also simultaneously shows how the platform’s constitutive elements are deeply interconnected. Taking Twitter as an example, the extent to which a specific content generated by a user is reproduced depends on the user’s social position in the network but also on the priorities defined by the platform’s algorithms, which create the structure in which the content is shared.

Multiple nested and overlapping fields can be found on any platform, just as they are in any (offline) social context. For example, YouTube constitutes a huge field of people broadly interested in sharing and viewing online video content. However, YouTube also hosts a variety of other, more focused subfields, for instance, a field centered on cryptocurrency videos. At the same time, platforms do not necessarily constitute a field in its entirety, for while some online fields exist mostly in a single platform, like the field of video content sharing on YouTube, competing platforms have entered some subfields, such as gaming videos in Twitch. At the same time, other online fields are embedded in larger fields of practice. For example, job seekers would look at job opportunities in LinkedIn while engaging offline with the job-offering companies.

Yet the creation of a digital platform can also be conceptualized as an attempt to “enclose” part of a field: an agent (the platform creator) designs a value creation model for users (the specific practices to be performed by them within the field) and develops the digital infrastructure that makes interactions possible. Digital platforms enclose the field because they attempt to create “exclusive control rights” (Boyle Reference Boyle2003) over dimensions of practices that were previously in the public domain. Consider Google’s Street View, launched in 2007, which permits users to view the fronts of buildings from a pedestrian’s viewpoint. The service utilizes photographs taken by Google of objects that are not covered by intellectual property rights, albeit that the photographs were taken without the authorization or agreement of the communities, and their use is monetized (Zuboff Reference Zuboff2019). In this case, Google Street View becomes not only a new service for users but also a new way of exploiting value through dispossession of public goods and private data (Zuboff Reference Zuboff2019).

A field enclosure by a platform also includes encoding social interactions defined by more or less variable practices (e.g., hailing a taxi on the street) into a precisely defined process in a controlled space (using a ride-hailing app). This appropriation is produced through the codification of social interactions, control over the digital space, and the data generated by these interactions. Moreover, by enclosing a field, digital platforms modify both the practices and the agents’ relative positions. For example, drivers and passengers are inscribed into a database owned by the platform owner and are organized into groups from which they are picked and matched.

Furthermore, the creation of the platform can transform the scope of the field. Digitalized practices often involve connecting with deeply intimate aspects of users’ lives (Lupton Reference Lupton2016), such as private data exemplified in photos, comments, or information about consumption habits. While typically regarded as private, the encoding of these portions of experience puts them into the potential reach of a field and exposes them to its specific field logic. Furthermore, because of the new ways of performing certain practices, platforms collide with the established scopes of the field, changing the agents and institutions involved in it. This is the so-called disruptive nature (SCDP 2019) of the platform. Examples can be found in conflicts around regulatory frameworks triggered by the introduction of platforms to some industries, such as Uber’s entry into the field of transportation and Airbnb’s into hospitality.

Capitals

Fields are dynamic spaces defined by the relations of power between players that constitute the structure of the field (Bourdieu and Wacquant Reference Bourdieu and Wacquant2007). These relations result from “the possession and activation of resources that are both materially and symbolically produced and perceived” (Bourdieu Reference Bourdieu1989, 16). These resources are the capitals.

The accumulation of capitals give access to “the specific profits that are at stake in the field, as well as by their objective relation to other positions (domination, subordination, homology, etc.)” (Bourdieu and Wacquant Reference Bourdieu and Wacquant2007, 97). In each of the specific fields, the spaces of objective relations are the sites of a logic specific to those who regulate the fields. This logic does not need to follow purely economic rationalities to be described (Sandberg and Alvesson Reference Sandberg and Alvesson2011). For example, TikTok users who copy their nearest higher-status digital neighbors in a particular contest or “dance” might not be guided by economic rationality, but they do follow the logic of the platform.

Capitals are therefore the resources—scarce and socially valued stocks of internalized abilities and externalized resources—that each agent has. Bourdieu defines three fundamental forms of capital through which power is accumulated: economic capital (money and other assets), cultural capital (knowledge and familiarity with accepted norms), and social capital (reflected in the actor’s creation of connections and social networks) (Bourdieu Reference Bourdieu, Granovetter and Swedberg2011). To these, Bourdieu adds symbolic capital, “which is the form that one or another of these species takes when it is grasped through categories of perception that recognize its specific logic, … [that] misrecognize the arbitrariness of its possession and accumulation” (Bourdieu and Wacquant Reference Bourdieu and Wacquant2007, 118), that is, the reflection in the relations of the field of accumulated prestige, consecration, or honor (Bourdieu Reference Bourdieu1993). For Bourdieu, power struggles are mainly symbolic, and agents who are willing to increase their power will ultimately exercise the symbolic capital that will help them to be “perceived and recognized as legitimate” (Bourdieu Reference Bourdieu1989, 17) in what Bourdieu (Reference Bourdieu1984) also calls “distinction.”

Social dynamics in fields are centered on the generation of distinction(s) by agents, who “constantly work to differentiate themselves from their closest rivals” (Bourdieu and Wacquant Reference Bourdieu and Wacquant2007, 100), although the actors’ participations in these games are typically no more than “unconscious or semi-conscious strategies” (Bourdieu Reference Bourdieu1969, 118). Distinction operates through the accumulation of capital that matters to the field. Thus fields are spaces of conflict and competition in which the hierarchy is continually contested. However, agents can attempt to convert one form of capital into another or transfer it to a different space, depending on the specific logic of the field (Levina and Arriaga Reference Levina and Arriaga2014).

The concept of distinction can be assimilated to the concept of “status” as it is used to explain the means of interaction on digital platforms (Levina and Arriaga Reference Levina and Arriaga2014). For example, on digital platforms like YouTube, a user’s social network position and cultural skills (e.g., their offline knowledge about a particular topic) combine with their taste and the time and money they invest into the field. Together, these shape which content gets noticed and which is ignored (Levina and Arriaga Reference Levina and Arriaga2014) and therefore which agents become “influencers” or agents with high status in the network.

Habitus

Besides the description of how agents, through their collective actions, shape emergent field structures and the understanding of which capital matters and how, Bourdieu also looks at how structure shapes agency. Bourdieu uses the notion of habitus to describe the socially learned schemata of perception and inclinations to action (Bourdieu and Wacquant Reference Bourdieu and Wacquant2007). Habitus is the internalization of the logic of the field. It is a set of historical relations incorporated within individual bodies in the form of mental and corporeal schemata (Ignatow and Robinson Reference Ignatow and Robinson2017). These relations, or the “system of schemes of perception and appreciation of practices, cognitive and evaluative structures,” are “acquired through the lasting experience of a social position” (Bourdieu Reference Bourdieu1989, 19); that is, they are acquired through interaction with other social agents. The habitus includes related comportment (posture and gait), aesthetic likes and dislikes, habitual linguistic practices, and ways of evaluating oneself and others via categories. It forges not only actions but also desires and aspirations (Ignatow and Robinson Reference Ignatow and Robinson2017). While cognitively embedded, it is also embodied in gestures, postures, movements, and accents (Ignatow and Robinson Reference Ignatow and Robinson2017). Its reproduction depends mainly on institutions like family and school. Mastery of the habitus tends to guarantee distinction and constancy of practice over time (Bourdieu Reference Bourdieu1990).

Crucially, the constitution of the habitus is recursive: while agents can reshape social distance and the ways it may be perceived, their own perception is likewise framed by their own position in the social structure. This recursive cycle is the process of constitution of the sociosymbolic space, where changes in position can be understood as the outcome of symbolic struggle. Habitus is therefore a way of conceptualizing how social structures influence practice without reifying those structures (Costa Reference Costa2006).

In his studies of class, taste, and lifestyles, Bourdieu (Reference Bourdieu1984) illustrates how habitus shapes taste in ways that make a virtue out of necessity. For example, working-class people develop a taste for sensible, plain food, furnishings, and clothes, and they shun fancy extravagances (Bourdieu Reference Bourdieu1984). Hence habitus leads to the “choice of the necessary,” and in so doing, it tends to generate practices that ultimately reproduce the original objective conditions, through which it functions as structure (Costa Reference Costa2006). Thus, given a set of conditions, “habitus affords an actor some thoughts and behaviors and not others, making those thoughts and behaviors seem more appropriate, attractive, and authentic than others” (Fayard and Weeks Reference Fayard and Weeks2014, 245). Ultimately, however, it is the actor who decides what to do. Often the decision occupies no conscious thought, but, as Bourdieu (Reference Bourdieu1990, 53) argues, it is “never ruled out that the responses of the habitus may be accompanied by strategic calculation tending to perform in a conscious mode.”

The concept of digital habitus has been used in the analysis of digital spaces (e.g., Levina and Arriaga Reference Levina and Arriaga2014; Julien Reference Julien2015; Ignatow and Robinson Reference Ignatow and Robinson2017; Romele and Rodighiero Reference Romele and Rodighiero2020) to explain the ways of acting, namely, the social and technologically ingrained habits, skills, and dispositions that define the practices in the digital field. Ignatow and Robinson (Reference Ignatow and Robinson2017) argue that digital machines are not only the crystallized parts of habitus but also habitus producers and reproducers. This is because practices performed in digital platforms have technological and symbolic mediations: they are digitized—coded—and they are performed through a constant interaction with algorithms and the data that feed the learning of the algorithms. For algorithms to constitute the habitus, they need the platform to be able to extract increasingly large amounts of data and transform them into capital. In this context, the data work as the culture that informs the knowledge about the social space. The norms of the platform are constantly shaped by the interaction between the data, the algorithm, and the agents. The capital created by this interaction can be appropriated by certain agents who know how to use these results to their advantage.

The mechanism of the digital habitus has two consequences. As socialization is increasingly done through digital platforms, the algorithmic logic becomes a norm that everyone needs to learn to play by or with (Beer Reference Beer2017), and thus it becomes part of the habitus. It becomes the representation of the current taste of a social class or group so that their decisions resemble each other. However, unlike the offline habitus, it derives from code as well as from action; thus it is somehow defined behind closed doors by the platform owners. Second, as Ignatow and Robinson (Reference Ignatow and Robinson2017) argued, the digital habitus becomes a (re)generator of the social group because it is mediated by the property of the algorithmic practice that relates to aggregation for prediction. The singularities of social agents are reduced to aggregates of decisions, actions, desires, and tastes. This phenomenon has been called “personalization without personality” (Ignatow and Robinson Reference Ignatow and Robinson2017, 100), personality being the principle that gives unique style to each human process of individualization.

Having set the theoretical apparatus to explain how digital platforms can be understood from a sociosymbolic perspective, we turn now to defining how digital platforms accumulate power and how power accumulation increases the problem of organized immaturity.

A Sociosymbolic Perspective of Power Accumulation and Its Consequences for Organized Immaturity

Building on Bourdieu’s later writings on the State and its forms of power (Bourdieu Reference Bourdieu1989, Reference Bourdieu and Champagne2014) and in light of the latest developments of digital platforms and their accumulation of power, we direct our analytic attention to the platform owner and its relations with the other platform agents and sociodigital objects. Thus, we go beyond the extant analysis of distinction in digital platforms done by scholars of digital sociology (e.g., Julien Reference Julien2015; Ignatow and Robinson Reference Ignatow and Robinson2017) which focuses on users, to capture the mechanisms of field transformation led by platform owners in their relationship with the other platform agents. We follow Bourdieu (Reference Bourdieu and Champagne2014) in terming these mechanisms “forms of power” and showing how these contribute to explaining organized immaturity.

Drawing on Bourdieu’s writings (Bourdieu Reference Bourdieu1984, Reference Bourdieu1989, Reference Bourdieu1991), we define the forms of power, distinguishing between two general dynamics. We first define five forms of power (constitutional, juridical, discursive, distinction, and crowd) that drive the accumulation of power within the platform. Second, inspired by recent literature on platforms (Ziccardi Reference Ziccardi2012; Eaton et al. Reference Eaton, Elaluf-Calderwood, Sørensen and Yoo2015; Krona Reference Krona, Montereo and Sierra2015; Bucher et al. Reference Bucher, Schou and Waldkirch2021), we show how counterpower can also be performed by end users and other peripheral agents through crowd and hacking power. Crowd and hacking power are not concepts derived directly by Bourdieu’s theory but provide a more comprehensive view of power accumulation dynamics.

We then articulate the platform power dynamics through three phases of platform evolution, which are derived from an interpretation of platform innovation research (Cutolo and Kenney Reference Cutolo and Kenney2020; Kolagar, Parida, and Sjödin Reference Kolagar, Parida and Sjödin2022; Rodon, Modol, and Eaton Reference Rodon Modol and Eaton2021; Teece Reference Teece2017): formation, where the platform is launched and starts to be used by agents; domination, where the platform has been widely adopted and operates under a relatively stable design within the original field; and cross-field expansion, where the platform expands to other fields, leveraging their accumulation of power. Although we describe for each stage the dominant forms of power and counterpower accumulation that enable the transformation of the field, we acknowledge that several forms of power coexist in these phases, that the evolution of platforms is often nonlinear, and that not all platforms will become dominant.

Forms of Platform Power

Constitutional Power

Constitutional power is the ability to “transform the objective principles of union and separation, … the power to conserve or to transform current classifications” (Bourdieu Reference Bourdieu1989, 23). Within the platform, this power comprises both the architectural design (platform layers and modularity, design of user interfaces and experiences) and the capacity to define the rules, norms, categories, and languages that make up the digital interactions. Constitutional power shapes the digital medium for interactions and defines what may and may not be accessed by each type of agent within the platform.

Constitutional power is exercised mainly by the platform owner. As the provider of the digital infrastructure upon which other agents collaborate, the owner defines the symbolic space through code. Code symbolically creates the objects that constitute the relations, being a neat, unified, and unambiguous language with no openings for interpretation (Lessig Reference Lessig2009). In the digital realm, the actor who manages the code can increase its symbolic imposition and therefore its legitimization. As the legitimation process is unified, creation and transformation are delegated. This legitimation is “world-making” (Bourdieu Reference Bourdieu1989), as it explicitly prescribes the possible realities and actions. The platform owner is therefore able to hold a monopoly over legitimate symbolic violence (Bourdieu Reference Bourdieu1989), having a differential capacity to influence and settle symbolic struggle. The possibility of obtaining and activating this symbolic capital is associated with complex technological competences, which are scarce and highly concentrated (Srnicek Reference Srnicek2016; Zuboff Reference Zuboff2019).

The coherent body of code adopted by the symbolic space through constitutional power is not a neutral technical medium (Beer Reference Beer2017; Gillespie Reference Gillespie2010), and it can trigger autonomy eroding. Code is created and transformed in accordance with the objectives of the platform owner and correspondingly managed toward these goals. For example, Kitchens et al. (Reference Kitchens, Johnson and Gray2020) show how the differences in platform design for Facebook, Twitter, and Reddit create a differentiated impact on the diversity of news and the type of content their users consume. Calo and Rosenblat (Reference Calo and Rosenblat2017) and Walker et al. (Reference Walker, Fleming and Berti2021) find that the algorithmic design in Uber reduces drivers’ insights about their working conditions and the competition they face, hindering their autonomy. Even without assuming strategic manipulation, the limited symbolic and repetitive action of users implies a delegation of users’ own independent reasoning and the emergent coordination of their actions by the platform.

Juridical Power

Along with the architecture definition, a second feature that is critical to the thriving of the platform is its governance. While constitutional power has to do with the design of governance, juridical power is the capacity to sanction via the created rules and the authority to arbitrate in disputes (Bourdieu Reference Bourdieu1987, Reference Bourdieu2005). Typically, it can take a variety of forms, such as sanctioning rule infringement, reporting abuses, or managing access to the platform (Adner and Kapoor Reference Adner and Kapoor2010).

Digital technologies can enable increased participation and distribution of roles among agents, which is why studies of governance in these contexts have favored the idea that digitalization processes are highly democratizing (von Hippel Reference von Hippel2006; Zittrain Reference Zittrain2009). However, the hierarchical structure of digital platforms facilitates the creation of governance layers, meaning that the importance of those decisions can be easily packaged, resulting in a limited distribution of power in the field. For example, transaction-oriented platforms like Amazon, eBay, and Uber rely on user-based rating systems to ensure good quality and sanction inadequate behavior; however, the platform owner designs the rankings and retains control of other actions, such as account activation and suspension (Gawer and Srnicek Reference Gawer and Srnicek2021).

This role division effectively creates and redistributes power and therefore restricts the capacity of some agents to interact without the intervention of the digital platform owner. Hence the definition and distribution of roles will interact with (and eventually transform) the authority structure and the conflict management mechanisms that preexist in the field, including regulation. For example, Valdez (Reference Valdez2023) explores how Uber uses what she calls “infrastructural” power to deploy a strategy of “contentious compliance,” both adapting to and challenging existing regulation. This strategy allows the company to exploit differences in regulation and regulatory scrutiny to reduce users’ access to information and acquired rights.

Discursive Power

A third distinctive form of power that characterizes agents’ strategic interplay is discursive power. Discursive power is the power exercised in linguistic exchanges, which are embodied and learned but also generative of the habitus (Bourdieu Reference Bourdieu1991). The way agents talk about platforms and the words they use to explain them—these discourses configure the collective narrative of what is possible on and valuable in a platform.

Platforms are narrated as part of a broader, already-institutionalized rational-technological narrative in which customer-centrism, effectiveness, and rationality of the exchanges are dominant values (Gillespie Reference Gillespie2010; Garud et al. Reference Garud, Kumaraswamy, Roberts and Xu2022). Technological determinism discourses promoted by platform owners reinforce the idea that platforms’ algorithms are inscrutable and of a complexity unfathomable to the public or the regulator (Martin Reference Martin2022; Pasquale Reference Pasquale2015). These discourses have led to a broader narrative of a “Manifest Destiny” (Maddox and Malson Reference Maddox and Malson2020) of digital platforms, where the user is explicitly asked to delegate their own reasoning to the platform. This, alongside user dispersion, is a fundamental element that enables prescribing actions. Critical to maintaining user dispersion is the narrative that users are directly connected through the platform, which is presented as an agora of exchanges. In actuality, platforms mediate that interaction, formatting it, regulating it, or even suspending it.

Distinction Power

Distinction power is the creation of categories and the mechanisms of categorization that drive choice in the platform. It builds on the concept of distinction proposed by Bourdieu (Reference Bourdieu1984). It defines the rules and practices that inhabit the habitus and designates which of them are legitimated and considered by society to be natural. The purpose of this type of power is to produce a behavioral response that serves some agents’ specific accumulation of capital. The platform owner can influence user behavior by modifying the interfaces, the encoding, and the algorithms, thereby manipulating the user’s decision-making. At the same time, users can access and activate this power through their digital habitus, allowing them to influence and drive other users’ choices.

On platforms, distinction power is often exercised through what Kornberger, Pflueger, and Mouritsen (Reference Kornberger, Pflueger and Mouritsen2017) call evaluative infrastructures. Evaluative infrastructures are the different interactive devices, such as rankings, ratings, or reviews, that establish an order of worth among the users of the platform, driving the “attention” (Goldhaber Reference Goldhaber1997) of other users. They relate agents and their contributions with each other, but they are also instruments of power. They define not only how agents are perceived and ranked in the community but also how the hierarchy is monetized by the platform’s owners (Kornberger et al. Reference Kornberger, Pflueger and Mouritsen2017). Status markers are examples of how distinction power is exercised. As they define how user activity and loyalty to the platform are rewarded, they become a fundamental element in guiding agents’ accumulation strategies. For example, YouTube and Wikipedia changed their strategy for recognizing content to stimulate newcomers (Kornberger et al. Reference Kornberger, Pflueger and Mouritsen2017). Ignatow and Robinson (Reference Ignatow and Robinson2017) refer to this process as the “übercapital.” Übercapital emphasizes the position and trajectory of users according to the scoring, gradings, and rankings and is mobilized as an index of superiority that can have strong reactive or performative effects on behavior (Ignatow and Robinson Reference Ignatow and Robinson2017).

A key feature of distinction power is that it is exercised heterogeneously over different users through differences created by constitutional and juridical power. Different types of users are granted different forms of agency, not only by the platform designers but also by their own intervention on the platform (Levina and Arriaga Reference Levina and Arriaga2014). For instance, passive users may be granted agency through technological features. For example, YouTube gives agency to passive users by displaying the number of views. Merely by viewing a piece of content, individuals cast a vote on its value, which has significant consequences for the content producers. Other users become judges or “raters” and producers of information at the same time. For example, retweeting on Twitter is both a contribution to the platform and an act of evaluation. As well as users who are raters, there are often users who act also as “expert evaluators” (users who have accumulated significant cultural capital). One such example is the “superdonor” on crowdfunding platforms like Kickstarter, whose expert evaluations influence which projects are funded. Expert evaluators tend to form a tight-knit group within a field (Vaast, Davidson, and Mattson Reference Vaast, Davidson and Mattson2013; Aral and Walker Reference Aral and Walker2012). Other users might have what Bourdieu called “institutionalized consecration” (Levina and Arriaga Reference Levina and Arriaga2014), which is the formal authority to evaluate content given by the platform designers. These are typically site moderators and community managers, who have more power than others to judge contributions (Levina and Arriaga Reference Levina and Arriaga2014). In sum, these different types of agencies are designed by the platform owners to orient users’ actions and to promote and demote content (Ghosh and Hummel Reference Ghosh and Hummel2014). They are typically linked to how the platform owner designs revenue models (Zuboff Reference Zuboff2019).

The forms of power presented so far tend to reinforce the power position of the platform owner, but there are other forms of power that create the opposite tensions, that is counterpower accumulation. These are crowd and hacking power.

Crowd Power

In the accumulation process, users are in a unique position in that they are the agents who produce the platform’s activity. Crowd power results from the influence that users can exert on the platform by the sheer mass of their actions, which may or may not be coordinated (Bennett, Segerberg, and Walker Reference Bennett, Segerberg and Walker2014; Culpepper and Thelen Reference Culpepper and Thelen2020). These practices are, in essence, the exercise of the digital habitus. The exercise of the habitus can have a long-lasting effect on the platform’s structure. Practices can both inspire new functionalities and generate unexpected transformations to the value proposition, which the platform owner can recapture through redesigning the code. For example, this has been observed in the sharing and creator economies, in which, because the provider side of the platform is the main value creator—for example, graphic designers, programmers—the platform owner periodically changes the design to facilitate delivery of that value (Bucher et al. Reference Bucher, Fieseler, Fleck and Lutz2018; Bhargava Reference Bhargava2022).

As Bourdieu (Reference Bourdieu1990) argued, the agents ultimately decide what they do, and the digital habitus may be accompanied by strategic calculation, even if most of the practices are bound by parameters defined by the platform owners and managed automatically by algorithms. This creates the opportunity for practices not aligned with the value proposition to go viral, eventually posing challenges to the balance envisioned in the platform design. For example, Krona (Reference Krona, Montereo and Sierra2015) uses the notion of “sousveillance”— an inverted surveillance “from the bottom” or “from many to a few”—to describe the novel use of an audiovisual sharing platform by social movements during the Arab Spring uprising. This emergent use emphasizes the emancipatory potential of users to create collective capabilities and decision-making (Ziccardi Reference Ziccardi2012), which we designate as crowd platform power–challenging forms of power.

Yet, platform owners can attempt to use crowd power in their favor, in what we call the crowd platform power–enhancing forms of power, through constitutional power (architecture design, limiting the possibility of contact between users), juridical power (policing and sanctioning users), and distinction power (by shaping the evaluative infrastructure). For example, Thelen (Reference Thelen2018) shows how Uber “weaponized” the volume of its users in a regulatory dispute by introducing a button on its interface that would send a templated complaint email to local government on the user’s behalf.

Hacking Power

Hacking power is the ability to identify the features and categories of digital spaces, such as overlooked programming errors and ungoverned areas, that may be used for a different purpose than the one originally intended (Jordan Reference Jordan2009; Hunsinger and Schrock Reference Hunsinger and Schrock2016). There are numerous examples in the literature of expressions of this type of power in digital platforms. Eaton et al. (Reference Eaton, Elaluf-Calderwood, Sørensen and Yoo2015) have described the continuous cycles of resistance and accommodation performed by groups of hackers and Apple that surround the jailbreaking of each new release of the iOS. Bucher et al. (Reference Bucher, Schou and Waldkirch2021) and Calo and Rosenblat (Reference Calo and Rosenblat2017) have shown how workers learn to anticipate patterns in algorithms that control their work processes and use this knowledge to defend themselves from abuses.

Hacking power is the antithesis of individual immaturity, as it requires not only the exercise of independent reasoning but also a degree of understanding of the specific system in which the power is exercised. It is deliberate and purposeful, unlike crowd power, which is independent of users’ understanding because it stems from the combined volume of their actions. At the same time, hacking power necessarily operates in the margins or interstices of the platform. Furthermore, hacking power can be thought of as opposed to the constitutional and juridical powers; as such, it will be dispersed, under the radar, and is often considered illegal (Castells Reference Castells2011). This creates difficulties for creating and accumulating this power in the field and consequently for using it to challenge other forms of power. Table 1 summarizes the different forms of power and provides further examples.

Table 1: Forms of Platform Power and the Organization of Immaturity

Platform Power Dynamics

By discussing platforms in the context of fields, we have shown how the relations between the different key agents can be understood through dynamics of power accumulation. On one hand, users activate their capitals through the production of the practices that configure the digital habitus, which enhances their understanding of the ways of participating on the platform. However, it is mainly the platform owner who captures most of the value creation process through constitutional, juridical, discursive, and distinction power. This uneven distribution facilitates the creation of a leveled field upon which the relative positions can be consolidated while, at the same time, enlarging the distance between agents and therefore their capacity to decide in an autonomous way.

We articulate these power dynamics through the phases of platform evolution to explain how platforms transform the agents’ relative positions over time and its impact on organizing immaturity. Figure 1 depicts the evolution in three phases.

Figure 1: Platform Power Dynamics over Time

Phase 1: Platform Formation and Field Enclosure

In platform formation, the primary objective for the platform owner is to get users to adopt the platform and regularly perform their practices on it. Constitutional, juridical, and discursive power are three of the forms of power through which platform owners attempt to enclose the field organizing the emergence of immaturity, for example, by designing a value creation model to create “exclusive control rights” (Boyle Reference Boyle2003) over dimensions of practices that were previously in the public domain. At the same time, these forms of power organize immaturity. First, constitutional power (in the form of rules, norms, categories, and languages) defines how and under what conditions interactions are performed and how the different agents can express their preferences. Through juridical power, platform owners have the capacity to define the sanctions that will promote or restrict an agent’s capacity to operate on the platform, for example, who can exercise their voice on the platform and who cannot and what sanctions are going to be applied to misbehavior. Finally, discursive power creates a common narrative about the value of the platform, restricting the capacity of agents to think beyond discourses that are presented as truths.

Phase 2: Platform Domination within Original Field

Platform adoption and sustained use create the conditions for it to increasingly occupy the field. The increasing participation of agents on the platform can change the predominant accumulation logics of the different agents in the field, shaping the digital habitus. The process of capital accumulation of different agents leveraged by distinction power defines further how immaturity is organized by promoting the processes of self-infliction of immaturity. Capital accumulation on platforms is expressed as more data and the levying of fees, and the influx of users is repurposed as capital to develop the platform. In turn, users invest different combinations of capitals (data about their practices, social networks, money, and other assets) with logics of both consumption (purchasing, sharing digital content) and profit and accumulation (influencer, merchant, driver). To thrive in the accumulation dynamics, agents must increasingly invest themselves in the platform, adapting their strategies so they align with those that are relevant to the platform and embedded in the digital habitus. Users adapt their practices to learn the specific logics. This brings user practices closer to their archetypical category, and because they are better rewarded, it further legitimizes the practices of the digital habitus. When users grasp the critical elements of the digital habitus that correspond to their type, their practices experience and enjoy a viral thrust that characterizes the platform logic (e.g., they may become social media superusers or influencers). This success in increasing the capital leveraged by the mechanism of distinction power, such as rating in the platform, calls for higher investment, increasing the users’ dependence on the platform and thus contributing to the self-inflictive process of immaturity.

At the same time, the processes that reinforce platform power accumulation coexist with other processes that create tensions that call for change and adjustment. Misalignments between users’ practices and their expected behavior can quickly accumulate, destabilizing the platform’s operation or posing challenges for its governance. In addition, platforms with massive user bases and innumerable interactions can become problematic for the platform owner to police, creating the space for agents to exercise their hacking power. These counterpower accumulation forces can therefore create an emergent enlightenment—as opposed to immaturity—for the agents.

Phase 3: Platform Cross-Field Expansion

In a third phase, the platform’s domination over the field leads to the possibility of integrating new fields, further contributing to the accumulation of power and the organizing of immaturity. Once a platform has become the dominant agent, a position in which the structure itself acts on the owner’s behalf, it can expand the scope of the field to new geographies and users and even enter and integrate previously separated fields. For example, Uber’s launch of Uber Eats was deployed using the platform’s extant base of users (drivers and passengers, viewed now as commensals).

From the domination position, the owner can operate in the various fields with great freedom, changing the exchange rate of the capitals at play and accumulating power. Highly dominant expressions of constitutional power include interoperability lock-ins, the use of dark patterns and biased information that impede sovereignty of choice, and digital workplace design and control. Juridical power can be commanded from a position of gatekeeping, permitting arbitrary suspension of users’ accounts, biased arbitration in a dispute, the imposition of discriminatory clauses, restriction of access to the platform, or limits on freedom of speech. Abuses of power are typically supported by the discursive power that enacts the discourse of Manifest Destiny and uses opaque arguments to justify the increased accumulation of power and the need to enforce the juridical power measures of the platform owner. Also in this phase, a full deployment of distinction power relates to the platform owner’s ability to monopolize the capture and processing of data through control of the technological architecture. This can be used to drive user choice in multiple ways, such as information asymmetries about the activities of a market or participant and political influences on social media platforms.

The activation of powers in the cross-field expansion phase depicts the dynamics within a field in a given moment, but it does not mean that the dominion of the platform owner is absolute or that the platform becomes a “total institution” (Bourdieu and Wacquant Reference Bourdieu and Wacquant2007). What we highlight is how the field’s structural homology with the platform eases a fast concentration of powers and creates remarkable obstacles to modifying this situation, whether from within (due to users’ habituation) or outside of the platform (because of network and lock-in effects, barriers to entry, and technical complexity). In addition to this, the form of dominion that the platform’s specific logic enables is very effective because by creating multisided businesses, it invisibilizes the specific accumulation and struggle dynamics with respect to the core practices users perform. For example, Amazon is “a place to buy and sell online,” and the fact that the company accumulates capital from the capture of user data and the use of its sellers’ capitals is not evident to the platform’s users. Thus the platform’s “rules of the game” may appear to be somewhat objective and relatively neutral, but they are in fact part of the organization of immaturity.

DISCUSSION

In this article, we present a sociosymbolic approach to power dynamics in digital platforms and how they relate to organizing immaturity. A sociosymbolic approach explains the structural and agentic dynamics of power accumulation leading to organized immaturity. We contrast the power asymmetries between the platform owner, as the central coordinating agent, and the rest of the agents directly participating in the platform to present five main forms of power enacted by the platform owner: constitutional, juridical, discursive, distinction, and crowd. We also present two forms of power that explain how users counteract the platform owner’s power accumulation: crowd and hacking. We explain how these forms of power are fundamental for understanding the different ways in which immaturity is organized. We show that constitutional power limits the symbolic world of the users and therefore their capacity to influence new rules and vocabularies that orchestrate participation. We explain how through juridical power, the platform owners have the capacity to define the sanctions that restrict the voice and participation of users. We show how through the digital habitus, the logic of the field is constituted, explaining the emergence of immaturity and its self-infliction. However, we also argue that distinction power enacts the platform owner’s capability to shape behavior through creating evaluative infrastructures that mediate the emergence of immaturity. Furthermore, we argue that the construction of a narrative of omniscience, through discursive power, explicitly asks users to delegate their own reasoning to the platform. We also highlight the existence of forms of power (hacking and crowd) that help users to accumulate power and resist the central authority of the platform owners.

Finally, we describe power dynamics and their relation to organized immaturity through three phases: first, platform formation, where forms of power—mainly constitutional, juridical, and discursive—operate to promote the field enclosure and set the basis for immaturity to occur; second, platform domination in the field, where distinction power promotes the field reproduction and processes of self-infliction of immaturity, while hacking and crowd power create resistance to the central authority; and third, platform cross-field expansion, in which power accumulation dynamics lead to the integration of new fields and increasing dynamics of immaturity. In defining the power accumulation dynamics, we explain the emergent character of immaturity and its relation to agents’ strategies.

By focusing on the digital platform and its power dynamics, we contribute in two ways to the current literature. First, we build a framework that explains the organizing dynamics of immaturity, based on the relations between the platform structure, the digital objects, and the agents’ strategies. Through this, we expand the understanding of organized immaturity in the light of sociological perspectives. Our framework analyzes how immaturity is constituted in practice and explains and nuances the possibility of emergence and the self-infliction dimensions of immaturity. Second, we provide a dynamic framework of platform power accumulation contributing to the platform literature. Finally, we also provide policy recommendations on how to tackle immaturity, and we highlight potential avenues for further research.

Rethinking Organized Immaturity from a Sociosymbolic Perspective

A sociosymbolic perspective on digital platforms and its power dynamics can push the boundaries of current concepts of organized immaturity toward a post-Kantian and more sociologically grounded view (Scherer and Neesham Reference Scherer and Neesham2020). This contributes to the understanding of organized immaturity in three ways. First is by explaining the different components of the emergence of immaturity through power struggles. We show how struggles are the result of agents’ different strategies, heterogeneously shaped by their positions on the platform and their practices, but also by their discourses and the history of experiences of each individual that shape the digital habitus. By showing the dynamics in these struggles, we contribute to explaining the process through which immaturity emerges as a nonorchestrated phenomenon.

Second, we explain self-infliction by moving away from the more political understandings of autonomy erosion. Political perspectives of immaturity look at the individual and its “(in)capacity for public use of reason” (Scherer and Neesham Reference Scherer and Neesham2020, 1) and consider the “delegation of decision making to impersonal authorities they cannot comprehend or control” (Scherer and Neesham Reference Scherer and Neesham2020, 4) as a condition of the individual. We, however, adopt a sociological view that focuses on the generation of practices and places the individual in a space of sociosymbolic power struggles. We complement previous literature exploring the symbolic aspects of technology and its impacts on society and, more concretely, on autonomy erosion (Zuboff Reference Zuboff2019; Stark and Pais Reference Stark and Pais2020; Fayard and Weeks Reference Fayard and Weeks2014) by providing a set of forms of power that articulate how self-infliction is embedded in the digital habitus and thus how immaturity is organized. Our sociosymbolic perspective explains how the conditions of agency are shaped by the specific structure of the platform and its power dynamics.

Last, looking at fields through the power dynamics between the different agents can shed explanatory light on the formation process of organized immaturity. The relationship between habitus and field operates in two ways: while the field structures the habitus as the embodiment of the immanent necessity of a field, the habitus makes the field a meaningful space in which the agents may invest their capitals and themselves (Bourdieu and Wacquant Reference Bourdieu and Wacquant2007). By defining the stages through which this relationship unfolds, we contribute to showing the emergent, dynamic, and accumulative nature of organized immaturity.

Contribution to the Understanding of Platform Power Accumulation

We have approached organized immaturity by analyzing platforms as spaces of coordination and production of practices, shaped by relations engrained into a digital habitus and the logic of the field. By better understanding the forms of power and the role they play in field transformation, we have identified more clearly the different forms of power accumulation through which digital platforms can become vehicles for organized immaturity and its dynamics. This contributes to the literature of platforms in the following ways. First, our description of the structural process of power accumulation on the platform expands market and network approaches (Jacobides Reference Jacobides2021; Khan Reference Khan2018; Eaton et al. Reference Eaton, Elaluf-Calderwood, Sørensen and Yoo2015) by showing the importance of the social, cultural, and symbolic dimensions of capital. This lays the foundations for fundamentally reconceptualizing platform power and further explaining how power is exercised by the platform owner (Cutolo and Kenney Reference Cutolo and Kenney2020; Kenney, Bearson, and Zysman Reference Kenney, Bearson and Zysman2019).

Second, we enrich structural approaches to platforms by presenting how fields can be transformed through dynamics of power accumulation that extend beyond the consequences of an asymmetric structure (Curchod et al. Reference Curchod, Patriotta, Cohen and Neysen2019; Hurni, Huber, and Dibbern Reference Hurni, Huber and Dibbern2022). Furthermore, our framework shows how platforms can be reshaped by the interaction of agents’ strategies and the reconfiguration of the fields. By introducing a field view, we provide a more holistic scheme in which power is both accumulated and contested. We also highlight how agents other than the platform owner play a role in producing and exercising forms of power. This nuances our understanding of field dynamics and agent interaction in the context of platform power dynamics.

Third, our model complements sociomaterial studies on platform power (e.g., Beer Reference Beer2017; Kornberger Reference Kornberger2017; Stark and Pais Reference Stark and Pais2020) with the notion of the digital habitus and its relation to organized immaturity. Other authors have analyzed technological affordances as social designations of a space and the social and cultural factors that signify a space and set a generative principle of governance (Jung and Lyytinen Reference Jung and Lyytinen2014). Although these authors do not talk explicitly about habitus or social capital, they reflect on the generative reproduction of norms by individuals in contact with their social spaces; this is very similar to the Bourdieu definition of habitus in social spaces. We complement the sociomaterial view of platforms by showing how the digital habitus works and by emphasizing the role of the platform as an organizing agent with a privileged capacity of capital accumulation. We present the platform as a space of symbolically mediated power relationships in which the digital objects and structural elements interplay to conform the logic of the field. We provide an understanding of the multifaceted nature of power as a process resulting from agents’ practices and strategies, the habitus, and capital accumulation in a field. We argue that this conceptualization defines power in platforms not only as an “instrument” (Zuboff Reference Zuboff2019) at the service of the platform owners but as a web of relations utilized by agents who can better exploit the different forms of capital. We also contribute to the debate about the coordinating role of platforms and how they create generative forms of distributed control while power remains centralized, in an interplay between hierarchical and heterarchical power relations (Kornberger et al. Reference Kornberger, Pflueger and Mouritsen2017).

Bourdieu’s (Reference Bourdieu1977, Reference Bourdieu1990) concepts of capitals, habitus, and distinction have been used before in the study of the social consequences of digitalization and platforms’ increase of power (e.g., Levina and Arriaga Reference Levina and Arriaga2014; Fayard and Weeks 204; Romele and Rodighiero Reference Romele and Rodighiero2020). We complement that research with a view of platforms’ accumulation of power and its role in the organizing of immaturity. We go beyond the explanation of distinction power to define constitutional, juridical, discourse, crowd, and hacking forms of power, thereby offering a more complete view of how platforms accumulate power and organize immaturity.

Contributions to Practice and Avenues for Future Research

Our article provides a conceptual framework to practitioners that can enable platform owners, users, and policy makers to fundamentally rethink how they might address the platforms’ negative consequences for society. First, it highlights immaturity as a relevant concept to address social issues in platforms. Our detailed understanding of the mechanisms leading to immaturity and the manipulation of individuals’ decisions can help policy makers to identify and set limits on these types of powers, especially in the light of platform domination. By explaining the organizing dynamics of immaturity, we direct attention to the more holistic assessments of the social consequences of platforms. Concretely, we emphasize how these are not just concerned with the concentration in specific industries (such as retailing or advertising) but also involve constraints on human rights (such as freedom of speech). Furthermore, we show how the consequences of organizing our practices through platforms are embedded in social structures and expressed in the transformation of fields. We believe that this line of thought is fundamental if we are to collectively rethink the social role of platforms.

Our article has also limitations that open up avenues for further research. We have identified not only forms of platform power accumulation but also forms of platform counterpower accumulation. As our focus in this article has been on how platforms organize immaturity, we have devoted more attention to the forms of power accumulation. However, future work is needed to deepen our understanding of how platforms lose power. For example, in recent years, we have witnessed an increasing backlash against big tech platforms, fueled by reputational scandals and vigorous societal complaints (Joyce and Stuart Reference Joyce, Stuart, Drahokoupil and Vandaele2021; Gawer and Srnicek Reference Gawer and Srnicek2021). We have also observed a new wave of regulatory initiatives that intend to curb platforms’ power by forcing interoperability and limiting self-preferencing and acquisitions (Cusumano et al. Reference Cusumano, Gawer and Yoffie2021; Jacobides, Bruncko, and Langen Reference Jacobides, Bruncko and Langen2020), even when the effectiveness of these policies is being debated (Rikap and Lundvall Reference Rikap and Lundvall2021). For example, in Europe, the new legislation of the Digital Markets Act (European Commission 2022a) and the Digital Services Act (European Commission 2022b) are respectively intended to create more contestability in digital platform markets. In the United States, there has been intense debate around the possible revocation of Section 230, which has so far provided a shield for platforms’ activities in social networks (SCDP 2019), leading to abuses of power and increasing immaturity. In parallel to regulatory or external counterpower mechanisms, research into power dynamics could also analyze the flows of affects and affective intensification (Just Reference Just2019) that happen with the abuse of the digital habitus. Incipient research (e.g., Just Reference Just2019; Castelló and Lopez-Berzosa Reference Castelló and Lopez-Berzosa2023) has shown how these flows of affects not only shape collective meanings but can also lead to increasing forms of hate speech and the renaissance of populist politics. More should be researched about what forms of counterpower may emerge in society to reduce populism and hate speech. We believe that our framework sets grounds for studying the more concrete practices of immaturity in platforms but also new forms of resistance.

CONCLUSION

Building on the concepts of fields, capitals, and habitus, we propose a sociosymbolic framework to explain organized immaturity in digital platforms. We articulate six forms of power that characterize the different ways in which platforms organize immaturity. It is our suggestion that a more precise understanding of the digital platforms’ role in driving organized immaturity can become the basis for fundamentally rethinking the role of the digital platform in society. Can the processes that lead to organized immaturity be reoriented toward organized enlightenment? We argue that a first step in this direction is to better understand how power is performed in digital platforms, which is what our framework contributes to explaining.

Acknowledgments

We express our gratitude to this special issue’s editorial team, and particularly to Dennis Schoene-born, for their extraordinary work in helping us develop this article into its current form. We also express our appreciation to the three anonymous reviewers who provided valuable guidance during the editorial process. We are thankful to the Research Council of Norway, project “Algorithmic Accountability: Designing Governance for Responsible Digital Transformations” (grant 299178) and the British Academy, project “Fighting Fake News in Italy, France and Ireland: COVID-19” (grant COVG7210059) for supporting this research.

Martín Harracá (, corresponding author) is a postgraduate researcher and PhD candidate at Surrey Business School, University of Surrey. He holds a MA (analyse politique et économique; Hons) from Paris 13–Sorbonne Paris Cité and a Licentiate degree (Hons) in economics from Universidad de Buenos Aires. He is interested in society’s transformation through digitalization, with a focus on strategy and competition in digital platforms.

Itziar Castelló is a reader at Bayes Business School (formerly Cass) at City University of London. She holds an executive MBA and a PhD from ESADE, Ramon Llull University and a MSc from the College of Europe in Belgium. She is interested in social change in digital contexts. Her research uses corporate social responsibility, deliberation, and social movement theories to understand social and environmental challenges like climate change, plastic pollution, and social polarization.

Annabelle Gawer is chaired professor in digital economy and director of the Centre of Digital Economy at the University of Surrey and a visiting professor of strategy and innovation at Oxford University Saïd Business School. A pioneering scholar of digital platforms and innovation ecosystems, she is a highly cited author or coauthor of more than forty articles and four books, including The Business of Platforms: Strategy in the Age of Digital Competition, Innovation, and Power (2019). Gawer is a digital expert for the UK Competition and Markets Authorities, and she has advised the European Parliament and the European Commission on digital platforms regulation as an expert in the EU Observatory of the Online Platform Economy.

References

REFERENCES

Adner, Ron, and Kapoor, Rahul. 2010. “Value Creation in Innovation Ecosystems: How the Structure of Technological Interdependence Affects Firm Performance in New Technology Generations.” Strategic Management Journal 31 (3): 306–33.CrossRefGoogle Scholar
Alaimo, Cristina, and Kallinikos, Jannis. 2017. “Computing the Everyday: Social Media as Data Platforms.” Information Society 33 (4): 175–91.CrossRefGoogle Scholar
Aral, Sinan, and Walker, Dylan. 2012. “Identifying Influential and Susceptible Members of Social Networks.” Science 337 (6092): 337–41.CrossRefGoogle ScholarPubMed
Baldwin, Carliss Y., and Woodard, C. Jason. 2009. “The Architecture of Platforms: A Unified View.” In Platforms, Markets and Innovation, edited by Gawer, Annabelle, 32. Cheltenham, UK: Edward Elgar Google Scholar
Beer, David. 2017. “The Social Power of Algorithms.” Information, Communication, and Society 20 (1): 113.CrossRefGoogle Scholar
Bennett, W. Lance, Segerberg, Alexandra, and Walker, Shawn. 2014. “Organization in the Crowd: Peer Production in Large-Scale Networked Protests.” Information, Communication, and Society 17 (2): 232–60.CrossRefGoogle Scholar
Bhargava, Hemant K. 2022. “The Creator Economy: Managing Ecosystem Supply, Revenue-Sharing, and Platform Design.” Management Science 68 (7): 5233–51.CrossRefGoogle Scholar
Boudreau, Kevin. 2010. “Open Platform Strategies and Innovation: Granting Access vs. Devolving Control.” Management Science 56 (10): 1849–72.CrossRefGoogle Scholar
Bourdieu, Pierre. 1969. “Intellectual Field and Creative Project.” Social Science Information 8 (2): 89119.CrossRefGoogle Scholar
Bourdieu, Pierre. 1977. Outline of a Theory of Practice. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Bourdieu, Pierre. 1979. “Symbolic Power.” Critique of Anthropology 4 (13–14): 7785.CrossRefGoogle Scholar
Bourdieu, Pierre. 1984. Distinction: A Social Critique of the Judgement of Taste. Cambridge, MA: Harvard University Press.Google Scholar
Bourdieu, Pierre. 1987. “The Force of Law: Toward a Sociology of the Juridical Field.” Hastings Law Journal 38 (5): 814–53.Google Scholar
Bourdieu, Pierre. 1989. “Social Space and Symbolic Power.” Sociological Theory 7 (1): 1425.CrossRefGoogle Scholar
Bourdieu, Pierre. 1990. The Logic of Practice. Redwood City, CA: Stanford University Press.CrossRefGoogle Scholar
Bourdieu, Pierre. 1991. Language and Symbolic Power. Cambridge, MA: Harvard University Press.Google Scholar
Bourdieu, Pierre. 1993. “Génesis y estructura del campo burocrático.” Actes de la Recherche en Sciences Sociales 96–97: 4962.Google Scholar
Bourdieu, Pierre. 2005. “Principles of an Economic Anthropology.” In The Handbook of Economic Sociology, 2nd ed., 7589. Princeton, NJ: Princeton University Press.Google Scholar
Bourdieu, Pierre. 2011. “The Forms of Capital.” In The Sociology of Economic Life, 3rd ed., edited by Granovetter, Mark and Swedberg, Richard, 241–58. New York: Routledge.Google Scholar
Bourdieu, Pierre. 2014. On the State: Lectures at the Collège de France, 1989–1992. Edited by Champagne, Patrick. Cambridge: Polity Press.Google Scholar
Bourdieu, Pierre, and Wacquant, Loïc, eds. 2007. An Invitation to Reflexive Sociology. Malden, MA: Polity Press.Google Scholar
Boyle, James. 2003. “The Second Enclosure Movement and the Construction of the Public Domain.” Law and Contemporary Problems 66 (1): 42.Google Scholar
Bucher, Eliane, Fieseler, Christian, Fleck, Matthes, and Lutz, Christoph. 2018. “Authenticity and the Sharing Economy.” Academy of Management Discoveries 4 (3): 294313.CrossRefGoogle Scholar
Bucher, Eliane Léontine, Schou, Peter Kalum, and Waldkirch, Matthias. 2021. “Pacifying the Algorithm: Anticipatory Compliance in the Face of Algorithmic Management in the Gig Economy.” Organization 28 (1): 4467.CrossRefGoogle Scholar
Busch, Christoph, Graef, Inge, Hofmann, Jeanette, and Gawer, Annabelle. 2021. Uncovering Blindspots in the Policy Debate on Platform Power: Final Report. Luxembourg: Publications Office of the European Union. https://platformobservatory.eu/app/uploads/2021/03/05Platformpower.pdf.Google Scholar
Caillaud, Bernard, and Jullien, Bruno. 2003. “Chicken and Egg: Competition among Intermediation Service Providers.” RAND Journal of Economics 34 (2): 309–28.CrossRefGoogle Scholar
Calo, Ryan, and Rosenblat, Alex. 2017. “The Taking Economy: Uber, Information, and Power.” SSRN Electronic Journal. https://doi.org/10/gfvmg3.CrossRefGoogle Scholar
Castelló, Itziar, and Lopez-Berzosa, David. 2023. “Affects in Online Stakeholder Engagement: A Dissensus Perspective.” Business Ethics Quarterly 33 (1): 180215.CrossRefGoogle Scholar
Castells, Manuel. 2011. “Network Theory: A Network Theory of Power.” International Journal of Communication 5: 773–87.Google Scholar
Clegg, Stewart R. 1989. Organization Theory and Class Analysis: New Approaches and New Issues. New York: De Gruyter.Google Scholar
Costa, Ricardo L. 2006. “The Logic of Practices in Pierre Bourdieu.” Current Sociology 54 (6): 873–95.CrossRefGoogle Scholar
Constantinides, Panos, Henfridsson, Ola, and Parker, Geoffrey G.. 2018. “Introduction: Platforms and Infrastructures in the Digital Age.” Information Systems Research 29 (2): 381400 CrossRefGoogle Scholar
Crémer, Jacques, de Montjoye, Yves-Alexandre, and Schweitzer, Heike. 2019. Competition Policy for the Digital Era. Luxembourg: Publications Office of the European Union. https://ec.europa.eu/competition/publications/reports/kd0419345enn.pdf.Google Scholar
Culpepper, Pepper D., and Thelen, Kathleen. 2020. “Are We All Amazon Primed? Consumers and the Politics of Platform Power.” Comparative Political Studies 53 (2): 288318.CrossRefGoogle Scholar
Curchod, Corentin, Patriotta, Gerardo, Cohen, Laurie, and Neysen, Nicolas. 2019. “Working for an Algorithm: Power Asymmetries and Agency in Online Work Settings.” Administrative Science Quarterly 65 (3): 644–76.CrossRefGoogle Scholar
Cusumano, Michael A., Gawer, Annabelle, and Yoffie, David B.. 2019. The Business of Platforms: Strategy in the Age of Digital Competition, Innovation, and Power. New York: Harper Business.Google Scholar
Cusumano, Michael, Gawer, Annabelle, and Yoffie, David. 2021. “Can Self-Regulation Save Digital Platforms?Industrial and Corporate Change 30 (5): 1259–85.CrossRefGoogle Scholar
Cutolo, Donato, and Kenney, Martin. 2020. “Platform-Dependent Entrepreneurs: Power Asymmetries, Risks, and Strategies in the Platform Economy.” Academy of Management Perspectives 35 (4): 584685.CrossRefGoogle Scholar
Dewey, John. 1939. Freedom and Culture. New York: Putnam.Google Scholar
Eaton, Ben, Elaluf-Calderwood, Silvia, Sørensen, Carsten, and Yoo, Youngjin. 2015. “Distributed Tuning of Boundary Resources: The Case of Apple’s IOS Service System.” MIS Quarterly 39 (1): 217–43.CrossRefGoogle Scholar
Etter, Michael, and Albu, Oana Brindusa. 2021. “Activists in the Dark: Social Media Algorithms and Collective Action in Two Social Movement Organizations.” Organization 28 (1): 6891.CrossRefGoogle Scholar
European Commission. 2022b. “The Digital Services Act Package.” https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package.Google Scholar
Evans, David. 2003. “Some Empirical Aspects of Multi-sided Platform Industries.” Review of Network Economics 2 (3): 191209.CrossRefGoogle Scholar
Faraj, Samer, Pachidi, Stella, and Sayegh, Karla. 2018. “Working and Organizing in the Age of the Learning Algorithm.” Information and Organization 28 (1): 6270.CrossRefGoogle Scholar
Fayard, Anne-Laure, and Weeks, John. 2014. “Affordances for Practice.” Information and Organization 24 (4): 236–49.CrossRefGoogle Scholar
Flyverbom, Mikkel, Deibert, Ronald, and Matten, Dirk. 2019. “The Governance of Digital Technology, Big Data, and the Internet: New Roles and Responsibilities for Business.” Business and Society 58 (1): 319.CrossRefGoogle Scholar
Fuchs, Christian. 2007. Internet and Society: Social Theory in the Information Age. New York: Routledge.CrossRefGoogle Scholar
Garud, Raghu, Kumaraswamy, Arun, Roberts, Anna, and Xu, Le. 2022. “Liminal Movement by Digital Platform-Based Sharing Economy Ventures: The Case of Uber Technologies.” Strategic Management Journal 43 (3): 447–75.CrossRefGoogle Scholar
Gawer, Annabelle. 2014. “Bridging Differing Perspectives on Technological Platforms: Toward an Integrative Framework.” Research Policy 43 (7): 1239–49.CrossRefGoogle Scholar
Gawer, Annabelle. 2021. “Digital Platforms’ Boundaries: The Interplay of Firm Scope, Platform Sides, and Digital Interfaces.” Long Range Planning 54 (5): 102045.CrossRefGoogle Scholar
Gawer, Annabelle, and Henderson, Rebecca. 2007. “Platform Owner Entry and Innovation in Complementary Markets: Evidence from Intel.” Journal of Economics and Management Strategy 16 (1): 134.Google Scholar
Gawer, Annabelle, and Srnicek, Nick. 2021. Online Platforms: Economic and Societal Effects. Brussels: Panel for the Future of Science and Technology, European Parliament. https://www.europarl.europa.eu/stoa/en/document/EPRS_STU(2021)656336.Google Scholar
Ghosh, Arpita, and Hummel, Patrick. 2014. “A Game-Theoretic Analysis of Rank-Order Mechanisms for User-Generated Content.” Journal of Economic Theory 154 (November): 349–74.CrossRefGoogle Scholar
Gillespie, Tarleton. 2010. “The Politics of ‘Platforms.’New Media and Society 12 (3): 347–64.CrossRefGoogle Scholar
Goldhaber, Michael H. 1997. “The Attention Economy and the Net.” First Monday 2 (4).Google Scholar
Gulati, Ranjay, Puranam, Phanish, and Tushman, Michael. 2012. “Meta-Organization Design: Rethinking Design in Interorganizational and Community Contexts.” Strategic Management Journal 33 (6): 571–86.CrossRefGoogle Scholar
Hilferding, Rudolph. 2005. Finance Capital: A Study of the Latest Phase of Capitalist Development. London: Taylor and Francis.Google Scholar
Hunsinger, Jeremy, and Schrock, Andrew. 2016. “The Democratization of Hacking and Making.” New Media and Society 18 (4): 535–38.CrossRefGoogle Scholar
Hurni, Thomas, Huber, Thomas L., and Dibbern, Jens. 2022. “Power Dynamics in Software Platform Ecosystems.” Information Systems Journal 32 (2): 310–43.CrossRefGoogle Scholar
Ignatow, Gabe, and Robinson, Laura. 2017. “Pierre Bourdieu: Theorizing the Digital.” Information, Communication, and Society 20 (7): 950–66.CrossRefGoogle Scholar
Introna, Lucas D. 2016. “Algorithms, Governance, and Governmentality: On Governing Academic Writing.” Science, Technology, and Human Values 41 (1): 1749.CrossRefGoogle Scholar
Jacobides, Michael G. 2021. “What Drives and Defines Digital Platform Power?” White paper, Evolution Ltd., April 18. https://www.evolutionltd.net/post/what-drives-and-defines-digital-platform-power.Google Scholar
Jacobides, Michael G., Bruncko, Martin, and Langen, Rene. 2020. “Regulating Big Tech in Europe: Why, So What, and How Understanding Their Business Models and Ecosystems Can Make a Difference.” White paper, Evolution Ltd., December 20. https://www.evolutionltd.net/post/regulating-big-tech-in-europe.CrossRefGoogle Scholar
Jordan, Tim. 2009. “Hacking and Power: Social and Technological Determinism in the Digital Age.” First Monday 14 (7).Google Scholar
Joyce, S., & Stuart, M. (2021). Trade union responses to platform work: An evolving tension between mainstream and grassroots approaches. In A Modern Guide to Labour and the Platform Economy, edited by Drahokoupil, Jan and Vandaele, Kurt, 177–92. Cheltenham, UK: Edward Elgar.Google Scholar
Julien, Chris. 2015. “Bourdieu, Social Capital and Online Interaction.” Sociology 49 (2): 356–73.CrossRefGoogle Scholar
Jung, Yusun, and Lyytinen, Kalle. 2014. “Towards an Ecological Account of Media Choice: A Case Study on Pluralistic Reasoning While Choosing Email.” Information Systems Journal 24 (3): 271–93.CrossRefGoogle Scholar
Just, Natascha, and Latzer, Michael. 2017. “Governance by Algorithms: Reality Construction by Algorithmic Selection on the Internet.” Media, Culture, and Society 39 (2): 238–58.CrossRefGoogle Scholar
Just, Sine N. 2019. “An Assemblage of Avatars: Digital Organization as Affective Intensification in the GamerGate Controversy.” Organization 26 (5): 716–38.CrossRefGoogle Scholar
Katz, Michael L., and Shapiro, Carl. 1985. “Network Externalities, Competition, and Compatibility.” American Economic Review 75 (3): 424–40.Google Scholar
Kelkar, Shreeharsh. 2018. “Engineering a Platform: The Construction of Interfaces, Users, Organizational Roles, and the Division of Labor.” New Media and Society 20 (7): 2629–46.CrossRefGoogle Scholar
Kenney, Martin, Bearson, Dafna, and Zysman, John. 2019. “The Platform Economy Matures: Pervasive Power, Private Regulation, and Dependent Entrepreneurs.” SSRN Electronic Journal. DOI: 10.2139/ssrn.3497974.CrossRefGoogle Scholar
Khan, Lina M. 2018. “Sources of Tech Platform Power.” Georgetown Law Technology Review 2 (2): 325–34.Google Scholar
Kitchens, Brent, Johnson, Steve L., and Gray, Peter. 2020. “Understanding Echo Chambers and Filter Bubbles: The Impact of Social Media on Diversification and Partisan Shifts in News Consumption.” MIS Quarterly 44 (4): 1619–49.CrossRefGoogle Scholar
Kolagar, Milad, Parida, Vinit, and Sjödin, David. 2022. “Ecosystem Transformation for Digital Servitization: A Systematic Review, Integrative Framework, and Future Research Agenda.” Journal of Business Research 146 (July): 176200.CrossRefGoogle Scholar
Kornberger, Martin. 2017. “The Visible Hand and the Crowd: Analyzing Organization Design in Distributed Innovation Systems.” Strategic Organization 15 (2): 174–93.CrossRefGoogle Scholar
Kornberger, Martin, Pflueger, Dane, and Mouritsen, Jan. 2017. “Evaluative Infrastructures: Accounting for Platform Organization.” Accounting, Organizations, and Society 60 (July): 7995.CrossRefGoogle Scholar
Kretschmer, Tobias, Leiponen, Aija, Schilling, Melissa, and Vasudeva, Gurneeta. 2022. “Platform Ecosystems as Metaorganizations: Implications for Platform Strategies.” Strategic Management Journal 43 (3): 405–24.CrossRefGoogle Scholar
Krona, Michael. 2015. “Contravigilancia y videoactivismo desde la plaza Tahrir. Sobre las paradojas de la sociedad contravigilante.” In Videoactivismo y movimientos sociales, edited by Montereo, David and Sierra, Francisco, 17. Barcelona: Gedisa.Google Scholar
Lanier, Jaron. 2018. Ten Arguments for Deleting Your Social Media Accounts Right Now. New York: Random House.Google Scholar
Lessig, Lawrence. 2009. El Código 2.0. Madrid: Traficantes de Sueños.Google Scholar
Levina, Natalia, and Arriaga, Manuel. 2014. “Distinction and Status Production on User-Generated Content Platforms: Using Bourdieu’s Theory of Cultural Production to Understand Social Dynamics in Online Fields.” Information Systems Research 25 (3): 443666.CrossRefGoogle Scholar
Lianos, Ioannis, and Carballa-Smichowski, Bruno. 2022. “A Coat of Many Colours: New Concepts and Metrics of Economic Power in Competition Law and Economics.” Journal of Competition Law and Economics 18 (4): 795831.CrossRefGoogle Scholar
Lupton, Deborah. 2016. The Quantified Self. Hoboken, NJ: John Wiley.Google Scholar
Lynskey, Orla. 2017. “Regulating ‘Platform Power.’” LSE Legal Studies Working Paper 1/2017, London School of Economics.CrossRefGoogle Scholar
Lynskey, Orla. 2019. “Grappling with ‘Data Power’: Normative Nudges from Data Protection and Privacy.” Theoretical Inquiries in Law 20 (1): 189220.CrossRefGoogle Scholar
Lyon, David. 2018. The Culture of Surveillance: Watching as a Way of Life. 1st ed. Medford, MA: Polity Press.Google Scholar
Maddox, Jessica, and Malson, Jennifer. 2020. “Guidelines without Lines, Communities without Borders: The Marketplace of Ideas and Digital Manifest Destiny in Social Media Platform Policies.” Social Media + Society 6 (2).CrossRefGoogle Scholar
Margetts, Helen, Lehdonvirta, Vili, González-Bailón, Sandra, Hutchinson, Jonathon, Bright, Jonathan, Nash, Vicki, and Sutcliffe, David. 2021. “The Internet and Public Policy: Future Directions.” Policy and Internet. DOI: 10.1002/poi3.263.CrossRefGoogle Scholar
Martin, Kirsten E. 2022. “Algorithmic Bias and Corporate Responsibility: How Companies Hide behind the False Veil of the Technological Imperative.” In The Ethics of Data and Analytics: Concepts and Cases, 3650. New York: Auerbach.CrossRefGoogle Scholar
McCoy, Jennifer, Rahman, Tahmina, and Somer, Murat. 2018. “Polarization and the Global Crisis of Democracy: Common Patterns, Dynamics, and Pernicious Consequences for Democratic Polities.” American Behavioral Scientist 62 (1): 1642.CrossRefGoogle Scholar
McIntyre, David, Srinivasan, Arati, Afuah, Allan, Gawer, Annabelle, and Kretschmer, Tobias. 2021. “Multi-sided Platforms as New Organizational Forms.” Academy of Management Perspectives 35 (4): 566–83.CrossRefGoogle Scholar
Noble, Safiya Umoja. 2018. Algorithms of Oppression: How Search Engines Reinforce Racism. Illustrated ed. New York: NYU Press.CrossRefGoogle Scholar
O’Connor, Cailin, and Weatherall, James Owen. 2019. The Misinformation Age: How False Beliefs Spread. New Haven, CT: Yale University Press.Google Scholar
Orlikowski, Wanda J., and Scott, Susan V.. 2015. “The Algorithm and the Crowd: Considering the Materiality of Service Innovation.” MIS Quarterly 39 (1): 201–16.CrossRefGoogle Scholar
Pasquale, Frank. 2015. The Black Box Society: The Secret Algorithms That Control Money and Information. Cambridge, MA: Harvard University Press.CrossRefGoogle Scholar
Philbeck, Thomas, and Davis, Nicholas. 2018. “The Fourth Industrial Revolution: Shaping a New Era.” Journal of International Affairs 72 (1): 1722.Google Scholar
Portes, Alejandro. 1998. “Social Capital: Its Origins and Applications in Modern Sociology.” Annual Review of Sociology 24 (1): 124.CrossRefGoogle Scholar
Richards, Neil M. 2012. “The Dangers of Surveillance Symposium: Privacy and Technology.” Harvard Law Review 126 (7): 1934–65.Google Scholar
Rikap, Cecilia, and Lundvall, Bengt-Åke. 2021. The Digital Innovation Race: Conceptualizing the Emerging New World Order. Cham, Switzerland: Springer.CrossRefGoogle Scholar
Rochet, Jean-Charles, and Tirole, Jean. 2003. “Platform Competition in Two-Sided Markets.” Journal of the European Economic Association 1 (4): 9901029.CrossRefGoogle Scholar
Rodon Modol, Joan, and Eaton, Ben. 2021. “Digital Infrastructure Evolution as Generative Entrenchment: The Formation of a Core–Periphery Structure.” Journal of Information Technology 36 (4): 342–64.CrossRefGoogle Scholar
Romele, Alberto, and Rodighiero, Dario. 2020. “Digital Habitus or Personalization without Personality.” HUMANA.MENTE Journal of Philosophical Studies 13 (37): 98126.Google Scholar
Sandberg, Jörgen, and Alvesson, Mats. 2011. “Ways of Constructing Research Questions: Gap-Spotting or Problematization?Organization 18 (1): 2344.CrossRefGoogle Scholar
Scherer, Andreas Georg, and Neesham, Cristina. 2020. “New Challenges to Enlightenment: Why Socio-technological Conditions Lead to Organized Immaturity and What to Do about It.” SSRN Electronic Journal. DOI: https://doi.org/10/gj8mhq.CrossRefGoogle Scholar
Scherer, Andreas Georg, Neesham, Cristina, Schoeneborn, Dennis, and Scholz, Markus. 2020. “Call for Submissions Business Ethics Quarterly Special Issue on: Socio-technological Conditions of Organized Immaturity in the Twenty-First Century.” Business Ethics Quarterly 30 (3): 440–44.Google Scholar
Schwab, Klaus. 2017. The Fourth Industrial Revolution. New York: Crown Business.Google Scholar
Seymour, Richard. 2019. “The Machine Always Wins: What Drives Our Addiction to Social Media.” Guardian, August 23, sec. Technology.Google Scholar
Srnicek, Nick. 2016. Platform Capitalism. Malden, MA: Polity Press.Google Scholar
Stark, David, and Pais, Ivana. 2020. “Algorithmic Management in the Platform Economy.” Sociologica 14 (3): 4772.Google Scholar
Stigler Committee on Digital Platforms. 2019. “Final Report.” Stigler Center for the Study of the Economy and the State. https://www.chicagobooth.edu/-/media/research/stigler/pdfs/digital-platforms---committee-report---stigler-center.pdf.Google Scholar
Striphas, Ted. 2010. “How to Have Culture in an Algorithmic Age.” https://www.thelateageofprint.org/2010/06/14/how-to-have-culture-in-an-algorithmic-age/.Google Scholar
Teece, David J. 2017. “Dynamic Capabilities and (Digital) Platform Lifecycles.” Advances in Strategic Management 37: 211–25.CrossRefGoogle Scholar
Thelen, Kathleen. 2018. “Regulating Uber: The Politics of the Platform Economy in Europe and the United States.” Perspectives on Politics 16 (4): 938–53.CrossRefGoogle Scholar
Tiwana, Amrit, Konsynski, Benn, and Bush, Ashley A.. 2010. “Platform Evolution: Coevolution of Platform Architecture, Governance, and Environmental Dynamics.” Information Systems Research 21 (4): 675–87.CrossRefGoogle Scholar
Trittin-Ulbrich, Hannah, Scherer, Andreas Georg, Munro, Iain, and Whelan, Glen. 2021. “Exploring the Dark and Unexpected Sides of Digitalization: Toward a Critical Agenda.” Organization 28 (1): 825.CrossRefGoogle Scholar
Vaast, Emmanuelle, Davidson, Elizabeth J., and Mattson, Thomas. 2013. “Talking about Technology: The Emergence of a New Actor Category through New Media.” MIS Quarterly 37 (4): 1069–92.CrossRefGoogle Scholar
Valdez, Jimena. 2023. “The Politics of Uber: Infrastructural Power in the United States and Europe.” Regulation and Governance 17 (1): 177194.CrossRefGoogle Scholar
von Hippel, Eric. 2006. Democratizing Innovation. Cambridge, MA: MIT Press.Google Scholar
Walker, Michael, Fleming, Peter, and Berti, Marco. 2021. “‘You Can’t Pick Up a Phone and Talk to Someone’: How Algorithms Function as Biopower in the Gig Economy.” Organization 28 (1): 2643.CrossRefGoogle Scholar
Wu, Liang, Morstatter, Fred, Carley, Kathleen M., and Liu, Huan. 2019. “Misinformation in Social Media: Definition, Manipulation, and Detection.” ACM SIGKDD Explorations Newsletter 21 (2): 8090.CrossRefGoogle Scholar
Ziccardi, G. (2012). Resistance, Liberation Technology and Human Rights in the Digital Age. Dordrecht, Netherlands: Springer Science & Business Media.Google Scholar
Zittrain, Jonathan. 2009. “Law and Technology: The End of the Generative Internet.” Communications of the ACM 52 (1): 1820.CrossRefGoogle Scholar
Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. London: Profile Books.Google Scholar
Figure 0

Table 1: Forms of Platform Power and the Organization of Immaturity

Figure 1

Figure 1: Platform Power Dynamics over Time