Hostname: page-component-76fb5796d-skm99 Total loading time: 0 Render date: 2024-04-26T12:50:32.865Z Has data issue: false hasContentIssue false

The Interplay between the Digital Services Act and Sector Regulation: How Special Is Copyright?

Published online by Cambridge University Press:  10 March 2022

João Pedro Quintais*
Affiliation:
Assistant Professor, Institute for Information Law (IViR), University of Amsterdam, Amsterdam, The Netherlands
Sebastian Felix Schwemer
Affiliation:
Associate Professor, Centre for Information and Innovation Law (CIIR), University of Copenhagen, Copenhagen, Denmark Adjunct Associate Professor, Norwegian Research Center for Computers and Law (NRCCL), University of Oslo, Oslo, Norway
*
*Corresponding author. Email: j.p.quintais@uva.nl
Rights & Permissions [Opens in a new window]

Abstract

On 15 December 2020, the European Commission published its proposal for the Digital Services Act, which is expected to be adopted before summer 2022. It carries out a regulatory overhaul of the twenty-one-year-old horizontal rules on intermediary liability in the e-Commerce Directive and introduces new due diligence obligations for intermediary services. Our analysis illuminates an important point that has so far received little attention: how would the Digital Services Act’s rules interact with existing sector-specific lex specialis rules? In this article, we look specifically at the intersection of the Digital Services Act with the regime for online content-sharing service providers (OCSSPs) set forth in Article 17 of Directive (EU) 2019/790 on Copyright in the Digital Single Market (CDSM Directive). At first glance, these regimes do not appear to overlap, as the rules on copyright are lex specialis to the Digital Services Act. A closer look shows a more complex and nuanced picture. Our analysis concludes that the Digital Services Act will apply to OCSSPs insofar as it contains rules that regulate matters not covered by Article 17 CDSM Directive, as well as specific rules on matters where Article 17 leaves a margin of discretion to Member States. This includes, to varying degrees, rules in the Digital Services Act relating to the liability of intermediary providers and to due diligence obligations for online platforms of different sizes. Importantly, we consider that such rules apply even where Article 17 CDSM Directive contains specific (but less precise) regulation on the matter. From a normative perspective, this might be a desirable outcome, to the extent that the Digital Services Act aims to establish “uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected”. Based on our analysis, we suggest a number of clarifications that might help us to achieve that goal.

Type
Articles
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s), 2022. Published by Cambridge University Press

I. Introduction

Online platforms provide the main points of access to information and other content in the digital age, whether through “search engines, social networks, micro-blogging sites or video-sharing platforms”.Footnote 1 Although these platforms bring economic and social benefits, they also enable the unprecedented spread of illegal content, including incitement to terrorism, hate speech and copyright infringement.Footnote 2 This is particularly true for so-called Big Tech platforms – such as Facebook and YouTube – that have amassed significant power over online speech and commerce in the past decade.Footnote 3

Against this background, the European Union (EU) and its Member States have proposed and adopted a growing number of laws and policies to regulate online content, with a focus on enhancing the responsibility of hosting platforms for user-uploaded illegal content.Footnote 4 The centrepiece of the European Commission’s (EC) strategy in this respect was published on 15 December 2020: the proposal for a Regulation on a Single Market for Digital Services (Digital Services Act; DSA),Footnote 5 which amends the e-Commerce DirectiveFootnote 6 for certain Internet intermediaries. The DSA carries out a regulatory overhaul of the twenty-one-year-old horizontal rules on intermediary liability in the e-Commerce Directive.

In this article, we use doctrinal legal analysis to look at how the DSA’s rules interplay with sector-specific, lex specialis rules. This question is relevant both for specific EU legislation, such as on copyrightFootnote 7 and terrorist content,Footnote 8 as well as for national sector regulation. The focus of our legal analysis, however, is on online platforms and copyright-protected material.Footnote 9

With regard to copyright-protected material, Article 17 of the Directive on Copyright in the Digital Single Market (CDSM Directive), which preceded the DSA, establishes a new liability regime for online content-sharing service providers (OCSSPs). These rules had to be implemented by EU Member States on 7 June 2021.Footnote 10 Both Article 17 CDSM Directive as well as multiple provisions of the DSA impose obligations on how online platforms deal with illegal information. Whereas Article 17 CDSM Directive targets copyright-infringing content, the DSA proposal targets illegal content in general, including that which infringes copyright.

This raises the question of how the two frameworks will interact once both enter into force. Besides the different natures of the legal instruments (Regulation versus Directive), this question is of high relevance, first and foremost where the frameworks differ. At first sight, these regimes may not appear to overlap since Article 17 CDSM Directive is lex specialis to the DSA. A closer look, however, reveals a much more complex picture. The proposed DSA regulation is complementary to Article 17 CDSM Directive and imposes a number of additional obligations on online platforms that qualify as OCSSPs. But the extent to which these obligations apply – and in some cases whether they do apply – is unclear. This article examines and maps this underexplored intersection between the CDSM Directive and the DSA.Footnote 11 Our legal analysis illuminates a point that has so far received little attention: the extent to which the DSA provides a new regulatory approach to online platforms through horizontal rules that extend to most corners of EU law, even where that reach appeared precluded or limited by specific legislation to be implemented at the national level. In the case of copyright, the issue is especially complex due to its territorial nature, leading to a multi-layered enforcement problem.Footnote 12 Varying national implementations of Article 17 CDSM Directive, which are outside the scope of this article, would further complicate matters.

This article carries out a doctrinal analysis of this particular legal question. To be sure, there are additional legal and empirical angles from which to address the underlying objective of the legal instruments under analysis. As noted, both the CDSM Directive (in a more targeted, sector-specific manner) and the DSA (in a general horizontal approach) aim to curb the increasing power and “digital dominance” of Big Tech companies, primarily by subjecting them to additional liability and obligations for the illegal (and even the harmful) content they host.Footnote 13 Our analysis only captures a small part of the regulatory and normative complexity involved in this task.

On the one hand, some of the legal solutions to curb platform power are found elsewhere in proposals that attempt to address the anti-competitive practices of Big Tech as so-called “gatekeepers”, such as the Digital Markets Act.Footnote 14 On the other hand, much of the power enjoyed by these platforms results from content-moderation rules, technologies and processes adopted by “platforms” proper (ie a form of private ordering). This type of regulation can fit into two broad categories. First are terms of service and similar documents (community guidelines, etc.) adopted by platforms, referred to by some authors as “platform law”.Footnote 15 In EU law, this would include, for instance, what is covered by the definition of “terms and conditions” in the proposed DSA.Footnote 16 Second, regulation of platforms can be carried out through technological devices or code, such as in the case of algorithmic moderation systems (eg for filtering of illegal content).Footnote 17 Big Tech platforms have long developed complex terms and conditions and content-recognition systems or tools that de facto govern their treatment of the illegal and harmful content they host.Footnote 18 In the particular case of copyright-protected content, these systems are perhaps at their most developed, and they include well-known examples such as YouTube’s suite of copyright management tools – most notably ContentID – and Facebook’s Rights Manager.Footnote 19 This is not surprising, since “[e]mpirically, copyright law accounts for most content removal from online platforms, by an order of magnitude”.Footnote 20 Outside the sphere of copyright, platforms mostly use different content-recognition tools for separate types of illegal or harmful/undesirable content (eg terrorism, violence, “toxic speech”, child abuse, sexual content, spam) that, in simple terms, “match content to known images, text or video in a database and classification tools which can classify new images as part of pre-defined categories”.Footnote 21 Although our analysis makes reference to some of these aspects, it focuses on the legal question above and can therefore only offer a modest contribution to this debate.

The article proceeds as follows. After this introduction, we provide a snapshot of the complex regime set out in Article 17 CDSM Directive, providing a baseline understanding for the subsequent analysis (Section II). We then move to the heart of our analysis, explaining why and how the DSA liability regime and especially its asymmetric due diligence obligations apply to online platforms that host and provide access to copyright-protected content, despite – and in addition to – the specific rules in Article 17 CDSM Directive, to be implemented into national law in light of Guidance issued by the Commission (Section III).Footnote 22 We conclude with the key findings of our analysis and suggestions for clarifications in the further legislative process (Section IV).

II. OCSSPs and Article 17 CDSM Directive

1. Overview

OCSSPs are a novel concept defined in Article 2(6) CDSM Directive, with further guidance in Recitals 62 and 63. They are providers of an information society service whose main purpose is to store and give the public access to a large amount of protected content by its users, provided it organises and promotes that content for profit-making purposes. The definition also contains a number of exclusions aimed at services that are either not aimed primarily at giving access to copyright-protected content and/or are primarily not for-profit (eg service providers such as Skype, Dropbox, eBay, Wikipedia, ArXiv.org and GitHub).Footnote 23

While this concept is new to the copyright acquis, OCSSPs do not appear to constitute a wholly new category of service providers in a technological or business sense. Rather, this is a new legal category covering a type of provider of hosting services whose activities or functions were previously regulated in different legal instruments, such as the e-Commerce Directive,Footnote 24 the InfoSoc DirectiveFootnote 25 and the Enforcement Directive.Footnote 26 Figure 1 represents this relationship.

Figure 1. Online content-sharing service providers (OCSSPs) in the context of the e-Commerce Directive. CDSM Directive = Directive (EU) 2019/790 on Copyright in the Digital Single Market.

Article 17 is an extremely complex legal provision. As Dusollier notes, it is the “monster provision” of the Directive, “both by its size and hazardousness”.Footnote 27 There is perhaps no better testament to this than the wealth of legal scholarship that already exists on Article 17, even before its national implementation deadline.Footnote 28

In simple terms, Article 17 states that OCSSPs carry out acts of communication to the public when they give access to works/subject matter uploaded by their users. As a result, these providers become directly liable for their users’ uploads. They are also expressly excluded in paragraph (3) from the hosting safe harbour for copyright-relevant acts, which was previously available to many of them under Article 14(1) e-Commerce Directive. Arguably, this makes Article 17 lex specialis to the e-Commerce Directive.Footnote 29

The provision then introduces a complex set of rules to regulate OCSSPs, including a liability-exemption mechanism in paragraph (4) and a number of what can be referred to as mitigation measures and safeguards.

The liability-exemption mechanism is composed of best-efforts obligations for preventative measures, including those aimed at filtering content ex ante, at notice and stay-down and at notice and takedown.Footnote 30 In particular, Article 17(4) establishes three cumulative conditions for this liability-exemption mechanism. The first condition is that OCSSPs must demonstrate that they have made best efforts to obtain an authorisation.Footnote 31 If this obligation is met, then OCSSPs are subject to two further cumulative conditions in paragraphs (b) and (c). Namely, they must demonstrate that they have: (1) made best efforts to ensure the unavailability of specific works for which the rights holders have provided them with the relevant and necessary information; and (2) acted expeditiously, subsequent to notice from rights holders, to take down infringing content and made best efforts to prevent its future upload. Condition (1) appears to impose what critics label an upload filtering obligation, whereas Condition (2) introduces both a notice-and-takedown mechanism (similar to that of Article 14 e-Commerce Directive) and a notice-and-stay-down (or re-upload filtering) obligation.Footnote 32

Among the mitigation measures and safeguards that Article 17 includes we find the following: first, the requirements for a proportionality assessment and the identification of relevant factors for preventative measuresFootnote 33 ; second, a special regime for small and new OCSSPsFootnote 34 ; third, a set of mandatory exceptions akin to user rights or freedoms that are designed as obligations of result expressly based on fundamental rightsFootnote 35 ; fourth, a clarification that Article 17 does not entail general monitoring, although without providing much insight into its relation to the prohibition contained in Article 15 e-Commerce DirectiveFootnote 36 ; and fifth, a set of procedural safeguards, including an in-platform complaint and redress mechanism and rules on out-of-court redress mechanisms.Footnote 37

Finally, Article 17(10) tasks the EC with organising stakeholder dialogues to ensure uniform application of the obligation of cooperation between OCSSPs and rights holders and to establish best practices regarding the appropriate industry standards of professional diligence. After much delay, the Guidance from the EC was finally published as a Communication on 4 June 2021, a mere working day before the transposition deadline of the CDSM Directive on 7 June 2021. The Guidance was adopted as a Communication and is therefore not binding.Footnote 38 Furthermore, as the Guidance itself states, it might have to be reviewed in light of the Court of Justice of the European Union (CJEU) judgment in C-401/192.Footnote 39 In fact, the Opinion of the Advocate General (AG) in that case suggests that key aspects of the Guidance might not be in conformity with fundamental rights.Footnote 40 Still, the Guidance is a rich document that is bound to influence national implementations. This Guidance, we note, only refers to the DSA once.Footnote 41

To be sure, Big Tech platforms such as YouTube (ContentID, Copyright Match Tool and Web FormFootnote 42 ) and Facebook (Rights Manager) already contain content-recognition tools, including the type of filtering and blocking measures required by Article 17’s liability-exemption mechanism. But this is not necessarily true for the majority of other smaller-scale platforms, who will be required to implement tools obtained from private third-party providers, such as Audible Magic and Pex.Footnote 43 In this sense, an unintended consequence of Article 17 is that it translates into a competitive advantage for bigger OCSSPs over smaller providers.Footnote 44 Additionally, it is also clear that the most advanced of current filtering technologies – based on matching, fingerprinting or hashing algorithms – are incapable of recognising the types of uses covered by mandatory copyright exceptions or limitations,Footnote 45 leading to a conflict between different obligations within Article 17.Footnote 46 This conflict is explored below.

2. Normative hierarchy of obligations and safeguards

In light of the above, it is important to further explain the normative hierarchy embedded in Article 17 as well as to provide additional detail on its complaint and redress rules.

Article 17(7) includes a general and a specific clause on exceptions and limitations to copyright. The general clause is contained in the first subparagraph, which states that the obligations in 4(b) and (c) should not prevent content uploaded by users from being available on OCSSPs if such an upload does not infringe copyright, including if it is covered by an exception.Footnote 47 The second paragraph of Article 17(7) CDSM Directive includes a special regime for certain exceptions and limitations: (1) quotation, criticism or review; and (2) use for the purpose of caricature, parody or pastiche.Footnote 48 Additionally, Article 17(9) requires that OCSSPs inform users in their terms and conditions of the user’s right to use works under exceptions. Footnote 49

One key feature of the legal design of Article 17 is that paragraph (7) translates into an obligation of result. That is to say, Member States must ensure that these exceptions are respected despite the preventative measures in Article 17(4). This point matters because paragraph (4) merely imposes “best-efforts” obligations. The different natures of the obligations, underscored by the fundamental rights basis of paragraph (7),Footnote 50 indicate a normative hierarchy between the higher-level obligation in paragraph (7) and the lower-level obligation in paragraph (4). This matters not only for legal interpretation of Article 17 in general, but also for the assessment of content-moderation obligations in this legal regime. For instance, this legal understanding justifies the view that, in order to comply with Article 17, it is insufficient to rely on the ex post complaint and redress mechanisms in Article 17(9). Compliance also requires the maintenance of ex ante safeguards that avoid the over-blocking of uploaded content by the content-filtering technologies used by OCSSPs that are incapable of carrying out the type of contextual assessment required under Article 17(7).Footnote 51

It is on this basis that Poland filed an action for annulment against Article 17 for failure to sufficiently safeguard the right to freedom of expression of users.Footnote 52 In his Opinion, AG Saugmandsgaard Øe delineated the scope of permissible filtering of users’ uploads.Footnote 53 While acknowledging that OCSSPs will have to deploy filtering and content-recognition systems to comply with their best-efforts obligations, the AG relies on the judgment in Eva Glawischnig-Piesczek to argue that any filtering must be “specific” to the content and information at issue, so as not to run afoul of the prohibition of general monitoring obligations in Article 15 e-Commerce Directive (and Article 17(8) CDSM Directive).Footnote 54 However, such filtering must be proportionate and avoid the risk of chilling effects on freedom of expression through over-blocking; in order to do so, it must be applied only to manifestly infringing or “equivalent” content.Footnote 55 All other uploads should benefit from a “presumption of lawfulness” and be subject to the ex ante and ex post safeguards embedded in Article 17, notably judicial review.Footnote 56

In this respect, Article 17(9) further includes certain ex post or procedural safeguards at: (1) the platform level; (2) the out-of-court level; and (3) the judicial or court level. Footnote 57 A few additional remarks are justified on the first two levels.

At the platform level, Member States are mandated to provide that OCSSPs “put in place an effective and expeditious complaint mechanism that is available to users of their services in the event of disputes over the disabling of access to, or the removal of, works or other subject matter uploaded by them”. Footnote 58 These mechanisms are further circumscribed insofar as complaints “shall be processed without undue delay, and decisions to disable access to or remove uploaded content shall be subject to human review”. Footnote 59 The latter human review criterion implies that everything leading up to a dispute can be processed by the platform in an automated fashion by algorithms. Footnote 60 It is further specified that these mechanisms should allow “users to complain about the steps taken with regard to their uploads, in particular where they could benefit from an exception or limitation to copyright in relation to an upload to which access has been disabled or that has been removed”.Footnote 61

Furthermore, the provision stipulates a justification duty on rights holders. The reasons for a rights holder’s request to make content unavailable needs to be “duly justified”. Footnote 62 The decision at this level remains with the platform, but as Senftleben notes, “The underlying legal assessment, however, is likely to be cautious and defensive … [and] a generous interpretation of copyright limitations serving freedom of expression seems unlikely, even though a broad application of the right of quotation and the parody exemption would be in line with CJEU jurisprudence”. Footnote 63 In other words, there is a risk of over-enforcement. Footnote 64

In addition to the platform-based procedural safeguards, out-of-court redress mechanisms for the impartial settlement of disputes are also to be put in place by Member States. Footnote 65 This mechanism is “without prejudice to the rights of users to have recourse to efficient judicial remedies …”. Footnote 66 Specifically in relation to exceptions, “Member States shall ensure that users have access to a court or another relevant judicial authority to assert the use of” the same. Footnote 67 Member States enjoy a considerable amount of discretion when implementing these procedural safeguards, and such mechanisms might also be informed by the EC Guidance on Article 17, which provides significant detail on how Member States may implement the safeguards in paragraphs (7) and (9). Footnote 68

III. The interplay between the DSA and the CDSM Directive

Against this background, the DSA proposal was published on 15 December 2020. The DSA is a regulation that is meant inter alia as a “REFIT”Footnote 69 of certain parts of the e-Commerce Directive. Other than the different legal nature of the proposed instrument – Regulation versus Directive – the DSA has a broader scope than the e-Commerce DirectiveFootnote 70 and sets up a much more detailed procedural framework, which is further explored below.Footnote 71

The proposed DSA is divided into five chapters: general provisions (Chapter I); liability of providers of intermediary services (Chapter II); due diligence obligations for a transparent and safe online environment (Chapter III); implementation, cooperation, sanctions and enforcement (Chapter IV); and final provisions (Chapter V). For the purposes of this article, we are mostly concerned with Chapters I–III.

The liability exemptions in Chapter II largely resemble the system set forth twenty-one years ago in the e-Commerce Directive,Footnote 72 with notable adjustments such as a “Good Samaritan”-like rule,Footnote 73 clarifications on scope in recitalsFootnote 74 and provisions on orders: to act against illegal content; and to provide information.Footnote 75 Separate from this, the proposal suggests the introduction of asymmetric due diligence obligations in Chapter III, which is a novelty compared to the e-Commerce Directive.

1. Are rules on copyright excluded from the DSA?

In this article, we are interested in the potential overlap between the proposed DSA and Article 17 CDSM Directive. This is visualised in the illustration in Figure 2, which represents the overlaps between the concepts of online platforms, very large online platforms (VLOPs) and OCSSPs. Similar overlaps could be envisaged as regards the relationship of the DSA proposal’s concepts with those of platforms used by providers in other sector-specific areas, such as “video-sharing platform services” in the Audiovisual Media Services Directive (AVMSD)Footnote 76 and “hosting services” used for the dissemination to the public of terrorist content online in the Terrorist Content Regulation.Footnote 77

Figure 2. Overlap between the Digital Services Act (DSA) and Directive (EU) 2019/790 on Copyright in the Digital Single Market (CDSM Directive). OCSSP = online content-sharing service provider; VLOP = very large online platform.

A preliminary question for our purposes is whether the DSA applies to OCSSPs in the first place. Importantly, the special “copyright” regime for OCSSPs only relates to the copyright-relevant portion of an online platform that qualifies as an OCSSP. Article 17(3) subparagraph 2 CDSM Directive states clearly that the hosting safe harbour of Article 14 e-Commerce Directive – and correspondingly that in Article 5 DSA – still applies to OCSSPs “for purposes falling outside the scope of this Directive”. Consider the example of YouTube, which qualifies as an OCSSP. If the relevant information or content it hosts relates to copyright, Article 17 CDSM Directive applies. If the relevant information, however, relates to hate speech or child sexual abuse material or any other illegal information or content,Footnote 78 the e-Commerce Directive’s – and correspondingly the DSA’s – hosting liability exemption is the place to look. In other words, YouTube would be considered an OCSSP (in the context of copyright) and also a VLOP (in the context of other information; see Figure 3).Footnote 79

Figure 3. An example of overlap between regulatory regimes in the case of online content-sharing service providers (OCSSPs). CDSMD = Directive (EU) 2019/790 on Copyright in the Digital Single Market; DSA = Digital Services Act; VLOP = very large online platform.

In the following, we focus on the copyright aspects. Article 1(5)(c) DSA states that the proposed Regulation is “without prejudice to the rules laid down by … Union law on copyright and related rights”. Supporting Recital 11 adds that the “Regulation is without prejudice to the rules of Union law on copyright and related rights, which establish specific rules and procedures that should remain unaffected”. Read alone, this Recital could be understood as the Commission’s view that Article 17 CDSM Directive, in our example, indeed contains the answers to all questions regarding obligations of OCSSPs. In our view, however, “unaffected”Footnote 80 can only relate to aspects that indeed are specifically covered by those rules.

Recital 11 (similar to Recital 10), however, is only a further example of areas of application of the general principle contained in Recital 9, aimed at providing further clarity on the interplay between the horizontal rules of the DSA and sector-specific rules. Recital 9 states that the DSA

… should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services … Therefore, this Regulation leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected. However, the rules of this Regulation apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level.Footnote 81

The Explanatory Memorandum repeats this text and provides as one example the obligations set out in the AVMSD on video-sharing platform providers as regards audiovisual content and audiovisual commercial communications. It continues that such rules “will continue to apply”, but that the DSA “applies to those providers to the extent that the AVSMD or other Union legal acts, such as the proposal for a Regulation on addressing the dissemination on terrorist content online, do not contain more specific provisions applicable to them”.Footnote 82

Applying this logic to the CDSM Directive, this means that the specific rules and procedures contained in Article 17 for OCSSPs are likely to be considered lex specialis to the DSA.Footnote 83 Conversely, the DSA will apply to OCSSPs insofar as it contains: (1) rules that regulate matters not covered by Article 17 CDSM Directive; and (2) specific rules on matters where Article 17 leaves a margin of discretionFootnote 84 to Member States. As we demonstrate below, whereas Category (1) is more or less straightforward, Category (2) is more challenging. In our view, the changes proposed to the DSA provisions examined above on the relationship between the copyright acquis and the DSA during the legislative process (by the Council and different European Parliament committees) do not affect the validity of our conclusions.Footnote 85 A similar conclusion appears valid for other types of illegal content.

2. Potentially applicable rules

At this stage, it is important to note that the DSA contains a bifurcated approach to regulation. On the one hand, Chapter II sets out a regime for the liability of providers of intermediary services.Footnote 86 This regime distinguishes between functions, namely “mere conduit”, “caching” and hosting. It is in essence a revamped version of the existing rules on liability exemption (also known as safe harbours) and bans on general monitoring in Articles 12–15 e-Commerce Directive.Footnote 87 As noted, the main differences are the addition of a “Good Samaritan”-like rule in Article 6Footnote 88 and provisions on orders to act against illegal content (Article 8) and to provide information (Article 9). On the other hand, Chapter III sets out “horizontal”Footnote 89 due diligence obligations for a transparent and safe online environment.Footnote 90 This regime distinguishes between categories of providers by setting out asymmetric obligations that apply in a tiered way to different categories of providers of information society services. As a starting point, the liability-exemption regime on the one hand and the due diligence obligations on the other are separate from each other. In other words, the availability of a liability exemption is not dependent on compliance with due diligence obligations and vice versa.Footnote 91

In this respect, the DSA retains in Article 2(a) the definition of “information society services” of the e-Commerce Directive that underpins the notion of an information society service provider. For the purposes of due diligence obligations, it then proposes a distinction between four categories of services, from the general to increasingly more specific: (1) intermediary services; (2) hosting services; (3) online platforms; and (4) VLOPs.Footnote 92 These are visualised in Figure 4.

Figure 4. Digital Services Act typology of information society service providers and the placement of online content-sharing service providers (OCSSPs).Footnote 100 VLOP = very large online platform.

Intermediary services – the broadest category – comprises “mere conduit”, “caching” or hosting services.Footnote 93 Hosting services consist of “the storage of information provided by, and at the request of, a recipient of the service”.Footnote 94 Online platforms are defined as providers of “a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation”.Footnote 95 In simple terms, VLOPs are those online platforms that provide their services to a number of average monthly active recipients of the service in the EU greater than or equal to 45 million (ie representing 10% of the European population).Footnote 96 In practical terms, only the major user-upload Big Tech platforms operating in the current digital ecosystem – such as YouTube, Facebook or Instagram – would qualify as VLOPs.Footnote 97 Under the asymmetric obligations approach of Chapter III DSA, VLOPs are subject to the highest number of cumulative obligations.Footnote 98 This is justified by the “systemic role” played by such platforms in “amplifying and shaping information flows online” and by the fact that “their design choices have a strong influence on user safety online, the shaping of public opinion and discourse, as well as on online trade”.Footnote 99

In our view, when contrasting the definitions in the DSA and the CDSM Directive, it is clear that the notion of OCSSP covers at least (certain) online platforms and VLOPs, as represented in Figure 4.

In light of this overlap, the legal question that arises is the extent to which the proposed DSA’s liability rules (in Chapter II) and the asymmetric obligations (in Chapter III) apply to OCSSPs as online platforms or VLOPs. Although the analysis below focuses on copyright, it provides a blueprint for a similar examination of the DSA liability regime and obligations that would apply to other sector-specific instruments. For instance, it could help shed light on the articulation of the DSA with the AVMSD, which already imposes certain obligations on video-sharing platform services to protect minors and EU citizens from certain categories of illegal and harmful content,Footnote 101 while attaching “cooperative responsibility to [those] platforms’ organisational control”.Footnote 102

a. DSA liability regime and OCCSPs

In our view, the liability regime in the DSA is partly excluded for OCSSPs. First, the hosting safe harbour (in Article 5 DSA) is meant to replace Article 14 e-Commerce Directive.Footnote 103 As such, its application is set aside by the express reference to it in Article 17(3) CDSM Directive, to the extent that the activities at issue fall within the scope of Article 17 CDSM Directive.Footnote 104

On the other hand, the general monitoring ban in Article 7 DSA, which aims to replace the similar prohibitionFootnote 105 in Article 15 e-Commerce Directive, appears to not be touched by the CDSM Directive. Article 17(8) CDSM Directive merely states that “the application of this Article shall not lead to any general monitoring obligation”. It does not set aside the application of Article 15 e-Commerce Directive, meaning that it can be understood as being of a merely declaratory nature.Footnote 106

Things are, however, less clear for the “Good Samaritan” rule in Article 6 DSA on “voluntary own-initiative investigations and legal compliance”. Given the direct reference to the liability exemptions in the DSA, its application appears to be directly connected (for our purposes) to the specific hosting safe harbour, which does not apply to OCSSPs as per Article 17(3) CDSM Directive. In a narrow reading, this direct connection could be interpreted as precluding Article 6 DSA’s application in the context of OCSSPs. This exact issue will resurface below when we examine due diligence obligations. There may, however, exist good arguments for not taking direct references to the DSA’s liability exemptions as evidence for precluding their applicability, which we explore in detail below.Footnote 107

In the specific context of Article 6 DSA, in any case, the applicability on OCSSPs is further complicated: Article 6 DSA is meant to enable “activities aimed at detecting, identifying and removing, or disabling of access to, illegal content, or take the necessary measures to comply with the requirements of Union law, including those set out in this Regulation”. But Articles 17(4)(b) and (c) CDSM Directive already set forth a liability-exemption mechanism requiring OCSSPs make best efforts to apply preventative measures to ensure the unavailability or removal of copyright-infringing content. These specific rules for OCSSPs would appear to leave little space for voluntary own-initiative investigations by online platforms and consequently the application of Article 6 DSA. As a result, there may be no need to look for interpretations that would include voluntary activities by OCCSPs.Footnote 108

Yet it is conceivable that certain voluntary measures by OCSSPs could go beyond the required “best efforts” and would therefore not trigger liability, provided they are within the limits imposed by Articles 17(7)–(9) CDSM Directive. This is particularly true in light of the different natures of the instruments at issue (Regulation versus Directive) and the potential for diverging national implementations and interpretations of the concept of “best efforts” in Article 17(4), as already manifested during the implementation process.Footnote 109 This problem of multi-layered, geographically dispersed enforcement is not necessarily solved by the EC’s Guidance either. For instance, when discussing best efforts to obtain an authorisation (a precondition for the liability exemption) in Article 17(4)(a), the EC identifies scenarios, sectors and players in relation to which OCSSPs must proactively seek a license or react to offered licenses. But despite copyright being a territorial right, the geographical scope of a platform’s obligation is far from clear. What is more, the Commission falls back to the conclusion that this obligation should be assessed on a “case-by-case” basis.Footnote 110 In other words, there is a likelihood for legal uncertainty arising out of the diverse national implementations and practices of preventative measures by platforms for copyright-infringing content pursuant to the Directive as compared to the horizontal EU-wide obligations stemming from the DSA. Given the potential complexity arising out of this legal puzzle, it would be important to clarify this relationship during the legislative process. This point has practical consequences for how platforms can and should design their content-moderation systems in light of both the DSA and the CDSM Directive.

Finally, the rules on orders against illegal content and orders to provide information in Articles 8 and 9 DSA may apply to OCSSPs. Article 8 DSA, in particular, sets out a detailed regime not available elsewhere to OCSSPs. To be sure, one could argue that Article 8(3) InfoSoc Directive, as interpreted by the CJEU, already provides specific rules on injunctions. But the latter provision applies only to “intermediaries whose services are used by a third party to infringe a copyright or related right”, a rule that is consistent with Article 14(3) e-Commerce Directive.Footnote 111 In other words, Article 8(3) InfoSoc Directive applies to intermediaries that are not directly liable for the content they host. This is not the case for OCSSPs, who by virtue of the legal regime in Article 17(1) CDSM Directive are directly liable for the content they host and that is publicly available. If this is the case, then it would seem that Article 8 DSA applies to OCSSPs, opening the thorny question as to the applicability to this new reality of the extensive CJEU case law on the intersection between fundamental rights, copyright enforcement against intermediaries and the prohibition on general monitoring.Footnote 112

b. What are the due diligence obligations for OCSSPs?

It is beyond the scope of this article to discuss in depth all potential obligations that apply to online platforms and VLOPs in the proposed DSA. Instead, we will focus on selected key obligations that apply to both categories and might be relevant for OCSSPs. This includes certain due diligence obligations for all providers of intermediary services (Articles 10–13), online platforms (Articles 14–24) and VLOPs (Articles 25–33).Footnote 113 In our view, these are also likely to be in practice some of the provisions that may impose additional obligations on providers subject to other sector-specific rules, such as video-sharing platforms in the AVMSD and hosting service providers in the Terrorist Content Regulation.

As a preliminary remark, we see no obstacle to the application to OCSSPs of general obligations Footnote 114 that extend to all intermediary services on points of contact, legal representatives, terms and conditionsFootnote 115 and transparency reporting. This includes the obligations set out in Articles 10–13 (with aggravation in Articles 23 and 33 DSA). Furthermore, since Article 17 CDSM Directive focuses on the disabling of illegal information and not the recommendation or promotion of information, the relevant rules in the DSA on recommender systems (Article 29) should also fully apply to OCCSPs.Footnote 116 This conclusion, we note, would be valid for other sector-specific legislative instruments in EU law that regulate only certain content-moderation activities by service providers (eg concrete aspects of notice-and-action) but not recommender systems.

As noted in Section I, private ordering via terms and conditions and automated content-moderation systems is a crucial component of platform power, especially as concerns Big Tech platforms. In this regard, Article 12 DSA is particularly noteworthy.Footnote 117 This provision applies to all intermediary service providers: it aims to increase the transparency of intermediaries’ terms and conditions and to bring their enforcement into direct relation with fundamental rights. In the EC’s proposal, Article 12(1) DSA imposes an information obligation regarding restrictions imposed on users of intermediary services, and this obligation extends to algorithmic decision-making. Article 12(2) DSA then introduces an apparently broad obligation for providers to act in a diligent, objective and proportionate manner when applying and enforcing such restrictions, explicitly linked to the respect of fundamental rights. Furthermore, the provision expands the scope of the obligations beyond illegal content, applying also to content that intermediaries consider harmful or undesirable in their terms and conditions. These horizontal obligations for all providers of intermediary services are welcome, especially as a means to curtail the private ordering power of Big Tech platforms (particularly VLOPs) as well as less visible intermediaries.Footnote 118 However, because the obligations appear too vague as to be effective, there are doubts as to whether this provision will be relevant to curtailing the power of platforms in defining the terms of their relationships with users, including how they operationalise their algorithmic content-moderation systems.Footnote 119 Despite these shortcomings, there is undoubtedly some value added in the application of Article 12 DSA to OCSSPs. The reason for this is that Article 17 CDSM Directive is remarkably thin in this respect. In fact, Article 17(9) merely requires that OCSSPs inform users in their terms and conditions of the user’s right to use works under exceptions, with the EC Guidance adding precious little in this respect.Footnote 120

i. Notice-and-action and statement of reasonsFootnote 121

A trickier question is whether or not the detailed regimes on notice-and-action (Article 14 DSA) and statement of reasons (Article 15 DSA) are suggested to apply to OCSSPs.

As explained above, Articles 17(4)(b) and (c) CDSM Directive set out a specific notice-and-action regime, which includes in paragraph (c) obligations regarding notice-and-takedown as well as notice-and-stay-down.Footnote 122 This could point in the direction of the DSA being excluded here, since the copyright-sector regulation contains rules on the matter. At the same time, however, Article 17 CDSM Directive remains vague around the concrete notice-and-action setup: it merely mentions “a sufficiently substantiated notice” that must originate from rights holders.Footnote 123 In a vacuum, this would, for instance, allow Member States a margin of discretion in regulating the details of such notices. In that line, the recent EC Guidance on Article 17 advances concrete recommendations on the content of such notices, most notably that they follow the 2018 Recommendation on Measures to Effectively Tackle Illegal Content Online.Footnote 124

Thus, it is also arguable that some components of the notice-and-action regime, such as the minimum elements that should be contained in a notice to a platform,Footnote 125 add a level of specificity not found in the lex specialis rules of the CDSM Directive.Footnote 126 Then again, already today the European landscape for notices is varying, since some Member States chose to amend the implementation of the hosting liability exemption in Article 14 e-Commerce Directive with procedural rules whereas others did not. On this point, it is important to remember that the very choice of instrument for the DSA – a Regulation vis-à-vis a Directive – was considered necessary to provide legal certainty, transparency and consistent monitoring.Footnote 127 Furthermore, the accompanying Explanatory Memorandum points out that sector-specific instruments do not cover all regulatory gaps, especially with regards to “fully-fledged rules on the procedural obligations related to illegal content and they only include basic rules on transparency and accountability of service providers and limited oversight mechanisms”.Footnote 128 Similarly, Article 1(2)(b) DSA notes that the aim of the Regulation is to set out uniform rules. All of these considerations suggest the application of DSA rules to OCSSPs.

Against this application, the strongest argument we find lies with the nature of the legal instrument and the consideration that the rationale for the vaguer regime of Article 17 CDSM Directive in this regard was precisely to allow some margin of discretion to platforms and rights holders on how to define the content of notices for the specific subject matter of copyright. In that line, such a margin would be more adequately filled by national implementations pursuant to the recommendations of the EC Guidance rather than by application of the DSA.Footnote 129

But there is an inherent tension (if not a contradiction) in this argument between allowing for the margin of discretion at the national level inherent to the nature of a Directive and the desire to claw back much of that discretion via the EC’s extensive Guidance on Article 17. In fact, the Guidance aims not only at a legally “coherent and consistent” transposition of the provisions across the EU, but also at assisting “market players” in complying with national laws in this area.Footnote 130 To this effect, for instance, the EC identifies standards for content-recognition tools for different types of providers, incentivises the standardisation of reporting information, encourages the development of registries of rights holders and protected content and establishes rules and thresholds for what types of content may and may not be subject to filtering measures.Footnote 131 In other words, the Guidance enables both a much more uniform implementation of Article 17 obligations by Member States and allows OCSSPs – especially those larger platforms that qualify as VLOPs – to provide identical offers across the EU in compliance with Article 17. For instance, in the case of YouTube, it would be more sensible to adjust its EU-wide services and systems (eg ContentID, Copyright Match Tool and Webform) to apply consistently on a cross-border basis and to ensure compliance with the most developed and sophisticated national implementation of Article 17, which would likely be German law.Footnote 132 The important consequence of this development, for our purposes, is that it facilitates an alignment of the horizontal DSA rules, particularly those applicable to VLOPs, with sector-specific copyright rules, going some way towards addressing the multi-layered enforcement problem arising from the overlapping obligations for platforms stemming from a Directive versus a Regulation.

In any case, the definite answer to the question on the application of Article 14 DSA to OCSSPs also depends on the legal nature of the provision: is it to be understood as a supplement to the specific hosting liability exemption in Article 5 DSA or as a due diligence obligation applicable to hosting services more broadly? On the one hand, it is clear that due diligence obligations are to be seen as separate from liability exemptions. The (non-)compliance with due diligence obligations does not affect the hosting safe harbour and vice versa. On the other hand, this distinction between safe harbours and due diligence obligations is blurred by the – we think problematic and probably unintended – effect that a notice is suggested by the Commission’s proposal to have on the actual knowledge of a hosting service.Footnote 133 Since Article 14(3) DSA makes direct reference to the hosting liability exemption in Article 5 DSA, at least paragraph (3) of Article 14 DSA may not directly apply to OCSSPs.

A similar reasoning applies to the rules on statements of reasons (Article 15 DSA), which apply to the justification provided by platforms to users regarding decisions to remove or disable access to specific items of information. In the scheme of Article 17 CDSM Directive, users appear to be informed about these reasons only through a complaint and redress mechanism. Under Article 17(9), rights holders “shall duly justify the reasons for their [removal] requests” to OCSSPs, who will then make a decision on removal or disabling. There are no explicit rules on whether, when and how these decisions are communicated to users, which suggests that there is ample margin for the application of the specific rules set out in Article 15 DSA. In practice, this would mean that platforms would have to extend their reporting systems for other types of illegal content also to copyright-infringing content in this regard.

ii. Internal complaint mechanism and out-of-court dispute settlement process

In the context of online platforms, Articles 17 and 18 DSA set forth a detailed internal complaint mechanism as well as an out-of-court dispute settlement process. Article 17 CDSM Directive also mandates such mechanisms in paragraph (9) for the specific genus of OCSSPs but in a much less detailed fashion. In various forms, both the DSA and Article 17 CDSM Directive stipulate that such internal complaint mechanisms need to be effective, to be processed within a reasonable timeframe (undue delay/timely manner) and to involve some form of human review. The DSA, however, is more detailed, and it includes, for instance, a requirement of user-friendliness and a minimum period for filing such complaints of six months following the takedown decision.

Thus, the question is again whether the DSA is able or intended to “fill” the holes that the lex specialis regulation in the CDSM Directive left open. First, however, even if answered in the negative, it could be argued that Articles 17 and 18 DSA – in the view of the EU lawmaker – represent the archetypes of “effective and expeditious” mechanisms. Complaint and redress mechanisms should therefore be modelled after the horizontal DSA example where the CDSM Directive falls short. In our view, this is a normatively desirable outcome in line with the aims of the DSA.Footnote 134

Second, we should not forget that OCSSPs are not relevant from a copyright perspective only. If a video on YouTube contains illegal hate speech, the notice-and-action mechanism (and following redress mechanisms) would not fall under the regime of Article 17 CDSM Directive, but rather that of the e-Commerce Directive and the future DSA.Footnote 135 Having various similar but different redress mechanisms for the very same platform depending only on the legal regime governing the content at issue (copyright, personal data, hate speech, etc.) can hardly be in the interest of the lawmaker, Footnote 136 OCSSPs, Internet users or other stakeholders. This strongly argues in favour of the application of the DSA rules consistently to all platforms.

This is especially true for Big Tech platforms, who have developed complex complaint and redress mechanisms as part of their content-moderation systems for different types of illegal and harmful content, with the result of obfuscating users’ ability to obtain effective redress for their complaints.Footnote 137 In particular for copyright-infringing content, the systems put in place by larger platforms such as YouTube have meant in practice that complaint and redress mechanisms are rarely used by users.Footnote 138 In this regard, it is noteworthy that Article 17 CDSM Directive lacks some of the DSA safeguards, such as the presence of a body like the Digital Services Coordinator and a clear obligation of reinstating content in Article 17(3) DSA as a countermeasure to over-blocking.Footnote 139 Combined with a limitation of filtering measures to “manifestly infringing content” endorsed by the EC Guidance on Article 17 and the AG in Case C-401/19,Footnote 140 these overlapping obligations of the DSA and the CDSM Directive would better protect the freedom of expression of users by influencing platforms’ design of these mechanisms, adding external oversight and increasing users’ in-platform redress avenues.Footnote 141

A counterargument would be that such a differentiated approach is justified in light of the specific character of the rights concerned. The question then is: what part of substantive copyright law would prescribe a different treatment for the complaint handling of copyright-related content takedowns? The immediate starting point for there being such a special place for copyright in the heart of the EU acquis would be its protection in Article 17(2) Charter of Fundamental Rights of the European UnionFootnote 142 and the high-level of protection as set out in the recitals of the InfoSoc Directive and as emphasised time and again by the CJEU.Footnote 143 In our view, however, such high-level protection can hardly be undermined by safeguarding complaint mechanisms. These complaint mechanisms only become relevant once content has been taken down and a potential infringement of the protected rights is prevented. Instead, redress mechanisms relate inter alia to users’ fundamental rights (vis-à-vis a platform’s right to conduct a business). Consequently, we argue that Articles 17 and 18 DSA should apply to OCSSPs to fill the gaps left open by the vaguer rules on complaint and redress in Article 17(9) CDSM Directive. As noted, this would have the result of forcing platforms that qualify as OCSSPs and VLOPs to align their copyright redress mechanisms with their remaining illegal content-moderation systems covered by the DSA, thereby increasing their level of procedural ex post safeguards in this area.

iii. Trusted flaggers/notifiers and measures against misuse

Another noteworthy novelty relates to the obligation for online platforms to collaborate with certain trusted flaggers/notifiers in Article 19 DSA. A trusted notifier is “an individual or entity which is considered by a hosting service provider to have particular expertise and responsibilities for the purposes of tackling illegal content online”.Footnote 144 Despite the regime in Article 17 CDSM Directive, we expect trusted flaggers to play an important role on OCCSPs for the flagging of copyright-protected material in the foreseeable future. In fact, in the context of larger OCSSPs, trusted flaggers/notifiers already play a crucial but often opaque role in the privatisation of online content (copyright and other) moderation and enforcement.Footnote 145

Recital 46 DSA, for example, notes that for “intellectual property rights, organisations of industry and of right-holders could be awarded trusted flagger status, where they have demonstrated that they meet the applicable conditions”. Once again, however, Article 19(1) DSA puts itself in direct connection to the notice-and-action mechanism in Article 14 DSA, meaning that this regime could be related to only those online platforms that are not OCSSPs. Thus, the applicability of Article 19 in the context of OCSSPs depends at least to some extent on the question as to whether the notice-and-action mechanism applies to OCSSPs, as discussed above.Footnote 146

In the field of copyright and OCSSPs, rights holders may have an interest in online platforms being obliged to collaborate with certain trusted notifiers. Already today, however, trusted flagger arrangements are common occurrences, at least on larger-scale online platforms such as YouTube or Facebook.Footnote 147 The notable twist of the DSA is that trusted flagger status is awarded by the relevant Digital Services Coordinator of the Member States if certain requirements are met.Footnote 148 Furthermore, the platform is obliged to inform the Coordinator if a trusted flagger submits “a significant number of insufficiently precise or inadequately substantiated notices”.Footnote 149 Ultimately, then, the trusted flagger status can be revoked.Footnote 150 In light of uncertainty around the data quality of copyright notices, such oversight could also be of particular importance in the context of OCSSPs.Footnote 151

But even if Article 19 DSA indeed were not applicable to OCSSPs, it is important to note that already the non-binding Recommendation (EU) 2018/334 on measures to effectively tackle illegal content online encourages platforms to voluntarily collaborate with trusted flaggers.Footnote 152 Similarly, nothing in the DSA prevents “voluntary” trusted notifier arrangements. However, these would be outside the scope of Article 19 and therefore outside the supervision of the Digital Services Coordinator.Footnote 153 This apparent gap is, however, at least partly tackled by Article 20 DSA.

Article 20 DSA on measures and protection against misuse contains two main angles: first, the obligation to suspend the accounts of users who “frequently provide manifestly illegal content”Footnote 154 ; and second, the obligation to suspend the processing of notices and complaints by individuals or entities or by complainants who “frequently submit notices or complaints that are manifestly unfounded”.Footnote 155 In our view, the Article 20 proposal is central to mitigating misuse both by users and by any type of flagger, probably excluding at least partly “trusted flaggers” (regulated by Article 19), but including flaggers covered by “voluntary” trusted notifier arrangements with platforms.

Again, Article 20(2) DSA, however, directly references Articles 14 and 17 DSA. Thus, for the application of Article 20 to OCSSPs once again the central question is whether Article 14 and (at least part of) Article 17 DSA apply the lex specialis of Article 17 CDSM Directive.

The issue of users repeatedly uploading illegal content is as relevant for OCSSPs as it is for other online platforms. Likewise, the misuse of notices and complaints is a concern on OCSSPs. Articles 17(7) and (9) subparagraph 3 CDSM Directive require that the copyright regime must not lead to the unavailability of non-infringing works without, however, explicitly putting in place any protection against misuse. In this absence of specific regulation, we argue that Article 20 DSA should be fully applicable to copyright misuse. This provision is also central for voluntary arrangements (eg trusted notifiers falling outside the regime set forth in Article 19 DSA), for which we equally argue that it is fully applicable. For reasons of legal certainty, it is desirable that the wording of Article 20 DSA is clarified during the legislative process to state this unequivocally.Footnote 156

iv. Additional obligations on VLOPs

Finally, VLOPs are subject to certain specific due diligence obligations inter alia risk assessment (Article 26 DSA) and risk mitigation (Article 27 DSA).Footnote 157 The functioning and use made of the services of very large OCSSPs (eg YouTube, Facebook and Instagram) might come with systemic risks, such as “dissemination of illegal content” (including copyright infringement) or “negative effects for the exercise” of fundamental rights (including freedom of expression). Since the CDSM Directive in no way addresses these issues, we do not see any argument that precludes the application of Articles 26 and 27 DSA (as well as other relevant provision such as data access) to VLOPs that are also OCSSPs.Footnote 158 The same reasoning holds for other relevant obligations, such as data access and transparency.Footnote 159

IV. Conclusions

In this article, we have looked at the (potential) relationship between the horizontal DSA rules and the sector-specific rules for OCSSPs in Article 17 CDSM Directive from a legal doctrinal perspective. Rules on copyright – vis-à-vis other forms of information (or content) – appear to have a special place in the EU legal order. Meanwhile, the EC has provided (internally) some insights on their interpretation in a presentation to the Council Working Party on Intellectual Property (Copyright).Footnote 160 In that presentation, the Commission reminds us that the “DSA is not an IPR enforcement tool” given its general and horizontal nature, but that it “includes a full toolbox which can be very useful for the enforcement of IPR [intellectual property rights]”, which would apply “without prejudice to existing IPR rules”. Notably, however, the Commission considers that Article 17 CDSM Directive remains “unaffected; i.e., DSA rules on limited liability, notice and action, redress and out of court mechanism [are] not applicable for [OCSSPs]”. Our analysis of the DSA proposal leads to a different conclusion, painting a more complex picture.

In our view, the reference in the DSA to “unaffected” does not mean its horizontal rules would not supplement those in Article 17 CDSM Directive, especially as it regards notice-and-action or redress mechanisms.Footnote 161 Rather, on the basis of the available proposal and the amendments thus far, we argue that the DSA will probably apply to OCSSPs insofar as it contains: (1) rules that regulate matters not covered by Article 17 CDSM Directive; and (2) specific rules on matters where Article 17 leaves a margin of discretion to Member States.

Category (1) applies to some provisions in the liability framework rulesFootnote 162 of the DSA and most clearly to procedural obligations. This makes sense since, in our view, the special role of copyright, as noted above, may only be related to substantive copyright law. But the DSA’s due diligence obligations we have examined relate to information requirements, quality assurances regarding notices and procedural safeguards for ex post control with a view to, for instance, reinstating non-infringing content. In this light, we find no strong argument for why EU copyright law would require a full exemption from procedural obligations set out for online platforms in the DSA. In fact, the very character of the proposed DSA (and the e-Commerce Directive that precedes it) is to provide broad and horizontal rules for a level playing field. Where no more specific regulation of Article 17 CDSM Directive applies, the asymmetric due diligence obligations of the DSA should apply.

The situation is trickier for Category (2), which relates to areas where Article 17 CDSM Directive does in fact provide for some degree of regulation, and the extent to which it pre-empts more detailed rules in the DSA is uncertain. The situation is further complicated by the Commission’s Guidance on Article 17, despite its non-binding character. In our view, in any case, the logical approach appears to be to consider the CDSM Directive’s regulation as lex specialis. However, where this lex specialis does not contain specific or more detailed regulation (or an explicit exemption from the general rules), the horizontal rules of the DSA would apply once it comes into force.Footnote 163 This is despite the different natures of the legal instruments at issue (Directive versus Regulation), the territorial nature of copyright and the potential issues arising therefrom according to the perspective of multi-layered enforcement. These problems may be attenuated by the harmonising effect of the EC Guidance on Article 17 on Member States’ laws and OCSSP practices.

From a normative standpoint, we understand the DSA’s due diligence obligations as “first principles” of how Internet intermediaries – and most notably platforms and VLOPs – must “behave”, and how the competing fundamental rights of the involved parties can be balanced. In other words, the DSA’s due diligence obligations should be viewed as the horizontal fall-back regime that would only be altered by more specific lex specialis rules. That is to say, as a horizontal framework, the DSA sets out the default legal regulation for the intertwined relations of platforms, users and rights holders.Footnote 164 As such, even in the presence of specific non-exhaustive sector regulation, the DSA rules should remain applicable unless they are clearly set aside by the lex specialis.

In this light, our analysis identifies several rules in the DSA proposal that should apply to OCSSPs despite the regime in Article 17 and the accompanying Guidance: on notice and action, internal complaint and out-of-court dispute settlement, trusted flaggers/notifiers and measures against misuse. But we have also identified a number of grey areas in these overlaps between the DSA and Article 17 CDSM Directive. To avoid legal uncertainty, it would be important to clarify these during the legislative process, thereby mitigating the risks associated with multi-layered enforcement on OCSSPs. This could be achieved, for instance, by stating that Chapter III DSA (Articles 10–37) applies as a horizontal framework mutatis mutandis also to those intermediary services covered by other secondary legislation, to the extent that no more specific rules are laid out. Further precise clarifications could be introduced in the specific grey areas identified in our analysis in order to ensure the applicability of the DSA’s safeguards to OCSSPs, where justified. After all, although we can all agree that copyright is special, it should not be a barrier to setting “uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected”.Footnote 165 In this respect, although our analysis focuses on the intersection of the DSA with copyright law, as we have noted throughout, our analytical framework could prove useful to further research and to clarifying the overlap of other sector-specific rules on different types of online platforms – such as in the Terrorist Content Regulation and the AVMSD – with the DSA. It could also serve as a reminder and blueprint for future national and EU legislative endeavours in the area of platform regulation to carefully consider their interplay with the DSA.

Acknowledgements

The authors wish to thank Alexander Peukert, Felix Reda, Christoph Schmon, Nuno Sousa e Silva and Jens Schovsbo for their valuable comments. All errors remain ours.

Financial support

This research is part of the reCreating Europe project, which has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 870626. João Pedro Quintais’s research in this article is also part of the VENI Project “Responsible Algorithms: How to Safeguard Freedom of Expression Online” funded by the Dutch Research Council (grant number: VI.Veni.201R.036).

Competing interests

The authors declare none.

References

1 Commission, “Tackling Illegal Content Online – Towards an Enhanced Responsibility of Online Platforms”, COM/2017/0555, 2.

2 Commission, Recommendation of 1 March 2018 on measures to effectively tackle illegal content online, C/2018/1177.

3 N Suzor, Lawless: The Secret Rules That Govern Our Digital Lives (Cambridge, Cambridge University Press 2019).

4 M Vermeulen, “Online Content: To Regulate or Not to Regulate – Is That the Question?” (Association for Progressive Communications 2019) Issue Paper <https://www.apc.org/en/pubs/online-content-regulate-or-not-regulate-question> (last accessed 25 November 2020); Commission, supra, note 1; Commission, supra, note 2.

5 Proposal for a Regulation of the European Parliament and of the Council on a Single Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC COM/2020/825 final. For ease of reference, we refer to this legislative proposal as “DSA” in the main body of the text and “DSA proposal” in the footnotes. Unless otherwise specified, our analysis refers to the text of the original proposals and not to the amendments advanced so far in the legislative process. For further details on this process, see European Parliament, Legislative Train Schedule, DSA proposal <https://www.europarl.europa.eu/legislative-train/theme-a-europe-fit-for-the-digital-age/file-digital-services-act> (last accessed 8 February 2022).

6 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on Certain Legal Aspects of Information Society Services, in Particular Electronic Commerce, in the Internal Market [2000] OJ L178/1 (e-Commerce Directive).

7 Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market (CDSM Directive).

8 See European Parliament, Press Room, New rules adopted for quick and smooth removal of terrorist content online (28 April 2021) <https://www.europarl.europa.eu/news/en/press-room/20210422IPR02621/new-rules-adopted-for-quick-and-smooth-removal-of-terrorist-content-online> (noting the approval in the European Parliament of the post-trilogue version of the new Regulation) (last accessed 8 February 2022).

9 For the purposes of simplicity, we use the term copyright-protected “material” to cover both works protected by copyright and other subject matter protected by related rights.

10 Most Member States failed to implement the directive by the deadline, leading the European Commission to start infringement proceedings. See European Commission, Press Release, “Copyright: Commission calls on Member States to comply with EU rules on copyright in the Digital Single Market” (26 July 2021) <https://ec.europa.eu/commission/presscorner/detail/en/MEX_21_3902> (last accessed 8 February 2022).

11 In addition to our work, existing research on this topic includes, eg, A Peukert et al, “European Copyright Society – Comment on Copyright and the Digital Services Act Proposal” (European Copyright Society 2022) <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4016208> (last accessed 8 February 2022); E Rosati, “The Digital Services Act and Copyright Enforcement: The Case of Article 17 of the DSM Directive” in Unravelling the Digital Services Act Package (Strasbourg, European Audiovisual Observatory 2021).

12 On the issue of copyright territoriality and the Internet, see, eg, T Dreier, “Copyright in the Times of the Internet – Overcoming the Principle of Territoriality within the EU” (2017) 18 ERA Forum 7. For a broader perspective on the territoriality of EU law with reference inter alia to copyright, see M Szpunar, “Territoriality of Union Law in the Era of Globalisation”, in Volution des rapports entre les ordres juridiques de l’Union européenne, international et nationaux. Liber Amicorum Jiří Malenovský (Brussels, Éditions Bruylant 2020).

13 I Buri and J van Hoboken, “The DSA Proposal’s Impact on Digital Dominance” (Verfassungsblog, 30 August 2021) <https://verfassungsblog.de/power-dsa-dma-01/> (last accessed 7 October 2021).

14 Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on contestable and fair markets in the digital sector (Digital Markets Act) COM/2020/842 final.

15 D Kaye, Speech Police: The Global Struggle to Govern the Internet (New York, Columbia Global Reports 2019). More broadly, see, eg, L Belli and J Venturini, “Private Ordering and the Rise of Terms of Service as Cyber-Regulation” (2016) 5 Internet Policy Review 4.

16 Art 2(q) proposed DSA: “terms and conditions” means all terms and conditions or specifications, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services. See also the more detailed definition of “terms and conditions” in Art 2(10) of Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services. For an analysis of Art 12 DSA proposal, see N Appelman, JP Quintais and R Fahy, “Article 12 DSA: Will Platforms Be Required to Apply EU Fundamental Rights in Content Moderation Decisions?” (DSA Observatory, 2021) <https://dsa-observatory.eu/2021/05/31/article-12-dsa-will-platforms-be-required-to-apply-eu-fundamental-rights-in-content-moderation-decisions/> (last accessed 28 August 2021); N Appelman, JP Quintais and R Fahy, “Using Terms and Conditions to Apply Fundamental Rights to Content Moderation” (Verfassungsblog, 2021) <https://verfassungsblog.de/power-dsa-dma-06/> (last accessed 7 October 2021).

17 R Gorwa, R Binns and C Katzenbach, “Algorithmic Content Moderation: Technical and Political Challenges in the Automation of Platform Governance” (2020) 7 Big Data & Society 2053951719897945.

18 For an overview, see DSA Impact Assessment – Part 2/2 (Brussels, 15 December 2020) SWD(2020) 348 Final, Annex 11, Content Recognition Tools, p 192ff.

19 See YouTube <https://support.google.com/youtube/answer/9245819?hl=en&ref_topic=9282364> and Facebook <https://rightsmanager.fb.com/>. Popular third-party tools include those provided by Audible Magic and Pex; see Audible Magic <https://www.audiblemagic.com/> and Pex <https://pex.com/>. For an overview of content-recognition systems for copyright at the EU level, see J-P Mochon and S Humbert, “CSPLA Mission on the Tools for the Recognition of Content Protected by Online Sharing Platforms: State of the Art and Proposals” (CLSPA – Superior Council for Literary and Artistic Property (France) 2020) <https://www.culture.gouv.fr/en/Thematiques/Propriete-litteraire-et-artistique/Conseil-superieur-de-la-propriete-litteraire-et-artistique/Travaux/Missions/Mission-du-CSPLA-sur-les-outils-de-reconnaissance-des-contenus-et-des-oeuvres-sur-les-plateformes-de-partage-en-ligne-II> (last accessed 8 February 2022); EUIPO, “Automated Content Recognition. Discussion Paper – Phase 1: Existing Technologies and Their Impact on IP” (EUIPO 2020) Discussion Paper <https://euipo.europa.eu/ohimportal/en/web/observatory/news/-/action/view/8365301> (last accessed 8 February 2022). For a critical analysis of the rhetoric surrounding the adoption of these tools in the legislative process of the CDSM Directive, see A Bridy, “The Price of Closing the ‘Value Gap’: How the Music Industry Hacked EU Copyright Reform” (2020) 22 Vanderbilt Journal of Entertainment & Technology Law 323.

20 Peukert et al, supra, note 11, 2. A good illustration is provided in YouTube’s first ever transparency report. For instance, during the first half of 2021, there were over 730 million unique claims or copyright removal requests made through the platform’s ContentID system. See YouTube, “YouTube Copyright Transparency Report” (YouTube 2021) 5 <https://blog.youtube/news-and-events/access-all-balanced-ecosystem-and-powerful-tools/> (last accessed 20 January 2022).

21 DSA Impact Assessment – Part 2/2 (Brussels, 15 December 2020) SWD(2020) 348 Final, Annex 11, Content Recognition Tools, p 192ff.

22 Communication from the Commission to the European Parliament and the Council, Guidance on Article 17 of Directive 2019/790 on Copyright in the Digital Single Market, COM/2021/288 final (hereafter “Guidance Art 17 CDSM Directive”).

23 See Art 2(6) para 2 CDSM Directive. See also Art 17(6) CDSM Directive on start-up OCCSPs.

24 See Art 14 e-Commerce Directive.

25 See especially Art 3 (right of communication to the public) and Art 8(3) (injunctions against intermediaries whose services are used by a third party to infringe a copyright or related right) InfoSoc Directive.

26 See especially Arts 5 and 11 Directive 2004/48/EC of the European Parliament and of the Council of 29 April 2004 on the enforcement of intellectual property rights (OJ L 157, 30 April 2004) (Enforcement Directive).

27 S Dusollier, “The 2019 Directive on Copyright in the Digital Single Market: Some Progress, a Few Bad Choices, and an Overall Failed Ambition” (2020) 57 Common Market Law Review 979.

28 There is already significant scholarship on Art 17 CDSM Directive. See, eg, M Leistner, “European Copyright Licensing and Infringement Liability Under Art. 17 DSM-Directive Compared to Secondary Liability of Content Platforms in the U.S. – Can We Make the New European System a Global Opportunity Instead of a Local Challenge?” (2020) Zeitschrift für Geistiges Eigentum/Intellectual Property Journal (ZGE/IPJ) <https://papers.ssrn.com/abstract=3572040> (last accessed 17 April 2020); A Metzger et al, “Selected Aspects of Implementing Article 17 of the Directive on Copyright in the Digital Single Market into National Law – Comment of the European Copyright Society” (European Copyright Society 2020) European Copyright Society Opinion ID 3589323 <https://papers.ssrn.com/abstract=3589323> (last accessed 4 July 2020); SF Schwemer, “Article 17 at the Intersection of EU Copyright Law and Platform Regulation” (2020) 3/2020 Nordic Intellectual Property Law Review <https://papers.ssrn.com/abstract=3627446> (last accessed 4 July 2020); T Spoerri, “On Upload-Filters and Other Competitive Advantages for Big Tech Companies under Article 17 of the Directive on Copyright in the Digital Single Market” (2019) 10 JIPITEC <https://www.jipitec.eu/issues/jipitec-10-2-2019/4914> (last accessed 8 February 2022); G Frosio, “Reforming the C-DSM Reform: A User-Based Copyright Theory for Commonplace Creativity” (2020) IIC – International Review of Intellectual Property and Competition Law <https://doi.org/10.1007/s40319-020-00931-0> (last accessed 4 July 2020); M Lambrecht, “Free Speech by Design – Algorithmic Protection of Exceptions and Limitations in the Copyright DSM Directive” (2020) 11 JIPITEC <https://www.jipitec.eu/issues/jipitec-11-1-2020/5080> (last accessed 8 February 2022); G Spindler, “The Liability System of Art. 17 DSMD and National Implementation – Contravening Prohibition of General Monitoring Duties?” 10 JIPITEC 334; K Garstka, “Guiding the Blind Bloodhounds: How to Mitigate the Risks Art. 17 of Directive 2019/790 Poses to the Freedom of Expression” in Intellectual Property and Human Rights (4th edn, Alphen aan den Rijn, Kluwer Law International 2019) <https://papers.ssrn.com/abstract=3471791> (last accessed 8 April 2020); Dusollier, supra, note 27; JB Nordemann and J Wiblinger, “Art. 17 DSM-RL – Spannungsverhältnis Zum Bisherigen Recht?” (2020) 122 GRUR 569; M Husovec and JP Quintais, “How to License Article 17? Exploring the Implementation Options for the New EU Rules on Content-Sharing Platforms under the Copyright in the Digital Single Market Directive” (2021) 70 GRUR International 325; M Husovec and J Quintais, “Too Small to Matter? On the Copyright Directive’s Bias in Favour of Big Right-Holders” in T Mylly and J Griffiths (eds.), Global Intellectual Property Protection and New Constitutionalism. Hedging Exclusive Rights (Oxford, Oxford University Press 2021) <https://papers.ssrn.com/abstract=3835930> (last accessed 3 May 2021).

29 See, eg, M Peguera, “The New Copyright Directive: Online Content-Sharing Service Providers Lose ECommerce Directive Immunity and Are Forced to Monitor Content Uploaded by Users (Article 17)” (Kluwer Copyright Blog, 26 September 2019) <http://copyrightblog.kluweriplaw.com/2019/09/26/the-new-copyright-directive-online-content-sharing-service-providers-lose-ecommerce-directive-immunity-and-are-forced-to-monitor-content-uploaded-by-users-article-17/> (last accessed 13 April 2020).

30 Arts 17(4) (b) and (c) CDSM Directive.

31 On the interpretation of this condition, see, eg, Metzger et al, supra, note 28.

32 For an analysis of these preventive obligations, see M Husovec, “How Europe Wants to Redefine Global Online Copyright Enforcement” in TE Synodinou (ed.), Pluralism or Universalism in International Copyright Law (Alphen aan den Rijn, Kluwer Law International 2019).

33 Art 17 (5) CDSM Directive.

34 Art 17 (6) CDSM Directive.

35 Art 17 (7) CDSM Directive.

36 Art 17(8) CDSM Directive. See, on this topic, C Angelopoulos and M Senftleben, “An Endless Odyssey? Content Moderation without General Content Monitoring Obligations” (IViR; CIPIL 2021) <https://papers.ssrn.com/abstract=3871916> (last accessed 24 June 2021); Schwemer, supra, note 28, 428.

37 Art 17(9) CDSM Directive.

38 See Arts 288 and 290 of the Treaty on the Functioning of the European Union (TFEU).

39 Guidance Art 17 CDSM Directive, supra, note 22, p 1.

40 Opinion AG Øe in C-401/19, 15 July 2021, ECLI:EU:C:2021:613, para 223.

41 Guidance Art 17 CDSM Directive, supra, note 22, p 23 (at fn 36).

42 For an overview of YouTube’s copyright content-moderation technologies, see YouTube, supra, note 20.

43 See supra, note 19 and references cited therein.

44 Spoerri, supra, note 28.

45 Under Art 17(7) CDSM Directive, “Member States shall ensure that users in each Member State are able to rely on any of the following existing exceptions or limitations when uploading and making available content generated by users on online content-sharing services: (a) quotation, criticism, review; (b) use for the purpose of caricature, parody or pastiche”.

46 In this respect, see the excellent reporting by Paul Keller on the Commission Stakeholder Dialogues and their aftermath: P Keller, “Article 17: (Mis)Understanding the Intent of the Legislator” (Kluwer Copyright Blog, 28 January 2021) <http://copyrightblog.kluweriplaw.com/2021/01/28/article-17-misunderstanding-the-intent-of-the-legislator/> (last accessed 4 May 2021); P Keller, “Article 17 Stakeholder Dialogue: What We Have Learned so Far – Part 1” (Kluwer Copyright Blog, 13 January 2020) <http://copyrightblog.kluweriplaw.com/2020/01/13/article-17-stakeholder-dialogue-what-we-have-learned-so-far-part-1/> (last accessed 7 May 2021); P Keller, “Article 17 Stakeholder Dialogue: What We Have Learned so Far – Part 2” (Kluwer Copyright Blog, 14 January 2020) <http://copyrightblog.kluweriplaw.com/2020/01/14/article-17-stakeholder-dialogue-what-we-have-learned-so-far-part-2/> (last accessed 7 May 2021).

47 This should be read in combination with the statement in Art 17(9) to the effect that the CDSM Directive “shall in no way affect legitimate uses, such as uses under exceptions or limitations provided for in Union law”. In this respect, Recital 70 emphasises the need for the preventative obligations to be implemented without prejudice to the application of exceptions and limitations, “in particular those that guarantee the freedom of expression of users”. See JP Quintais et al, “Safeguarding User Freedoms in Implementing Article 17 of the Copyright in the Digital Single Market Directive: Recommendations from European Academics” (2020) 10 JIPITEC <https://www.jipitec.eu/issues/jipitec-10-3-2019/5042> (last accessed 8 February 2022).

48 These were optional exceptions and limitations in Arts 5(3)(d) and (k) of the InfoSoc Directive, which have not been implemented in all Member States; where they have, the implementations differ.

49 . Art 17(9) para 4 CDSM Directive.

50 See, eg, Recital 70 CDSM Directive.

51 See Quintais et al, supra, note 47; M Husovec, “(Ir)Responsible Legislature? Speech Risks under the EU’s Rules on Delegated Digital Enforcement” (LSE 2021) Working Paper <https://papers.ssrn.com/abstract=3784149> (last accessed 4 May 2021); C Geiger and B Justin Jütte, “Platform Liability under Art. 17 of the Copyright in the Digital Single Market Directive, Automated Filtering and Fundamental Rights: An Impossible Match” (2021) GRUR International <https://academic.oup.com/grurint/advance-article-abstract/doi/10.1093/grurint/ikab037/6169057?redirectedFrom=fulltext> (last accessed 4 May 2021). See also, agreeing with this interpretation, Guidance Art 17 CDSM Directive, supra, note 22, pp 2–3. See also supra at Section II.1 and note 42.

52 Case C-401/19, Poland v Parliament and Council. Arguing that the Court should invalidate Art 17 on these ground, see Geiger and Jütte, supra, note 51; Husovec, supra, note 51. See also Case C-401/19, Poland v Parliament and Council, Opinion of Advocate General Saugmandsgaard Øe delivered on 15 July 2021, ECLI:EU:C:2021:613 (hereafter AG Opinion C-401/19, Poland).

53 BJ Jutte and G Priora, “On the Necessity of Filtering Online Content and Its Limitations: AG Saugmandsgaard Øe Outlines the Borders of Article 17 CDSM Directive” (Kluwer Copyright Blog, 20 July 2021) <http://copyrightblog.kluweriplaw.com/2021/07/20/on-the-necessity-of-filtering-online-content-and-its-limitations-ag-saugmandsgaard-oe-outlines-the-borders-of-article-17-cdsm-directive/> (last accessed 6 October 2021).

54 AG Opinion C-401/19, Poland, paras 112–14 and 196, citing the Court’s judgment in Case C-18/18, Eva Glawischnig-Piesczek v Facebook Ireland Limited (3 October 2019), ECLI:EU:C:2019:821. On the topic of general monitoring obligations in the context of EU copyright law, see Angelopoulos and Senftleben, supra, note 36; C Angelopoulos, “YouTube and Cyando, Injunctions against Intermediaries and General Monitoring Obligations: Any Movement?” (Kluwer Copyright Blog, 9 August 2021) <http://copyrightblog.kluweriplaw.com/2021/08/09/youtube-and-cyando-injunctions-against-intermediaries-and-general-monitoring-obligations-any-movement/> (last accessed 5 October 2021).

55 AG Opinion C-401/19, Poland, para 196 ff. In this respect, the Opinion deviates from and strongly criticises the Commission’s Guidance, which suggests a separate category of “earmarked content” susceptible of filtering, with lesser safeguards. See ibid, para 223 (Postscriptum) and Guidance Art 17 CDSM Directive, supra, note 22, pp 22–24. For criticism, see J Reda and P Keller, “European Commission Back-Tracks on User Rights in Article 17 Guidance” (Kluwer Copyright Blog, 4 June 2021) <http://copyrightblog.kluweriplaw.com/2021/06/04/european-commission-back-tracks-on-user-rights-in-article-17-guidance/> (last accessed 24 June 2021); C Geiger and BJ Jütte, “Towards a Virtuous Legal Framework for Content Moderation by Digital Platforms in the EU? The Commission’s Guidance on Article 17 CDSM Directive in the Light of the YouTube/Cyando Judgement and the AG’s Opinion in C-401/19” (2021) European International Property Review <https://papers.ssrn.com/abstract=3889049> (last accessed 17 August 2021).

56 AG Opinion C-401/19, Poland, para 193.

57 In this article, we do not examine the third level of safeguards in Art 17(9) CDSM Directive, relating to judicial authority or court level. See, eg, Schwemer, supra, note 28; S Schwemer and J Schovsbo, “What Is Left of User Rights? – Algorithmic Copyright Enforcement and Free Speech in the Light of the Article 17 Regime” in P Torremans (ed.), Intellectual Property Law and Human Rights, 4th edition (Alphen aan den Rijn, Wolters Kluwer 2020) p 17.

58 Emphasis added. Note that the requirement is on Member States, compared to the ensuring of unavailability, which is on the platforms. This first aspect resembles the Commission’s original proposal from September 2016, where it suggested in Art 13(2) that “Member States shall ensure that the service providers … put in place complaints and redress mechanisms that are available to users in case of disputes over the application of the measures …”; see Proposal for a Directive of the European Parliament and of the Council on copyright in the Digital Single Market – COM(2016)593.

59 Art 17(9)(2), subpara 2 CDSM Directive (emphasis added). On a critique of the “elastic timeframe”, see M Senftleben, “Bermuda Triangle – Licensing, Filtering and Privileging User-Generated Content Under the New Directive on Copyright in the Digital Single Market” (2019) 41 European Intellectual Property Review 480. In its Council vote, Germany suggests the timeframe to be understood “as rapidly as possible”; see Draft Directive of the European Parliament and of the Council on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC (first reading), Statements (Council of the European Union, 15 April 2019) <https://data.consilium.europa.eu/doc/document/ST-7986-2019-ADD-1-REV-2/en/pdf> (Statement by Germany, point 7) (last accessed 8 February 2022).

60 Similarly, see Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online, 6 March 2018, [2018] L 63/50, points 20 and 27 in relation to proactive measures on human oversight, and in the context of data protection, see, eg, Art 22(3) Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). See also Schwemer, supra, note 28; Schwemer and Schovsbo, supra, note 57.

61 Recital 70 (emphasis added).

62 Article 17(9) subpara 2 CDSM Directive.

63 See Senftleben, supra, note 59.

64 For empirical work on over-enforcement, see, eg, K Erickson and M Kretschmer, “Empirical Approaches to Intermediary Liability”, CREATe Working Paper 2019/6 (2019), p 10 ff; J Urban, J Karaganis and B Schofield, “Notice and Takedown: Online Service Provider and Rightsholder Accounts of Everyday Practice” (2017) 64 Journal of the Copyright Society 371; S Bar-Ziv and N Elkin-Koren, “Behind the Scenes of Online Copyright Enforcement: Empirical Evidence on Notice & Takedown” (2017) 50 Connecticut Law Review; specifically in the context of YouTube and parodies, see K Erickson and M Kretschmer, “This Video Is Unavailable: Analyzing Copyright Takedown of User-Generated Content on YouTube” (2018) 9 Journal of Intellectual Property, Information Technology and Electronic Commerce Law 75; S Jacques et al, “An Empirical Study of the Use of Automated Anti-Piracy Systems and Their Consequences for Cultural Diversity” (2018) 15(2) SCRIPTed 277–312. For a recent overview of existing studies in this areas, see also D Keller and P Leerssen, “Facts and Where to Find Them: Empirical Research on Internet Platforms and Content Moderation” in Social Media and Democracy: The State of the Field and Prospects for Reform (Cambridge, Cambridge University Press 2019) <https://papers.ssrn.com/abstract=3504930> (last accessed 4 May 2021).

65 Art 17(9) subpara 2 CDSM Directive.

66 ibid.

67 ibid.

68 Guidance Art 17 CDSM Directive, supra, note 22, pp 18–25.

69 European Commission, “REFIT – Making EU Law Simpler, Less Costly and Future Proof <https://ec.europa.eu/info/law/law-making-process/evaluating-and-improving-existing-laws/refit-making-eu-law-simpler-less-costly-and-future-proof_en> (last accessed 8 February 2022).

70 See Art 1 DSA proposal.

71 See infra, Section III.2.b.i.

72 In other words, the specific liability exemptions for “mere conduit”, “caching” and hosting remain largely unchanged.

73 See, eg, A Kuczerawy, “The Good Samaritan That Wasn’t: Voluntary Monitoring under the (Draft) Digital Services Act” (Verfassungsblog, 12 January 2021) <https://verfassungsblog.de/good-samaritan-dsa/> (last accessed 5 May 2021).

74 SF Schwemer, T Mahler and H Styri, “Liability Exemptions of Non-Hosting Intermediaries: Sideshow in the Digital Services Act?” (2021) 8 Oslo Law Review 4.

75 See Arts 8 and 9 DSA proposal.

76 Art 1(1)(aa) AVMSD Directive 2010/13/EC, as amended by Directive (EU) 2018/1808 (AVMSD).

77 Arts 1 and 2 Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online (Text with EEA relevance) (Terrorist Content Regulation).

78 Art 2(g) DSA proposal defines “illegal content” as “any information, which, in itself or by its reference to an activity, including the sale of products or provision of services is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law”.

79 In the context of the AVMSD, YouTube would qualify as a video-sharing platform service.

80 Confusingly, the Explanatory Memorandum in one instance notes that “the proposal does not amend sector-specific legislation or the enforcement and governance mechanisms set thereunder, but provides for a horizontal framework to rely on, for aspects beyond specific content or subcategories of services regulated in sector-specific acts” (Explanatory Memorandum, p 6). The wording “amend” could suggest a broader exclusion than “unaffected”. Since recitals and articles of the proposal, however, do not take this up, we refrain from further analysis.

81 Recital 9 DSA proposal.

82 Unfortunately, the Explanatory Memorandum refrains from specifically addressing its relation to the CDSM Directive. Since the AVMSD explicitly only serves as one example, however, there is no indication that this general principle would not apply to other specific rules. In the specific context of the AVMSD, see also Art 28a(5) AVMSD.

83 One could also reflect upon the chronological order of the legislative acts in the vein of a lex posteriori derogate (legi) priori interpretation. However, since the CDSM Directive was only adopted as late as April 2019 and since there is no indication of such intention in the preparatory works, we refrain from further exploring this perspective.

84 A different set of questions, notably around the primacy of EU law, would arise if Member States lack that discretion.

85 Reaching the same conclusion, see Peukert et al, supra, note 11, 3–4. See, in particular, the IMCO Draft European Parliament Legislative Resolution, EP Document A9-0356/2021 (proposed Amendment 11 to Recital 11, according to which the DSA is without prejudice to the CDSM Directive, since the Directive “establish[es] specific rules and procedures that should remain unaffected”).

86 See Arts 3–9 DSA proposal.

87 See also Art 71 DSA proposal.

88 On which, see Kuczerawy, supra, note 73, noting that the provision “aims to eliminate existing disincentives towards voluntary own-investigations undertaken by internet intermediaries”, but that “it is questionable whether facilitating more voluntary removals is actually beneficial from the perspective of users and their right to freedom of expression”.

89 Cf. Recital 73 DSA proposal.

90 Arts 10–37 DSA proposal.

100 Figure 4 is an adjustment by the authors of a similar figure available at European Commission, “The Digital Services Act: Ensuring a Safe and Accountable Online Environment”, Which Providers Are Covered <https://ec.europa.eu/info/strategy/priorities-2019-2024/europe-fit-digital-age/digital-services-act-ensuring-safe-and-accountable-online-environment_en> (last accessed 8 February 2022).

91 Note, however, eg Art 14(3) DSA proposal (“Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned”).

92 Arts 1(f) and 25 DSA proposal.

93 Art 2(f) DSA proposal.

94 Similar to the current wording of the e-Commerce Directive’s Art 14.

95 Art 2(h) DSA proposal.

96 Art 25 DSA proposal.

97 See, eg, the number of average users in the EU for these platforms reported in DSA Impact Assessment – Part 2/2 (Brussels, 15 December 2020) SWD(2020) 348 Final, pp 64–65. In other words, whereas many online platforms might qualify as OCSSPs under the CDSM Directive (eg TikTok, Twitter, Reddit or Pornhub), their qualification as VLOPs under the DSA depends on the number of recipients of the service.

98 Arts 25–33 DSA proposal. For a visual representation, see European Commission, “The Digital Services Act: Ensuring a Safe and Accountable Online Environment” <https://ec.europa.eu/info/strategy/priorities-2019-2024/europe-fit-digital-age/digital-services-act-ensuring-safe-and-accountable-online-environment_en> (last accessed 8 February 2022).

99 DSA Impact Assessment, Part 1/2 (Brussels, 15 December 2020) SWD(2020) 348 Final, p 12.

101 Arts 28a and 29a AVMSD.

102 MZ van Drunen, “The Post-Editorial Control Era: How EU Media Law Matches Platforms’ Organisational Control with Cooperative Responsibility” (2020) 12 Journal of Media Law 166.

103 See Art 71 DSA proposal.

104 See supra, Section II. Note that activities of certain online platforms that host copyright-protected materials but do not qualify as OCSSPs will still potentially benefit from the safe harbour in Art 14 e-Commerce Directive/Art 5 DSA.

105 A further analysis of the differences between Art 7 DSA proposal and Art 15 e-Commerce Directive is beyond the scope of this paper.

106 See supra, Section II.1 and Schwemer, supra, note 28. Generally on the topic and with further interpretations, see Angelopoulos and Senftleben, supra, note 36.

107 See infra, Section III.2.b.i.

108 This may be different in other sector-specific legislation, which is beyond the scope of this paper.

109 A Larroyed, “When Translations Shape Legal Systems: How Misguided Translations Impact Users and Lead to Inaccurate Transposition – The Case of ‘Best Efforts’ Under Article 17 DCDSM” (Institute for Globalization and International Regulation Maastricht University 2020) <https://papers.ssrn.com/abstract=3740066> (last accessed 14 July 2021). See also Guidance Art 17 CDSM Directive, supra, note 22, advancing multiple possible interpretations of the concepts of best efforts in Art 17(4).

110 See Guidance Art 17 CDSM Directive, supra, note 22, pp 9–11 (Section V.1). For a critical analysis, see JP Quintais, “Commission’s Guidance on Art. 17 CDSM Directive: The Authorisation Dimension” (Kluwer Copyright Blog, 10 June 2021) <http://copyrightblog.kluweriplaw.com/2021/06/10/commissions-guidance-on-art-17-cdsm-directive-the-authorisation-dimension/> (last accessed 7 October 2021).

111 Art 14(3) e-Commerce Directive states the hosting safe harbour “shall not affect the possibility for a court or administrative authority, in accordance with Member States’ legal systems, of requiring the service provider to terminate or prevent an infringement, nor does it affect the possibility for Member States of establishing procedures governing the removal or disabling of access to information”.

112 On which, see M Husovec, Injunctions against Intermediaries in the European Union: Accountable but Not Liable? (Cambridge, Cambridge University Press 2017); C Angelopoulos, European Intermediary Liability in Copyright: A Tort-Based Analysis (Alphen aan den Rijn, Kluwer Law International 2016); Angelopoulos and Senftleben, supra, note 36; Angelopoulos, supra, note 54.

113 We do not discuss here the exclusion, in Art 16 DSA, of micro and small enterprises from the additional obligations imposed on online platforms in Section 3 of Chapter III DSA. In this connection, it is noteworthy that Art 17(6) CDSM Directive contains a special regime for small and new OCSSPs, with mitigated obligations but no exclusion. On this regime and its interpretation, see also Guidance Art 17 CDSM Directive, supra, note 22, pp 16–17.

114 Chapter III, Section 1 DSA proposal, with certain further adjustments of the obligations for specific intermediary services.

115 Art 12 DSA obliges intermediary services inter alia to provide information on content moderation including algorithmic decision-making and human review. Art 19(9) subpara 4 CDSM Directive, too, stipulates a duty on OCSSPs to inform users regarding their terms and conditions, although only with respect to the possibility to use copyright-protected works under copyright limitations and exceptions provided for in the copyright acquis.

116 See Art 29 DSA proposal (and subsequent amendments by Parliament). See also SF Schwemer, “Recommender Systems in the EU: From Responsibility to Regulation” (2022) 1 Morals & Machines 60–69.

117 We refer in this paragraph to the Commission’s proposal version of Art 12 DSA. Although the provision has been subject to amendments in both the Council and European Parliament versions, the core obligations we examine remain intact.

118 See Schwemer et al, supra, note 74.

119 For criticism, see Appelman et al, “Article 12 DSA”, supra, note 16; Appelman et al, “Using Terms and Conditions to Apply Fundamental Rights to Content Moderation”, supra, note 16; A Peukert, “Five Reasons to be Skeptical About the DSA” (Verfassungsblog, 2021) <https://verfassungsblog.de/power-dsa-dma-04/> (last accessed 8 October 2021). It is, for example, unclear whether Art 12(2) DSA would require some kind of fundamental rights impact assessment by the respective intermediaries.

120 On this point, the Commission’s Guidance merely suggests that “Member States could give recommendations on how service providers can increase users’ awareness of what may constitute legitimate uses”, such as through the provision of “accessible and concise information on the exceptions for users, containing as a minimum information on the mandatory exceptions provided for in Article 17”. See Guidance Art 17 CDSM Directive 90, supra, note 22, p 26, adding: “Besides providing this information in the general terms and conditions of the service providers, this information could be given in context of the redress mechanism, to raise the awareness of users of possible exceptions or limitations that can be applicable”.

121 Chapter III, Section 2 DSA proposal.

122 See supra, Section II.

123 See Art 17(4)(c) CDSM Directive. In this respect, Art 17 clearly requires that the notice must originate from the rights holder (and presumably its representative), which is a marked difference from Art 14(1) DSA proposal. The latter allows “any individual or entity to notify [hosting service providers] of the presence on their service of specific items of information that the individual or entity considers to be illegal content”. In our view, given the private right nature of copyright and the specific requirement in Art 17(4) CDSM Directive, only rights holders or those entitled to act on their behalf (eg pursuant to Art 5 Enforcement Directive) would be able to make notifications to service providers regarding copyright infringement, even if portions of Art 14 CDSM Directive apply to OCSSPs.

124 Guidance Art 17 CDSM Directive, supra, note 22, pp 15–16 (referring to points 6–8 of the Recommendation).

125 Art 14(2) DSA proposal.

126 See also Peukert et al, supra, note 11, 4–5.

127 Explanatory Memorandum, DSA proposal p 7.

128 ibid, p 4.

129 See Guidance Art 17 CDSM Directive, supra, note 22, pp 15–16.

130 ibid, p 1.

131 ibid.

132 On YouTube’s general approach in this regard, see M Pancini, “YouTube’s Approach to Copyright” (Google, 31 August 2021) <https://blog.google/around-the-globe/google-europe/youtubes-approach-to-copyright/> (last accessed 14 October 2021). On the German implementation act, see BJfV, Aktuelle Gesetzgebungsverfahren, “Act on the Copyright Liability of Online Content Sharing Service Providers” (Urheberrechts-Diensteanbieter-Gesetz – UrhDaG, 14 June 2021) <https://www.bmjv.de/SharedDocs/Gesetzgebungsverfahren/Dokumente/UrhDaG_ENG.html?nn=6712350> (last accessed 8 February 2022).

133 See Art 14(3) DSA proposal.

134 See, in this respect, the proposals in Peukert et al, supra, note 11, 14.

135 Another related question is what framework would apply if one and the same video is relevant from a copyright perspective and a non-copyright perspective (eg a parody of a copyright-protected work that also contains hate speech).

136 See, eg, Recitals 4 and 7 DSA proposal.

137 Compare also The Santa Clara Principles on Transparency and Accountability on Content Moderation (Santa Clara Principles 2.0) <https://santaclaraprinciples.org> (last accessed 8 February 2022).

138 This point is confirmed in YouTube’s 2021 copyright transparency report, where it emerges that only 0.05% of copyright claims via ContentID were contested by users in the first half of 2021 (ie 3,698,019 “disputed claims” out of 722,649,569 total ContentID claims). See Peukert et al, supra, note 11, 10. See generally on this topic JM Urban, J Karaganis and B Schofield, “Notice and Takedown in Everyday Practice” (Social Science Research Network 2017) SSRN Scholarly Paper ID 2755628 <https://papers.ssrn.com/abstract=2755628> (last accessed 8 October 2021); N Elkin-Koren and M Perel (Filmar), “Algorithmic Governance by Online Intermediaries” in E Brousseau, J-M Glachant and J Sgard (eds.), Oxford Handbook of International Economic Governance and Market Regulation (Oxford, Oxford University Press 2018) <https://papers.ssrn.com/abstract=3213355> (last accessed 20 January 2022).

139 In this respect, see the proposal for an “institutional intermediary” in this area by Geiger and Jütte, supra, note 51.

140 See Guidance Art 17 CDSM Directive, supra, note 22, pp 20, 21, 23 and 24; and AG Opinion C-401/19, Poland, para 201.

141 Arguably, they could also reduce costs for platforms, who could scale up their existing mechanisms for other illegal content also to copyright-protected content. But a conclusion on this point would require access to data on the costs of switching systems and the running costs of the current mechanism for running copyright redress mechanisms as compared to future costs under a system with additional procedural safeguards.

142 Charter of Fundamental Rights of the European Union, OJ C 326, 26 October 2012, pp 391–407.

143 For a scholarly analysis of the use of the “high level of protection” justificatory argument by the CJEU in this context, see, eg, M Favale, M Kretschmer and PC Torremans, “Is There an EU Copyright Jurisprudence? An Empirical Analysis of the Workings of the European Court of Justice” (2016) 79 The Modern Law Review 31.

144 Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online, OJ L 63, 6.3.2018, pp 50–61, point 4(g).

145 See generally SF Schwemer, “Trusted Notifiers and the Privatization of Online Enforcement” (2019) 35 Computer Law & Security Review 105339.

146 See supra, Section III.2.b.i.

147 See, eg, Google Support, YouTube Help, YouTube Trusted Flagger program <https://support.google.com/youtube/answer/7554338?hl=en> (last accessed 8 February 2022), Facebook Response to EU Public Consultation on the Digital Services Act (DSA) <https://about.fb.com/de/wp-content/uploads/sites/10/2020/09/FINAL-FB-Response-to-DSA-Consultations.pdf> (last accessed 8 February 2022).

148 Art 19(2) DSA proposal.

149 Art 19(5) DSA proposal.

150 Art 19(6) DSA proposal.

151 See, eg, Keller and Leerssen, supra, note 64.

152 For an in-depth analysis, see Schwemer, supra, note 145.

153 See, eg, Recital 46, which states that “the rules of this Regulation on trusted flaggers should not be understood to prevent online platforms from giving similar treatment to notices submitted by entities or individuals that have not been awarded trusted flagger status …”.

154 Art 20(1) DSA proposal.

155 Art 20(2) DSA proposal.

156 Outside the scope of this paper, it is also surprising that the misuse mechanism is only foreseen for online platforms but not other intermediary services (eg hosting or even non-hosting intermediary services). Arguably, in light of the DSA’s goals, it would be desirable that such a misuse mechanism is applicable to all voluntary notice-and-action mechanisms for all intermediary services.

157 For an analysis of these provisions, see, eg, Buri and van Hoboken, supra, note 13; J Barata, “The Digital Services Act and Its Impact on the Right to Freedom of Expression: Special Focus on Risk Mitigation Obligations” (PDLI 2021) <https://libertadinformacion.cc/wp-content/uploads/2021/06/DSA-AND-ITS-IMPACT-ON-FREEDOM-OF-EXPRESSION-JOAN-BARATA-PDLI.pdf> (last accessed 8 February 2022).

158 See, similarly, as regards risk mitigation, Peukert et al, supra, note 11, 10–11.

159 See, similarly, as regards transparency provisions, Peukert et al, supra, note 11, 5–6, 11–12. On the topic of transparency in the DSA proposal, see, eg, P Leerssen, “Platform Research Access in Article 31 of the Digital Services Act” (Verfassungsblog, 7 September 2021) <https://verfassungsblog.de/power-dsa-dma-14/> (last accessed 8 October 2021).

160 Council of the European Union, Working Paper, N° Cion doc.: 14124/20, Digital Services Act and EU copyright legislation - Information from the Commission, Brussels, 1 March 2021 WK 2824/2021 INIT (on file with the authors).

161 The Commission’s presentation itself continues to lay out the possible “complementarity” of the DSA rules with regards to Art 17 CDSM Directive, namely for “e.g. transparency obligations with regard to action taken by the online platform; trusted flaggers”. See ibid.

162 See supra, Section III.2.a.

163 This understanding is also supported by Recital 11 DSA proposal, which clearly only relates to “specific rules and procedures” (emphasis added). Naturally, before the DSA is approved and comes into force, national laws can make use of the margin of discretion available to them in this respect under Art 17 CDSM Directive.

164 Which, in our example, relates to copyright rights holders, but this could also be the holder of another protected right.

165 Art 1(2)(b) DSA proposal (setting out the aims of the Regulation).

Figure 0

Figure 1. Online content-sharing service providers (OCSSPs) in the context of the e-Commerce Directive. CDSM Directive = Directive (EU) 2019/790 on Copyright in the Digital Single Market.

Figure 1

Figure 2. Overlap between the Digital Services Act (DSA) and Directive (EU) 2019/790 on Copyright in the Digital Single Market (CDSM Directive). OCSSP = online content-sharing service provider; VLOP = very large online platform.

Figure 2

Figure 3. An example of overlap between regulatory regimes in the case of online content-sharing service providers (OCSSPs). CDSMD = Directive (EU) 2019/790 on Copyright in the Digital Single Market; DSA = Digital Services Act; VLOP = very large online platform.

Figure 3

Figure 4. Digital Services Act typology of information society service providers and the placement of online content-sharing service providers (OCSSPs).100 VLOP = very large online platform.