The risk of mandatory upload filters for freedom of expression and information online has been at the core of criticisms of Article 17 of the EU Directive on Copyright in the Digital Single Market (DSM Directive). This risk is evident from numerous examples of restrictions on legitimate speech resulting from the voluntary use of such systems by major online platforms. Recently, the national public broadcaster of Turkey has used YouTube’s copyright filters to silence critical reporting on the Turkish government by having videos of independent journalists blocked and their accounts suspended for alleged copyright infringement. Given the heightened public attention to Article 17’s freedom of expression implications, it is not surprising that the Polish government has based an action for annulment of certain central provisions of Article 17 before the European Court of Justice (Case C-401/19) on the assertion that they violate this fundamental right, enshrined in Article 11 of the Charter.
A much less prominent fundamental rights issue may however play an important role in the CJEU judgment on Article 17. On several occasions, the CJEU has thrown out blocking injunctions for violating the service providers’ freedom to conduct a business. In a recently published study on behalf of German fundamental rights litigation organization Gesellschaft für Freiheitsrechte e.V., the authors of this blog post argue that when ruling on the request for annulment of Article 17, the CJEU will have to balance all relevant fundamental rights, including the freedom to conduct a business. In this blog post, we will put the spotlight on this under-examined fundamental right. In part 1, we will discuss its relevance for the court case pending before the CJEU. We will examine the ways in which Article 17 places new burdens on online platforms that are fundamentally different from the voluntary copyright enforcement schemes employed by some of the larger platforms today. In part 2, we analyse those new platform obligations in light of the CJEU case law on the freedom to conduct a business and discuss the role of the proportionality mechanism included in Article 17 (5). We find that the legislator may have grossly underestimated the impact of Article 17 on the freedom to conduct a business.
Can the CJEU take into account the platforms’ freedom to conduct a business?
Poland’s legal challenge before the CJEU is narrow in two respects: It is limited to the provisions of Article 17 which require the blocking of user uploads, Article 17 (4) (b) and (c), and it is solely based on a violation of Article 11 of the Charter. While Poland’s approach bars the CJEU from examining the legality of other provisions of Article 17, it does not limit the CJEU’s competence to take other affected fundamental rights into account. The standard of review in actions for annulment before the CJEU is the entirety of primary law. The rather narrow scope of the action filed by Poland therefore does not limit the CJEU’s competence to assess compatibility with the Charter as a whole. This is particularly the case in horizontal, multi-polar fundamental rights situations, since the CJEU bases its assessment on a balancing of all affected fundamental rights, as it explained in Scarlet, another case dealing with automated filtering of copyright-protected content.
The CJEU is not just entitled but required to comprehensively examine the challenged provisions and strike a fair balance between all affected fundamental rights, including a possible violation of the platforms’ freedom to conduct a business provided by Article 16 of the Charter. Accordingly, the CJEU routinely takes fundamental rights into account that were not mentioned by the referring courts in preliminary ruling proceedings. The contested provisions of Article 17 relate to the relationship between the rightsholders, the platforms and their users. In this context, the protection of intellectual property, under Article 17 (2) of the Charter, must not only be weighed against the fundamental rights of the users but also against those of the platform operators.
Article 17 affects small platforms
The charge that mandatory upload filters violate the freedom to conduct a business of platform operators has often been dismissed by proponents of Article 17 with reference to the fact that certain platforms already use upload filters on a voluntary basis (ch. 7.1). The additional burden placed on platform operators should therefore – it is argued – be limited. Indeed, in a time when major social media companies are the focus of public discussion because of their immense influence over public discourse, many commentators seem to be under the impression that platforms are capable of implementing any policy and that their economic capacities are limitless. Nevertheless, the argument that mandatory upload filters would not place a serious burden on platform companies is unconvincing on two grounds: First of all, although the public debate on Article 17 has largely focused on copyright enforcement by Internet giants such as YouTube and Facebook, its actual scope of application is much broader and includes small and medium-sized enterprises which do not have filtering systems in place, and may not even experience frequent copyright infringement on their services.
The fact that Article 17 includes a lighter regime for start-up companies within their first three years of business, provided that their annual revenues remain below 10 million and their monthly user base is below 5 million, implies that as a general rule platforms with annual revenues below 10 million that fail to fulfil the other two requirements are intended to be covered by the obligations laid out in Article 17. By comparison, the annual revenues of Facebook and YouTube are both well over 10 billion, more than a thousand times larger than those of companies which may hope to benefit from the lighter startup regime.
While the European Commission’s recently published proposal for a Digital Markets Act recognizes that global gatekeepers with tens of millions of users and billions in revenues merit stricter regulation than small discussion forums, the DSM Directive largely lacks this nuance. Although Recital 62 explains that the definition of affected platforms should be limited to those “that play an important role on the online content market by competing with other online content services, such as online audio and video streaming services, for the same audiences”, this restriction is not reflected in the legal definition of online content-sharing service providers in Article 2(6). Instead, the legal definition relies on the criterion of a “large amount of copyright-protected works or other subject-matter”, which is a very poor indicator for the actual number of infringements, and which almost every platform that hosts some sort of user-generated content fulfills. Protected content, after all, is not limited to professionally produced films and music by large media companies, but also includes every snapshot uploaded by any user. Even dating portals like Tinder therefore risk falling under the legal definition.
A chain reaction of fundamental rights restrictions
The other reason why the existence of voluntary upload filters for copyright infringement on some major platforms tells us little about the true impact of Article 17 on the freedom to conduct a business is the fact that those companies have been applying upload filters selectively, often only covering a particular type of protected content, such as music recordings, and only offering blocking functionality to a small handful of major rightsholders. A requirement for platforms to introduce or extend the use of upload filters would multiply the costs associated with their operation, because platforms cannot technologically limit the types of copyright-protected works that may be uploaded (images may include embedded text, videos may include sound or images, text may include literary works or software code etc.).
Giving all rightsholders the power to unilaterally block user uploads without human oversight, as only a small number of rightsholders for particular types of works are able to do today, would also drastically increase the cost of moderation over disputes. This likely consequence of Article 17 should not only concern those who consider the freedom to conduct a business worth protecting in its own right, but also those primarily preoccupied with freedom of expression and information. In the above-mentioned example, the Turkish public broadcaster TRT was only able to use wrongful copyright claims to suppress critical reporting on the Turkish government because TRT is a major rightsholder in its own right, and has been given access to YouTube’s proprietary filtering tools. Under Article 17, YouTube would likely be required to give access to these tools to anyone who claims to be a rightsholder and not just to those that meet the requirements of “trusted flagger”. Given the extremely low thresholds for originality and the absence of a public copyright register, that is virtually anyone.
In addition, the current practice of platforms that voluntarily filter differs from what may be required under Article 17 with regard to the length of extracts from protected works that can be detected and automatically blocked. The accuracy of filtering technologies based on fingerprinting, which are among the most sophisticated tools available today (though they are still completely incapable of detecting legal uses under copyright exceptions), correlates with the length of the protected work to be detected. Should Article 17 lead to an obligation on platforms to detect even very short extracts of protected works, which are not covered by voluntarily used upload filters today, the number of false identifications, and thus the blocking of legal content, would drastically increase.
The likely outcome of this change will be an explosion both of the costs associated with complaint and redress mechanisms, which platforms will be required to offer, and of the instances in which legitimate expression is silenced. Combined with the obligation on platforms to prevent the removal of legal content, which is also a central provision of Article 17, platforms are given little to no guidance on what they must do in order to escape liability. In practice, platforms will probably judge the risk of being sued by rightsholders for leaving infringements online as higher than the risk of being sued by users for over-blocking, a clear incentive to throw user rights under the bus.
Part 2 of this blog post will build upon our assessment of these specific additional burdens facing platform operators after Article 17 has been implemented. We will go on to discuss on the basis of the CJEU case-law on the freedom to conduct a business whether these burdens constitute a restriction of the fundamental rights of platform operators and whether those restrictions are permissible, particularly in light of the principle of proportionality included in Article 17 (5).
CC BY 4.0
_____________________________
To make sure you do not miss out on regular updates from the Kluwer Copyright Blog, please subscribe here.
Kluwer IP Law
The 2022 Future Ready Lawyer survey showed that 79% of lawyers think that the importance of legal technology will increase for next year. With Kluwer IP Law you can navigate the increasingly global practice of IP law with specialized, local and cross-border information and tools from every preferred location. Are you, as an IP professional, ready for the future?
Learn how Kluwer IP Law can support you.
The authors’ seem to assume that OCSSPs under Art 17 are obliged to block content. That is not the case. Under Art 17 no OCSSP is obliged to block content. Blocking becomes an obligation only after an OCSSP decides against obtaining licenses and instead wishes to make use of the process provided in the article to avoid liability under copyright law. The authors’ argument has not much to do with fundamental rights, and all about the belief that OCSSPs should be free to use copyright works. While that may a valid point of view to hold, it does not reflect what is good law.
Thank you for your comment. It is correct that OCSSPs are only obliged to block content when they have been unable to obtain a license. We do not explicitly mention this obligation, enshrined in Article 17 (4) (a) CDSMD, in the blog post, because it is not subject of the action for annulment of parts of Article 17 before the CJEU. We do however address the issue in our full-length study, where we come to the conclusion that despite the existence of fundamental rights issues with Article 17 (4) (a), the Court is unlikely to address them in this case, as it must limit its assessment to the provisions that have been challenged, namely Article 17 (4) (b) and (c) in fine.
However, obtaining a license is not solely a decision left to the OCSSP, as you suggest. On the contrary, while OCSSPs are obliged to make best efforts to obtain a license, rightsholders are not obliged to offer a license to the OCSSP. Recital 61 CDSMD states that “rightholders should not be obliged to give an authorisation or to conclude licensing agreements”. So unless Member States are introducing some form of mandatory collective management of the right to communication to the public established in Article 17, which so far no Member State has proposed, OCSSPs will be obliged to make best efforts to block content as soon as a single rightsholder refuses to offer the OCSSP a license and provides them with the information necessary for blocking instead. Some sectors, most notably the film and broadcasting industries, have already indicated that they are not interested in offering licenses to OCSSPs, because they wish to continue exploiting their catalogues on an exclusive basis. It is therefore clear that OCSSPs will not be able to evade the blocking obligation by making best efforts to obtain licenses, because the obligation to make best efforts to block applies in addition to said obligation. Only if an OCSSP has actually obtained a license are they exempt from having to implement Article 17 (4) (b) and (c), not if they have tried and failed to obtain a license because the rightholder refused to offer one.
I also disagree with your assessment that this is an argument about whether OCSSPs should be free to use copyright-protected works. Even OCSSPs that have no interest in users uploading copyright-infringing works have no means of fully preventing it, unless they also remove the ability for users to upload legitimate content, which obviously would constitute a severe restriction of their freedom to conduct a business. For example, if an OCSSP allows the upload of text that users have written themselves, there is no means of technologically preventing users from also uploading text that includes literary works, or software code. A balancing of all affected fundamental rights must leave the possibility to run a legitimate business that includes the uploading of user-generated content without having to obtain licenses from all rightsholders in the world, of which there are hundreds of millions, merely because their works may be uploaded to the platform at some point in the future.
All of these points are discussed in detail in our study, which is available here: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3732223