Image by Thomas Breher via Pixabay

With its landmark decision in Poland/Parliament and Council of 26 April 2022 (case C-401/19), the Grand Chamber of the Court of Justice of the European Union (CJEU) has clarified that the filtering obligations arising from Article 17(4)(b) and (c) of the Directive on Copyright in the Digital Single Market 2019/790 (“DSM Directive” or “DSMD”), are not unconstitutional per se. The decision has already been discussed by a number of commentators. For instance, see the posts by João Pedro Quintais, Eleonora Rosati and Christophe Geiger and Natasha Mangal. Adding a nuance, the following analysis focuses on the fact that the Court qualified the complaint and redress mechanisms mandated by Article 17(9) DSMD as additional safeguards against content overblocking (para. 93). Hence, these ex post measures – allowing corrections of wrong filtering decisions after the harm has occurred – cannot be considered sufficient. First and foremost, it is necessary to have ex ante mechanisms in place that allow permissible content uploads, such as quotations, parodies and pastiches, to survive algorithmic content scrutiny. Implementing Article 17 DSMD, EU Member States must ensure that such lawful content appears directly on the platform. But let’s explore this aspect of the decision step by step.

The Court held that the specific liability regime following from Article 17(4)(b) and (c) DSMD was not only appropriate but also appeared necessary to meet the need to protect intellectual property rights falling under the right to property recognized in Article 17(2) of the Charter of Fundamental Rights of the European Union (para. 83). It was satisfied that the filtering obligations imposed on online content-sharing service providers (“OCSSPs”, see the definition in Article 2(6) DSMD) did not “disproportionately restrict the right to freedom of expression and information of users of those services” (para. 84). The Court, thus, confirmed the legitimacy of content filtering systems in the light of the principle of proportionality.

For a content filtering system to satisfy the proportionality requirements formulated by the Court, however, it must meet several preconditions. First, the CJEU underlined that a filtering system could only be deemed permissible if it did not suppress lawful user uploads, such as uploads falling within the scope of a copyright limitation, concerning public domain material or consisting of own creations of the uploader:

“[a] filtering system which might not distinguish adequately between unlawful content and lawful content, with the result that its introduction could lead to the blocking of lawful communications, would be incompatible with the right to freedom of expression and information, guaranteed in Article 11 of the Charter, and would not respect the fair balance between that right and the right to intellectual property” (para. 86).

The CJEU recalled in this respect that copyright limitations, such as the quotation right and the exemption for parodies, caricatures and pastiches of Article 5(3)(d) and (k) InfoSoc Directive (see also Painer para.132, Deckmyn para. 26, Spiegel Online para. 54 and Funke Medien para. 70), conferred “rights on the users of works or of other protected subject matter” (para. 87) and sought to ensure “a fair balance between the fundamental rights of those users and of rightholders” (para. 87). Article 17(7) DSMD left no doubt that the adoption of these copyright limitations at the national level was mandatory, and that the user rights following from Article 17(7) DSMD had to survive the introduction of automated content filtering systems. Each EU Member State was bound to ensure that users could upload and make available content generated by themselves for the specific purposes of quotation, criticism, review, caricature, parody or pastiche (para. 87). The Court also emphasized that Article 17(8) DSMD prohibited any general monitoring obligation. Therefore, OCSSPs could not be required to prevent the uploading and making available of content which, in order to be found unlawful, would require an independent content assessment in the light of information made available by right holders and relevant copyright limitations (para. 90) (cf. Glawischnig-Piesczek para. 41-46). In particular, it could not be excluded that, in some cases, unauthorized content could only be banned upon notification of right holders (para. 91).

Referring also to the complaint and redress mechanism set forth in Article 17(9) DSMD, the Court highlighted that, under those provisions, users had to be able to submit a complaint where they considered that uploaded content had wrongly been blocked or removed. Any complaint had to be processed without undue delay and be subject to human review (para. 94). Importantly, the Court characterized the procedural safeguards following from Article 17(9) DSMD as additionalsafeguards:

“the first and second subparagraphs of Article 17(9) of Directive 2019/790 introduce several procedural safeguards, which are additional to those provided for in Article 17(7) and (8) of that directive, and which protect the right to freedom of expression and information of users of online content-sharing services in cases where, notwithstanding the safeguards laid down in those latter provisions, the providers of those services nonetheless erroneously or unjustifiably block lawful content” (para. 93).

The decision in Poland/Parliament and Council thus shows that the obligation to introduce content filtering systems arising from Article 17(4)(b) and (c) DSMD is only compatible with Article 11 of the Charter on the condition that OCSSPs offer two safeguards cumulatively:

  • in the first place, it follows from Article 17(7) and (8) DSMD that the OCSSP must have ex ante safeguards in place – in the sense of flagging options allowing users to ensure that permissible quotations, parodies etc. are not filtered out and, instead, become directly available on the platform. For an example of national legislation providing for this option, see §§ 14(1), 11(1), no. 1 and 3, 9(1) and 5(1) of the German Copyright Service Provider Act (Urheberrechts-Diensteanbieter-Gesetz). As long as content filtering systems fail to distinguish reliably between piracy and parody, it is not possible to hide behind technology and simply assume that available algorithmic tools will safeguard freedom of expression;
  • in addition, it follows from Article 17(9) DSMD that, with regard to cases where the ex ante mechanism fails to ensure content availability, the OCSSP must have ex post safeguards in place – consisting of a well-functioning complaint and redress mechanism that allows users to bring the malfunctioning of the system to the attention of the platform and ensure the correction of unjustified content blocking.

On balance, the OCSSP regulation following from the decision in Poland/Parliament and Council can be described as a double-edged sword. On the one hand, the CJEU refused to dismantle the doubtful edifice of licensing and filtering obligations in Article 17(1), (2) and (4) DSMD. On the other hand, the CJEU insisted on the introduction of appropriate safeguards against disproportionate, excessive content blocking.

As a result, the status quo with regard to OCSSP platforms is as follows: to police the borders of licensing deals which OCSSPs manage to obtain and prevent illegal uploads in the absence of licenses, it is legitimate to deploy algorithmic content filtering tools even though these tools are likely to curtail the freedom of users to participate actively in the creation of online content. As a counterbalance, however, the CJEU requires that, in the absence of content coverage on the basis of licensing efforts, all types of content uploads must survive automated content filtering when they constitute lawful use, including use that can be qualified as a permissible quotation, criticism, review, caricature, parody or pastiche (para. 87). The need to create breathing space for lawful content uploads follows from primary EU law, namely the guarantee of freedom of expression and information in Article 11 of the Charter.

In practice, this means that national legislation must ensure that OCSSPs implement effective safeguards against excessive content blocking. More specifically, these safeguards must include flagging options that enable users to ensure content availability ex ante, and complaint and redress mechanisms that allow users to correct unjustified content blocking and ensure content availability ex post. National copyright law and practice that does not ensure the existence of these double safeguards – ex ante flagging as well as ex post complaint systems – fails to meet the requirements following from the ruling in Poland/Parliament and Council.

Luckily, Article 17 DSMD is an element of an EU directive. Member States enjoy the freedom of integrating the new rules in their national copyright systems in the light of fundamental rights, such as Article 11 of the Charter. In contrast to provisions following from an EU regulation, the final legal text that is applicable at the national level is not set in stone. With the guidance given in the Poland ruling, Member States can use the room to manoeuvre in a way that leads to effective ex ante and ex post safeguards. In any case, the safeguarding of freedom of expression and information must not be left to the “black box” of algorithmic content analysis that draws an obscure line between prohibited piracy and permissible parody.


_____________________________

To make sure you do not miss out on regular updates from the Kluwer Copyright Blog, please subscribe here.


Kluwer IP Law

The 2022 Future Ready Lawyer survey showed that 79% of lawyers think that the importance of legal technology will increase for next year. With Kluwer IP Law you can navigate the increasingly global practice of IP law with specialized, local and cross-border information and tools from every preferred location. Are you, as an IP professional, ready for the future?

Learn how Kluwer IP Law can support you.

Kluwer IP Law
This page as PDF

Leave a Reply

Your email address will not be published. Required fields are marked *