This morning the CJEU delivered its much awaited judgment in Case C-401/19 – Poland v Parliament and Council. In simple terms, the main issue before the Court was the validity of the preventive measures required by Article 17(4) (b) and (c) in fine in light of the right to freedom of expression and information recognized in Article 11 of the Charter; in the alternative, should these provisions not be severable from Article 17 as a whole, the Republic of Poland asked the Court to annul Article 17 in its entirety.
The lead up to this judgment has been covered extensively on the blog, where readers can find analysis of the hearing and the AG Opinion (see here and here). In the coming days, we will publish in-depth analysis of the judgment. In the meantime, this post provides a refresher on the contents of Article 17, followed by a brief highlight of the main takeaways to kick-off the discussion.
The regime of Article 17 and the issue at stake
Article 17 applies to online content sharing service providers (OCSSPs), as defined in Article 2(6) CDSM Directive, with further guidance in recitals 62 and 63.
In simple terms, Article 17 states that OCSSPs carry out acts of communication to the public when they give access to works/subject matter uploaded by their users. As a result, these providers become directly liable for their users’ uploads. They are also expressly excluded in paragraph (3) from the hosting safe harbour for copyright relevant acts, previously available to many of them under Article 14(1) e-Commerce Directive.
The provision then introduces a complex set of rules to regulate OCSSPs, including a liability exemption mechanism in paragraph (4), and a number of what can be referred to as mitigation measures and safeguards.
The liability exemption mechanism on Article 17(4) CDSM encompasses a series of cumulative “best efforts” obligations to: (a) obtain an authorisation; (b) ensure unavailability of specific protected content; and (c) put in place notice and take down and notice and stay down mechanisms.
Assuming OCSSPs are able to demonstrate best efforts to obtain an authorisation, they must then comply with the additional requirements of Article 17(4)(b) and (c) to benefit from a liability exemption for the user-uploaded content they host. As noted, these obligations relate to preventive/proactive and reactive measures. For preventive or proactive measures, OCSSPs must first receive from rights holders “relevant and necessary information”, upon which they must either carry out “best efforts to ensure the unavailability of specific works” (4(b)) or ensure the works already taken down do not resurface on the platform (4(c)).
But Article 17 also includes the mitigation measures and safeguards against the potential negative effects of the preventive measures. First, the requirements of a proportionality assessment and the identification of relevant factors for preventive measures (paragraph 5). Second, a special regime for small and new OCSSPs (paragraph 6). Third, a set of mandatory exceptions akin to user rights or freedoms that are designed as obligations of result expressly based on fundamental rights (paragraph 7). Fourth, a clarification that Article 17 does not entail general monitoring – a similar prohibition to that set out in Article 15 e-Commerce Directive (paragraph 8). Fifth, a set of procedural safeguards, including an in-platform complaint and redress mechanism and rules on out of court redress mechanisms (paragraph 9).
Finally, Article 17(10) tasks the European Commission (EC) with organising stakeholder dialogues to ensure uniform application of the obligation of cooperation between OCSSPs and rights holders and to establish best practices regarding the appropriate industry standards of professional diligence. These stakeholder dialogues have resulted in the publication of Commission Guidance on the interpretation of Article 17, which has been subject to detailed analysis on this blog (here and here).
As noted previously, a key feature of the legal design of Article 17 is that paragraph (7) translates into an obligation of result. Member States must therefore ensure that these exceptions are respected despite the preventive measures in Article 17(4), which are mere “best efforts” obligations. The different nature of the obligations, underscored by the fundamental rights basis of paragraph (7), indicates a normative hierarchy between the higher-level obligation in paragraph (7) and the lower-level obligation in paragraph (4). It was up to the Court to assess whether and how the regime of Article 17, namely the mitigation measures and safeguards highlighted above may strike (or not) the necessary balance.
The Judgment
As a preliminary remark, it is noted that the Court for the most part followed the lead of the AG in his Opinion. Like the AG, it considered that Article 17 can only be assessed in its entirety, meaning that points (b) and (c), in fine, of Article 17(4) should not be assessed separately [para 21].
As a departure point, the Court confirmed that Article 17 requires OCSSPs to de facto carry out a prior review of uploaded content in cases where rights holders have provided “relevant and necessary information”, as required by paragraph (4)(b). [para 53]
Importantly, the Court recognizes that, depending on the scale of the task (i.e. “on the number of files uploaded and the type of protected subject matter in question, and within the limits set out in Article 17(5) ”), review of uploads by OCSSPs requires automatic recognition and filtering tools. As the Court noted, “neither the defendant institutions nor the interveners were able, at the hearing before the Court, to designate possible alternatives to such tools.”
Therefore, in certain cases – and certainly for the largest platforms (e.g. YouTube and Meta) – automated content filtering is required to comply with the best efforts obligations in Article 17(4) CDSMD. In other words, at least where it matters most, Article 17 requires what critics have labelled as “upload filters”.
For the Court, such prior review and filtering is liable to restrict an important means of disseminating online content. Thus, the specific liability regime in Article 17, especially its paragraph (4), entails a limitation on the exercise of the right to freedom of expression and information of users of those content-sharing services, as guaranteed by Article 11 of the Charter [paras 55, 58, 82] (and Article 10 ECHR).
But the Court considers that such a limitation is justified in light of the test in Article 52(1) Charter, which requires any limitation on the exercise of the rights and freedoms recognised by that charter must be provided for by law and respect the essence of those rights and freedoms. [see paras 63 et seqs, referring to the principle of proportionality].
In essence, the Court considers the limitation on freedom of expression imposed by Article 17(4) justified in relation to the legitimate objective pursued by Article 17, namely that of ensuring a high level of protection for rights holders under Article 17(2) of the Charter [para 69].
In a passage that to some extent summarizes a key part of the Court’s proportionality analysis, it is stated that
the liability mechanism referred to in Article 17(4)… is not only appropriate but also appears necessary to meet the need to protect intellectual property rights. In particular, although the alternative mechanism proposed by the Republic of Poland, under which only the obligations laid down in point (a) and the beginning of point (c) of Article 17(4) would be imposed on [OCSSPs], would indeed constitute a less restrictive measure with regard to exercising the right to freedom of expression and information, that alternative mechanism would, however, not be as effective in terms of protecting intellectual property rights as the mechanism adopted by the EU legislature [para 84].
The Court then advances six arguments why the limitation imposed by Article 17(4) of freedom of expression is justified and does not disproportionately restrict the right to freedom of expression and information of users of those services. [84 et seq]
First, following the AG, it considers that the EU legislature laid down clear and precise limits for preventive measures by excluding, in particular, measures which filter and block lawful content when uploading. In this regard, a filtering system that cannot make a distinction between lawful and unlawful content would not be consistent with the requirements of Article 17 and the fair balance between competing rights and interests. [paras 85-86] (This topic and its implications will be the subject of a separate blogpost.)
Second, Article 17 provides that users will be authorised, by national law, to upload content generated by themselves for purposes like parody or pastiche (paragraph 7), as well as be informed by OCSSPs that they can use works under exceptions or limitations (paragraph 9). [Paras 86-88]. In this context, it is noteworthy that the Court explicitly calls these exceptions “user rights” [para 88].
Third, the liability regime requires the provision by rightsholders of “relevant and necessary information” (paragraph (4)(b) or a “sufficiently substantiated notification” (paragraph (4)(c) in fine), a precondition which the Court believes “protects the exercise of the right to freedom of expression and information of users who lawfully use those services.” [para 89].
Fourth, Article 17(8) clarifies that its application must not lead to any general monitoring obligation. This is “an additional safeguard for ensuring that the right to freedom of expression and information of users of [OCSSPs] is observed”, meaning that such providers “cannot be required to prevent the uploading and making available to the public of content which, in order to be found unlawful, would require an independent assessment of the content by them in the light of the information provided by the rightholders and of any exceptions and limitations to copyright”.
As such, OCSSPs must not be forced into making “an independent assessment of the content” in order to determine its lawfulness, e.g. by contrasting the information provided by rightsholders with applicable exceptions. (paras 90-92, applying inter alia by analogy , C‑18/18 Glawischnig-Piesczek, paras 41–46).
Fifth, the different procedural safeguards introduced by Article 17(9) are adequate to address the situations of over-blocking. [paras 93-95]
Sixth, pursuant to Article 17(10), the Commission carried out stakeholder Dialogues and produced Guidance to supplement the system of safeguards provided for in Article 17(7) to (9), which inter alia (i) took special account of the need to balance fundamental rights and of the use of exceptions and limitations; and (ii) provided users’ organisations with access to adequate information from OCSSPs on the functioning of their practices with regard to Article 17(4) of that directive. [para 96]
In light of these considerations, the Court concludes that the design of Article 17 includes appropriate safeguards to ensure, in accordance with Article 52(1) Charter, the right to freedom of expression and information of the users of those services (Article 11 Charter), and a fair balance between that right of users and the right to intellectual property (Article 17(2) of the Charter) [para 98].
Still, the Court cautions Member States that when transposing Article 17 they must implement it in such a way as to strike a fair balance between the various fundamental rights. In addition, “the authorities and courts of the Member States must not only interpret their national law in a manner consistent with that provision but also make sure that they do not act on the basis of an interpretation of the provision which would be in conflict with those fundamental rights or with the other general principles of EU law, such as the principle of proportionality” [para 99].
Conclusion
In sum, from this first quick analysis, it can be said that although Article 17 survives, its validity is subject to the strict application of safeguards that ensure the right to freedom of expression and information of the users of OCSSPs. In particular as regards permissible filtering measures, the Court mostly follows the AG Opinion, with much less detail. And, as is well known, the devil is in the detail. For instance, the Court was cautious to omit any concrete guidance on permissible filtering as regards categories such as “manifestly infringing” or “earmarked content”. However, reading this judgment side-by-side with the AG Opinion, it is likely that the Court endorses the AG’s rejection of the permissibility of filtering of “earmarked” content that is not also “manifestly infringing”.
Stay tuned for more detailed analysis in this blog and, if you want to join a discussion on the topic, be sure to register for this Communia Salon next Thursday.
This research is part of the following projects: the reCreating Europe project, which has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 870626; the author’s VENI Project “Responsible Algorithms: How to Safeguard Freedom of Expression Online” funded by the Dutch Research Council (grant number: VI.Veni.201R.036).
________________________
To make sure you do not miss out on regular updates from the Kluwer Copyright Blog, please subscribe here.