Article 17 is here to stay, but most national implementations fail to meet the fundamental rights standards developed by the Court in its judgment. Tuesday’s long-awaited ruling in Case C-401/19 finally brings some clarity to the almost three-year-long discussion about the implementation of Article 17 of the Copyright in the Digital Single Market Directive (DSM Directive).
The judgment comes almost a year and a half after the Court heard the parties to the case in November 2020. During this hearing three different lines of argument emerged:
The Polish government — which had initiated the case — argued that the de-facto filtering obligations in Article 17(4)b and the second part of paragraph c violate the right to freedom of expression and information recognized in Article 11 of the EU Charter of Fundamental Rights and should therefore be annulled.
Obligation to protect legal uploads trumps obligation to block infringements
The defenders of the provision were divided into two camps. The representatives of the EU institutions led by the Commission argued that the disputed parts of Article 17 do not violate the right to freedom of expression because there are sufficient internal safeguards present in other parts of the Article, notably in Article 17(7), which contains an obligation not to prevent the availability of lawful uploads.
On the other hand the representatives of France, Spain and Portugal argued that Article 17 did not violate the Charter because any undue limitations of the freedom of expression would be temporary in nature and would be justified by the overarching regulatory objective of effectively protecting the rights of authors and other rightholders.
In this week’s judgment the Court rejected the argument brought forward by Poland and concluded that the design of Article 17 includes sufficient safeguards protecting the right to freedom of expression and information of the users. Just like the Advocate General in his opinion, the Court arrives at this conclusion largely along the lines of argument presented by the EU institutions. In its judgment [paras 78-79] the CJEU confirms the interpretation brought forward by the institutions that the contested filtering provisions in Article 17(4) are mere “obligations of best effort” while the obligation not to prevent the availability of lawful uploads is an “obligation of result”. This hierarchy of obligations has been the central element of the Commission’s defence of the provision’s fundamental rights compliance and has been disputed by the three Member States led by France and by representatives of rightholders. Having the Court confirm the Commission’s interpretation forms the basis of much of the rest of the judgment and will be an important consideration for Member States when implementing the directive.
Court limits upload filters to obvious infringements
While the Court has rejected Poland’s overall claim, it does concur with Poland on one important point — that the disputed provisions of Article 17 will necessitate the use of automated upload filters [para 55] and that their use constitutes a form of prior restraint [para 68]. Drawing extensively on the case-law of the European Court of Human Rights, the Court underlines that such prior restraint is not categorically prohibited, but is a particularly severe restriction of the right to freedom of expression and information that does require strong safeguards laid down in law. Article 17 only meets this bar when interpreting the safeguards enshrined in paragraphs 5 to 10 thereof in a fundamental rights-preserving manner.
Firstly, the Court follows the line of argument of the Advocate General that the application of upload filters must be limited to cases where the infringing nature of a use has been previously determined by a court or is manifest, without a need to consider the context of the use [AG opinion, para 198]. Otherwise, the obligation placed on platforms by Article 17 would constitute a prohibited general monitoring obligation as clarified by the Court in its Glawischnig ruling.
The Court appears to concur, albeit without employing the term “manifest”, by pointing out that Article 17 may not oblige platforms to block uploads in cases where a finding of illegality “would require an independent assessment of the content by them in the light of the information provided by the rightholders and of any exceptions and limitations to copyright” [para.90].
At the same time, the Court holds the introduction of filtering obligations that lead to the blocking of lawful content to be incompatible with the right to freedom of expression [para. 86] – and indeed with Article 17 (7), which contains an obligation of result to prevent the blocking of lawful content, as explained above.
Member States have not done their homework
Read in conjunction, it follows from the requirement that Article 17 may not impose the use of filters that block legal content, and from the requirement that platforms may not be put in the position of having to independently assess the legality of user uploads, that platforms cannot be expected to define the technical parameters under which upload filters may be used, either. The question of what constitutes a manifest infringement in the eyes of a filtering algorithm must be answered by the legislator, not by platforms or private-sector vendors of filtering technologies.
Without further qualification of the filtering obligation by the Member States, this means that automated blocking of user uploads is not permissible for uses that put a protected work or parts of it in a new context, as they may well be covered by an exception or limitation to copyright, which the Court has elevated to the status of user rights [para. 87]. Additionally, automated blocking may only take place in cases where the blocking of works or other subject matter in the public domain or for which platform users may have obtained a licence is ruled out [para. 86]. The Court therefore places high requirements on the “necessary and relevant information” to be provided by rightholders, which is not further specified in Article 17 itself.
The Court outright rejects the claim made by the Polish, Spanish and French governments in the hearing on the case that the complaint and redress mechanism included in Article 17 (7) is by itself a sufficient safeguard. Rather, the Court characterises this ex-post mechanism as a procedural safeguard when the ex-ante safeguards fail and legal content does inadvertently get blocked [para 93]. Instead, it references the Advocate General’s argument that “[service providers and Member State authorities] must take into account, ex ante, respect for users’ rights” (AG opinion, para. 193, endorsed by the Court in para. 85). Clearly, there is a need for Member States to define the safeguards against overblocking if platforms are supposed to be spared from having to make any independent assessments of legality on their own.
Do any national implementations of Article 17 meet the Court’s standards?
It is therefore no surprise that the Court points out that although Article 17 is upheld, this is without prejudice to any national implementations thereof [para. 71]. This bodes ill for the numerous verbatim implementations of Article 17 by Member States who have tried to kick the can down the road by simply copy-pasting Article 17 into their national copyright laws and leaving the difficult question of how it should be applied in practice to the European Court of Justice.
It would go without saying that national implementations that diverge from the text of the Directive may be in violation of the Charter, but the statement by the Court could indicate that even the approach of simply re-stating the provisions of Article 17 in national law, employed by a large number of Member States, is insufficient, as it fails to include specific ex-ante safeguards that limit the application of upload filters to situations where the danger of over-blocking is minimal. This potential implication of the judgment is supported by the Court’s statement that “Member States must, when transposing Article 17… into their national law, take care to act on the basis of an interpretation of that provision which allows a fair balance to be struck between the various fundamental rights protected by the Charter” (para. 99).
While some of those Member States (notably the Netherlands) have kept open the possibility of including such elements via secondary legislation, most will need to revisit their transpositions unless they want to expose themselves to legal challenges. In the meantime, courts in those Member States are required to interpret national law in a manner that complies with today’s ruling.
When developing ex-ante safeguards to be included in national laws, the legislators, including those of the fourteen Member States that have not yet transposed Article 17, should refer to the parts of the Advocate General’s opinion that expand on the nature of possible ex-ante safeguards – as well as those elements of the Commission’s implementation guidance that have not been explicitly rejected by the Advocate General as being incompatible with the right to freedom of expression. As a result of yesterday’s ruling, the Commission should revisit the controversial “earmarking” mechanism that had been rejected by the Advocate General and clearly does not comply with the Court’s instruction that implementations must exclude “measures which filter and block lawful content when uploading” [para 85].
As a consequence of the Court’s judgment, the Spanish and Italian transpositions, which contain elements that directly contradict the Court’s interpretation (Art. 102-decies (3) of the Italian implementation decree and Artículo 73 (10) of the Spanish Royal Decree) will need to be changed by the respective legislators. Both provisions require that blocked uploads that are subsequently disputed by the uploader remain unavailable until the resolution of the dispute. This requirement does not meet the standards developed by the Court. In addition, both Member States should introduce ex-ante safeguards that limit the use of automated content filters to manifestly infringing uploads.
Redemption of the “German Sonderweg”
The only two national transpositions so far that could in principle meet the Court’s standards are the German and Austrian laws, which both include ex-ante safeguards against overblocking in the form of quantitative minimum thresholds for the use of upload filters and allow users, under certain circumstances, to flag uploads of longer extracts of copyright-protected works as covered by an exception or limitation. Those ex-ante safeguards are backed up by procedural ex-post safeguards that go beyond the complaint and redress mechanism. These are more pronounced in the German implementation, which includes a right for researchers to access data about content moderation decisions by platforms and a collective redress option by organisations representing users’ rights, whereas the Austrian implementation merely empowers an administrative authority to sanction structural overblocking by platforms based in Austria.
However, upon closer examination, the Austrian implementation is at odds with the CJEU ruling in at least two respects: Firstly, Austria limits the application of the exception for caricature, parody and pastiche to uses made on online platforms covered by Article 17. This implementation not only leads to counter-intuitive results that strengthen the dominant position of large online companies, where a parody published on a creator’s personal website would violate copyright, but the same parody posted on Facebook would be legal. It also disregards the special importance that the EU legislator has awarded to this copyright exception for the exercise of freedom of expression. As the Court rightly points out, Article 17 makes this copyright exception, as well as the exception for quotation, criticism and review, mandatory for all Member States [para. 87].
Secondly, the Austrian implementation includes the earmarking provision recommended by the European Commission in its guidance, which endorses the blocking of user uploads that are not manifestly infringing in scenarios where rightholders claim that the potential economic damage resulting from infringements is particularly high. While the German implementation includes a similar provision which may also fail to meet the Court’s standards, it has a higher likelihood of prevailing, as it is strictly limited to unpublished video content or works whose initial communication to the public is still ongoing – both scenarios during which user uploads are unlikely to be covered by a copyright exception. Austria, on the other hand, simply decides that the claimed economic risk of a use outweighs the user rights to make legal use of certain types of “high-value” content, a proposition that the Advocate General rightfully rejected as being incompatible with freedom of expression.
It appears therefore that thus far the German transposition of Article 17 is the only one that has a chance of meeting the Court’s requirements for the protection of freedom of expression. Specific elements of it still appear grossly insufficient, such as the very low quantitative threshold established for the filtering of texts, which is all but guaranteed to produce false positives, given that even the full name of the DSM Directive is longer than the threshold set at 160 characters. The overall approach chosen by Germany has however been vindicated by the judgment, which is all the more remarkable given that throughout the national legislative process, it was decried by representatives of the entertainment industry as a “Sonderweg” – a deviation from the requirements of EU law.
Further research will be necessary to test whether the German transposition will prevent over-blocking in practice, as platforms are slowly starting to implement its requirements. If so, it could become a model for a uniform implementation of Article 17 throughout the Union.