The judgement of the European Court of Justice in case C-401/19 has hardly laid to rest the debates over the use of upload filters in automated copyright enforcement. On the contrary, by declaring Article 17 of the Directive on Copyright in the Digital Single Market compatible with the Charter, while requiring Member States to ensure that filtering obligations on platforms remain limited to the use of filtering systems that leave legal uses unaffected, the Court has rekindled the academic debate over the correct approach to national implementation of the provision.
To support the dialogue on a fundamental rights-preserving application of Article 17, Gesellschaft für Freiheitsrechte and the COMMUNIA Association for the Public Domain will host the “Filtered Futures: Fundamental Rights Constraints of Upload Filters after the CJEU Ruling on Article 17 of the Copyright Directive” conference on September 19, 2022 at the Robert Bosch Stiftung in Berlin. Academics from all disciplines and interested stakeholders are invited to submit abstracts by July 10th.
Protecting Legal Uploads from Overblocking
The Article 17 judgement has raised as many new questions as it has provided answers. It is clear that the obligation to prevent legal content from being blocked is an obligation of result, whereas the obligation on platforms to block is merely an obligation of best effort.
When in doubt, user uploads must stay online. There is broad agreement among commentators that Member States must implement Article 17 using a combination of ex ante and ex post measures against overblocking. What is less clear is how, in the absence of such specific safeguards, the numerous verbatim implementations of Article 17 are to be interpreted by the courts and by platforms. Does the ruling open the door for an EU-wide uniform technical implementation of safeguards by online platforms? If so, what would such an implementation have to look like and who is responsible for defining these parameters? What does this mean for the implementation guidance published by the commission in June last year?
On substance, the CJEU does not require that the filtering systems used never make mistakes (otherwise ex-post measures would not be needed), but that they must not be biased against lawful uses and the filtering obligation must not require the platforms to perform an independent assessment of legality. Some commentators highlight the importance of the specific ex-ante safeguards already included in some national implementations, such as minimum quantitative thresholds for the application of automated filtering and the possibility for users to pre-flag legal uses and thus exempt them from filtering in the German and Austrian implementations.
Others argue that platforms will have to use machine learning to detect lawful uses and reduce error rates of filtering systems, which raises the question whose responsibility it would be to ensure the reliability of such filtering systems. The immense regulatory challenge resulting from the obligation on Member States to ensure that automated filtering systems safeguard freedom of expression has led others to call for a dedicated EU copyright institution.
But is a quantitative approach to reducing error rates even suitable to achieve the obligation of result in Article 17? A filter can systematically ignore user rights such as quotation or parody, but still have very low error rates overall, provided that those exceptions are rarely used on a particular platform. In those cases, the restriction on the freedom of expression of the affected users would still be severe. Answering the question of how legal content can be protected from overblocking in practice will be crucial guidance for those Member States that are still in the process of implementing Article 17, and for the courts assessing the already existing implementations.
The impact of the Article 17 judgement beyond copyright
Particular features of Article 17 implementations that the conference will discuss include the rights and obligations on rightsholders and users resulting from the judgement, for example as regards the “relevant and necessary information” to be provided by rightsholders, or the consequences of elevating copyright exceptions to enforceable user rights.
Aside from the pressing issue of the judgement’s impact on national implementations of Article 17, we also invite submissions on its consequences for the relationship of Article 17 to other norms, such as the Digital Services Act, and on the implications of the judgement for the use of automated content moderation in fields outside of copyright law, as well as the Court’s freedom of expression case-law more generally.
Submit your abstracts!
We are looking forward to submissions from a broad range of perspectives that push the discussion further and that can contribute to shaping the practical application of Article 17 in a fundamental rights preserving way. The deadline for abstracts is the 10th of July (submissions via his form) Applicants will be notified by July 22th, 2022 and a limited budget to support travel and accommodation expenses for presenters is available.