Germany was the main battleground over last year’s adoption of the EU Directive on Copyright in the Digital Single Market (DSM Directive). After 200,000 people took to the streets against impending restrictions of their freedom of communication, the German government promised to avoid the use of upload filters in its national implementation. One of the governing parties, the Christian Democrats, even went as far as promising that there would be “no upload filters” as a consequence of Article 17 of the DSM Directive.
The recently published discussion draft for implementation of Article 17 falls short of that promise – upload filters are part of the rationale of the German Ministry of Justice that penned the proposal. Nevertheless, the draft makes use of the considerable room for manoeuvre that Member States have in implementing the provision by including the most detailed proposals to date for somewhat limiting the negative impact of upload filters on freedom of expression and freedom of information, namely through rules designed to prevent the inadvertent blocking of legal uploads, also known as over-blocking. Still, more fundamental concerns over the imposition of upload filters expressed by the CJEU in its Netlog and Scarlet judgments, notably their impact on privacy and freedom to conduct a business, are not addressed by the German proposal. This two-part post will not recap the contents of Article 17, which have been extensively discussed elsewhere, but instead take a close look at the German implementation proposal. Part 1 of this post deals with the proposed rules on user rights and pre-flagging as a means to counteract filtering obligations. Part 2 covers the proposal of a new, automatically enforceable compensated exception designed to protect insignificant uses of third-party material, as well as taking a closer look at the efforts of the German proposal to make Article 17 more predictable for platform operators.
The discussion draft (made available in English, indicating that the Justice Ministry is interested in engaging in the implementation debate in other Member States) proposes to implement Article 17 in an entirely new standalone law, the Copyright Service Provider Act, rather than incorporating the provisions into the existing German copyright act. This choice is partially made to improve readability, but also due to the uniqueness of the legal regime introduced by Article 17, which is a sui generis extension of the communication to the public right introduced by the 2001 InfoSoc Directive, according to the ministry. This is not the only point on which the German approach differs markedly from that of other Member State governments such as the Netherlands.
Taking user rights seriously
Earlier versions of Article 17 had been amended towards the end of the negotiations on the DSM Directive to try to better protect users’ rights, most notably by including a provision that the application of the new rules “shall not result in the prevention of the availability of works or other subject matter uploaded by users, which do not infringe copyright”. So far, Member State governments in France and the Netherlands have ignored this essential users’ rights safeguard in their draft implementations. This omission is likely to violate the requirements of the DSM Directive, as the user rights safeguards enshrined in Article 17 were vital for striking the necessary balance of interests in order to narrowly adopt the DSM Directive in the European Parliament and the Council.
The German discussion draft, rather than treating the user rights safeguards merely as aspirational statements, tries to introduce a system that – while still being incapable of ensuring that all legal uses remain unaffected by upload filters – would at least drastically reduce the number of wrongful removals of content on copyright grounds. While serious concerns about the compatibility of Article 17 with the Charter remain, the Copyright Service Provider Act is the first implementation draft that may at least come close to fulfilling the user rights requirements imposed by Article 17.
Pre-flagging as a limit to automated blocking decisions
The intense opposition of the German public to upload filters has apparently led to a high awareness at the Ministry of Justice of the different failure modes of content recognition tools when trying to automatically enforce copyright laws. The draft is therefore focused on limiting instances of “overblocking”, most notably by requiring platforms to allow users to pre-flag the legal inclusion of third-party content at the point of upload (§ 8). Platforms are required to allow users to pre-flag the third-party content included in their uploads as legally authorized (including copyright exceptions and public domain materials) or contractually authorized (covering licensed uses, including open licences such as Creative Commons).
As a general rule, such pre-flagged content must not be automatically blocked by upload filters, and platforms are not held liable for those uploads while the dispute resolution over the legality of such uses is pending (§ 16). However, the proposal also includes a problematic exception to this rule for instances of “obviously incorrect” pre-flagging by users. In those cases, the content must be blocked despite having been pre-flagged as legal (§ 12).
The ministry justifies the pre-flagging approach by referring to the importance of being able to rely on legal uses of third party materials in public debates on current affairs, where being able to retroactively complain about the wrongful deletion of material and having it reinstated days after the fact would be insufficient to protect users’ rights. Indeed, the European Commission has previously underlined that the obligation to keep legal content online pursuant to Article 17 (7) DSM Directive is not fulfilled simply by implementing the ex-post redress mechanism included in Article 17 (9), but that the two provisions are to be implemented independently of one another. The German Justice Ministry also points to the impossibility of automatically detecting legal uses under most exceptions and limitations to copyright, which require an analysis of the context in which works are used.
The pre-flagging mechanism is complemented by a set of measures against abuse of the system (§ 19) – by self-proclaimed rightsholders who try to have content blocked for which they do not hold the exclusive rights, by platforms that repeatedly block legal content despite the proposed safeguards, and by users that repeatedly flag infringing uploads as legally or contractually authorized. The recognition of the phenomenon of false copyright claims by ostensible rightsholders is a particularly welcome addition, as the experience with the US notice-and-takedown regime shows that false copyright claims are a frequently used tool to suppress all kinds of legitimate information from the Internet. How effective the proposed measures against abuse would be is open to debate, as it will be difficult for users to assert their rights in court and demonstrate economic harm resulting from the wrongful blocking of their uploads. The proposal tries to empower users’ organizations to bring such claims against platforms, but not against self-proclaimed rightsholders. While the introduction of sanctions against abuse of the pre-flagging system by users may be understandable to avoid users simply flagging all of their uploads as legal, the proposed exclusion of users from the pre-flagging system should be limited to cases where users intentionally abuse the system. Given the complexity of copyright rules, honest mistakes about the applicability of copyright exceptions are bound to happen.
The pitfalls of the pre-flagging approach
Of course, putting users in charge of asserting the legality of their uses of third-party materials as the main safeguard against over-blocking comes with its own set of problems. First of all, copyright law is complex, and ordinary users can hardly be expected to judge correctly whether their use falls under the German quotation exception, for example, which has been subject to complex interpretation by the courts. Secondly, the pre-flagging system would offer no protection to legal uses of material already online at the point of entry into force of the new law. Depending on how strictly courts interpret the obligation to make best efforts to block works that rightsholders have identified to the platform, platforms could then become liable for continuing to provide access to materials uploaded prior to entry into force, and would likely block all content that matches the information provided by rightsholders, without any meaningful mechanism for checking whether the use may have been legal. Thirdly, particularly in the case of live streaming, users may not be able to flag the legal inclusion of third-party content in advance. A person live streaming from a protest, for example, may be unaware that somebody will start playing music in the background halfway through the live stream, which is legal under the exception for incidental inclusion under German copyright law.
The biggest weakness in the pre-flagging approach, however, is the caveat about “obviously incorrectly flagged” uploads (§ 12). This provision shows that the goal of the German legislator is not to prevent upload filters per se, but only to reduce the occurrence of over-blocking as a result of upload filters. An obligation to identify instances of “obviously incorrect” pre-flagging requires platforms to subject all user uploads to filtering technology, even those that have been pre-flagged. Given the cost of human review, platforms are likely to fully automate the process of identifying these cases of “obviously incorrect” flagging and block them automatically, invariably leading to the blocking of legal content in the process.
The ministry seems to encourage the automated detection of “obviously incorrect” pre-flagging by specifying that materials that have been pre-flagged as legally authorized can categorically be considered “obviously” incorrectly flagged if they include a 90% match to a work which the rightsholder has requested that the platform block. It’s not difficult to imagine scenarios where this blanket rule would come to the wrong conclusion, be it in the case of quotations of entire poems for legitimate purposes, or live streams of religious services (protected by a dedicated copyright exception under German law), which have become much more frequent recently due to COVID-19. While the intention of the 90% rule is clearly to prevent users uploading entire feature films and flagging them as legal, the way it is drafted is likely to lead to significant over-blocking of legal content in the process.
CC BY 4.0
_____________________________
To make sure you do not miss out on regular updates from the Kluwer Copyright Blog, please subscribe here.
Kluwer IP Law
The 2022 Future Ready Lawyer survey showed that 79% of lawyers think that the importance of legal technology will increase for next year. With Kluwer IP Law you can navigate the increasingly global practice of IP law with specialized, local and cross-border information and tools from every preferred location. Are you, as an IP professional, ready for the future?
Learn how Kluwer IP Law can support you.
What has Article 17 to do with “user rights”? Users are infringers if they upload without a licence – punkt.
We are concerned here with the availability of defences to platforms for their unlicensed communication of works to the public.