Part I of this post discussed the current position of host providers and the changes that will be brought about by Article 17. Part II addresses the major problems in relation to Article 17 and how it should be implemented to try and minimize these.
The host provider privilege as a safeguard for a diverse online culture
It is simply a fact that the internet is rich in content that is either illegal or has an unclear legal status under copyright law. Very often, the reason that such content is not removed is that authors or rights holders do not object to its publication. There can be many reasons for this, but a widespread reality is that many copyright infringements do not harm the rights owners but rather benefit them. Examples include fan content, remixes, tributes with small excerpts or karaoke videos. In some cases, these uses may be covered by exceptions or limitations; in many others they will not.
Even if there are illegal, minor violations are usually tolerated, such as user videos in which protected material (e.g. a painting on the wall) can be seen or podcasts in which someone recites a poem. Because such casual, minor copyright infringements generally cause no damage, the effort required to negotiate and conclude license agreements for these kinds of use is completely disproportionate.
But there is more to it: Even substantial subsequent uses are often useful rather than harmful to the rights holder. Remixes or mash-ups, sound collages or memes, which spread widely because of their quality or sheer popularity, and garner many social-media “likes,” often exert significant advertising effects on behalf of original rights owners. A vivid fan culture, for example, increases the awareness and popularity of music, films or video games immensely. Even if rights are technically infringed, such cultural expressions do not result in any disadvantages; generally net benefits accrue to authors and publishers.
Accordingly, this kind of online content is valued, tolerated and often even promoted. If extensive rights had to be cleared in every case, it usually would not exist, at least not on the internet.
The current liability regime for platform and host providers safeguards the resulting cultural diversity of the internet. It ensures that copyright infringing content is not deleted or blocked as long as the rights holder does not object to it. When rights owners tolerate such uses, content remains available even if technically illegal. The effect is to the common good. Everybody benefits: users, authors and rights holders, and the general public, which has a genuine interest in a diverse online culture.
The effect of “private judges” on the availability of tolerated illegal content
The implementation of Article 17 could fundamentally change this situation if national legislators do not pay close attention to these circumstances. Platform providers are neither users nor rights holders. They cannot judge, nor are they entitled to decide, whether illegal content is desirable, tolerated or in the public interest. If they are threatened with cease-and-desist proceedings or lawsuits for any copyright infringements committed by their myriad users, they must do everything in their power to minimize their liability. They can hardly sit back and “hope” that infringements will be tolerated by the copyright holder.
The responsibility they have under Article 17 to prevent copyright infringements in the first place inevitably results in a much tighter and stricter control than could ever be exercised by all authors or producers combined.
This situation demonstrates well that the general tendency in platform and internet regulation to involve intermediaries in legal disputes between other actors is dangerous. The approach may seem obvious at first glance: Due to their technical sovereignty over their systems, providers are in fact in the best position to eliminate conflicts and infringements. Because they are in the best position to enforce regulation, so the argument seems to go, they should make the decisions themselves. And they should be liable for any mistakes.
The inclusion of the uninvolved third party undermines two fundamental legal principles. On the one hand, it collides with the principle of “Where there is no plaintiff, there is no judge.” This (German) idiom refers to the basic idea that legal violations are generally only prosecuted if the infringed party takes action against them. This is true especially for civil law.
Because of their own responsibility and the resulting liability exposure, however, platform and host providers are now forced to intervene as third parties in the conflict of interest between rights holders and the users. They are to judge whether user content can be posted. And they must take a decision irrespective of whether the rights holder would even want to pursue the potentially illegal use. In short, in such cases there would be a judge without a plaintiff.
This is problematic because, among other things, the decisions made by service providers will often not be based on the same considerations that rights holders would have. After all, the interests of platform providers and rights holders are not the same.
Host and platform providers are not courts!
This circumstance points to the second fundamental problem of the inclusion of third parties in legal conflicts. It is normally left to the courts to resolve them. However, online service providers are not courts. They are neither democratically legitimized or neutral nor well suited for this role. Unlike the courts, they neither serve the public interest nor do they have a constitutional mandate.
It is obvious that very few disputes surrounding the application of Article 17 will ever be decided by a court. GEMA (a CMO for music in Germany) or Sony Music may go to court against YouTube. But the final decision on the filtering or blocking of individual content – i.e. millions and millions of individual cases in total – will usually be made by the platform provider. The fact that a platform will in most cases leave this decision to an algorithm should make the explosive nature of this issue clear.
Ways to cure major flaws for national legislators
In implementing Article 17, national legislators still have considerable influence over the final outcome. In transposing this legislation on the national level, they can mitigate excessive and collateral effects through sensible and carefully designed solutions. It appears that the creators of the DSM Directive were at least aware that Article 17 entails very considerable risks, the scope of which nobody has yet been able to assess. This explains why back doors have been provided, the use of which may significantly reduce the negative impact of this new regulatory approach.
First and foremost, it would be useful and important to keep the scope of Article 17 as narrow as possible when transposing it. There are many indications in the recitals of the Directive that only very large host and platform providers should be affected. They imply that Article 17 should only apply to platforms that seriously compete with large licensed streaming services (such as Spotify or Netflix). Startups are partially excluded up to a certain age or size. The same applies to non-commercial sites and certain forms of services such as online marketplaces or scientific repositories.
In short, Article 17 is clearly designed as a special rule, as an exception. When it is implemented, the scope must be limited to ensure that the exception does not become the general rule.
Restrict licensing obligations based on feasibility
It is also important to ensure that the licensing obligations for service providers are not overwhelming. According to Article 17, para 4, platforms are exempted from primary liability if they have made all reasonable efforts to obtain a license. If the provider has made this effort, it only has to block or remove content that has been reported by the rights holder.
In other words, once the licensing requirements have been met, liability rules will be applied that are broadly in line with the current liability regime. Accordingly, how these requirements are defined is of fundamental importance to the scope of Article 17. Hopeless and inappropriate licensing efforts that would amount to an individual clearing of thousands of individual rights should not be required. The obligations should also be graduated according to the capacity of service providers to meet them, and they should take into account the existence of central licensing bodies. Furthermore, the emergence of more and more effective central licensing bodies should be promoted.
Bottom line
Article 17 of the DSM Directive was designed with tunnel vision and without foresight. For the “problem” which it primarily targets it is obsolete: YouTube and the music industry have reached agreements in the absence of an Article 17. The vast majority of the other situations that are affected by Article 17 in general are not comparable with this supposed standard context. In most cases Article 17 is not manageable for authors and rights holders, nor for providers. If this fact is again neglected in the transposition process, the culture on the internet will suffer considerable damage.
Although it may seem grotesque, the primary goal in implementing Article 17 must be to keep its scope as narrow as possible. This is the only way to prevent massive collateral damage to cultural diversity online. If this were to happen, everyone would lose: creative people whose content is blocked, authors and rights holders who benefit from illegal but tolerable use, and of course the users, who will constantly see the message: “This video is not available in your country.”
________________________
To make sure you do not miss out on regular updates from the Kluwer Copyright Blog, please subscribe here.