Last week, the Legal Affairs (JURI) Committee of the European Parliament voted in favour of Rapporteur MEP Axel Voss’s proposal on Article 13 of the draft Directive on Copyright in the Digital Single Market. The saga surrounding this infamous text is however far from over. A group of MEPs are currently challenging the JURI version of the proposal through the so-called rule 69c procedure. This means that on the 5 July at 12:00, the plenary of the Parliament will vote on whether the JURI report provides an acceptable basis to start “trilogue” negotiations with the Council. This will be a yes/no vote. If the EP decides to vote against the JURI mandate, every MEP will be able to file an amendment to the JURI report in the next plenary session in September. This would allow for Article 13 (as well as the equally contested Article 11) to be reworded.

Should Article 13 be amended? As I explain below, the answer to this question was already given by the Court of Justice of the European Union in 2012.

 

Pictured: an endangered species (no, not the frog)

What websites does Article 13 target?

Article 13 in the Voss version of the draft directive focuses on so-called ‘online content sharing service providers’. These are defined by Article 2 of the proposal as internet service providers one of the ‘main purposes’ of which is to give access to the public to copyright protected works and which ‘optimise’ those works. ‘Optimisation’ is further defined in Recital 37a as including the promotion, display, tagging, curating and sequencing of works. Until now, it has been argued by many that such acts did not give a platform an ‘active role’ of the kind that would expose it to liability if they were done through automated technology. The Voss draft however closes the door on this interpretation by unequivocally stating that such optimisation amounts to an active role ‘irrespective of the means used therefore’.

This exposes most modern User-Generated Content (UGC) websites to liability. A safety valve might be found in the concept of ‘main purpose’ – unfortunately however, this is not defined in the text. Is the ‘main purpose’ of UGC sites to give access to copyright-protected works? I would argue that they intend to promote the exchange of content created by users – but copyright-infringing content is often also posted on such services. Politically, it is clear that the reforms are targeted at such platforms. It is hard to predict how courts would react if cases involving UGC sites were to come before them. The legal uncertainty this would create would incentivise such websites to comply with the proposal’s requirements anyway, so as to avoid lengthy and expensive legal battles.

The proposal does include a list of carve-outs for online encyclopaedias, educational and scientific repositories, providers of private cloud services, open source software developing platforms and online marketplaces. These exceptions are helpful, but they would still leave video-sharing websites, such as Dailymotion, online forums, such as reddit, and social networking sites, such as Twitter, out in the cold.

What obligations does the law impose on providers?

Article 13 seeks to make the targeted platforms directly liable for copyright infringements performed by their users online. It offers platforms two basic options: they must either enter into licensing agreements with copyright holders or take measures to prevent copyright works from being posted on their websites. As Prof. Martin Senftleben explains here, the first of these options is practically impossible: the platforms would have to acquire licenses from every copyright holder currently protected in the EU. Given the wide variety of works some platforms host, the rights clearance task would be enormous. The fragmented collecting society landscape in the EU raises further roadblocks.

This would leave providers with one option: to ensure that copyright-protected works and other subject matter are not posted on their platforms. Given the vast volumes of content that even smaller websites handle, the only practicable way of achieving this would be through filtering. The Voss draft appears aware of this. Although the wording tries to suggest that other options would be available, the only one named in the draft is the implementation of ‘effective technologies’ – i.e. filtering. It is therefore clear that Article 13 aims at imposing an obligation on platforms to filter.

What is wrong with that?

That the EU legislator would seek to oblige platforms to filter is perhaps somewhat surprising, given that this option has already been rejected by the CJEU in its case law as incompatible with: a) existing EU law and b) the Charter of Fundamental Rights of the European Union.

a) Incompatibility with the Prohibition on General Monitoring Obligations of Art. 15 E-Commerce Directive

Article 15 E-Commerce Directive (ECD) states that

‘Member States shall not impose a general obligation on providers, when providing the services covered by Articles 12, 13 and 14, to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity.’

This is the so-called ‘general monitoring prohibition’ that has formed a cornerstone of the EU’s internet law for almost 20 years. The term ‘general monitoring’ is contrasted by Recital 47 ECD to monitoring ‘in a specific case’, which is permissible.

The prohibition has been further interpreted by the CJEU. The seminal cases in this area are SABAM v Scarlet and SABAM v Netlog (see an earlier post on SABAM v Scarlet here). In these, the CJEU examined whether an injunction imposed on, respectively, an internet access provider and a hosting service provider, requiring them to install filtering systems, would be permissible under EU law. It is worth noting that Netlog concerned precisely the type of provider considered by Article 13. The Court concluded that such an injunction would oblige the intermediaries to actively monitor almost all the data relating to all of their users in order to prevent any future infringement of intellectual-property rights. It would therefore involve general monitoring, bringing it into conflict with Article 15 ECD.

The same conclusion would hold true of Article 13’s ‘effective technologies’. Indeed, even if a platform were to eschew filtering in favour of, for example, manual moderation, it would still not be able to check whether each piece of content on its website is infringing or not without, well, checking each piece of content. General monitoring is a necessary property of filtering.

It is worth noting the counter-argument that, in the case of Article 13, the monitoring would not be truly ‘general’, as it would not be targeted at preventing any future infringement in an abstract sense. Proponents of Article 13 suggest that it only envisages an obligation for the providers to remove works where they have previously received ‘relevant information’ by the right-holders. As a result, they conclude, platforms will not be obliged to filter at large, but only in order to prevent those copyright infringements brought to their attention by right-holders. This is an interpretation inspired by decisions of the German BGH, according to which, if a monitoring activity is targeted at preventing the infringement of a specific work, then it is itself also specific. Yet, even the extremist position of the BGH at least requires a court order before filtering can be imposed. This basic procedural guarantee is absent in the proposed directive, which would instead incorporate a duty to monitor directly into statutory law.

Regardless, this logic does not add up. Even given a limitation to pre-identified works, the monitoring will necessarily affect the totality of the content available on the service as uploaded by any user. If everybody’s content is being monitored, whether that is to prevent any copyright infringement or only particular copyright infringements is immaterial: the monitoring is still general. This has been made clear by the CJEU in McFadden, where it defined ‘general monitoring’ as the monitoring of ‘all of the information transmitted’ by a provider, with no disclaimer depending on the specificity of the protected works. This is also the approach favoured by the French Cour de casssation, which in 2012 rejected the ‘notice-and-stay-down’ model as involving general monitoring.

Instead, ‘monitoring in specific cases’ should be limited to what the natural meaning of the term suggests: the monitoring of the activity of specific persons online. It must be recalled that Article 15 ECD applies horizontally to a variety of different kinds of online unlawfulness. While monitoring in specific cases may be less useful for copyright holders, it is often imperative in the investigation of cases involving the posting of images of child sexual abuse or terrorist content. This has been recently acknowledged by the Commission’s Communication on tackling illegal content online, which indicates that, in certain cases, platforms should abstain from removing illegal content, so as to enable investigations by national law enforcement authorities and the prosecution of perpetrators.

b) Incompatibility with the Charter

Even if problems did not arise with regard to Article 15 ECD however, filtering obligations would still be incompatible with the EU’s most basic legal principles. This is because, according to case law of the CJEU, in the enforcement of copyright, a ‘fair balance’ must be struck between the protection of copyright on the one hand and, on the other hand, the protection of the fundamental rights of other persons. In the SABAM cases, it was found that filtering violates this requirement.

According to the Court, this is because filtering obligations would require the intermediary to install a complicated, costly, permanent computer system at its own expense’. This would bring the obligation out of balance with Article 16 of the Charter and the freedom of the intermediary to conduct its business. The filtering would also have involved ‘the identification, systematic analysis and processing of information connected with the profiles created on the social network by its users’. This information is protected personal data because, in principle, it allows those users to be identified. As a result, the filtering would also have been incompatible with Article 8 of the Charter on the protection of personal data. Finally, the systemmight not distinguish adequately between unlawful content and lawful content, with the result that its introduction could lead to the blocking of lawful communications.’ This would make the filtering incompatible with a fair balance with Article 11 of the Charter on end-users’ freedom of expression and information.

The lack of a ‘fair balance’ would also plague the obligations to adopt effective content recognition technologies envisaged by Article 13 of the Proposal.

It should be noted that the cooperation with right-holders which the Proposal envisages could lessen some of the burden on the intermediaries’ freedom to conduct their business. As right-holders would take on the responsibility of indicating which copyrights should be protected, the intermediary would not have to filter for any infringement. Potentially, therefore, a greater balance – perhaps even a sufficient balance – between the protection of copyright and the platforms’ freedom to conduct their business may be injected into the system.

Nevertheless, the problems with end-users’ freedom of expression and data protection rights persist.

For one thing, in order to identify infringements among the totality of the content on the host service providers network, it would still be necessary to ‘identity, systematically analyse and process’ that totality of content. This would include end-users’ personal data. In this regard, it is worth considering the difference in nature between the right to privacy and the right to the protection of personal data. Where users upload non-infringing content onto the platform of a hosting service – e.g. a home-video onto YouTube – it will almost inevitably contain personal data. That content would nevertheless have to be scanned and compared against the ‘fingerprint’ of the copyrighted work in the filter’s database to ensure that it is not infringing.

Finally, although in recent years the technology has improved in this area, it is still not capable of correctly identifying whether a given use of a work benefits from the protection of an exception or limitation to copyright. The same problem, therefore, that was noted by the CJEU in the SABAM cases with regard to freedom of expression continues to exist. Technology is simply not yet at the stage where it is capable of identifying a parody or deciding whether one work criticises or reviews another.

Moreover, it should be considered that exceptions and limitations to copyright are not currently properly harmonised at the EU level. Article 5 of the Information Society Directive instead only provides a closed list of options from among which the Member States may choose. As a result, even if the technology could develop the ability to correctly discern an exception or limitation in sensitive context-dependent situations, different filtering systems would have to be devised for each individual Member State. In the SABAM cases, the CJEU emphasised that whether a transmission is lawful also depends on the application of statutory exceptions to copyright which vary from one Member State to another. In addition, in some Member States certain works fall within the public domain or may be posted online free of charge by the authors concerned.

The severity of the problem has also been pointed out by David Keye, the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression a recent communication to the European Commission..

What about safeguards?

Of course, the Voss report does attempt to introduce some safeguards for end-users’ rights. So, while Article 13(1) states that the measures adopted by platforms must lead to ‘the non-availability’ of infringing works, it also requires that non-infringing works remain available. Article 13(1b) further states that,

‘Members States shall ensure that the implementation of such measures shall be proportionate and strike a balance between the fundamental rights of users and rightholders and shall in accordance with Article 15 of Directive 2000/31/EC, where applicable, not impose a general obligation on online content sharing service providers to monitor the information which they transmit or store.’

Two big problems arise in this regard. First, as noted above, the text has already stated that the targeted platforms play an ‘active role’. This means that they are deprived of the protection of the defences of Articles 12, 13 and 14 ECD. Yet, Article 15 ECD states that is only applicable to platforms ‘when providing the services covered by Articles 12, 13 and 14’. Whether this means that providers need only provide mere conduit, caching or hosting services to fall under Article 15 or need to abide by the conditions of Articles 12-14 is not entirely clear. The natural meaning of the sentence would suggest the first – in which case (as explained above) Article 13 is incompatible with the ECD. Alternatively, Article 15 never applies to the targeted providers, making reference to it as a safeguard misleading.

More importantly, these safeguards are entirely incompatible with the basic purpose of Article 13: to require platforms to proactively remove infringing content. To achieve this aim, platforms will have to examine each piece of content posted by end users. This will amount to general monitoring. Realistically, providers will turn to filters to accomplish this general monitoring. Filters by definition can neither avoid systematically processing the personal data of users nor reliably recognise defences against copyright infringement. They are therefore incapable of allowing for a ‘fair balance’ between the fundamental rights of users and right-holders. This makes the entire provision a contradiction in terms.

In any case, perhaps the most important guarantee for fundamental rights foreseen in the text is introduced by Article 13(2). Although this does nothing to preserve end-users’ Article 8 rights, it does attempt to protect exceptions and limitations to copyright. It requires that platforms put in place effective and expeditious complaints and redress mechanisms. These are intended to allow users to object in cases where content that is protected by an exception or limitation is incorrectly taken down. The trouble here is that, as research on existing notice-and-take-down systems has demonstrated, complaints and redress mechanisms are rarely utilised by end-users. Indeed, in the fast-paced world of internet, the use of e.g. a meme in a discussion forum is rendered useless minutes after it has been posted: the conversation will have moved on, complaining takes time and restoring the content will do nothing to rectify the effect on freedom of expression.

Furthermore, the complaints and redress mechanisms envisaged by the Voss report suffer from fundamental failings. The text only requires that rightholders reasonably justify their decisions *after* a complaint has been lodged. This indicates that content may be removed whenever a filter finds a match *without any justification at all*. More troublingly yet, Recital 39c also suggests that it should fall to the rightholders themselves to decide whether the take down has been justified or not. As the text states,

‘[the mechanism] should prescribe minimum standards for complaints to ensure that rightholders are given sufficient information to assess and respond to complaints. Rightholders or a representative should reply to any complaints received within a reasonable amount of time.’

This means that copyright holders, i.e. one of the parties in the dispute, will be put in charge of deciding whether their own rights have been infringed or not!

How courts will approach these oxymoronic statements is unclear. It is possible that some courts might conclude that, if no filtering is available that can respect user’s rights, then no filtering needs to be implemented. Other courts may be tempted to enforce the main objective of the law anyway. And, again, incentives are important: well before they find themselves in court, providers will have a choice between either taking the risk that they may be held liable for infringing content or adopting filtering software, regardless of whether this means that lawful content will be taken down. It is likely that they will err on the side of caution to the detriment of users’ rights.

Although it is of course hard to predict the future, none of this looks very good, especially for smaller businesses and internet users. The larger, more established and richer players have already invested in developing filtering technologies and have incorporated these into their business models (see e.g. YouTube’s Content ID). Competitors will either have to develop their own content recognition systems (which is very difficult) or buy existing systems from others. It is hard to tell how expensive this might be, as filtering developers don’t tend to make their pricelists public, preferring instead to offer bespoke deals to platforms. What is clear is that unsophisticated filtering systems, that result in higher rates of false negatives and positives, are much cheaper than the good ones.

The result will be bad for both smaller platforms and end-users. Smaller platforms will either be pushed out of the market or forced to pay large sums to filter providers. This supports big, established companies by forcing their competition to close up shop or buy their products. End-users rights will be affected, as everything they post online will be supervised against a database of copyright-protected works to ensure there is no match. This will have obvious chilling effects on freedom of expression online. It will also mean that posters will have to fight for each protected use of a copyright work: first the filter will block posting and only after using the complaints and redress mechanism and getting explicit permission from the right-holder will the content be allowed online – despite being entirely legal!

For all the above reasons, leading copyright academics across the EU have raised serious concerns about Article 13, arguing for its deletion or at least substantial re-wording. Thankfully, the rather dismal future the provision spells out can still be avoided – but only if the European Parliament decides to side with the fundamental rights of EU citizens next week.


_____________________________

To make sure you do not miss out on regular updates from the Kluwer Copyright Blog, please subscribe here.


Kluwer IP Law

The 2022 Future Ready Lawyer survey showed that 79% of lawyers think that the importance of legal technology will increase for next year. With Kluwer IP Law you can navigate the increasingly global practice of IP law with specialized, local and cross-border information and tools from every preferred location. Are you, as an IP professional, ready for the future?

Learn how Kluwer IP Law can support you.

Kluwer IP Law
This page as PDF

Leave a Reply

Your email address will not be published. Required fields are marked *