In Part 1 of this blog post, we explained the importance of the CJEU judgment in joined cases C-682/18 (YouTube) and C-683/18 (Cyando) for the application of copyright law, even after the introduction of a new copyright liability regime for certain online platforms through Art. 17 DSM Directive. In this part 2, we turn to the aspects of the judgment that go beyond copyright law, namely the application of the hosting safe harbour of Art. 14(1) ECD. We pay particular attention to the different standards of knowledge set by the Court and the role that the fundamental right to freedom of expression plays in the Court’s reasoning. We go on to draw conclusions for the upcoming Digital Services Act (DSA), which is set to reform the liability regime of the E-Commerce Directive (ECD).
Application of Hosting Safe Harbour
A hotly debated question regarding the EU intermediary liability framework has been whether the hosting safe harbour in Art. 14(1) ECD applies to primary liability. The referring court in the present cases does not seem to think so, as it only asked the CJEU to consider whether platforms like YouTube and Uploaded benefit from the safe harbour if they don’t perform an act of communication to the public, i.e. when primary liability for copyright infringement is excluded. The Court nevertheless clarified that the safe harbour does apply to primary liability, even if in a copyright context the result is the same.
According to the CJEU, in order to benefit from the hosting safe harbour, a platform must not play an active role that gives it knowledge or control over the contents of users’ communications. However, as suggested by the Advocate General in his opinion, the Court concludes that a platform operator that performs an act of communication to the public by intervening in a user’s copyright infringement in full knowledge of the consequences must also have knowledge or control over the user uploads and hence play an active role that precludes the application of the hosting safe harbour. Outside of the realm of copyright law, it is still an important clarification that the safe harbour does, in principle, apply to primary liability.
The Knowledge Standards Set by the Court
The CJEU does not clearly address whether the criteria for a platform to perform an act of communication to the public regarding user uploads are the same as for playing an active role, but its reasoning points in that direction. The Court dismisses the notion that by taking voluntary measures to try to counter copyright infringement, platforms play an active role. It would indeed be counter-productive if “Good Samaritan” actions by platform operators could lead to their exclusion from the hosting safe harbour. As explained above, voluntary measures against copyright infringements by users are considered a factor in excluding primary liability of platforms; it would be illogical if the same actions led to secondary liability. This indicates that the Court indeed wishes to fully align the criteria for communication to the public and active role.
On the other hand, however, the Court clarifies that an active role requires knowledge or control over the content of user uploads (para. 109), whereas mere awareness, in a general sense, that users are infringing copyright on the platform could be considered a sufficient level of knowledge for a platform to communicate to the public if combined with other factors that demonstrate that the platform operator was deliberately intervening in those infringements. This apparent contradiction can be read in two ways. One option is that the CJEU has created a legal fiction that platforms that communicate to the public are always considered active, even if they fail to meet the standard of knowledge otherwise necessary for an active role. Alternatively, the Court may have wished to express that the element of control over user uploads that is required for an active role is identical to the deliberate nature of the intervention necessary for communication to the public.
In any case, it is clear that general awareness of infringements alone is never sufficient to cause a platform to meet the standard of primary or secondary liability. At a minimum, it needs to be combined with other factors that indicate intent to facilitate those infringements. Consequently, it is also clear that Art. 17 DSM Directive is not a mere clarification of the existing liability framework, as at the very least it lacks the subjective elements of knowledge of or control over the contents of user uploads, as well as a deliberate intervention in the illegal communication thereof.
Notice and Action: Lessons for the Digital Services Act (DSA)
Even those platforms that meet the requirements of the safe harbour are only exempted from liability if, upon gaining actual knowledge of illegal activities, act expeditiously to remove or disable access to the information. Thankfully, the judgment is much clearer when it comes to the standard of actual knowledge. Following the detailed analysis of the Advocate General in his opinion, the Court clarifies that actual knowledge must refer to specific illegal acts. In a copyright context, that means that it’s not sufficient for a rightholder to inform a platform through a notice that a specific work has been uploaded in which they hold exclusive rights, but also to establish that this use of the work is indeed infringing. According to the Advocate General, whose reasoning the Court affirms, that means “where the application of an exception is not automatically precluded, the notification must contain reasonable explanations why it should be” (para. 190).
According to the Court, a notice that includes insufficient or inaccurate information does not automatically create actual knowledge. The Court explicitly grounds this interpretation of Art. 14(1) ECD not just in the wording of the provision, but also in the Charter of Fundamental Rights. It notes that its application must reflect the balance that the directive seeks to strike between all competing interests, including the freedom of expression. This reference to the Charter is highly relevant as it indicates that the draft DSA presented by the European Commission may need to be revised to protect the freedom of expression of platform users. According to Art. 14 (3) of the draft DSA, notices that include the location of the information in question, the identity of the notifier, an explanation why the notifier considers the information to be illegal and a statement of good faith should automatically give rise to actual knowledge.
By contrast, the Court holds that while acting expeditiously to remove illegal content, platforms “must do so with due regard to the principle of freedom of expression”. While a notice is certainly a factor indicating actual knowledge, it is subject to a case-by-case assessment by a court to determine whether a platform “was actually aware of facts or circumstances on the basis of which a diligent economic operator should have identified the illegality” before holding the platform liable for the illegal act. This interpretation of the notice and action system in light of the fundamental right to freedom of expression seems to preclude the automatic actual knowledge envisioned in the DSA.
Outlook
Whether platforms will be liable under Art. 3(1) InfoSoc Directive in the future depends on how the national courts apply the criteria set out by the CJEU. In this respect, the decision provides little guidance so we may see further preliminary references to establish clarity. We can expect further discussion of the impact of the present judgment for the upcoming DSA. While it is not yet possible to estimate the exact consequences for the DSA, our preliminary analysis suggests that the European Legislator may have to revise the concept of automatic actual knowledge in light of the Court’s emphasis on the importance of the freedom of expression and information on the internet.
The judgment also touches on a number of other issues that could not be addressed in this blog post but that deserve further elaboration, inter alia the Court’s interpretation of Art. 15 (1) ECD and clarification on injunctions in a copyright context. These, together with the issues we have touched upon in this blog post, should provide plenty of food for discussion.
CC BY 4.0
________________________
To make sure you do not miss out on regular updates from the Kluwer Copyright Blog, please subscribe here.