Forthcoming in the November 2018 issue of Communications of the ACM, a computing professionals journal, is a column entitled “Legally Speaking: The EU’s Controversial Digital Single Market Directive” by Professor Pamela Samuelson, Berkeley Law School. The editors of Communications of the ACM have given permission for this column to be pre-published for the Kluwer Copyright Blog.
The stated goals of the EU’s proposed Digital Single Market (DSM) Directive are laudable: Who could object to modernizing the EU’s digital copyright rules, facilitating cross-border uses of in-copyright materials, promoting growth of the internal market of the EU, and clarifying and harmonizing copyright rules for digital networked environments?
The devil, as always, is in the details. The most controversial DSM proposal is its Article 13, which would require online content sharing services to use “effective and proportionate” measures to ensure that user uploads to their sites are non-infringing. Their failure to achieve this objective would result in their being direct liable for any infringements. This seemingly requires those services to employ monitoring and filtering technologies, which would fundamentally transform the rules of the road under which these firms have long operated.
This column explains the rationales for this new measure, specific terms of concern, and why critics have argued for changes to make the rules more balanced.
Article 13’s Changes to Online Service Liability Rules
For roughly the past two decades, the European Union’s E-Commerce Directive, like the U.S. Digital Millennium Copyright Act, has provided Internet service providers (ISPs) with “safe harbors” from copyright liability for infringing uses of their services about which the ISPs had neither knowledge nor control.
Under these rules, ISPs must take down infringing materials after copyright owners notify them of the existence and location of those materials. But they do not have to monitor for infringements or use filtering technologies to prevent infringing materials from being uploaded and stored on their sites.
Because online infringements have greatly proliferated, copyright industry groups have strongly urged policymakers in the EU (as well as the U.S.) to impose stronger obligations on ISPs to thwart infringements. Their goal has been the adoption of legal rules requiring ISPs to use monitoring technologies to detect in-copyright materials and filtering technologies to block infringing uploads.
In proposing the DSM Directive, the European Commission has responded to these calls by deciding that certain ISPs should take on greater responsibilities on to help prevent infringements. Article 13 is aimed at those ISPs that enable online content sharing (think YouTube).
While not directly requiring the use of monitoring or filtering technologies, Article 13 can reasonably be interpreted as intending to achieve this result.
Which Online Services Are Affected?
The DSM Directive says that Article 13 is intended to target only those online content sharing services that play an “important role” in the online content market by competing with other services, such as online audio or video streaming services, for the same customers.
If the “main purpose” (or “one of the main purposes”) of the service is to provide access to “large amounts” of copyrighted content uploaded by users and it organizes and promotes those uploads for profit-making purposes, that service will no longer be protected by the E-Commerce safe harbor. It will instead be subjected to the new liability rules.
Concerns about the overbreadth of Article 13 led the Commission to narrow the definition of the online content sharing services affected by the rules. It now specifically excludes online encyclopedias (think Wikipedia), repositories of scientific or educational materials uploaded by their authors, open source software repositories, cloud services, cyberlockers, and marketplaces engaged in online retail sales of digital copies.
Article 13’s New Liability Rules
The most significant regulation in Article 13 is its subsection (4):
Member States shall provide that an online content sharing service provider shall not be liable for acts of communication to the public or making available to the public within the meaning of this Article when:
(a) it demonstrates that it has made best efforts to prevent the availability of specific works or other subject matter by implementing effective and proportionate measures … to prevent the availability on its services of the specific works or other subject matter identified by rightholders and for which the rightholders have provided the service with relevant and necessary information for the application of these measures; and
(b) upon notification by rightholders of works or other subject matter, it has acted expeditiously to remove or disable access to these works or other subject matter and it demonstrates that it has made its best efforts to prevent their future availability through the measures referred to in point (a).
The italicized language above signals terminology that is vague and open to varying interpretations, but anticipates the use of technologies to show those “best efforts.”
Copyright industry groups can be expected to assert that it is necessary to use monitoring and filtering technologies to satisfy the requirements of Article 13(4). They will also point to an alternative way that online services can avoid liability: by licensing uploaded copyrighted content from their respective rights holders.
Affected online services will have an uphill battle to fend off the efforts to interpret the ambiguous terms as imposing monitoring and filtering obligations. It is, of course, impossible to license contents for every copyrighted work that their users might upload to their site. But the big media firms can use this new rule to extract more compensation from platforms.
Concerns About Article 13’s Liability Rules
Critics have raised two major concerns about this proposal. First, it will likely further entrench the market power of the leading platforms who can afford to develop filtering technologies such as YouTube’s ContentID, and deter new entry into the online content sharing market. Second, it will undermine user privacy and free speech interests, leading to blockages of many parodies, remixes, fan fiction, and other creative reuses of copyrighted works that would, if examined by a neutral observer, be deemed non-infringing.
When the proposal was pending before the European Council in late May, several members, including representatives from Finland, Germany, and the Netherlands, opposed it and offered some compromise language, so it does not have consensus support. Since then, opponents have mounted a public relations campaign to urge EU residents to contact their Parliamentary representatives telling them to vote no in order to “save the Internet.”
Among the many critics of Article 13 has been David Kaye, the United Nation’s Special Rapporteur for Freedom of Expression. He wrote a nine-page letter explaining why Article 13 is inconsistent with EU’s commitments under international human rights instruments.
In addition, Tim Berners-Lee, Vint Cerf, and 89 other Internet pioneers (plus me) signed an open letter urging the EU Parliament to drop Article 13:
By requiring Internet platforms to perform automatic filtering on all of the content that their users upload, Article 13 takes an unprecedented step towards the transformation of the Internet from an open platform for sharing and innovation, into a tool for the automated surveillance and control of its users.
More than 145 civil society organizations also came out against it, urging European voters to contact members of Parliament to oppose it.
The protests about Article 13 were successful enough to induce a majority of the European Parliament to vote for giving further consideration to the DSM directive.
Conclusion
While it is certainly good news that the EU Parliament decided against giving a rubber stamp to the DSM proposal in its current form, the battle over Article 13 is far from over. The EU Parliament will be taking up further proceedings about it in the fall of 2018, but its proponents can be expected to mount a new campaign for its retention.
Whether Article 13, if adopted as is, would “kill” the Internet as we know it, as some critics have charged, remains to be seen. Yet the prospect of bearing direct liability for the infringing activities of users will likely cause many sharing services to be overly cautious about what their users can upload and new entry will be chilled. In its current form, Article 13 gives copyright enforcement priority over the interests of users in information privacy and fundamental freedoms.
About the Author:
Pamela Samuelson is the Richard M. Sherman Distinguished Professor of Law, University of California, Berkeley. She can be reached at pam@law.berkeley.edu.
_____________________________
To make sure you do not miss out on regular updates from the Kluwer Copyright Blog, please subscribe here.
Kluwer IP Law
The 2022 Future Ready Lawyer survey showed that 79% of lawyers think that the importance of legal technology will increase for next year. With Kluwer IP Law you can navigate the increasingly global practice of IP law with specialized, local and cross-border information and tools from every preferred location. Are you, as an IP professional, ready for the future?
Learn how Kluwer IP Law can support you.
I am not in agree with article 13,what Will happen with us,(with the musicians,singers and other artista like me who áre just initiating us in our musical career??almost all the singers initate singing covers from songs of several singers,what Will happen with us?the music Is my life and they should be cutting my life.