Filter or Die: EU Copyright Directive Risks Chilling Global Content Creation and Access

Joe Rabinovitsj is a J.D. candidate, 2021 at NYU School of Law.

Article 17 of the European Union’s Digital Single Market Copyright Directive (DSMCD) fully exposes online platforms and service providers to liability for user-generated content (UGC) infringing copyright. Article 17 reads, in relevant part:

An online content-sharing service provider shall…obtain an authorisation from…[Copyright] rightholders… for instance by concluding a licensing agreement, in order to communicate to the public or make available to the public works or other subject matter. […]

If no authorisation is granted, online content-sharing service providers shall be liable for unauthorised acts of communication to the public, including making available to the public, of copyright-protected works and other subject matter, unless the service providers demonstrate that they have:

(a) made best efforts to obtain an authorisation, and

(b) made…best efforts to ensure the unavailability of specific works and other subject matter for which the rightholders have provided the service providers with the relevant and necessary information; and in any event

(c) acted expeditiously, upon receiving a sufficiently substantiated notice from the rightholders, to disable access to, or to remove from their websites, the notified works or other subject matter, and made best efforts to prevent their future uploads in accordance with point (b).

In essence, unless platforms take best efforts to obtain licenses for all copyrighted UGC they host, take best efforts to block UGC that infringes licensed copyrightable content, and act expeditiously to remove infringing content upon notice, platforms are liable for all infringing UGC they host.

Critics suggest that by dumping the lion’s share of copyright liability risk on platforms, Article 17 imposes on them a de facto copyright filtering requirement. Copyright filters are systems, either human or digital, that analyze UGC uploaded onto a platform to determine the risk that it infringes copyrights, and filters out likely infringing material. The idea that Article 17 imposes a de facto filtering requirement is supported first by legislative history: earlier drafts of Article 17 (formerly Article 13) included an express filtering requirement. Further, business and technological considerations provide convincing support for inferring Article 17’s de facto filtering requirement. Specifically, the volume of UGC uploaded to platforms makes the threat of copyright infringement liability prohibitively high for many platforms hosting UGC, which could cause those platforms to shutter. On the other hand, platforms willing to risk this liability and continue to host UGC (without entering licensing agreements with creators), would need to engage in systematic scanning for infringing content—i.e., copyright filters—to avoid Article 17 driving them into bankruptcy. In other words, Article 17 presents UGC-dependent platforms with an ultimatum: filter or die.

Article 17’s proponents justify its onerous imposition of liability on platforms by citing the additional protection it affords rightsholders. At first blush, it seems reasonable to infer that imposing liability on UGC-dependent platforms (where much or most copyright infringement occurs) would lead to reduction of infringement and increased licensing revenues for rightsholders. One may even be tempted to extend this justification and argue that Article 17’s stronger copyright protections will encourage creation of copyrightable expression. But this policy argument crumbles when scrutinizing the filter-or-die ultimatum Article 17 forces upon UGC-dependent platforms.

In addition to Article 17’s adverse consequences for the survival of platforms and internet service providers relying on UGC, Article 17’s de facto copyright filter requirement also threatens to chill global creation and distribution of valuable copyrightable expression.

Article 17’s filter-or-die ultimatum will stymie global creation and proliferation of copyrightable content by reducing the number of platforms available—i.e., the means by which users share and consume content. Although high proportions of infringement occur on platforms (a 2019 EUIPO report indicates that 75.3% of all consumption of infringing content occurs via streaming), according to a number of surveys, the majority of content users consume online is not infringing (e.g., a 2018 Canadian Government survey indicates that 74% of internet-users consume exclusively non-infringing content, and a 2016 IPO report indicates that only 15% of UK internet users consumed at least one infringed item over a three-month period). All this to say: the majority of content consumed on platforms is not infringing. Therefore, the stakes for creation and distribution of expression are high—if Article 17’s draconian threat of liability shutters platforms, not only will users lack outlets for their copyrightable expression, thus curbing distribution, but fewer means for dissemination stands to undermine incentives to create copyrightable expression—viewership (e.g., tracked by “views” on YouTube and “likes” on Instagram) is a primary incentive to create for online content creators.

Further, the risk that platforms will shutter when faced with liability under Article 17 looms large. The primary cause for this risk, in addition to the bankruptcy-inducing cost copyright infringement damages threaten, is that copyright filtering tools are prohibitively expensive for most companies. Google invested over $100 million developing Content ID, its proprietary copyright filtering tool for YouTube, and licensing Audible Magic (one of the few commercially available copyright filtering tools on the market) costs between $10,000 and $50,000 per month.

But for companies that could afford to develop or license copyright filtering tools, these tools may not sufficiently mitigate risk of liability under Article 17 because existing filtering technology exhibits dubious accuracy at identifying and removing infringing content. For example, existing filters are unable to distinguish between otherwise infringing material shielded from liability under fair use doctrines, and content not protected by fair use. Fair use in most jurisdictions, especially the US, relies on a finicky, highly fact-bound balancing inquiry, largely ineffable to current rule-dependent filtering algorithms. Further, filters exhibit a number of troubling false-positives for infringement not related to fair use issues—for example, when Content ID refused to allow a user to post a video with chirping birds because it was too similar to a song using the sound of chirping birds.

With fewer outlets available for distributing and accessing content, creators will lack means of distribution, and viewership will fall, thus undermining incentives to create.