Copyright Strikes as a Tool for Censorship, Bullying, Extortion
With the advent and then
explosion of consumer internet access in the 1990s, a concomitant threat of
massive, anonymous, and difficult-to-trace copyright infringement also arose.
Congress faced significant pressure from two major industries: traditional
media producers and internet service providers. Traditional media producers feared
the practical death of their intellectual property through unstoppable and
untraceable internet file-sharing, while internet service providers feared
endless lawsuits for the infringing activities of their users.
Congress responded to these
concerns with the creation of the Digital Millennium Copyright Act of 1998
(DMCA). This act was designed to accommodate the needs of both content creators
and internet service providers. It did so by protecting internet service
providers from both direct and indirect liability for the actions of their
users. In exchange for this protection, the act predicated ISPs’ safe harbor on
meeting certain conditions; namely, ISPs would have to expeditiously remove
purportedly infringing material upon receiving notice from copyright owners or
their agents. These takedown notices are known are commonly referred to as
“strikes,” and they have become a significant bugbear to many content-creators
today.
YouTube is by far the largest
and most frequented video site on the internet,
with over 1.8
billion registered users accessing YouTube every month and over 400
hours of content uploaded every
minute. Virtually anyone can upload a video to YouTube, and the
platform cannot vet each video for copyright infringement prior to posting.
Thus, in order to prevent rampant infringement on the platform, YouTube makes
use of its own “strikes” system.
Put simply, anyone who uploads videos to YouTube might receive a “strike” on their videos. These strikes are initiated by third parties, oftentimes larger, more established content creators such as Sony, Universal, or Viacom. After receiving a strike, uploaders are then faced with a decision; they can wait for the strike to expire, ask for a retraction by communicating with the striking party, or they can submit a counter notification to YouTube. If an uploader acquires three ongoing strikes in three-month period without successful retraction or counter-notification, YouTube may shut down their entire channel and remove all of their uploaded videos.
Despite several avenues for
defending against strikes, the system for their resolution can be opaque and confusing
to average uploaders. Users frequently fail to rebut illegitimate strikes
despite their attempts to do so. It is not uncommon for wholly original content
or content protected by fair use to be struck down. Perhaps most troublingly,
the strikes system is sometimes used for extortion and bullying by internet
trolls, and censorship
by corporations.
Among the issues with
YouTube’s strikes system is the fact that uploaders must provider personal
information in order to file counterclaims, which can then be used by troublemakers
to harass uploaders beyond cyberspace. Another issue is that strikes can take
over a month to remove, during which time YouTube prevents further uploads from
the offending channel. This means that uploaders who make their livelihood on
YouTube (an ever-growing class of content creators), face a very real risk of
extortion from those would cripple their channels by way of bad faith strikes.
In one recent case, an uploader known as “ObbyRaidz” faced just
such an extortion attempt, and was only able to remove the strikes
against him once his fans on Reddit and Twitter made the issue a matter of
public outcry.
Despite the Ninth Circuits
ruling in Lenz v. Universal Music Corp.,
which held that copyright holders must consider fair use in good faith before
issuing takedown notices, YouTube has yet to hold copyright holders to this
standard, and small-time uploaders rarely have the means to contest strikes in
a court. To their credit, YouTube has instituted what it calls Content ID. This
system allows copyright owners to upload content which they have exclusive
rights over; YouTube then searches new and extant videos for potential
infringment, alerting the owner to suspect videos. Sadly, this system does
little to prevent the kinds of strike-system abuses noted above, and can even
abet them by making it easier to find instances
of fair use and flag them for strikes.
The DMCA’s takedown
clauses were written with an expectation that strikes would be made in good
faith, and YouTube’s system is a natural result of that optimism. Sadly, the
reality of the situation has not been borne out in the way that Congress
envisioned 21 years ago. The term “troll” has leapt from a Tolkien-esque
monster, to an everyday threat to those who make their livings on the internet.
Copyright is always locked in a struggle between two competing drives: to
protect creators’ works so that they are incentivized to make more and better creations,
and to ensure that creative works are spread far and wide for the betterment of
society. Video hosting sites and Congress need to come together to create a
more modern, and far more humane system than this one. For starters, creating
serious penalties for bad faith strikes, and making remedies more easily
available to those who are hurt by them. In addition, making the system less
opaque and more responsive would be a great help; at the DMCA’s inception,
large conglomerates made the vast majority of video content. But more and more
lately, small-time, independent creators are taking on that role and they are
doing primarily on sites like YouTube. The law must be amended to recognize
their value as creators, and must account for their limited means of recourse.
So much about internet media has changed since 1998, and our copyright laws
sorely need to reflect that.
Stephen Gray is a JD candidate, 2020, at NYU School of Law.