Does Moderating Your Website Make You Liable for Copyright Infringement?


FIND MORE LEGAL ARTICLES
Online platform providers are increasingly looking for ways to curate user generated content, both to promote good content and filter bad content. As a recent example, YouTube has taken steps to demonetize channels that may not align with the values of their advertisement buyers. Nearly everyone running an online platform is working on removing internet troll comments.

Social media companies are under intense pressure to moderate their content. President Trump has condemned “fake news” in social media. Germany enacted a law requiring social media companies to promptly remove certain content or face heavy fines. Government officials in the UK are considering similar measures.

While we would all like to see fewer internet trolls, platform provider’s steps to moderate content may inadvertently create liability for copyright infringement. Social media companies and other platform providers must carefully craft both their internal moderating systems, and the appearance of those systems. By transitioning from passive content hosts to deliverers of moderated or curated content, content hosts may inadvertently take on substantial liabilities by making themselves ineligible for the copyright infringement safe harbor in the Digital Millennium Copyright Act.

Title two of the Digital Millennium Copyright Act (DMCA) protects most social media platforms, forums, and other online content hosts from liability for the copyright infringement of their users. The DMCA covers four categories of online content hosts: (1) transitory digital network communications, (2) system caching, (3) information residing on systems or networks at the direction of users, and (4) information location tools. 17 U.S.C. § 512(c)(1). Most social media sites, forums, and other online platform providers fall under the third category, “information residing on systems or networks at the direction of users.” To qualify under this category, platforms must show (as a threshold matter) that the information was posted at the direction of users. If so, platform providers must also show “(1) it lacked actual or red flag knowledge of the infringing material; and (2) it did not receive a “financial benefit directly attributable to the infringing activity, in a case in which the service provider has the right and ability to control such activity.” Id.

If these qualifying conditions are met, platform providers are shielded from direct and indirect copyright infringement claims if they prevent access to or remove content in response to a DMCA notice from a copyright owner.
Most platform hosts meet the requirements and are protected. The traditional business model of platform providers is to provider forum for their users. Generally speaking, most platforms allow content users to create and upload anything permitted by law. As platform providers increasingly take control over their users’ content, their protection under the DMCA becomes questionable.

The Ninth Circuit recently called into question whether a forum moderated by its owner is a passive host protected by the DMCA in Mavrix Photographs LLC v. LiveJournal, Inc. (case no. 14-56596). Mavrix is in the business of taking photographs of celebrities. LiveJournal host online community journals, wherein community members can post photos and stories. LiveJournal’s most popular online journal, named ONTD, is about celebrity news. A member of LiveJournal’s ONTD community posted Mavrix’s copyright protected celebrity photos on LiveJournal. Mavrix sued. LiveJournal moved to dismiss under the DMCA safe harbor. The trial court granted LiveJournal’s motion because LiveJournal qualified for the DMCA safe harbor because the infringing content qualified as “information residing on systems or networks at the direction of users.” Mavrix appealed.

This is where the case gets interesting – the Ninth Circuit reversed, holding there is a genuine issue of material fact as to whether LiveJournal’s content was posted at the direct of users, considering LiveJournal’s content moderating procedures and the way LiveJournal’s authority appeared to its users. The court held that a reasonable jury could conclude an agency relationship existed between LiveJournal and its moderators because (1) it appeared to users that the moderators acted on LiveJournal’s behalf and (2) LiveJournal's active moderating practices.

The specific facts referenced by the court suggesting LiveJournal had apparent or actual authority over the moderators included the following: LiveJournal uses a hierarchical moderator system with varying levels of authority to review posts; volunteers fill each position, except for one moderator, who is an employee of LiveJournal; LiveJournal tasked that employee with developing the ONTD journal; the employee also removes other moderators based on their performance; LiveJournal provides guidelines on how to review content, both to block undesirable content and select attractive content; and ONTD is an important source of revenue for LiveJournal.

This case provides a valuable lesson for platform providers. Platform providers must monitor not only their internal content management policies, but also the way users perceive the effect of those policies to avoid losing the DMCA safe harbor. Even if a platform does not control its users’ content, the presentation and dissemination strategies may impress on users that the provider is responsible for the content. A platform that promotes selected posts in regular email blasts may create the impression that the platform controls the content, rather than its users. A curated feed may also create the impression that a platform has vetted the content.

Another interesting question that remains to be answers is the effect of algorithmic selection of content. Users may assume that a host’s algorithms can effectively filter and parse user generated content, such that users believe the content was vetted systematically by the platform’s algorithms. This could create copyright infringement liability, even if the algorithm is not actually as sophisticated as users may believe.

ABOUT THE AUTHOR: Benton Patterson
Benton Patterson is an intellectual property attorney in Dallas, Texas. He is a graduate of Texas A&M University School of Law.

Copyright Patterson Legal
More information about

Disclaimer: While every effort has been made to ensure the accuracy of this publication, it is not intended to provide legal advice as individual situations will differ and should be discussed with an expert and/or lawyer. For specific technical or legal advice on the information provided and related topics, please contact the author.

Find a Lawyer

Find a Local Lawyer