Today, many websites host defamatory, embarrassing, or otherwise harmful user-generated content. However, such sites are not generally liable for the content posted by their users. Under Section 230 of the Communications Decency Act, "[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." (47 U.S.C. § 230(c)(1).) Despite Section 230's broad protection of online intermediaries, the growing number of clients with concerns about their online reputations still have a number of options available.
Typically, when a website does not take down harmful content voluntarily, the victim can identify and go after the poster directly. These matters are often resolved by a settlement agreement that assigns copyright of the content to the victim, who can then seek removal. Unfortunately, the cost of identifying user-defendants and litigating against or settling with them can be prohibitive for victims of online defamation, harassment, invasion of privacy, and related torts.
In recent years a new breed of website has emerged - one solely dedicated to hosting content designed to harm third parties. Among these sites, the most well-known category is probably the "involuntary porn" site, which invites members of the public to post pornographic images of people who have not consented to distribution. Although the immunity provided by Section 230 is very broad, it does not insulate online service providers from liability for intellectual property infringement. The particular facts of how an individual website operates are critical in determining the nature and extent of liability it may face.
In Fair Housing Council of San Fernando Valley v. Roommates.com, LLC
, (521 F.3d 1157 (9th Cir. 2008) (en banc)), the Ninth Circuit explored a potential limitation on the immunity provided by Section 230. Roommates.com allowed its members to search for roommates; the Fair Housing Council complained that the site was facilitating housing discrimination. Normally, the site would be entitled to the protections of Section 230, but Roommates.com
specifically required participants to select certain preferences via a pull-down menu, including the sex and sexual orientation of their potential roommates, and whether or not they had children. Section 230 immunity is not available to entities that are responsible, in whole or in part, for the creation or development of the information at issue, and the Roommates.com court found that the pull-down menus constituted "development" of the content.
Many involuntary porn sites - including the now defunct IsAnyoneUp.com, and the PinkMeth sites - associate the photos uploaded by users with the victims' Facebook or other social networking accounts. A court could find that this constitutes "development" if it were revealed through discovery that the sites themselves are going out and finding these social media profiles, then publishing them next to user-uploaded photos.
But even in the absence of such association, the original appellate three-judge panel in the Roommates.com
case speculated in dicta that a site solely dedicated to the harassment of people that encourages users to upload harmful content could lose Section 230 immunity.
In addition to the "developing" limitation on Section 230, intellectual property and criminal content also are exempt. At least one pending lawsuit, against PinkMeth.com, alleges distribution of child pornography and solicitation of computer hackers to gain unauthorized access to photos for the purpose of uploading them to the website. (Conklin v. Katz Global Media, LLC
, No. 2012-61554-393 (Tex. 393rd Dist., Denton Cnty.) (filed Nov. 29, 2012).) If proved true, these allegations of criminal conduct could take the site outside the ambit of Section 230, and indeed the possibility that users will post child pornography remains one of the most serious risks to those who run involuntary porn sites.
There are strong policy reasons for keeping the immunity of Section 230 broadly applicable. But if the immunity were ever pierced with regard to involuntary porn or similar sites, they could be liable under the same theories pursuant to which the posters of content are liable: torts like harassment, intrusion into seclusion, false light, defamation - and in some circumstances even criminal conduct.
Some lawyers have already begun to challenge these sites. In January attorneys in Texas filed a class action complaint against the involuntary porn site Texxxan.com and the domain registrar GoDaddy for violation of certain state law torts including intrusion into seclusion and misappropriation of name or likeness. (Toups v. GoDaddy.com
, No. D130018-C (Tex. 260th Dist., Orange Cnty. filed Jan. 18, 2013).)
Recently, some of these sites have attempted to exploit another revenue source in addition to advertising - charging fees for takedowns. For example, PotentialProstitutes.com charges $99.95 for photo removal. YouGotPosted.com does not explicitly charge for takedowns, but according to Seattle attorney Gary Marshall it has run an advertisement for a "reputation company" that for $199 will promptly remove a person's pictures from the site (and which, Marshall says, is owned by the same people who own YouGotPosted.com). Another site, IsAnybodyDown.com, attempted a similar strategy, advertising for the services of a purportedly independent third party that would remove images from the site for $250. But according to legal bloggers, the takedown company is not independent of the site. This type of deceptive conduct represents another avenue of legal risk.
Is it extortion to offer to take down harmful content in exchange for a payment from the victim? Although this issue hasn't been completely fleshed out, probably not. Extortion (in California, codified at Penal Code Section 518) requires force or threat to coerce a payment - in these cases, the harm that would have been threatened has already been done before payment is requested. Although it does not appear that any pay-for-takedown sites have been sued for extortion, similar claims made against the online review sites Yelp and the more aggressive RipoffReport.com have not met with success. (See, e.g., Levitt v. Yelp
, No. 10-CV-1321 (N.D. Cal. Mar. 22, 2011); and Asia Economic Institute v. Xcentric Ventures LLC
, No.10-CV-1360 (C.D. Cal. July 19, 2010).)
Attorneys are likely to see a growing number of clients with reputation-related concerns about online content. In some cases, when the option is available, it may make sense for victims of online harassment to pay sites for removal; in others (for example, when multiple sites are involved), it may be preferable to take the more conventional approach of working with the content poster
Chris Ridder, a partner at Ridder, Costa & Johnstone in San Francisco, represents clients on transactional and litigation matters related to intellectual property and all aspects of cyberlaw.