The Business Lawyer
American Bar Association
image
Developments in Intermediary Liability
DOI 10.928/ac.2021.03.39 , Volume: 76 , Issue: 1
Edwards: Developments in Intermediary Liability

I. Introduction

The Communications Decency Act (“CDA”)1 was crafted at the dawn of the Internet Age to provide immunity for any “interactive computer service”2 (“ICS”) from liability based on information provided by third parties.3 Colloquially known by its section number in Title 47 of the U.S. Code, Section 230 shielded the nascent industry by greatly limiting federal claims4 against an ICS and preempting conflicting state claims.5 Section 230’s immunity clause has generally been interpreted by courts to cover virtually all forms of third-party content published by platforms of all types (e.g., Facebook, Airbnb, Tinder, and Amazon), even if the information is patently inaccurate, illegal, or intended to deceive.

The cases included in this survey relate to a wide array of business law topics, including hosting a marketplace for products designed to inflict harm or for recruitment of—and communications by—like-minded individuals (Part II.A), liability for marking a competitor’s products as a security risk (Part II.B), loss of immunity due to a role in compiling information (Part II.C), removing content or users from a platform (Part II.D), and the Snapchat speed filter (Part II.E). The future of Section 230 is uncertain due to controversy surrounding the increased moderation by major platforms, including warning labels, user bans, and content removal, which prompted an executive order from the President encouraging increased scrutiny of Section 230’s application and multiple public statements calling for the repeal or amendment of the law itself (Part III). Given the wide-ranging application of Section 230’s immunity clause to companies that constitute a significant portion of the economy, all business lawyers should monitor its current applications, the impact of the recent executive order, and any future statutory amendments or new administrative rules.

II. Applications of Section 230

A. Crime Victims vs. Marketplaces

Crime victims with significant injuries who attempt to recover civil damages are often forced to look for responsible parties other than the tortfeasor to obtain redress. Such defendants in these civil suits can include insurers, co-conspirators, or entities that enable the crime to occur through their own negligent conduct. Armslist, a marketplace for firearms, is a frequent target of plaintiffs who fall victim to the crimes committed by its customers.6 Unlawful use of firearms typically results in both extensive harms and a defendant who is judgment proof.

In the case of Stokinger v. Armslist, LLC,7 the plaintiff was a Boston police officer shot by a convicted felon using a gun purchased from an arms trafficker who used Armslist to purchase dozens of firearms for resale. The customers of the arms trafficker were typically those who could not legally purchase a firearm or wanted to purchase a firearm that could not be traced to the end user. The plaintiff argued that Armslist illegally facilitated the trafficking of firearms to prohibited owners.8

Armslist defended the suit by claiming immunity under Section 230. In arguing against the applicability of Section 230, Officer Stokinger put forth several arguments, including that the claim was not based on Armslist’s actions as a publisher but in its design and maintenance of the site, that Armslist knew or should have known that its website facilitates arms trafficking, and that there should be a presumption against preemption of state law.9 The court rejected each argument, holding that the plaintiff failed to prove that his case should overcome precedent.10

In Force vs. Facebook, Inc.,11 the Second Circuit held that Section 230 insulated Facebook from claims alleging that it provided material support to terrorists by doing too little to limit their use of the platform for recruitment and communication. However, the case is most noteworthy for Chief Judge Katzmann’s dissenting opinion on the issue of whether algorithmic content matching deviates substantially from the fairly neutral purveyor or presenter of information that was envisioned in 1996 when Section 230 was enacted.

In his dissent, Judge Katzmann asserted that the pendulum has swung too far toward immunity, stating: “[W]e today extend a provision that was designed to encourage computer service providers to shield minors from obscene material so that it now immunizes those same providers for allegedly connecting terrorists to one another.”12 Furthermore, despite the fact that Facebook does publish content created by these groups, “it strains the English language to say that in targeting and recommending these writings to users—and thereby forging connections, developing new social networks—Facebook is acting as ‘the publisher . . . of . . . information provided by another information content provider.’”13

B. Anticompetitive Screening of Competitors

Two cases in the Ninth Circuit addressed the applicability of Section 230 to the screening decisions made by the designers of cybersecurity software, such as spam filters, malware screens, and antivirus protection. The courts analyzed Section 230’s immunity shield in light of the presence or absence of an anticompetitive animus in the screening actions taken.

In Enigma Software Grp. USA, LLC v. Malwarebytes, Inc.,14 the Ninth Circuit heard the complaint of a software company that claimed that the defendant, a direct competitor malware protective service, was flagging it as a “potentially unwanted program.”15 The defendant, Malwarebytes, asserted that its actions were immunized by Section 230(c)(2)(A), which allows an ICS to filter “material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.”16

The court ruled that Section 230 does not immunize Malwarebytes against claims alleging deceptive business practices and tortious interference. It reasoned that “[u]sers would not reasonably anticipate providers blocking valuable online content in order to stifle competition. Immunizing anticompetitive blocking would, therefore, be contrary to another of the statute’s express policies: ‘remov[ing] disincentives for the . . . utilization of blocking and filtering technologies.’”17

Soon after, the Northern District of California decided Asurvio LP v. Malwarebytes Inc.,18 in which the same defendant was sued for similarly flagging and blocking the software of another company. While Asurvio’s software, like that of Malwarebytes, provided computer maintenance services, its product did not have the same primary function as that of Malwarebytes. The court applied the decision in Enigma but arrived at the opposite conclusion, holding that the Section 230 exception carved out in Enigma only applies when an ICS, like an antivirus or malware screening service, filters out software made by a company that is a “direct competitor.”19

C. Loss of Immunity Due to Involvement in Compiling Information

Stephanie Lukis brought a putative class action suit against Whitepages Inc. and Instant Checkmate LLC, both purveyors of online background reports.20 Advertisements for each company’s services included free previews of the reports that were available for individual sale or through a subscription that allows subscribers to run an unlimited number of reports each month.

Lukis alleged that the companies violated Illinois’ right of publicity law by using her name and other identifying information for a commercial purpose without her consent.21 Whitepages argued, inter alia, that Section 230 immunizes it as the mere publisher of information provided by another source.22 The court disagreed, holding that “Whitepages did not act as a mere passive transmitter or publisher of information that was ‘provided by another information content provider.’ . . . Rather, it is alleged to have actively compiled and collated, from several sources, information regarding Lukis.”23 Thus, Section 230 immunity was denied to the defendant companies.

D. Content Moderation

Courts continue to wrangle with various forms of content moderation on ICSs. The actions that cause these suits range from the removal of a post, to temporary suspensions, permanent deletion of an account, and, in extreme cases, blocking the offending user from creating a new account on the platform. In Wilson v. Twitter,24 Twitter did all of the above to the plaintiff, who used his account to publish hate speech aimed at the LGBT community. Wilson asserted several claims, including one under the Civil Rights Act of 1964, which the court rejected on the merits.25 In addition, the court held the claims were barred by Section 230, observing:

While this case does not represent the “typical” case envisioned by § 230 immunity, wherein a litigant seeks to hold an interactive computer service provider liable for publishing content from a third-party which the litigant finds objectionable, courts have readily found that the statutory immunity also applies to the factual scenario presented here, where the plaintiff objects to the removal of his or her own content.26

Similar conclusions were reached in other cases in which platforms removed the content, profiles, and/or platform access of users who violated the terms of service. In King v. Facebook, Inc.,27 the platform removed multiple posts by a Black activist and suspended his account several times. The plaintiff alleged that “Facebook’s treatment of black posters is not equivalent to its treatment of others, [in] that Facebook ‘has allowed whites to say the same thing that blacks have been banned for.’”28 The court held that, “[b]ecause each cause of action accuses Facebook of wrongful acts it took as a publisher, none survives the application of Section 230(c)(1) of the CDA.”29

In Federal Agency of News LLC v. Facebook, Inc.,30 the plaintiff brought several claims against Facebook for shutting down its account. The plaintiff—a company controlled by a Russian, state-sponsored media operation—created 470 Facebook pages and produced 80,000 pieces of content that reached 126 million Americans in an effort to influence the 2016 U.S. presidential election.31 Facebook removed the account on the ground that it violated Facebook’s terms of service. The court held that Facebook could rely on Section 230 as a defense, because the plaintiff’s “claims are based on Facebook’s decision not to publish [its] content.”32

In Domen v. Vimeo, Inc.,33 Vimeo successfully invoked Section 230 against claims based on its banning videos promoting or showing conversion therapy. Also referred to as Sexual Orientation Change Efforts (“SOCE”),34 conversion therapy seeks to turn a homosexual or transsexual person into a heterosexual, gender-normative person through behavior modification. This controversial therapy has been strongly opposed by the American Psychological Association for more than a decade,35 and several states have enacted restrictions or bans on its practice.36 Vimeo followed suit by adopting an anti-SOCE policy that resulted in the removal of the plaintiff’s content and the court agreed that the removal was protected by Section 230. The opinion significantly contributes to the jurisprudence on SOCE by adding a reported opinion from a district court in the Second Circuit, as few cases have been reported outside of the Ninth Circuit.

Finally, in a case not involving Section 230, a court upheld YouTube’s innovative, intermediate method of content moderation in a suit brought by PragerU.37 Instead of removing PragerU’s content from the platform, YouTube categorized its videos as containing potentially mature content; YouTube’s categorization would limit a user’s access to PragerU’s videos only if the user previously activated YouTube’s Restricted Mode.38 Less than 2 percent of YouTube users activate the Restricted Mode of browsing and viewing videos.39 However, PragerU claimed that the classification of its videos as separate from other videos violated its First Amendment rights and that YouTube was engaged in false advertising in violation of the Lanham Act.40 The First Amendment claim failed because “[t]he Free Speech Clause of the First Amendment prohibits the government—not a private party—from abridging speech” and neither YouTube nor its parent company Google are state actors.41 The false advertising claim likewise failed because YouTube’s statements about its content moderation policies are neither “commercial advertising or promotion” nor a “false or misleading representation of fact” requisite to establish a claim.42

E. Snapchat Filters

Courts have issued conflicting opinions regarding one of the most popular mobile phone applications in the world. Snap Inc., the maker of Snapchat, has faced several actions for injuries stemming from the use of filters for user-generated content. Filters overlay a variety of information onto a user’s pictures. Some are static, acting much like a picture frame or stickers placed onto the picture, while others are dynamic, incorporating information from sensors in the phone like the speed filter, which uses a phone’s GPS information to determine the current speed of the person taking the photo.43

In Lemmon v. Snap Inc.,44 three young men were killed in an automobile accident caused by the trio’s attempt to log an entry in Snapchat at more than 100 miles per hour. Their Snapchat post showed that they had logged one entry showing the vehicle traveling at 123 miles per hour. When the car crashed a few minutes later, it was estimated to have been traveling at 113 miles per hour.45 The plaintiffs alleged that:

Snap knew or should have known that . . . many of its users were drivers of, or passengers in, cars driven at speeds of 100 m.p.h. or more because they wanted to use Snapchat to capture a mobile photo or video showing them hitting 100 m.p.h. and then share the Snap with their friends.46

In Lemmon, the court referenced Maynard v. Snapchat, Inc.,47 a case in which the company’s Section 230 defense failed. In Maynard, a passenger in the speeding car described the wreck and the use of Snapchat’s speed filter feature as follows:

I looked up and noticed that we seemed to be accelerating. I looked in the front, and saw Christal McGee holding her phone. The screen had a speed on it, which was about 80 m.p.h. and climbing. I asked Christal if her phone was keeping up with the speed of the car. Christal said it was. I told her I was pregnant and asked her to slow down. Christal responded and said she was just trying to get the car to 100 m.p.h. to post it on Snapchat. She said “I’m about to post it.”48

Immediately following that statement, Maynard pulled out of his apartment complex, was struck by the speeding vehicle, and suffered permanent brain damage.49 The Maynard court held that Section 230 did not apply because the claim was not based on any third-party posts: The plaintiffs instead “seek to hold Snapchat liable for its own conduct, principally for the creation of the Speed Filter and its failure to warn users that the Speed Filter could encourage speeding and unsafe driving practices.”50

Acknowledging that its holding was inconsistent with that of Maynard—a state court decision out of Georgia—the Lemmon court explained that it was bound to apply Ninth Circuit precedent under which the plaintiff’s claim was barred because the speed filter was a “content-neutral tool,” meaning that users could have used the filter to post pictures of themselves not speeding.51

III. Fact Checkers Under Fire

The application of Section 230 has remained relatively stable for the past quarter century, despite being at the center of an evolving industry. Its future, however, may depend on the impact of the Executive Order on Preventing Online Censorship that President Trump issued on May 28, 2020.52

Public discourse in recent years has included relentless accusations from both left- and right-wing news sources that the other side is propagating “fake news.” In response, some ICSs have begun to flag articles with the ratings provided by independent fact checkers, many of which adhere to the Code of Principles set forth by the International Fact-Checking Network, when determining the truthfulness of information that is shared online.53 A variety of media companies— ranging from small outlets to worldwide media titans, such as Reuters and the Associated Press—have pledged to follow the Code of Principles and agreed to be assessed by the organization.54

On May 27, 2020, Twitter flagged two of President Trump’s tweets about mail-in ballots in California. Twitter explained: “We added a label to two @realDonaldTrump Tweets about California’s vote-by-mail plans as part of our efforts to enforce our civic integrity policy. We believe those Tweets could confuse voters about what they need to do to receive a ballot and participate in the election process.”55

The executive order issued just one day later called for a revamp of laws governing liability of Internet intermediaries. It states: “When large, powerful social media companies censor opinions with which they disagree, they exercise a dangerous power. They cease functioning as passive bulletin boards, and ought to be viewed and treated as content creators.”56 President Trump followed up the order the next day with a more direct message on Twitter: “REVOKE 230!”57

Of course, Section 230 cannot be repealed by executive order because it is codified law. The U.S. Congress would have to draft and approve a bill to modify the law and, although there has been increasing attention from lawmakers on the inner workings of technology giants like Google and Facebook, there is no indication of political will or actual plans to upend this cornerstone of the Internet.

What remains to be seen is how federal officials and agencies named in the executive order will respond to its directives. The order directs the heads of departments and agencies to, among other things, review federal spending on advertisements with platforms that restrict ads “due to viewpoint discrimination”;58 consider whether to take action in response to complaints collected through a “Tech Bias Reporting tool” launched by the White House;59 and request that the Federal Communications Commission propose regulations to “clarify” which actions can result in the loss of Section 230’s protections.60 The Attorney General is ordered to “establish a working group regarding the potential enforcement of State statutes that prohibit online platforms from engaging in unfair or deceptive acts or practices,” and the order directs the working group to “develop model legislation for consideration by legislatures in States where existing statutes do not protect Americans from such unfair and deceptive acts and practices.”61 Each of these actions is subject to, and will likely face, judicial review in the coming year.

Notes

1 Pub. L. No. 104-104, § 509, 110 Stat. 56, 137 (1996) (codified as amended at 47 U.S.C. § 230 (2018)).
2 The CDA defines “interactive computer service” as “any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server.” 47 U.S.C. § 230(f)(2).
3 Id. § 230(c)(1) (“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”). An “information content provider” is “any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.” Id. § 230(f)(3).
4 Certain federal claims are excluded from the preemptive effect of Section 230. Id. § 230(e)(1), (2), (4) & (5) (excluding effect on criminal law, intellectual property law, communication-privacy law, and sex-trafficking law).
5 Id. § 230(e)(3) (“No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.”).
6 See, e.g., Vesely v. Armslist LLC, 762 F.3d 661 (7th Cir. 2014); Daniel v. Armslist, LLC, 926 N.W.2d 710 (Wis. 2019).
7 No. 1884CV03236F, 2020 WL 2617168 (Mass. Super. Ct. Apr. 28, 2020).
8 Id. at *4.
9 Id. at *3–6.
10 Id. at *4–7 (citing, among other cases, Doe v. Backpage.com, LLC, 817 F.3d 12, 16–21 (1st Cir. 2016); Daniel, 926 N.W.2d at 722–27).
11 934 F.3d 53 (2d Cir. 2019).
12 Id. at 77 (Katzmann, C.J., dissenting in part).
13 Id. at 76–77 (Katzmann, C.J., dissenting in part) (quoting 47 U.S.C. § 230(c)(1)).
14 946 F.3d 1040 (9th Cir. 2019).
15 Id. at 1047.
16 Id. at 1045 (quoting 47 U.S.C. § 230(c)(2)(A)).
17 Id. at 1051 (quoting 47 U.S.C. § 230(b)(4)).
18 No. 5:18-CV-05409-EJD, 2020 WL 1478345 (N.D. Cal. Mar. 26, 2020).
19 Id. at *5 (quoting Enigma, 946 F.3d at 1047).
20 Lukis v. Whitepages Inc., No. 19 C 4871, 2020 WL 1888916 (N.D. Ill. Apr. 16, 2020).
21 Id. at *6–8 (interpreting 765 Ill. Comp. Stat. Ann. 1075/30(a) (West, Westlaw through P.A. 101-651) (“A person may not use an individual’s identity for commercial purposes . . . without having obtained previous written consent . . . .”)).
22 Id. at *9.
23 Id. (quoting 47 U.S.C. § 230(c)(1)).
24 No. 3:20-cv-00054, 2020 WL 3410349 (S.D. W. Va. May 1, 2020), magistrate’s report adopted, 2020 WL 3256820 (S.D. W. Va. June 16, 2020).
25 Id. at *6–9 (interpreting 42 U.S.C. § 2000a).
26 Id. at *12 (citing Domen v. Vimeo, Inc., 433 F. Supp. 3d 592 (S.D.N.Y. 2020), appeal docketed, No. 20-616 (2d Cir. Feb. 18, 2020)).
27 No. 19-CV-01987-WHO, 2019 WL 4221768 (N.D. Cal. Sept. 5, 2019).
28 Id. at *2 (quoting Second Amended Complaint).
29 Id. at *1.
30 395 F. Supp. 3d 1295 (N.D. Cal. 2019).
31 Id. at 1300–01; see Exposing Russia’s Effort to Sow Discord Online: The Internet Research Agency and Advertisements, U.S. House Permanent Select Comm. on Intelligence, https://intelligence.house.gov/social-media-content/ (last visited Sept. 1, 2020).
32 Facebook, 395 F. Supp. 3d at 1306.
33 433 F. Supp. 3d 592 (S.D.N.Y. 2020), appeal docketed, No. 20-616 (2d Cir. Feb. 18, 2020).
34 Id. at 598 (quoting email from Vimeo to plaintiff and his church as account holders).
35 Resolution on Appropriate Affirmative Responses to Sexual Orientation Distress and Change Efforts, Am. Psych. Association (Aug. 2009), https://www.apa.org/about/policy/sexual-orientation.
36 See Cal. Bus. & Prof. Code § 865.2 (West, Westlaw through ch. 33 of 2020 Reg. Sess.); N.J. Stat. Ann. § 45:1-55(a) (West, Westlaw through ch. 67 of L.2020). See generally Devinn Larsen, Striving for Change: California’s Attempt to Outlaw Conversion Therapy, 50 U. Pac. L. Rev. 285 (2019) (discussing A.B. 2943, 2017–18 Reg. Sess. (Cal. 2018)).
37 Prager Univ. v. Google LLC, 951 F.3d 991 (9th Cir. 2020). The plaintiff challenged the actions of YouTube, which was acquired by Google in 2006. See id. at 996; Andrew Ross Sorkin & Jeremy W. Peters, Google to Acquire YouTube for $1.65 Billion, N.Y. Times (Oct. 9, 2006), https://www.nytimes.com/2006/10/09/business/09cnd-deal.html.
38 Prager Univ., 951 F.3d at 996. Some institutions—libraries, for example—activate “Restricted Mode” on behalf of all of the institution’s users. See id.
39 Id.
40 Id. (citing U.S. Const. amend. I; 15 U.S.C. § 1125(a)(1)(B)).
41 Id.
42 Id. at 999–1000 (quoting 15 U.S.C. § 1125(a)(1)(B)).
43 Hilary Silvia & Nanci K. Carr, When Worlds Collide: Protecting Physical World Interests Against Virtual World Malfeasance, 26 Mich. Tech. L. Rev. 279, 312 (2020).
44 440 F. Supp. 3d 1103 (C.D. Cal. 2020), appeal docketed, No. 20-55295 (9th Cir. Mar. 19, 2020).
45 Id. at 1105.
46 Id. at 1106 (citing First Amended Complaint).
47 816 S.E.2d 77 (Ga. Ct. App. 2018).
48 Id. at 79 (quoting the passenger’s affidavit).
49 Id.
50 Id. at 81.
51 Lemmon, 440 F. Supp. 3d at 1109, 1113 (applying, among other cases, Dyroff v. Ultimate Software Grp., Inc., 934 F.3d 1093 (9th Cir. 2019); Fair Hous. Council v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008) (en banc)).
52 Exec. Order No. 13925, 85 Fed. Reg. 34079 (June 2, 2020).
53 Commit to Transparency—Sign Up for the International Fact-Checking Network’s Code of Principles, Intl Fact-Checking Network, https://ifcncodeofprinciples.poynter.org/ (last visited Sept. 1, 2020).
54 Verified Signatories of the IFCN Code of Principles, Intl Fact-Checking Network, https://www.ifcncodeofprinciples.poynter.org/signatories (last visited Sept. 1, 2020).
55 Twitter Safety (@TwitterSafety), Twitter (May 27, 2020, 9:54 PM), https://twitter.com/TwitterSafety/status/1265838823663075341.
56 Exec. Order No. 13925, 85 Fed. Reg. at 34079.
57 Donald J. Trump (@realDonaldTrump), Twitter (May 29, 2020, 10:15 AM), https://twitter.com/realDonaldTrump/status/1266387743996870656.
58 Exec. Order No. 13925, 85 Fed. Reg. at 34081.
59 Id.
60 Id.
61 Id. at 34082.
https://test.researchpad.co/tools/openurl?pubtype=article&doi=10.928/ac.2021.03.39&title=Developments in Intermediary Liability&author=Chase J. Edwards,&keyword=&subject=Report,