TAWAINNA ANDERSON, INDIVIDUALLY AND AS ADMINISTRATIX OF THE ESTATE OF N.A., A DECEASED MINOR v. TIKTOK, INC.; BYTEDANCE, INC.
No. 22-3061
UNITED STATES COURT OF APPEALS FOR THE THIRD CIRCUIT
August 27, 2024
PRECEDENTIAL.
Before: SHWARTZ, MATEY, and PHIPPS, Circuit Judges.
Jeffrey P. Goodman [ARGUED]
Robert J. Mongeluzzi
Saltz Mongeluzzi & Bendesky
1650 Market Street
One Liberty Place, 52nd Floor
Philadelphia, PA 19103
Counsel for Appellant Tawainna Anderson
Geoffrey M. Drake
King & Spalding
1180 Peachtree Street NE
Suite 1600
Atlanta, GA 30309
Albert Giang
King & Spalding
633 W 5th Street
Suite 1600
Los Angeles, CA 90071
David Mattern
King & Spalding
1700 Pennsylvania Avenue NW
Suite 900
Washington, DC 20006
Joseph O‘Neil
Katherine A. Wang
Campbell Conroy & O‘Neil
1205 Westlakes Drive
Suite 330
Berwyn, PA 19312
Andrew J. Pincus [ARGUED]
Nicole A. Saharsky
Mayer Brown
1999 K Street NW
Washington, DC 20006
Mark J. Winebrenner
Faegre Drinker Biddle & Reath
90 S. Seventh Street
2200 Wells Fargo Center
Minneapolis, MN 55402
Counsel for Appellees TikTok, Inc. and ByteDance, Inc.
OPINION OF THE COURT
SHWARTZ, Circuit Judge.
TikTok, Inc., via its algorithm, recommended and promoted videos posted by third parties to ten-year-old Nylah Anderson on her uniquely curated “For You Page.” One video depicted the “Blackout Challenge,” which encourages viewers to record themselves engaging in acts of self-asphyxiation. After watching the video, Nylah attempted the conduct depicted in the challenge and unintentionally hanged herself. Nylah‘s mother, Tawainna Anderson, sued TikTok and its corporate relative ByteDance, Inc., (collectively, “TikTok“) for violations of state law. The District Court dismissed her complaint, holding that the Communications Decency Act (“CDA“),
I
A1
TikTok is a video-sharing social media platform that allows users to create,
Some videos that may appear on users’ FYPs are known as “challenges,” which urge users to post videos of themselves replicating the conduct depicted in the videos. The “Blackout Challenge ... encourages users to choke themselves with belts, purse strings, or anything similar until passing out.” App. 31 (Compl. ¶ 64). TikTok‘s FYP algorithm recommended a Blackout Challenge video to Nylah, and after watching it, Nylah attempted to replicate what she saw and died of asphyxiation.
B
Anderson, as the administratrix of Nylah‘s estate, sued TikTok in the United States District Court for the Eastern District of Pennsylvania, asserting claims for, among other things, strict products liability and negligence.3 She alleges that TikTok: (1) was aware of the Blackout Challenge; (2) allowed users to post videos of themselves participating in the
Blackout Challenge; and (3) recommended and promoted Blackout Challenge videos to minors’ FYPs through its algorithm, including at least one such video to Nylah‘s FYP, which resulted in her death. The District Court dismissed the complaint, holding that TikTok was immune under
Anderson appeals.4
II5
Congress enacted
F.T.C. v. Accusearch Inc., 570 F.3d 1187, 1195 (10th Cir. 2009).7 Section 230 immunizes ICSs only to the extent that they are sued for “information provided by another information content provider.”8
Anderson asserts that TikTok‘s algorithm “amalgamat[es] [] third-party videos,” which results in “an expressive product” that “communicates to users ... that the curated stream of videos will be interesting to them[.]” ECF No. 50 at 5. The Supreme Court‘s recent discussion about algorithms, albeit in the First Amendment context, supports
this view.10 In Moody v. NetChoice, LLC, 144 S. Ct. 2383 (2024), the Court considered whether state laws that “restrict the ability of social-
media platforms to control whether and how third-party posts are presented to other users” run afoul of the First Amendment. 144 S. Ct. 2383, 2393 (2024). The Court held that a platform‘s algorithm that reflects “editorial judgments” about “compiling the third-party speech it wants in the way it wants” is the platform‘s own “expressive product” and is therefore protected by the First Amendment. Id. at 2394.
Given the Supreme Court‘s observations that platforms engage in protected first-party speech under the First Amendment when they curate compilations of others’ content via their expressive algorithms, id. at 2409, it follows that doing so amounts to first-party speech under
Because TikTok concedes that Anderson‘s complaint “describe[s] an algorithm indistinguishable from those addressed in NetChoice[,]” ECF No. 51 at 2, which the Supreme Court described as one that results in expressive speech, NetChoice, 144 S. Ct. at 2405 (holding that “social-media platforms are in the business, when curating their feeds, of combining multifarious voices to create a distinctive expressive offering” (internal quotation marks and citation omitted)), we need not weigh in on whether other algorithms result in expressive speech. Moreover, because TikTok‘s “algorithm, as described in the complaint, does not” “‘respond solely to how users act online,‘” ECF No. 51 at 2 (quoting NetChoice, 144 S. Ct. at 2404 n.5), TikTok makes choices about the content recommended and promoted to specific users, and by doing so, is engaged in its own first-party speech.
Here, as alleged, TikTok‘s FYP algorithm “[d]ecid[es] on the third-party speech that will be included in or excluded from a compilation—and then organiz[es] and present[s] the included items” on users’ FYPs. NetChoice, 144 S. Ct. at 2402. Accordingly, TikTok‘s algorithm, which recommended the Blackout Challenge to Nylah on her FYP, was TikTok‘s own “expressive activity,” id., and thus its first-party speech. Such first-party speech is the basis for Anderson‘s claims. See App. 39 (Compl. ¶¶ 107(k), (o)) (alleging, among other things, that TikTok‘s FYP algorithm was defectively designed because it “recommended” and “promoted” the Blackout Challenge).11 Section 230 immunizes only information “provided by another[,]”
III
For the foregoing reasons, we will reverse in part, vacate in part, and remand.14
MATEY, Circuit Judge, concurring in the judgment in part and dissenting in part.
TikTok reads
But it is not found in the words Congress wrote in
I.
A.
Ten-year-old Nylah Anderson died after attempting to recreate the “Blackout Challenge” she watched on TikTok. The Blackout Challenge—performed in videos widely circulated on TikTok—involved individuals “chok[ing] themselves with belts, purse strings, or anything similar until passing out.” App. 31.3 The videos “encourage[d]” viewers to record themselves doing the same and post their videos for other TikTok users to watch. App. 31. Nylah, still in the first year of her adolescence, likely had no idea what she was doing or that following along with the images on her screen would kill her. But TikTok knew that Nylah would watch because the company‘s customized algorithm placed the videos on her “For You Page”4 after it “determined that the Blackout Challenge was ‘tailored’ and ‘likely to be of interest’ to Nylah.” App. 31.
No one claims the videos Nylah viewed were created by TikTok; all agree they were produced and posted by other TikTok subscribers. But by the time Nylah viewed these videos, TikTok knew that: 1) “the deadly Blackout Challenge was spreading through its app,” 2) “its algorithm was specifically feeding the Blackout Challenge to children,” and 3) several children had died while attempting the Blackout Challenge after viewing videos of the Challenge on their For You Pages. App. 31–32. Yet TikTok “took no and/or completely inadequate action to extinguish and prevent the spread of the Blackout Challenge and specifically to prevent the Blackout Challenge from being shown to children on their [For You Pages].” App. 32–33. Instead, TikTok continued to recommend these videos to children like Nylah.
B.
Following her daughter‘s death, Tawainna Anderson sued TikTok and its parent company, ByteDance, Inc. Anderson seeks to hold TikTok liable for 1) hosting the Blackout Challenge videos on its platform, 2) continuing to distribute the videos after it learned about the videos and the deaths that followed, and 3) recommending the videos to Nylah after TikTok knew the videos were likely to cause harm. TikTok moved to dismiss, arguing that Anderson sought to hold TikTok liable for acts completely
II.
TikTok maintains that Anderson‘s claims are foreclosed by a nearly-limitless interpretation of
A.
Like any man-made law,
1. Begin with the birth of long-distance communication. Like the chat rooms and bulletin boards provided by 1990s online service providers, telegraph companies long served as the conduit for communication for much of the late nineteenth and early twentieth centuries. Given the immense market power of the telegraph,5 the law regularly imposed access and nondiscrimination duties familiar to physical networks like railroads.6 That raised questions about liability, since state laws often held companies responsible for negligent deliveries. See Adam Candeub, The Common Carrier Privacy Model, 51 U.C. Davis L. Rev. 805, 810–15 (2018). Liability could also attach based on the content of third-party information. See Adam Candeub, Reading Section 230 as Written, 1 J. of Free Speech L. 139, 145–47 & 146 n.26 (2021). While telegraph operators were ordinarily not responsible for the materials they transmitted, see O‘Brien v. W. U. Tel. Co., 113 F.2d 539, 541–43 (1st Cir. 1940), liability could attach if the company knew the content was harmful, see Von Meysenbug v. W. U. Tel. Co., 54 F. Supp. 100, 101 (S.D. Fla. 1944); see also Biden v. Knight First Amend. Inst. at Columbia Univ., 141 S. Ct. 1220, 1223 & n.3 (2021) (Thomas, J., concurring).7 But that was the rare exception.
This was the common-sense system
2. The internet began infiltrating daily life in the early 1990s through large commercial service providers like CompuServe, Prodigy, and AOL.11 These emerging services “were born serving content of their own,”12 but, facing competition, they expanded to allow “users to post comments on bulletin boards, open to other members, and to communicate in chat rooms.”13 Those added functions resurrected the old legal question familiar to common carriers: Should online service providers be liable for the actions of third parties on their networks? Understanding how courts answered this question is essential to understanding the legal context in which
Believed to be the first case in the United States “to decide whether an online service ... could be held liable for third-party content,”14 Cubby, Inc. v. CompuServe, Inc. involved a defamation claim arising out of an allegedly libelous statement appearing on one of CompuServe‘s “special interest ‘forums.‘” 776 F. Supp. 135, 137 (S.D.N.Y. 1991). These fora, “comprised of electronic bulletin boards, interactive online conferences, and topical databases,” allowed
subscribers to post their own messages and interact with other users. Id. Pivoting from the closed curation of the old networks, CompuServe did not review subscriber postings. Id. Inevitably, disagreements arose among the users, and a lawsuit followed seeking to hold CompuServe liable for a posting on its system.
The district court sketched two paths for determining CompuServe‘s liability. Perhaps the company could be considered a “publisher,” someone strictly liable for repeating defamatory statements no matter the company‘s knowledge of what was said and why it might be actionable. Id. at 139.
CompuServe both won praise and stoked worry because the opinion turned on the amount and kind of editorial control exercised by the internet forum, a test that could vary in application from service to service. See, e.g., Jonathan M. Moses & Michael W. Miller, CompuServe Is not Liable for Contents, Wall St. J. (Oct. 31, 1991). Prodigy, for example, sold subscribers on the rigor of its screening and the promise that families could enjoy online entertainment without offensive messages. That suggested Prodigy could be subject to strict liability because it was “the only major commercial [bulletin board] operator that monitor[ed] all public messages by screening them before they [were] posted.” David J. Conner, Note, Cubby v. CompuServe, Defamation Law on the Electronic Frontier, 2 Geo. Mason Indep. L. Rev. 227, 240 (1993).
These predictions proved prescient. Three years later, in Stratton Oakmont, Inc. v. Prodigy Services Company, No. 31063/94, 1995 WL 323710 (N.Y. Sup. Ct. May 24, 1995), Prodigy was sued for hosting allegedly defamatory statements posted on one of its electronic bulletin boards. Id. at *1. Following the reasoning of CompuServe, the Stratton Oakmont court found Prodigy “exercised sufficient editorial control over its computer bulletin boards to render it a publisher with the same responsibilities as a newspaper.” Id. at *3. That meant Prodigy was liable for any defamatory statements on its service. Id. at *3–5. Though it was a non-precedential opinion issued by a state trial court judge, Stratton Oakmont received significant attention, much of it negative.15 If Stratton Oakmont‘s reasoning stood, online service providers acting to exclude offensive and obscene content would now risk liability for the rest of the material they hosted. See Adam Candeub, Bargaining For Free Speech: Common Carriage, Network Neutrality, and Section 230, 22 Yale J.L. & Tech. 391, 421 (2020).
B.
1. Congress responded vigorously, and a mere nine months after Stratton Oakmont, the President signed the Communications Decency Act of 1996 (CDA) into law as part of the
As enacted,
It is conventional wisdom that
2. But from the very start, courts held
C.
But this conception of
As with all cases involving the interpretation of statutes, our job in interpreting
1.
Candeub, Reading Section 230 as Written, supra, at 146–51. It cannot, in short, be held liable as a publisher.
But
2. Properly read,
3. What does all this mean for Anderson‘s claims? Well,
* * *
“It used to be said that there were three great influences on a child: home, school, and church. Today, there is a fourth great
influence . . . .” Newton N. Minow, Speech Before the Nat‘l Ass‘n of Broads. (May 9, 1961), reprinted in Newton N. Minow, Television and the Public Interest, 55 Fed. Comm. L.J. 395, 399 (2003). When Commissioner Minow spoke of the perils and promise of television, the internet was still two decades from its earliest form. But his description of a “procession of game shows, . . . formula comedies about totally unbelievable families, blood and thunder, mayhem, violence, sadism, murder, . . . more violence, and cartoons” captures the dreary state of the modern internet. Id. at 398. The marketplace of ideas, such as it now is, may reward TikTok‘s pursuit of profit above all other values. The company may decide to curate the content it serves up to children to emphasize the lowest virtues, the basest tastes. It may decline to use a common good to advance the common good.
But it cannot claim immunity that Congress did not provide. For these reasons, I would affirm the District Court‘s judgment as it relates to any of Anderson‘s claims that seek to hold TikTok liable for the Blackout Challenge videos’ mere existence on TikTok‘s platform. But I would reverse the District Court‘s judgment as it relates to any of Anderson‘s claims that seek to hold TikTok liable for its knowing distribution and targeted recommendation of the Blackout Challenge videos. Accordingly, I concur in the judgment in part and dissent in part.
