637 F.Supp.3d 276
E.D. Pa.2022Background
- TikTok is a social-media platform that uses an algorithmically generated "For You Page" (FYP) to recommend third-party short videos to users, including minors.
- In Dec. 2021, ten-year-old Nylah Anderson viewed and attempted the "Blackout Challenge" on TikTok and later died from ligature injuries.
- Plaintiff Tawainna Anderson sued TikTok, Inc. and ByteDance, Inc. for design defect, failure to warn, negligence, wrongful death, survival action, and consumer-protection claims, alleging the platform’s algorithm promoted dangerous third‑party content to children.
- Defendants moved to dismiss, invoking lack of personal jurisdiction, failure to state a claim, and immunity under 47 U.S.C. § 230(c)(1). Plaintiff abandoned her state and California consumer‑protection claims in response.
- The district court held that Section 230 immunity is evident on the face of the complaint because Anderson’s claims rest on how defendants published and promoted third‑party content (including algorithmic recommendations), and therefore granted dismissal of the remaining tort claims.
Issues
| Issue | Plaintiff's Argument | Defendant's Argument | Held |
|---|---|---|---|
| Whether § 230 bars Anderson’s product‑liability and negligence claims | Anderson frames claims as direct designer/manufacturer liability for a defective product (the app/algorithm), not as publisher liability | § 230 precludes treating interactive service providers as publishers or speakers of third‑party content; algorithmic recommendation and promotion are publisher functions | Court: § 230 bars the claims because they are inextricably linked to publication/distribution of third‑party content; dismissal granted |
| Whether algorithmic features (recommendations) are distinct from publisher functions (i.e., akin to Lemmon/Internet Brands exceptions) | Relies on cases where site design defects were independent of user content (e.g., Lemmon, Internet Brands) to argue claims survive § 230 | Defendants: algorithmic curation, promotion, and distribution are publisher decisions protected by § 230; Lemmon/Internet Brands are distinguishable | Court: Distinguishes those cases; algorithmic recommendation is a publisher function covered by § 230 |
| Status of consumer‑protection claims | Plaintiff initially alleged state UTPCPL and California CLRA claims | Defendants moved to dismiss; plaintiff did not defend those claims | Court: Plaintiff abandoned those consumer‑protection claims; they are dismissed |
Key Cases Cited
- Zeran v. Am. Online, Inc., 129 F.3d 327 (4th Cir.) (Congress intended § 230 immunity to preserve robust online speech and minimize provider liability for third‑party content)
- Green v. Am. Online (AOL), 318 F.3d 465 (3d Cir.) (§ 230 shields provider decisions about monitoring, screening, and deleting third‑party content)
- Barnes v. Yahoo!, Inc., 570 F.3d 1096 (9th Cir.) (substance of claim—whether it treats defendant as publisher—controls over label)
- Dyroff v. Ultimate Software Grp., Inc., 934 F.3d 1093 (9th Cir.) (algorithmic recommendations are not independent content and can be treated as publisher functions)
- Force v. Facebook, Inc., 934 F.3d 53 (2d Cir.) (use of algorithms to match content to users falls within publisher functions protected by § 230)
- Doe v. Internet Brands, Inc., 824 F.3d 846 (9th Cir.) (§ 230 inapplicable where duty to warn arose independent of regulation of user content)
- Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir.) (site feature allegedly caused harm independent of user content—court declined § 230 immunity)
- MySpace, Inc. v. [plaintiff], 528 F.3d 413 (5th Cir.) (claims based solely on failure to implement site safety measures are effectively claims about publishing user communications)
- Erie Ins. Co. v. Amazon.com, Inc., 925 F.3d 135 (4th Cir.) (discussed by plaintiff as an example of product‑liability framing; court analyzes whether claim inherently treats defendant as publisher)
