Patterson v. Meta Platforms, Inc.
2025 NY Slip Op 04385
| N.Y. App. Div. | 2025Background
- This case arises from four consolidated actions brought after the May 14, 2022 mass shooting at a grocery store in Buffalo, motivated by racist ideology.
- Plaintiffs are survivors and families of victims. Defendants are numerous social media companies whose platforms were allegedly used by the shooter pre-attack.
- Plaintiffs allege negligence, unjust enrichment, and strict products liability, arguing the platforms’ addictive and algorithmic features caused the shooter’s radicalization and violence.
- Defendants moved to dismiss, citing immunity under Section 230 of the Communications Decency Act (CDA) and the First Amendment.
- The trial court denied the motions; the defendants appealed this denial.
- The Appellate Division reversed, dismissing the complaints against the social media defendants.
Issues
| Issue | Plaintiff’s Argument | Defendant’s Argument | Held |
|---|---|---|---|
| Section 230 Immunity | Section 230 does not protect platforms from liability as product designers; claims are about defective/addictive design, not publishing third-party content | Section 230 bars any claim based on the platform acting as publisher or speaker of third-party content, including via its recommendation algorithms | Section 230 immunity applies; the claims are fundamentally about publication of third-party content |
| First Amendment Protection | Recommending harmful/racist content via algorithms is not protected; claims based on product design, not speech | Content moderation and recommendation algorithms are protected expressive activity; any liability would chill free speech rights | Algorithms recommending third-party content are protected by the First Amendment; immunity applies |
| Strict Products Liability | Social media platforms are products; their defective/addictive design caused the harm, independent of content | Platforms aren’t products in tort law; alleged harms arise from content, so product liability theory doesn't apply | Even if platforms are products, the claims rest on third-party content and are barred by Section 230 |
| Causation | Addictive design/features caused the shooter’s radicalization and actions | The shooter’s criminal acts are a superseding cause; addiction to content—not platforms—was the alleged link to harm | Causation too remote; intervening criminal act breaks causal chain; claims fail on causation even aside from immunity |
Key Cases Cited
- Packingham v. North Carolina, 582 U.S. 98 (2017) (internet as essential public square)
- Reno v. American Civ. Liberties Union, 521 U.S. 844 (1997) (section 230 as the framework for internet speech)
- Shiamili v. Real Estate Group of N.Y., Inc., 17 N.Y.3d 281 (2011) (application of Section 230 immunity in New York)
- Force v. Facebook, Inc., 934 F.3d 53 (2d Cir. 2019) (algorithmic recommendations do not deprive platforms of publisher immunity under Section 230)
- Moody v. NetChoice, LLC, 603 U.S. 707 (2024) (algorithmically curated content is expressive activity protected by the First Amendment)
- Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir. 2021) (Section 230 does not bar product liability claims based solely on design features unrelated to content)
