History
  • No items yet
midpage
Patterson v. Meta Platforms, Inc.
2025 NY Slip Op 04438
| N.Y. App. Div. | 2025
Read the full case

Background

  • This appeal arises from multiple consolidated civil lawsuits following the racially motivated mass shooting at a Buffalo grocery store on May 14, 2022, where the shooter killed 10 people and injured 3 others.
  • Plaintiffs are survivors and family members of victims, suing various parties, including major social media companies (Meta/Facebook, Snap, Alphabet/Google/YouTube, Discord, Amazon, Reddit, 4chan, etc.) whose platforms the shooter used prior to the attack.
  • Plaintiffs allege tort causes of action (primarily negligence, unjust enrichment, and strict products liability based on design defect and failure to warn), claiming the social media platforms' algorithms and features promoted addiction, radicalization, and ultimately violence.
  • Social media defendants moved to dismiss, arguing they are immune from liability under Section 230 of the Communications Decency Act and protected by the First Amendment.
  • The trial court denied the motions; the appellate division reversed, dismissing the complaints against the social media defendants.
  • A dissent argued that the platforms themselves are defective products, not simply publishers of user content, and that Section 230 should not bar the claims.

Issues

Issue Plaintiff's Argument Defendant's Argument Held
Applicability of Section 230 immunity Social media companies are liable as product designers, not as publishers of third-party content. Section 230 immunizes all claims that treat them as publishers of third-party content, regardless of how plead. Section 230 immunity applies; tort claims are barred.
Content-recommendation algorithms’ legal effect Algorithms render platforms defective, so Section 230 shouldn't apply. Algorithms are editorial tools; organizing or displaying third-party content is a protected publishing function. Algorithms do not remove Section 230 protections; still considered traditional editorial functions.
First Amendment protection for recommended/curated content Section 230 doesn't apply, and even if considered first-party content, no First Amendment shield for design defects. Algorithmic curation is protected expressive activity under the First Amendment (per recent Supreme Court dicta). Defendants also protected by First Amendment if Section 230 does not apply.
Platforms as "products" for strict products liability Social media features are addictive design defects, independent of third-party content, thus actionable as products liability. Platforms are services, not products; claims still hinge on third-party content, not design alone. Strict products liability claims fail; cannot avoid Section 230 by re-labeling theory.

Key Cases Cited

  • Packingham v. North Carolina, 582 U.S. 98 (discussing the importance of the internet as a public square for diverse viewpoints)
  • Reno v. American Civ. Liberties Union, 521 U.S. 844 (characterizing the internet as a place of "content as diverse as human thought")
  • Force v. Facebook, Inc., 934 F.3d 53 (holding Section 230 immunizes social media platforms from liability for algorithmic content recommendations)
  • Shiamili v. Real Estate Grp. of N.Y., Inc., 17 N.Y.3d 281 (New York Court of Appeals examining limits of Section 230 for interactive computer services)
  • Moody v. NetChoice, LLC, 603 U.S. 707 (recent Supreme Court discussion of algorithmic sorting and First Amendment protection)
Read the full case

Case Details

Case Name: Patterson v. Meta Platforms, Inc.
Court Name: Appellate Division of the Supreme Court of the State of New York
Date Published: Jul 25, 2025
Citation: 2025 NY Slip Op 04438
Court Abbreviation: N.Y. App. Div.