- Claims against non-algorithm-based sites seem weaker
- Judge worries about broad impact on defamation cases
New York judges on Tuesday hinted that a lawsuit seeking to hold internet companies—including Meta, Google, Reddit, and Amazon—liable for a 2022 mass shooting in Buffalo may hinge on which companies use algorithms to feed content to users.
Lawyers for survivors and family members of the shooting victims defended their claims that the algorithms those sites use to deliver tailored content to users should be considered products under the state’s product liability law, though they conceded that some of the defendant companies, specifically chat room-focused sites Discord and 4Chan, don’t use algorithms.
The companies that use algorithms seem to be “in a category unto themselves,” and the arguments that the algorithms can be treated as products subject to liability may be stronger than those against the non-algorithm-based companies, said Associate Justice John Curran of the New York Supreme Court Fourth Appellate Department in Rochester.
The internet companies say they’re protected from liability under the federal Communications Decency Act, which says no “interactive computer service” provider can be treated as “the publisher or speaker of any information provided by another information content provider.”
They want the court to dismiss four consolidated cases seeking to hold them accountable for disseminating “extreme and harmful” content that radicalized then-18-year-old Payton Gendron and led him carry out the May 14, 2022, shooting at Tops Friendly Market.
Everytown for Gun Safety, which represents some of the plaintiffs, is backed by Michael Bloomberg. Bloomberg Law is operated by entities controlled by Michael Bloomberg.
Proximate Cause
The plaintiffs argue the sites were the proximate cause of the shooting—something that Curran noted, in a separate but connected set of appeals in which the plaintiffs seek to hold gun makers accountable for the shooting, is an issue for a trial judge or jury to analyze.
Snap Inc. attorney Jonathan Schneller of O’Melveny & Myers LLP argued there are circumstances where proximate cause can be decided on the law, but Curran said “it’s awfully difficult” to say the plaintiffs don’t have a cause of action “based on the absence of the factors relating to proximate cause.”
If the cases aren’t about the content as the plaintiffs say, then Discord is “in a pretty good position,” Associate Justice Stephen Lindley said. “I’m not sure I wouldn’t see a strong proximate cause defense for you,” he told an attorney for the site, adding that giving users virtual rewards for posting and engaging with content doesn’t seem like it would lead to a foreseeable harm.
The content on 4Chan “by a factor of 100 is more alarming than the content of any other parties” in the case, but it still might be in a better legal position because they’re not using an algorithm to push anything to users, Lindley said.
“If you’re going to rank forums for radicalization, I might put 4Chan at the top of the list, but that doesn’t mean they’re not entitled to protection” under the Communications Decency Act, the judge said.
John Elmore of Buffalo, representing some of the plaintiffs, agreed with the judges that claims against the algorithm-based defendants are their strongest argument for liability, but emphasized that all of the defendants should be kept in the case.
Broad Implications
Lindley cited a 2024 decision, Anderson v. TikTok Inc., in which the US Court of Appeals for the Third Circuit ruled the social media site must again face a lawsuit from the family of a 10-year-old girl who died after trying the “blackout” challenge promoted on the platform. Section 230 of the Communications Decency Act doesn’t apply to TikTok because the site’s recommendation and promotion of blackout challenge videos was the site’s “own expressive activity,” the court said last August.
The Third Circuit seemed to draw a distinction between first-party that’s not protected under Section 230, and third-party speech that is protected, Lindley said. Excluding algorithms from the federal liability shield and accepting Anderson “as authority” would “eviscerate the entire purpose of 230,” which could also broaden the reach of defamation cases, the judge added.
The cases at issue aren’t about defamation, but “if you open the door to this first-party, third-party concept, it covers everything,” he said.
Anderson wrongly conflated the First Amendment and Section 230, said Eric Shumsky of Orrick, Herrington & Sutcliffe LLP, arguing on behalf of Meta, Reddit, and Google. Section 230 was enacted “for the purpose of broadening First Amendment protections,” and protects publishing activity, he said.
Associate Justices Tracey Bannister, Nancy Smith, and Henry Nowak also sat on the panel.
Morrison & Foerster LLP represents Discord Inc. Orrick, Herrington & Sutcliffe LLP represents Meta, Alphabet, and Reddit. O’Melveny & Myers LLP represents Snap Inc. Harris Beach Murtha Cullina PLLC represents 4Chan Community Support. Hueston Hennigan LLP represents Amazon and Twitch Interactive Inc. Tycko & Zavareei LLP represents Jones and Stanfield. John Elmore of Buffalo represents Patterson. Dicello Levitt LLP represents Salter.
The cases are Salter v. Meta Platforms Inc., N.Y. App. Div., 4th Dep’t, No. CA 24-00524, oral argument 5/20/25, Patterson v. Meta Platforms Inc., N.Y. App. Div., 4th Dep’t, No. CA 24-00513, oral argument 5/20/25, Jones v. MEAN LLC, N.Y. App. Div., 4th Dep’t, No. CA 24-00515, oral argument 5/20/25, and Stanfield v. MEAN LLC, N.Y. App. Div., 4th Dep’t, No. CA 24-00527, oral argument 5/20/25.
To contact the reporter on this story:
To contact the editor responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
Learn About Bloomberg Law
AI-powered legal analytics, workflow tools and premium legal & business news.
Already a subscriber?
Log in to keep reading or access research tools.