Web Giants in NY Mass Shooting Case Say Algorithm Isn’t Product

May 19, 2025, 9:00 AM UTC

Internet giants Google, Meta, Amazon, and Reddit will argue Tuesday before a New York court that they can’t be held liable for publishing racist content that allegedly pushed a gunman to kill 10 people in a 2022 mass shooting at a Buffalo supermarket.

But survivors and family members of the shooting victims say the algorithms those sites use to deliver tailored content to users should be considered products under the state’s product liability law—a question the state’s trial court punted on answering for now.

To get their way, they’ll need to convince the appellate court judges that, for the first time in New York, a non-tangible thing should be considered a product for purposes of strict liability.

“The position plaintiffs are advancing in this case is far outside of any type of strict product liability claim that New York courts have recognized in the past, and I would be surprised if the law swung so far in that direction in a single case,” said Thomas Kurland, a partner at Patterson Belknap Webb & Tyler LLP who focuses on products liability.

The trial court twice last year denied the companies’ motions to dismiss four consolidated cases that seek to hold them accountable for disseminating “extreme and harmful” content. The companies may still prove they can’t be held responsible because they act as publishers who host third-party content, but the judge said at this stage of litigation there are viable allegations from victims of the May 14, 2022, shooting at Tops Friendly Market.

Everytown for Gun Safety, which represented the plaintiffs in lower courts, is backed by Michael Bloomberg. Bloomberg Law is operated by entities controlled by Michael Bloomberg.

Algorithms as Products

The Supreme Court Fourth Appellate Department is one of many courts around the country wrestling with how to apply Section 230 of the federal Communications Decency Act in wrongful death suits, as well as when to hold websites liable for gun violence.

The plaintiffs argue that the shooter, who was 18 years old at the time, was motivated by racist online content to commit a hate crime by targeting people in a historically Black neighborhood hundreds of miles from where he lived.

The internet companies say they’re protected from liability under the law, which says no “interactive computer service” provider can be treated as “the publisher or speaker of any information provided by another information content provider.”

But the plaintiffs say it’s not about the content that was published, but rather the proprietary algorithms the companies use to serve content to users.

If the automatic ranking and delivery of content is separated from the Section 230 liability shield, it will “suddenly make many, many pieces of content open to liability” and incentivize platforms that use automated ranking systems to deliver content to suppress or eliminate delivery of content that they’re worried about, said Kate Ruane, director of the Center for Democracy and Technology’s Free Expression Project. The center filed an amicus brief in support of the internet companies.

Ruane said it will be interesting to see if the court addresses whether Section 230 applies to livestreaming platforms, like Twitch—another of the defendant companies. The lower court didn’t address that in its March 2024 opinion.

“We make an argument in our brief that Section 230 does apply to those services because they’re interactive computer providers, they are publishing other content provided by other people and it should be a straightforward analysis,” Ruane said.

Proximate Cause

Plaintiffs will also have to convince the judges that the sites were the proximate causation of the shooting, said Paul Barrett, who recently retired as deputy director of the Center for Business and Human Rights at New York University’s Stern School of Business. The suits claim platforms like Discord and Facebook are defective in their design, and those defects led to the radicalization of shooter Payton Gendron.

“In this context, there is an actor whose actions are more obviously the proximate cause of the terrible harm and the consequent damages,” Barrett said of Gendron. There’s a “strong argument,” he added, that the plaintiffs “are trying to hold liable a set of parties who are active in the background, but the proper target for holding somebody civilly liable is the person in the foreground.”

The appellate could reject efforts to expand the scope of New York products liability law to cover websites and social media platforms, if it decides to overturn the lower court’s finding. But the easiest approach, Kurland said, would be for the appellate to conclude it’s premature to dismiss the case at the pleading stage and allow for more discovery to establish a full factual record.

If the court says algorithms aren’t covered under the state’s “fairly expansive tort law,” it could signal to plaintiffs in other states that such arguments are unlikely to succeed, Barrett said.

“This is not a bad state to try and bring an injury suit in,” he added. “So if New York State basically closes the courthouse door to these kinds of arguments, that sends a signal.”

The cases are Salter v. Meta Platforms Inc., N.Y. App. Div., 4th Dep’t, No. 24-00450, oral argument scheduled 5/20/25 and Patterson v. Meta Platforms Inc., N.Y. App. Div., 4th Dep’t, No. 24-00513, oral argument scheduled 5/20/25.

To contact the reporter on this story: Beth Wang in New York City at bwang@bloombergindustry.com

To contact the editors responsible for this story: Alex Clearfield at aclearfield@bloombergindustry.com; Patrick L. Gregory at pgregory@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

Learn About Bloomberg Law

AI-powered legal analytics, workflow tools and premium legal & business news.

Already a subscriber?

Log in to keep reading or access research tools.