- Novel failure-to-warn legal theory may clear hurdle
- Judge skeptical that warning would be undue compelled speech
Social media companies’ arguments that plaintiffs can’t wield a novel legal theory that would hold them liable for the design of their sites were met with skepticism on Thursday from a Los Angeles judge.
Social media giants, including Meta Platforms Inc., Snap Inc., TikTok Inc., and Google LLC, asked Judge Carolyn B. Kuhl in a demurrer hearing to scrap several plaintiffs’ non-product negligent failure-to-warn claims that seek to hold platforms accountable for youth addiction.
But Kuhl pushed back on attorney David Mattern’s argument that those kinds of claims don’t fall within an allowable scope in the state. The framework isn’t strictly limited to products, she said, suggesting that certain services can be the subject to failure-to-warn claims.
For example, California law would hold a dry cleaner—providing a service, rather than a product—to a duty to warn customers of fumes, said Kuhl, of California Superior Court, Los Angeles County.
To Mattern’s response that those claims would be encompassed by product or premises contexts, Kuhl said, “You’re just defining that there’s no such thing as services, I think. You’re slicing the salami so thin that there’s nothing between—everything’s a product.”
She also raised an eyebrow at Mattern’s argument that warnings would be compelled speech in violation of the First Amendment.
“Tobacco companies tried that, trust me,” Kuhl said. “I was litigating cases when I saw that argument. It didn’t work.”
First Amendment, Section 230
Kuhl also asked Mattern to comment on a U.S. Court of Appeals for the Third Circuit panel’s interpretation that the U.S. Supreme Court’s 2024 NetChoice ruling found Section 230—the federal shield protecting online platforms from lawsuits concerning third-party content—doesn’t bar video recommendation algorithms.
She added that her “going in proposition” is “I’m bound by Wozniak and none of this matters.”
The Third Circuit panel erred by tethering its reasoning to the view that the NetChoice decision overruled Section 230, Mattern said.
Attorney Josh Autry, for the plaintiffs, pushed hard on the companies’ claims of immunity from liability over third-party predatory behavior because of the First Amendment and Section 230.
He drew the analogy of a “Meta music festival,” where the company invited children but excluded their parents, paired up strangers, and sent them to private areas.
Autry, of Morgan & Morgan, asked Kuhl to imagine that the arrangement spurred child predation complaints, but the festival pleaded that they had no legal duty to warn attendees because “music is expressive.”
“They want a right that literally no company has,” Autry said. “They want to be unique under the law.”
The case is Social Media Cases JCCP, Cal. Super. Ct., No. JCCP5255, 10/10/24.
To contact the reporter on this story:
To contact the editor responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
Learn About Bloomberg Law
AI-powered legal analytics, workflow tools and premium legal & business news.
Already a subscriber?
Log in to keep reading or access research tools.