Social Media Trial Sparks Reckoning for Product Design (2)

March 3, 2026, 7:07 PM UTCUpdated: March 3, 2026, 9:56 PM UTC

Companies are watching to see if a bellwether lawsuit accusing Meta’s Instagram and Google’s YouTube of knowingly designing a product that harms children will force them to add guardrails around algorithms and other code deployed by online platforms.

A ruling that puts Meta and YouTube on the hook for product liability could increase legal risk for developers and companies that implement consumer-facing algorithms, including AI products.

A 20-year-old social media user sued Meta Platforms Inc., Google LLC., Snap Inc., and TikTok Inc., claiming the companies knowingly deployed addictive features she interacted with from a young age that led to her depression and self harm. At issue in the trial, brought under product liability law and currently underway in Los Angeles Superior Court, is whether the companies knew about the dangerous effects of their products and failed to warn consumers. Snap and TikTok settled outside of court before the trial. Snap declined to comment and TikTok did not respond to a request for comment.

“It’s an overdue reckoning,” said Peter Jackson, attorney at Greenberg Glusker LLP. “It’s probably the most clarion call moment where businesses that operate online need to be thinking about product design as though they were designing any other type of consumer product.”

Meta and Google have maintained in court that their products weren’t built to hook children and that they already have settings and safeguards in place to protect young users.

“The question for the jury in Los Angeles is whether Instagram was a substantial factor in the plaintiff’s mental health struggles,” Liza Crenshaw, a spokesperson for Meta, wrote in an email. “The evidence will show she faced many significant, difficult challenges well before she ever used social media.”

Google declined to comment.

The lawsuit is the first of thousands targeting companies to proceed to trial under claims of addictive technologies.

Charting New Territory

The Los Angeles judge dealt a blow to tech companies early in the case by ruling a federal law shielding online platforms from liability tied to user content does not apply.

The Communications Decency Act of 1996, specifically Section 230, protects online platforms from liability for content posted by third-party users. But in the three decades since the law passed, the internet has evolved dramatically. The federal protections have been the subject of bipartisan congressional efforts to prevent the law from protecting online abuse spurred by social media platforms. In the absence of new law, litigants are looking to courts—like the one in Los Angeles County—to step in.

“You can’t hide behind Section 230 to the same extent that you once felt assured you were able to,” said Jackson. “It’s not just the Facebooks of the world that need to be worried.”

Private lawsuits aren’t the only ways in which companies are at risk. Texas and Massachusetts sued tech companies over allegedly addictive design features that violate consumer protection laws. State lawmakers could take their cues from the wave of litigation by enacting bolder laws that would regulate design features they say harm children.

“State lawmakers are likely to be watching these social media trials closely because they could clarify whether existing tort and consumer protection laws are sufficient to address alleged platform design harms to children,” Tatiana Rice, senior director for US legislation team at the Future of Privacy Forum, said. “Even if the plaintiffs win, showing that existing law may be sufficient, lawmakers may nonetheless look to use those common law determinations and remedies as a legislative blueprint.”

Challenges to design-feature laws have relied on the same arguments as those surfacing in the civil trials, including arguments around First Amendment rights. Those challenges haven’t deterred states from pushing design feature legislation, though so far the courts haven’t let the laws take effect. A ruling shooting down companies’ defenses could provide a more solid footing for state efforts—and a whole new area of enforcement for companies to worry about.

Compliance Now

Companies that develop or use algorithms to personalize customer experience could see increased legal risk if the Los Angeles ruling goes against Meta and YouTube, said Philip Yannella, partner at Blank Rome LLP.

“There haven’t been a lot of lawsuits that have successfully argued that code is subject to strict liability law,” said Yannella. “This is charting new ground in that area. Algorithms are a fungible product to design defect principles.”

The suit could also expose companies to other legal risks, including having to disclose internal communications in court.

“The theory of liability opens a company to new avenues of discovery that could be risky and problematic,” said Genny Ngai, partner with Morrison Cohen.

Companies need to think proactively about what harms their technology may cause, said Ngai.

“There’s never been a better time to be proactive and affirmative in implementing safeguards,” she said. “It’s all about child safety and vulnerable populations. And the question is, ‘Are social media doing anything to protect these populations?’”

To contact the reporter on this story: Tonya Riley in Washington at triley@bloombergindustry.com

To contact the editors responsible for this story: Michelle M. Stein at mstein1@bloombergindustry.com; Fawn Johnson at fjohnson@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

See Breaking News in Context

Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.

Already a subscriber?

Log in to keep reading or access research tools and resources.