- Marathon hearing to test limits of Section 230 legal shield
- State court judge already allowed similar suits to proceed
The fate of hundreds of lawsuits claiming social media platforms have caused a youth mental health crisis will soon be in the hands of a federal judge who must evaluate whether the claims can bypass the tech industry’s legal shield.
The companies behind Facebook, Instagram, TikTok, Snapchat, and YouTube have moved to dismiss the addiction cases—currently sitting at 425 lawsuits—that have been filed around the country by parents and children. Hundreds of school districts are also involved, arguing the crisis has strained their mental health resources.
The parties will convene for a marathon hearing on Friday in a federal district court in Oakland, Calif., where the lawsuits were consolidated in what’s known as a multidistrict litigation case.
The hearing comes as the tech industry endures a mountain of intense legal scrutiny from state regulators and plaintiffs’ attorneys seeking to hold social media platforms accountable for the alleged harms their products have caused young people.
The states’ complaint was assigned to US District Judge Yvonne Gonzalez Rogers, who is also overseeing the mulitidistrict litigation. Colorado Attorney General Phil Weiser said at a Tuesday press conference the states’ case against Meta could be folded into the addiction litigation because they contain overlapping evidence and legal questions.
A California state judge recently allowed hundreds of similar addiction lawsuits to proceed in state court against the platforms, finding that immunity provided by Section 230 of the 1996 Communications Decency Act wasn’t a viable defense.
Social media platforms for decades have relied on that federal statute to block lawsuits stemming from the content created and posted by users.
Rogers, who’s overseeing the addiction cases in the US District Court for the Northern District of California, isn’t obligated to follow the state court’s ruling, but the reasoning behind it could be persuasive. Regardless of the ruling, the explosive growth in legal action against the platforms is only just beginning.
“We’re in the very early stages of this issue,” said Matthew Lawrence, a health law professor at Emory University who has been following the social media cases. “Every case about tobacco addiction against the cigarette manufacturers failed for decades, until they didn’t.”
Novel Theory Tested
The plaintiffs in both the federal and state social media cases aim to advance a novel legal theory to bypass the Section 230 shield that for decades has been impenetrable in courtrooms. The 300-page master complaint filed by the plaintiffs in the federal case attempts to treat the platforms as products that use defectively designed algorithms to maximize user attention, similar to a slot machine at a casino.
“They’re taking old theories of tort law and applying them to new technologies in new contexts, which is expected,” said Elizabeth Burch, a law professor at the University of Georgia who studies multidistrict litigation.
Websites that face lawsuits over their publication of user content are traditionally protected by Section 230. The law’s immunity has faced intense criticism in recent years from lawmakers in both parties: Republicans argue it allows platforms censor conservative speech, while Democrats contend it allows platforms to host hate speech.
But the political parties have found unity in efforts to protect children online. The states’ lawsuit against Meta was brought by both Republican and Democratic attorneys general, and lawmakers in Congress have introduced bipartisan bills seeking to establish greater protections for kids online.
Digital rights groups and free speech advocates argue Section 230 laid the groundwork for the modern internet. Without its protections, the groups have said, most online platforms would shut down out of fear of legal liability.
The lawsuits’ product liability theory attempts to get around the legal shield by arguing children face harms that stem not from the content they see, but from the platform’s decisions about how it recommends content to young users.
“This case is about conduct, not content—defects, not speech,” the plaintiffs said in a brief responding to the platforms’ Section 230 defense.
Rogers initially told the platforms in the addiction case last year that she wouldn’t evaluate the Section 230 defense until the US Supreme Court issued a decision in Gonzalez v. Google, a case that tested its scope. The high court, however, punted the Section 230 question in June. At oral arguments, the justices appeared to have a difficult time distinguishing between content and conduct.
Now, Rogers will have to parse the existing case law on that question without the Supreme Court’s guidance.
“This is going to be one of the first test cases that really draws the line on the amount of interactivity that is sufficient or insufficient to overcome the protections of Section 230,” said Patrick Luff, a products liability attorney based in Texas.
Past Rulings
Plaintiffs’ attorneys frequently cite a 2021 ruling from the US Court of Appeals for the Ninth Circuit, Lemmon v. Snap, as support for their product liability theory.
In that case, the court found
But many other product liability lawsuits have failed to overcome Section 230, with judges ruling that defective design allegations are ultimately about content created by users.
The addiction suits are about the “creation and dissemination of content by others—from videos about viral pranks and online challenges, to images that allegedly invite personal comparisons, to communications with other users,” the platforms said in their motion to dismiss the federal case.
Earlier this month, Los Angeles County Judge Carolyn Kuhl ruled Section 230 doesn’t protect the platforms in hundreds of addiction cases filed in state courts. While she found the platforms aren’t “products” for the purpose of a product liability claim, she said the lawsuits did sufficiently argue the companies had been careless in designing their algorithms, a negligence theory.
Courts should be cautious “not to stretch the immunity provision of Section 230 beyond its plain meaning,” the judge said.
Section 230 advocates say the judge got the ruling wrong. Allegations of addictive recommendation algorithms are ultimately about addiction to particular kinds of content, said Jess Miers, legal advocacy counsel at the tech-aligned think tank Chamber of Progress.
“At the end of the day, its the underlying content—and that content was never identified by the court,” she said.
The case is In Re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, N.D. Cal., No. 4:22-md-03047, hearing 10/27/23.
To contact the reporter on this story:
To contact the editor responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
See Breaking News in Context
Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.
Already a subscriber?
Log in to keep reading or access research tools and resources.
