Plaintiffs seeking to hold online platforms accountable for facilitating crimes committed by their users are finding success using a new avenue: product liability.
Platforms enjoy broad protection from liability for users’ posts under Section 230 of the 1996 Communications Decency Act, a law that’s been credited with the rise of social media and the internet as we know it. It’s also been criticized as shielding some of the world’s largest companies when they turn a blind eye to abuse and harm perpetrated through their offerings.
Federal judges overseeing lawsuits against Omegle.com LLC and Snap Inc. came to opposite conclusions just days apart about whether those companies are protected by Section 230’s liability shield. Omegle, a website that randomly matches users to video chat, and SnapChat, the person-to-person photo sharing app, were each alleged to have facilitated the sexual abuse of a minor by an adult.
While Snap escaped its suit in Texas, Omegle will have to continue fighting a suit in Oregon, where a judge agreed that the company was being treated as the maker of a defective product, not the host of user-made content.
“These social media companies have products that are dangerously designed and should face responsibility for the harms that they cause their users,” said Carrie A. Goldberg, a victims’ rights attorney representing the plaintiff in the Omegle case.
But proponents of Section 230’s protections say the Oregon judge bungled the Omegle decision. The new product liability litigation trend, they say, could have harmful consequences for free speech on the internet and the ability of vulnerable people to speak out.
If courts start limiting protection in these cases, “Section 230 could always be taken away, including when users are using the platform in ways that are good,” said Cathy Gellis, an internet law attorney who has written amicus briefs supporting the law.
Publisher or Product Maker?
Section 230, enacted in 1996 as part of a broad set of changes to communications law, provides a safe harbor for interactive computer services that host user content. The law was intended to facilitate the growth of the internet by allowing platforms to moderate user-provided content without being treated as the publisher, which would open them up to lawsuits over users’ posts.
For more than two decades, the Section 230 liability shield had been nearly impenetrable in courtrooms. But in 2021, the Ninth Circuit ruled that Snap could be sued for the design of a feature on its app.
In that case, Lemmon v. Snap, two boys were killed in a car crash after attempting to take photos using SnapChat’s “speed filter,” which shows how quickly a user is moving. The parents sued Snap, arguing that the use of the speed filter contributed to the deaths.
The appeals court found that Section 230 didn’t apply because the parents were treating “Snap as a products manufacturer, accusing it of negligently designing a product with a defect”—not as a publisher.
The product liability argument has since gained traction among plaintiffs’ attorneys. In recent months, lawsuits filed in courts across the country have accused TikTok of encouraging young users to participate in the dangerous “blackout” challenge and claimed Facebook has caused addiction, depression, and suicide among children.
“Product liability law is a vehicle that is well known to incentivize companies to make safer products by internalizing the cost of safety,” said Matthew Bergman, an attorney with decades of experience in asbestos product litigation who founded the Social Media Victims Law Center last year. He is serving as counsel in many of the new suits.
Goldberg was one of the first to use the product liability claim in a 2017 lawsuit against Grindr, which ultimately failed on appeal in the Second Circuit. But she said “the wave of litigation that we’re seeing is basically an endorsement of that concept.”
The legal tactic has drawn skepticism. Eric Goldman, an internet law professor at the Santa Clara University School of Law, said the product analogy isn’t appropriate in the context of online platforms, which are almost always based around speech.
“We’re talking about services that help people talk to each other,” Goldman said. “It’s not an appropriate analogy, because in the end if we say that plaintiffs can decide whether speech products are properly designed, that just leads to flat-out censorship.”
In the Snap case, US District Judge Lee H. Rosenthal in the Southern District of Texas rejected the product liability theory. His July 7 decision held that Snap couldn’t be liable for facilitating sexual grooming of a high school student by a teacher at the school.
The plaintiff’s attorneys argued that Snap’s app was negligently designed to allow users to easily lie about their age to create an account and that it should have safety features that would have prevented communication between the plaintiff and the teacher.
The judge, however, said that still treats Snap as a publisher, so the company can’t be held liable.
Only a week later, though, Judge Michael W. Mosman in the District of Oregon mostly denied Omegle’s motion to dismiss similar allegations. The platform could be on the hook for helping facilitate sex abuse by allowing a minor to randomly match with a sexual predator, the judge said.
Mosman was convinced by Goldberg’s product liability argument. “Plaintiff’s case does not rest on third party content,” the judge wrote. “Plaintiff’s contention is the product is designed a way that connects individuals who should not be connected (minor children and adult men) and that it does so before any content is exchanged.”
Omegle said in a statement to Bloomberg Law that it disagrees with the court’s ruling, and that the lawsuit seeks “to hold Omegle liable for the reprehensible actions of a third-party user of the chat service,” which it said is barred by Section 230.
Goldman said he believes the judge “cut corners” and misapplied the Lemmon precedent, which Goldman said was narrowly written. The harm in this case comes from communication between the users, which is within the scope of the liability shield, he argued.
If Omegle randomly matched adults and minors, but then never allowed them to communicate, the platform could be liable under Mosman’s interpretation, which doesn’t make sense, Goldman said. “The only time harm could kick in isn’t from the matching, its from the resulting conversation.”
Curtailing Section 230 protections will also result in fewer safe spaces for vulnerable people across the internet, Gellis argued. Without the protections, platforms that don’t have perfect safety features or moderation policies will close down in the face of heightened legal liability, the internet law attorney said.
When Congress amended Section 230 in 2018 to remove the safe harbor in cases of online prostitution, the legal repercussions became “all consuming,” Gellis said. Consensual sex workers lost access to useful online tools and platforms like Craigslist’s personals section, which the company pulled down over the threat of lawsuits, she said.
“The big picture is that more people are helped by having Section 230 than hurt by having Section 230,” Gellis said.
But Goldberg said platforms should be held accountable for the “catastrophic injuries” in the cases her firm brings. “Holding Omegle liable for this kind of extreme abuse really has no impact on the free exchange of ideas on the internet,” she said. “There has to be limits.”