Bloomberg Law
Nov. 14, 2022, 2:00 AM

ANALYSIS: New Year Threatens §230 Immunity for Social Media

Golriz Chrostowski
Golriz Chrostowski
Legal Analyst

Until recently, social media platforms have enjoyed broad immunity from certain civil and criminal claims under 47 U.S.C. §230 of the Communications Decency Act.

The breadth of this immunity has come under fire, though, as more federal district and appellate courts are asked to interpret its limits. Now the US Supreme Court has agreed to take on the issue.

With a strongly conservative Supreme Court comprised of justices who are often textualists—and one justice in particular who’s taken aim at the immunity issue—social media platforms can expect a stricter interpretation of §230 in 2023.

Consequently, social media platforms can expect an increase in new cases where they will face liability for their product design flaws, including their powerful algorithms, potentially without the shield of §230 immunity.

Section 230 Immunity

Enacted in 1996—before the pervasiveness of social media—§230 was intended to protect minors from receiving offensive material online through “interactive computer services.” These include internet service providers and social media platforms.

Section 230 states that providers or users of interactive computer services shouldn’t be treated as a “publisher or speaker” of information provided by a third party. The statute also shields these providers and users from liability when they block (or don’t block) objectionable information provided by a third party.

By limiting liability through the statute, Congress intended to encourage the continued development of interactive computer services, which in turn, has led courts to interpret and apply §230 immunity broadly.

Most commonly, courts have granted immunity to defendant social media platforms that were sued for offensive or defamatory content posted by a third party.

However, the immunity has also been applied in other cases where the social media platforms arguably played a more active role in advancing objectionable content. This has divided circuit court judges who questioned the breadth of §230, but were also bound by precedent.

The Lemmon Case

In 2021, the Ninth Circuit took a stab at limiting the scope of §230 in Lemmon v. Snap, Inc., a product liability case where the plaintiffs alleged that defendant Snap Inc. negligently designed a feature on its social media platform, SnapChat, that recorded the real-time speed of drivers or passengers. The social appeal was that users would drive over 100 mph, overlay the speed onto a photo or video, and post it as a “Snap.”

The plaintiffs were the parents of teenage boys who died in a car accident, allegedly prompted by the speed feature. The parents claimed that prior to the accident, one of the boys opened his SnapChat app, and captured their speed at 123 mph. Within minutes, the car ran off the road at an estimated 113 mph, and crashed. The parents alleged that Snap’s negligence in designing the speed feature was a critical cause of the accident.

Snap sought a dismissal, arguing that it was entitled to §230 immunity. The trial court agreed, reasoning that the speed feature was a neutral tool—akin to a regular speedometer—that could be used for proper and improper purposes. The court found that the parents were essentially seeking to hold Snap responsible for failing to regulate how their sons used the speed feature. Under §230, Snap couldn’t be held responsible for the content created by a third party, the court said, and dismissed the case.

On appeal, the Ninth Circuit reversed the grant of immunity, and sent the case back to the trial court.

Thomas’s Heavy Hand

The Ninth Circuit’s reversal wasn’t surprising, though. While the appeal was pending, Justice Clarence Thomas issued a statement in an unrelated §230 case that the high court decided not to hear.

In his statement, Thomas cited Lemmon as one of several cases where immunity may have been inappropriately applied to a defendant’s own misconduct (i.e., the product design flaws).

Thomas has been a vocal critic of the many courts that “have construed the law broadly to confer sweeping immunity on some of the largest companies in the world.” Twice, he explained that in an “appropriate case,” the Supreme Court should consider “whether the text of this increasingly important statute aligns with the current state of immunity enjoyed by internet platforms.”

Ultimately, the Ninth Circuit held that Snap could be sued because the parents’ claim neither treated Snap as a “publisher or speaker,” nor relied on content posted by a third-party. Rather, the parents’ claim of negligent design put Snap in the hot seat as the creator of the speed feature, which was designed in such a way that could foreseeably lead to injury.

This interpretation of §230 has led to a slew of new cases, allowing plaintiffs to pursue product liability claims arising from a social media platform’s own misconduct—as opposed to that of a third party and their content.

Supreme Changes Ahead

As for Thomas, his sought-after “appropriate cases” have apparently arisen.

Recently, the Supreme Court agreed to hear two Ninth Circuit cases to resolve whether social media platforms are immune from liability under §230 when they make targeted recommendations, via their algorithms, of content provided by third parties.

In these two cases, the plaintiffs alleged that the defendant social media platforms weren’t immune from liability because—as opposed to simply letting ISIS upload videos—the platforms recommended inflammatory videos created by ISIS to specific users based upon information gathered through their algorithms. The court will likely hear the cases early next year, and issue decisions by the end of the term in June.

Predictions for 2023

The outcomes of the Supreme Court cases are foreshadowed by the outcome in Lemmon. It’s fair to say that the court will likely limit the scope of §230 immunity, allowing claims arising from a social media platform’s own misconduct (including product design flaws) to proceed.

These platforms may soon be held responsible for their role in targeted recommendations—via their powerful algorithms—of objectionable content, even if the content is provided by third parties.

Consequently, social media platforms can expect an increase in the number of cases in which they’re sued for unreasonably and foreseeably dangerous social media products that caused damages and—in some cases—death.

Access additional analyses from our Bloomberg Law 2023 series here, covering trends in Litigation, Transactional, ESG & Employment, Technology, and the Future of the Legal Industry.

Bloomberg Law subscribers can find related content on our Advanced Dockets Search and our Litigation Practical Guidance Library page.

If you’re reading this on the Bloomberg Terminal, please run BLAW OUT <GO> in order to access the hyperlinked content, or click here to view the web version of this article.

Learn more about Bloomberg Law or Log In to keep reading:

Learn About Bloomberg Law

AI-powered legal analytics, workflow tools and premium legal & business news.

Already a subscriber?

Log in to keep reading or access research tools.