Bloomberg Law
Free Newsletter Sign Up
Login
BROWSE
Bloomberg Law
Welcome
Login
Advanced Search Go
Free Newsletter Sign Up

TikTok Beats Suit Saying ‘Blackout Challenge’ Caused Child Death

Oct. 26, 2022, 2:16 PM

TikTok Inc. escaped a lawsuit alleging the immensely popular social media platform recommended videos of the “blackout challenge” to a 10-year-old girl who strangled herself to death attempting to replicate the challenge.

A Pennsylvania federal judge ruled Tuesday that the short-form video-sharing app owned by the China-based ByteDance Inc. was protected by Section 230 of the Communications Decency Act, a law that immunizes online platforms from lawsuits over user-generated content.

The case was part of a recent wave of lawsuits advancing a new legal argument in an attempt to get around the historically impenetrable liability shield. The lawsuit, filed in May, argued that TikTok is a product that was defectively designed by promoting blackout challenge videos in users’ feeds.

The challenge encouraged users to strangle themselves with household items and post footage on TikTok. The lawsuit said that in December 2021 Tawainna Anderson found her daughter Nylah hanging in her closet after attempting the challenge. Nylah died five days later.

Judge Paul S. Diamond of the US District Court for the Eastern District of Pennsylvania ruled that the suit ultimately pinned liability on the platform for content posted by other users.

TikTok recommended videos created by other users, which is “exactly the activity Section 230 shields from liability,” Diamond said. “The wisdom of conferring such immunity is something properly taken up with Congress, not the courts.”

Diamond differentiated the case from a 2021 Ninth Circuit ruling establishing that Snap Inc. could be held liable for a defective design of its Snapchat app. That case, Lemmon v. Snap, focused on Snapchat’s “speed filter,” which showed how quickly a user was moving and allegedly contributed to a fatal car crash involving two teenagers.

“The Court ruled that Section 230 did not apply because the plaintiff’s claims were ‘independent of the content’ created by Snapchat’s users,” Diamond said, which isn’t the case for Tawainna Anderson’s lawsuit.

An attorney for Tawainna Anderson didn’t immediately return a request for comment.

More than 80 similar product liability lawsuits targeting TikTok, Snapchat, and Meta Platform Inc.'s Facebook and Instagram were consolidated earlier this month under one docket in the Northern District of California. The Judicial Panel on Multidistrict Litigation ruled that the lawsuits contained similar questions of law relating to the Section 230 liability shield and whether apps are classified as products.

The US Supreme Court also announced this month that it would hear a case challenging whether YouTube enjoys immunity under Section 230 for its recommendation algorithms. The family of an American student who died in the the Islamic State group’s 2015 terrorist attacks in Paris alleged that YouTube allowed the terrorist group to grow and radicalize its followers.

Saltz Mongeluzzi & Bendesky PC represents Anderson. Campbell Conroy & O’Neil and King & Spalding represent TikTok.

The case is Anderson v. TikTok Inc., E.D. Pa., No. 2:22-cv-01849, 10/24/22.

To contact the reporter on this story: Isaiah Poritz in Washington at iporitz@bloombergindustry.com

To contact the editors responsible for this story: Adam M. Taylor at ataylor@bloombergindustry.com; Jay-Anne B. Casuga at jcasuga@bloomberglaw.com