- Suit says social media app is a breeding ground for trafficking
- Judge has difficulty ‘reconciling’ conflicting legal precedent
A San Francisco federal judge overseeing a lawsuit from a woman alleging a sex trafficker used
Judge Rita F. Lin said differing legal precedents from appeals courts about the powers of Section 230 of the federal Communications Decency Act left her conflicted about whether that legal shield applies to Meta.
“I’m having a hard time reconciling these cases in some ways,” Lin said of the different rulings.
Section 230, passed by Congress in 1996, immunizes internet platforms from civil lawsuits that stem from actions and content posted by users.
The plaintiff, who goes only by Jane Doe, alleged that Meta has “knowingly created a breeding ground for human trafficking” on Instagram by failing to properly verify the identities of user accounts.
The victim alleged that in 2017 she was contacted on Instagram by another user who groomed her by gaining her trust and then publicly posted photos of her that “obviously advertised Jane Doe for sale for sex,” according to the complaint filed in 2022.
The suit was first brought in Texas state court but later transferred to the US District Court for the Northern District of California.
Meta moved to dismiss the case on Section 230 grounds, arguing that the harm in the case ultimately stems from the trafficker and his communications with the plaintiff, not from any conduct by Meta.
Lin said the Ninth Circuit case Lemmon v. Snap Inc., which found that Section 230 doesn’t apply, “is probably the best case” for the plaintiff. In that case, the parents of two boys who died in a car crash sued Snap over the SnapChat app’s “speed filter,” which shows how quickly a user is moving and allegedly caused the boys to speed while driving.
But Lin also said that a more recent Ninth Circuit ruling, Estate of Bride v. Yolo Technologies Inc., appears to go against the plaintiff’s legal theory. In that case, the family of a child who was cyberbullied and died by suicide sued YOLO, a platform that allows users to send each other anonymous messages. The appeals court found that Section 230 does apply because the family sought to hold YOLO liable for the speech of its users, even if the harm to the child occurred offline.
Meta attorney Kristin Linsley of Gibson Dunn & Crutcher LLP said the ruling in YOLO is should resolve the entire case. A “straightforward” analysis shows that the victim was ultimately harmed by the communication from the trafficker, she said. “This is in line with many other Section 230 precedents that say it’s not about artful pleading,” Linsley said.
The plaintiff’s attorney, Walter Simons of Bracewell LLP, countered that the victim is seeking to hold Meta liable for not properly verifying the identities of its users, which has nothing to do with online content. “Checking someone’s identity is not a traditional function of a publisher,” he said.
Lin said what she found most “challenging” about the plaintiffs argument was determining whether the harm ultimately stems from online content created by users, in which case Section 230 immunity would apply.
The plaintiff’s theory of liability “appears directed at the content that is produced when you have anonymous conduct, much like cyberbullying,” Lin said.
Boucher LLP, Annie McAdams PC, and Sico Hoelscher Harris LLP also represent the plaintiff.
The case is Doe (K.B.) v. Backpage.com LLC, N.D. Cal., No. 3:23-cv-02387, 11/12/24.
To contact the reporter on this story:
To contact the editor responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
Learn About Bloomberg Law
AI-powered legal analytics, workflow tools and premium legal & business news.
Already a subscriber?
Log in to keep reading or access research tools.