Apple Ignored Child Sexual Abuse Material on iCloud, Suit Says

Aug. 14, 2024, 4:15 PM UTC

Apple Inc. allowed the storage and distribution of child sexual abuse material on its iMessage and iCloud products under the pretense of privacy protections, according to a proposed class action.

Apple knew it had a “dire CSAM problem but chose not to address it,” according to a complaint filed Tuesday in the US District Court for the Northern District of California.

Instead, the tech giant engaged in “privacy-washing,” the complaint said, touting its commitment to protecting users’ privacy without implementing practical measures to do so.

Apple wielded privacy as “a justification to look the other way while Doe and other children’s privacy is utterly trampled on through the proliferation of CSAM on Apple’s iCloud,” according to the complaint, which was brought on behalf of an unnamed 9-year old plaintiff.

The child, referred to as Jane Doe in the complaint, was coerced to produce and upload CSAM on her iPad, the filing said. The North Carolina resident was communicating with unknown users on Snapchat before they sent her videos containing child sexual abuse via iMessage.

The individuals asked the child to record and upload her own videos, according to the complaint.

Apple’s privacy policy states that it can screen and scan content to root out CSAM, but it failed to do so in practice, the complaint said.

In 2021, the tech giant announced a new CSAM scanning tool, NeuralHash, that would scan images for CSAM. The project was abandoned in 2022 over concerns of potential “bulk surveillance,” according to the complaint.

Other technology providers like Meta Platforms Inc., Snapchat Inc., and Google rely on detection techniques like PhotoDNA to identify CSAM content, the complaint said.

Apple chose not to adopt industry standards for CSAM detection and instead shifted the responsibility onto users, according to the complaint.

“Apple relies on ‘privacy’ as an excuse to not invest in safety, but it is a false narrative,” it said. “Privacy and safety need not be mutually exclusive.”

Apple didn’t immediately respond to a request for comment.

The complaint cited text messages unearthed during discovery in Epic Games v. Apple which show Apple’s anti-fraud chief Eric Friedman telling a colleague that Apple had become “the greatest platform for distributing” child sexual abuse material.

The complaint accuses the tech giant of violating sex trafficking and consumer protection laws, as well as including claims for breach of contract, misrepresentation, and unjust enrichment. The plaintiff seeks injunctive relief, compensatory and punitive damages, and attorneys’ fees and costs.

The putative class action also asks for Apple to adopt measures to prevent the storage and distribution of CSAM on its products, to ensure that its privacy policy matches its practices, and to create reporting mechanisms for users to flag inappropriate images.

The Buche Law Firm PC and Eisenberg & Baum LLP represent the unnamed plaintiff.

The case is DOE v. Apple Inc., N.D. Cal., No. 5:24-cv-05107, complaint filed 8/13/24.

To contact the reporter on this story: Cassandre Coyer in Washington at ccoyer@bloombergindustry.com

To contact the editor responsible for this story: James Arkin at jarkin@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

See Breaking News in Context

Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.

Already a subscriber?

Log in to keep reading or access research tools and resources.