Welcome

Google-Flagged Child Porn Case Shows Court Split on Privacy

Sept. 24, 2021, 9:01 AM

A Ninth Circuit case concerning images that Alphabet Inc.'s Google flagged as possible child pornography highlights challenges for courts grappling with privacy protections for law enforcement searches of digital communications.

The case, United States v. Wilson, questioned whether a warrant was needed under the Fourth Amendment to search flagged images found in defendant Luke Wilson’s email account. The U.S. Court of Appeals for the Ninth Circuit ruled Sept. 21 that investigators in San Diego should have sought a warrant before viewing the images and acting on the tip.

The decision contributes to what the Ninth Circuit called “a growing tension” in federal appeals courts over how the Fourth Amendment applies to government searches of images that private sector companies like Google flag as child pornography.

Other courts, including the Fifth and Sixth Circuits, have let law enforcement officials review suspected child pornography without a warrant because their review confirmed the accuracy of the flags. The legal split even stretches across state courts in California, where Wilson was convicted and sent to prison on related child sexual abuse charges.

“We have a split in the same search,” said Jennifer Lynch, surveillance litigation director at the nonprofit Electronic Frontier Foundation. EFF and other privacy advocates submitted briefs in Wilson’s case, arguing that people expect privacy in their emails.

The state-level charges against Wilson stemmed in part from the flagged images, which led law enforcement officers to seek warrants to search his email account and his home. Officers found further evidence used in Wilson’s prosecution.

Wilson’s defense is urging the U.S. Supreme Court to step in, arguing that the initial warrantless search of the flagged images was illegal, meaning the additional evidence can’t be used against him either. Prosecutors in the federal case could also seek Supreme Court review, potentially leaving it up to the justices to decide both cases and resolve the split between them.

A representative for government prosecutors in the office of the U.S. Attorney for the Southern District of California declined to comment on the case or whether they would appeal to the Supreme Court.

Lawyers representing Google and Facebook Inc., which joined Google on a brief in the Wilson case, didn’t respond to requests for comment.

Automatic Reporting

The Ninth Circuit’s opinion shows that automated systems that big tech companies like Google use to detect child sexual abuse online “cannot automate away the requirements of the Constitution,” said Riana Pfefferkorn, a research scholar at Stanford University’s Internet Observatory, in an email.

The San Diego law enforcement task force at issue in Wilson’s case has since changed its policy so that it obtains a warrant before automatically flagged images are viewed, according to the Ninth Circuit opinion. The ruling could prompt other law enforcement agencies to rethink their procedures for investigating child sexual abuse online, Pfefferkorn said.

Tech companies should also come up with a better system for reporting child sexual abuse that protects privacy while also protecting children, according to Megan Iorio, counsel at the nonprofit Electronic Privacy Information Center.

“Automatic referral is not that,” she said. “That’s the lesson of Wilson.”

The appeals court took issue with the process followed for flagged images because no one at Google or the National Center for Missing and Exploited Children, an intermediary between tech companies and law enforcement, verified the content of the images before passing the tip along.

Human Review

The lack of human review meant law enforcement couldn’t rely on a special exception to the Fourth Amendment’s warrant requirements for situations where a person’s privacy has already been invaded by a private party’s search, the court found.

“AI screening alone is not enough in this case to support a warrantless search,” said Susan Hintze, a managing partner at Hintze Law PLLC and former in-house privacy counsel for Microsoft Corp., in an email. “The court based its decision on a volume of precedent that sets a very high bar to meet an exception for a warrantless search.”

Apple Inc. recently delayed plans for a similar system to scan user images for signs of child sex abuse after a backlash from privacy advocates who worry it could lead to tracking for other types of content. Apple intends to include a layer of human review to verify images marked as explicit before a user is reported to law enforcement.

Automation detection relies on a system known as hash matching, which detects duplicates of images previously identified as apparent child pornography.

“A picture speaks a thousand words. A hash match speaks a couple,” said Devin Burstein, a partner at Warren and Burstein who represents Wilson in the federal case. “You learn so much more private information from opening a file and looking at it.”

The case is United States v. Wilson, 9th Cir., No. 18-50440, opinion issued 9/21/21.

To contact the reporter on this story: Andrea Vittorio in Washington at avittorio@bloomberglaw.com

To contact the editors responsible for this story: Kibkabe Araya at karaya@bloombergindustry.com; Keith Perine at kperine@bloomberglaw.com

To read more articles log in. To learn more about a subscription click here.