- State AGs say Meta could have used age verification
- Suit could add to push by states to require the technology
A lawsuit brought by California and more than 30 state attorneys general against
The lawsuit alleges that, in addition to harming young users’ mental health, Meta violated federal children’s privacy law by failing to get parental consent to collect data of users younger than 13. Instead, the company relied on what the state attorneys general are calling a faulty age-gating process to keep young users off the platforms, all while turning a blind eye to evidence of significant numbers of too-young users
State attorneys general argued that Meta could have used “feasible alternative age verification methods” such as “requiring young users to submit student IDs upon registration,” according to the complaint filed on Oct. 24 in the US District Court for the Northern District of California.
Pressure from lawmakers on companies to better shield young users from harmful content has created a flood of opportunities for companies that collect forms of identification or use advanced facial recognition, known as age estimation, to verify a user’s age. But the industry’s growing reach has sparked concerns from privacy groups about the untested nature of the technology and data security concerns. Laws requiring the technology have attracted scrutiny from tech industry and civil liberties groups, leading to First Amendment challenges in courts against several states over laws requiring age verification on social media and adult websites.
“You’re going to see a very, very robust battle between the governments and Meta as to whether or not various forms of age verification are practical, deployable at scale, and effectively able to avoid compromise,” said Matthew Woods, a partner at Robins Kaplan LLP. “And that’s a technical question that I think the courts are going to have to wrestle with.”
Verification vs. Consent
The federal Children’s Online Privacy Protection Act doesn’t require platforms to verify the ages of users. Rather, the law requires platforms to obtain parental consent to collect data from users under 13 if they have actual knowledge those users are using the platform or if the platform is targeted toward children.
The attorneys general’s complaint alleges that while Meta says it doesn’t allow users under 13 to join the platform—thereby removing its requirement to collect parental consent—the company essentially ignores the presence of younger users. Meta “routinely obtains actual knowledge that users on Instagram and Facebook are under 13 years old” and targets features and advertising toward young users, making it responsible for ensuring it has parental consent, according to the filing.
“You can age-screen to determine that this person behind the screen is likely to be a child. But we also know that age-screening can be circumvented,” said Sheila Millar, a partner at Keller & Heckman LLP. “I think it becomes really challenging in that environment to determine, from a legal standpoint under COPPA, what are the limits of the online services’ responsibility?”
The argument that Meta could have used age verification to prevent under-aged user access puts the state attorneys general in the middle of a growing fight over the role of age-verification technologies in keeping minors away from harmful content.
Several states have already turned to age verification as a solution to growing concerns about how online content can affect children’s mental health. Arkansas enacted a law earlier this year requiring social media companies to perform a “reasonable” level of age verification to make sure users under 18 weren’t using their platforms without parental consent. Utah made a similar law, while Texas adopted one requiring age verification on websites featuring sexual content. Louisiana enacted requirements for both social media and adult websites. Meanwhile, California’s Age-Appropriate Design Code requires platforms to use age estimation to sort users into different groups to determine whether content is appropriate.
Industry and free speech critics say such laws violate user’s First Amendment rights. Federal judges have paused the Arkansas, Texas, and California laws as legal challenges against them proceed, while a case against the Louisiana porn law was thrown out by a federal judge earlier this month.
This week’s lawsuit against Meta makes an end-run around state legislatures and puts the question directly into the hands of the federal courts, just as lawmakers and regulators are seeking to update children’s online privacy laws for the modern age.
“The timing is really interesting in light of this line of cases we’ve gotten reaffirming the First Amendment issues with age verification,” said Bailey Sanchez, senior counsel with the Future of Privacy Forum. “It’s very clear in the COPPA rule there’s no requirement to ask user ages.”
Meta Target
The complaint’s suggestion that Meta could have turned to tactics like collecting users’ student IDs, raises serious “privacy implications,” says Sanchez.
Critics have raised concerns about how age-verification companies store and use biometric data collected from users, as well as potential concerns over the accuracy of the face-recognition technology. The Federal Trade Commission is currently considering a proposal that would allow companies to offer age-estimation tools under COPPA to verify that an adult has consented to a minor’s online use.
Though the text of COPPA doesn’t directly require any form of age verification, a court ruling against Meta could force the company—and others—in that direction. If Meta is found to have been negligent, the court could require the company to choose between strengthening its age-gating methods through means like age verification, or implementing verifiable parental consent, said Woods.
Meta may already be heading toward such a solution, regardless of how this lawsuit plays out. Last year, Instagram began testing new ways to verify users’ ages, including using technology to estimate a user’s age based on a video selfie or allowing users to upload a driver’s license.
Though the case is focused on Meta, its effects could be much broader—and other platforms with potentially underage users should be on alert, said Woods.
“I think the message here to any operator looking at this lawsuit is, if they come after Meta, they can probably come after me and I better do an audit of my procedures,” he said.
The case is People of the State of California v. Meta Platforms Inc., N.D. Cal., No. 23-cv-05448.
To contact the reporter on this story:
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
See Breaking News in Context
Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.
Already a subscriber?
Log in to keep reading or access research tools and resources.
