Companies are struggling to navigate compliance with a wave of state age verification requirements that are butting up against protections for kids’ data.
More than 25 states have passed age verification laws for online service providers. The spree of lawmaking has put companies that have long denied having users under 13 in a difficult spot: verify the ages of users they think are adults but unlawfully collect the data of children, or run afoul of state age verification laws. As a result, companies could find themselves facing additional scrutiny from state and federal regulators as they chart next steps.
“The way things are moving, it’s becoming harder and harder to bury your head in the sand about whether or not there are kids on your platform,” said Jessica Lee, partner at Loeb & Loeb.
The federal government could help provide clarity. Federal Trade Commission Chair Andrew Ferguson said the agency was working on a policy statement regarding age verification technology, as well as a possible amendment to the agency’s Children’s Online Privacy Protection Act regulation.
“We hope to incentivize a wider adoption of age verification technology that would enable operators of online services to know whether children are visiting their website, and if they are, to ensure that the operators take the necessary steps, not only to comply with COPPA but also to impose safeguards for children at a time of rapid technological innovation,” he said at a workshop last month.
An FTC spokesperson declined to comment on the timing of either proposal.
Knowledge Burden
State legislators, who are on track to pass more age verification laws this year, and companies facing new international requirements aren’t waiting for the federal government to act. Both Discord and Roblox this month announced mandatory age verification to access some services and settings. The announcements follow a wave of other companies adopting age-gated services, even as US state laws face ongoing legal challenges.
New age-verification processes introduce a conundrum for companies. By collecting additional information about children, they may find themselves running afoul of the Children’s Online Privacy Protection Act, a 1998 federal law that requires companies targeting or aware of children on their platform to obtain parental consent to collect data from kids under 13. If they don’t, they could be subject to penalties under age verification laws.
As FTC commissioner, Ferguson criticized updates to COPPA finalized under the Biden administration for failing to make an exception to parental consent for the collection of children’s personal information “for the sole purpose of age verification, along with a requirement that such information be promptly deleted once that purpose is fulfilled.”
Both Roblox and Discord have said that data used to verify users’ ages is deleted after the verification process — something required by some state age verification laws.
Data Security
Using third-parties to collect that data, either through the upload of a government ID or biometric scan, can lead to additional data security concerns. For example, in October, a third-party service used by Discord to conduct age-related appeals suffered a breach of approximately 70,000 users and potentially their government IDs. Vendors involved in the new age assurance requirements were not included in the previous breach, Discord wrote in a press release.
Discord did not respond to a request for comment.
Even after initial verification, companies may continue to collect behavioral data on users to assess for violations.
Roblox monitors users’ behavior, such as the in-game experiences they use and who they interact with, to detect potential discrepancies with a user’s declared age, according to a spokesperson.
“You’re essentially engaging in a ton of profiling of children,” said Jessica Lipson, partner at Morrison Cohen. “I think there’s going to be a lot of questions from the regulators—what are you doing with that data, what derivative data are you actually creating, and who’s going to have access to that?”
Those questions could include whether data is used to train algorithms. The Federal Trade Commission brought previous enforcement actions against Amazon and Weight Watchers for using data unlawfully obtained from children to train their algorithms.
Companies also need to be careful about legal limitations on what sensitive data can be collected and for how long.
“A lot of the data that might be collected in the course of trying to verify someone’s age can be sensitive,” said Duane Pozza, partner at Wiley Rein. “Personal data needs to be treated very carefully.”
Pozza pointed to a crop of state laws and bills with data retention limits.
While age verification laws vary between states, most generally require companies to use “commercially reasonable” standards to ascertain a user’s age and gate their services accordingly. Companies are waiting on the FTC for a gold standard as to what that means.
“Right now you see companies taking a variety of approaches but it would be great if there was a consistent, uniform way to approach this,” said Lee.
Meanwhile, companies need to make sure they’re carefully vetting third-party vendors and conducting due diligence on new services, said Heidi Tandy, partner at Berger Singerman.
“I think one of the major things for companies to understand in terms of compliance is that ‘commercially reasonable’ does not necessarily mean perfect in absolutely every way,” said Tandy “So striving for something that is both economically feasible and also meets the needs of the company, and its investors and individual users, is extremely important.”
To contact the reporter on this story:
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
See Breaking News in Context
Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.
Already a subscriber?
Log in to keep reading or access research tools and resources.
