- FTC seeks stronger child privacy protections
- Proposed rulemaking follows four years of comment review
The Federal Trade Commission proposed updated rules this week to limit how companies monetize children’s online data even as legislation that would further enhance the agency’s powers to protect kids continues to languish in Congress.
Advertising targeted at kids younger than 13 and the sale of their educational data would be banned if the FTC finalizes a proposed rule amending its enforcement of the Children’s Online Privacy Protection Act. The proposal would also formalize requirements for education technology providers and subject data collectors to new security standards, according to the notice published on Dec. 20.
Although the rule changes clarify and codify the FTC’s position on children’s privacy since its last update in 2013, privacy experts contend rulemaking will remain limited unless Congress passes an update to COPPA with expanded powers. Sponsors of a Senate bill to broaden the agency’s authority have expressed interest in protecting all children under 16 while establishing a new youth marketing and privacy division at the agency and tighter restrictions on social media. Privacy advocates have also asked lawmakers to close a loophole used by companies to claim non-liability when collecting children’s data—but in two years these proposals have yet to receive a Senate floor vote.
None of these expanded protections can be included in rulemaking without passage of the Children and Teens’ Online Privacy Protection Act, also referred to as COPPA 2.0, or a similar legislative proposal, said Phyllis Marcus, who led the FTC’s children’s privacy enforcement program before becoming a partner at Hunton Andrews Kurth LLP.
The proposed changes are relatively incremental, Marcus said. “Fundamentally more seismic” shifts to how children’s privacy is enforced federally could come only if lawmakers take action, she said.
Monetizing Data
The FTC’s latest proposal would prohibit companies from retaining data they collect on children “indefinitely” or from using push notifications to encourage children to log onto, or remain on, a digital platform. Industry groups that monitor data collectors for compliance with the law also would face new oversight standards.
Those changes could impact revenue streams for tech companies and content creators, said Garrett Johnson, an assistant professor studying marketing and privacy at Boston University’s Questrom School of Business. Johnson is researching the fallout of a $170 million COPPA settlement between YouTube and the FTC which forced the platform to eliminate personalized ads and settings for children. The deal, he found, spurred an 18% drop in children’s content and slashed ad revenue for those channels. One YouTuber’s ad prices tumbled 73% in the year after the agreement, Johnson said.
“Here we can see a platform that overnight shuts down precise advertising and shuts down all the personalization on its platform,” Johnson said. “As a researcher, that’s a really rich environment to try to understand the impacts of these moves that you don’t get to see otherwise.”
Sales of student data for commercial purposes, like advertising, would also be banned if the existing rule is finalized. That would force small education technology providers relying on sales of educational data to third parties to identify new streams of income, said Derek E. Baird, the chief youth and privacy officer at BeMe, a teen mental health app.
“I have advised a lot of edtech startups and the first thing I always do is say, tell me what you’re doing about privacy,” Baird said. “Sometimes there’s a complete absence of understanding that COPPA exists, so I think for smaller companies it’s going to be a rude awakening.”
The FTC laid out its compliance expectations for edtech companies via policy guidance published in May 2022. The agency’s proposed rulemaking seeks to cement those requirements as part of the enforceable rule.
The agency also expanded its definition of personal information to including biometric identifiers like fingerprints.
Transparency
Central to current COPPA enforcement are six organizations approved by the agency to certify digital platforms’ compliance with the children’s privacy law. These “safe harbor” organizations—two non-profit groups and four for-profit entities—would have to publish their membership lists and more detailed annual reports under the FTC’s proposed rule.
One company, the Entertainment Software Rating Board, voiced concern in public comments that its competitors would poach clients from membership lists.
Despite the competitive risks, greater transparency is a positive development, said Dona Fraser, a former director at The Children’s Advertising Review Unit, one of the nonprofit safe harbor organizations.
“Most businesses don’t have to do that, but there’s a level of transparency here that’s required for the public, for everyone who’s in the landscape to know who the good actors are,” Fraser said.
Publicizing safe harbor members runs a slight risk of exposing them to even greater scrutiny from consumer advocates, which could chill some from participating in the first place, she said.
The changes follow several instances of public criticism about the effectiveness of safe harbor programs from the agency itself. Rohit Chopra, the FTC commissioner at the time, called for greater oversight of the programs in May 2020. A year later the agency removed Aristotle International, Inc. as a safe harbor organization for failing to properly monitor whether its members were complying with the children’s privacy law.
“You can sort of draw a line from Chopra’s comments to Aristotle to this” rulemaking, said Laura Riposo VanDruff, a former assistant director at the FTC’s bureau of consumer protection and chair of Kelley Drye & Warren LLP’s advertising practice.
20 Questions
The FTC spent four years sifting through 175,000 public comments to inform its proposed rulemaking. But the agency’s obligations to the guardrails under COPPA may remain unfinished, said Fraser, who is now the senior vice president of privacy initiatives at BBB National Programs.
The FTC outlined 20 questions in its proposed rulemaking document for comment on a range of issues, from additional limits on advertising to changing certain definitions. The regulator’s questions indicate it’s “still not quite settled” in its approach to enforcing COPPA, Fraser said, and the volume of comments the agency receives this time around could determine whether the proposed rulemaking is finalized or headed for a second, refined version.
The additional feedback will shape next steps for the agency’s enforcement approach, and any action from Congress will only further bolster an agency focused on children’s internet safety, said Marcus, the ex-FTC officer.
“They are asking enough questions at the back end of this document about whether they should also consider other issues,” said Marcus. “That leads me to believe people are now being called on to comment not just on the text of the changes the FTC proposes, but also on all of these other questions that they raise.”
To contact the reporter on this story:
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
See Breaking News in Context
Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.
Already a subscriber?
Log in to keep reading or access research tools and resources.