Bloomberg Law
Sept. 5, 2019, 3:35 PM

YouTube Settlement Offers Lessons for Child Privacy Compliance

Sara Merken
Sara Merken
Reporter

Website platform operators can look to the Federal Trade Commission’s settlement with Google over alleged violations of children’s privacy law for guidance on best practices for staying within the law, attorneys said.

YouTube, Google’s video-sharing site, agreed to pay $170 million to resolve allegations by the FTC and the New York Attorney General that the company violated the agency’s Children’s Online Privacy Protection Act (COPPA) Rule. The settlement, announced Sept. 4, requires YouTube to change some of its business practices beyond what’s required under COPPA, such as a mandate to develop a system for channel owners to identify their child-directed content on YouTube.

The case marks the first time the FTC has held a platform liable under COPPA for content posted by someone else, Andrew Smith, the director of the FTC’s bureau of consumer protection, said at a news conference.

The settlement “sends a strong message to children’s content providers and to platforms about their obligation to comply with the COPPA Rule,” FTC Chairman Joseph Simons and Republican Commissioner Christine Wilson said in a statement.

The action comes as U.S. regulators boost scrutiny of big technology companies and how they collect and use consumer data. Google and YouTube will pay $136 million to the FTC—the largest amount the commission has ever received in a COPPA case—and $34 million to New York.

The FTC and New York alleged YouTube collected personal data from kids under 13 without parental consent, in violation of COPPA. YouTube collected “persistent identifiers"—codes that track users over time and across different services—to serve targeted advertisements to viewers of child-directed channels, the regulators said.

For platform operators, the settlement sends a message to “look at how you’re interacting with channels or content creators publishing on your site,” Kandi Parsons, an attorney at ZwillGen PLLC and former senior staff attorney in the FTC’s division of privacy and identity protection, said.

‘Actual Knowledge’

Platforms are generally not responsible for the child-directed content on their sites unless they have actual knowledge that they’re collecting personal data from users of a child-directed site, attorneys said.

The regulators claimed that YouTube did have actual knowledge, because it marketed itself as a “top destination” for kids, among other reasons.

Sites that are popular with children “should be on notice now, after this case, that they too could be held to have actual knowledge they’re hosting kid-directed content,” said Phyllis Marcus, a partner at Hunton Andrews Kurth LLP and former leader of the FTC’s children’s privacy enforcement program.

Companies and platforms should be mindful of any differences between what they say in their marketing efforts and in their terms of service, attorneys said.

“Be careful how you describe your site in your marketing materials, as these descriptions could come back to hurt you, as they did against YouTube,” Allison Fitzpatrick, a partner at Davis & Gilbert LLP, said.

Companies that know their platforms include child-directed content also can’t “hide behind the language in [their] terms of service,” Fitzpatrick said. Those companies should “implement a system, such as an age-gate, to help guard against the collection of personal information from children on the areas of the site that are not intended for children,” she said.

To contact the reporter on this story: Sara Merken in Washington at smerken@bloomberglaw.com

To contact the editor responsible for this story: Keith Perine at kperine@bloomberglaw.com