Bloomberg Law
Aug. 16, 2021, 9:01 AM

Social Platforms Feel Policy Pressure on Teen Privacy Controls

Andrea Vittorio
Andrea Vittorio

Tech companies including TikTok, Alphabet Inc., and Facebook Inc. are tightening privacy controls for teenagers as social media platforms feel policy pressure over protections for younger users.

Short-video platform TikTok, which is popular among teens, is changing privacy settings for users ages 13 to 17 to give them more control over who they share videos and messages with, the company announced in a blog post.

TikTok already makes accounts belonging to users under age 16 private by default, meaning only someone that they approve as a follower can view their videos. Features like messaging directly with other TikTok users are reserved for users who are 16 and older. The new controls for teens ages 13 to 17 build on existing measures by adding privacy prompts for video posts and downloads, along with a new default for direct messages.

TikTok’s measures come on the heels of similar moves to make teen accounts on Alphabet’s YouTube and Facebook’s Instagram more private by default.

Tech companies are likely acting in response to new design standards for children’s privacy in the U.K. and a legislative proposal targeting teen privacy in the U.S., according to Josh Golin, executive director of children’s advocacy group Fairplay.

The U.K.'s so-called age appropriate design code directs online services to build in privacy protections by default and to explain settings in ways that children would understand. Enforcement of the code begins in September.

“You can clearly see the hand of the age appropriate design code in all three of these announcements,” Golin said of the recent moves by TikTok, Google, and Facebook.

U.S. lawmakers are also showing interest in teen privacy that could lead to new rules for companies to follow. Legislation proposed in Congress earlier this year would require consent for collecting data from teenagers.

Age Ranges

In the U.S., children under age 13 are subject to the Children’s Online Privacy Protection Act. The law, known as COPPA, gives parents control over what information online platforms can collect about their kids.

Companies that violate COPPA can face fines from the Federal Trade Commission. TikTok reached a record $5.7 million settlement with the agency in 2019 for not safeguarding the data of app users who were under 13. Google’s YouTube was also subject to a $170 million children’s privacy pact with the commission.

A bill from Sens. Edward Markey (D-Mass.) and Bill Cassidy (R-La.) would extend privacy protections to teenagers ages 13 to 15. In the House, Rep. Kathy Castor (D-Fla.) has floated a bill, H.R. 4801, with similar protections for teens up to 17.

The Senate bill, S. 1628, also would raise legal expectations for social media companies to know that children are on their platform. Children’s advocates argue that stricter standards are needed, because some children lie about their age to create accounts on platforms like TikTok and Facebook’s photo-sharing app Instagram.

The U.K.'s design code, which covers children under 18, separates them into different age groups depending on their maturity level. Ages 13 to 15 are considered early teens, while those between 16 and 17 are approaching adulthood, according to the code.

TikTok’s new privacy controls are specific to teens in those age bands.

“They clearly had this code in mind,” said Phyllis Marcus, a partner at Hunton Andrews Kurth who previously led the FTC’s program for children’s privacy online. Other online platforms are expected to adopt measures like these as policymakers in the U.K. and U.S. shine a light on safeguarding children online, Marcus said.

Default Settings

TikTok’s measures are meant to help teens understand available settings, according to the post from Alexandra Evans, the company’s head of child safety public policy, and Aruna Sharma, global head of privacy.

The platform, owned by China’s ByteDance Ltd., is adding a pop-up when teens under age 16 post their first video, asking them to choose who can watch it. Teens on TikTok can continue to decide their audience for each video going forward.

“The process of making a TikTok is fun and creative—choosing music, picking effects, and getting the transitions right—but it is just as important to choose who that video will be shared with,” the post from Evans and Sharma said.

When a teen between the ages of 16 and 17 joins TikTok, their account will be set in a way that requires them to choose a sharing option before being able to message directly with other users. Teens in that age bracket will also see a pop-up asking them to confirm their settings before other TikTok users can download their videos.

YouTube is likewise making videos uploaded by users ages 13 to 17 private by default in the coming weeks, while Instagram is already defaulting teens younger than 16 into private accounts.

TikTok also announced that it’s limiting how late at night its app sends push notifications to teens.

That shift “seems to fit in with” what would be required under another piece of legislation from Markey, according to Ariel Fox Johnson, senior counsel for global policy at the nonprofit Common Sense, which researches and rates online platforms used by children.

The bill, S. 3411, would ban push alerts that encourage kids and young teens to spend more time on an app.

To contact the reporter on this story: Andrea Vittorio in Washington at

To contact the editors responsible for this story: Melissa B. Robinson at, Keith Perine at