Certain features of the popular video-sharing app aren’t available to users younger than 18 in the UK and some other European countries, according to a version of TikTok’s privacy policy that the groups cite. The UK’s data watchdog recently began enforcing a design code aimed at ensuring age-appropriate experiences online, with similar guidance issued in countries like Ireland and the Netherlands.
“Where a local practice or policy is found to maximize children’s safety or privacy, TikTok should adopt this globally,” the US-based nonprofit Fairplay, UK-based 5Rights Foundation, and three dozen other groups said in a letter to TikTok Chief Executive Officer Shou Zi Chew.
ByteDance Ltd.-owned TikTok and other social media platforms face a patchwork of policies meant to protect children online, as proposals similar to the UK children’s code are under consideration in California and elsewhere.
The UK policy, which applies to children under age 18, directs online services to build in privacy protections by default and to explain settings in ways that children would understand. The directive has pushed platforms such as TikTok, Instagram, and YouTube to tighten their privacy controls for teenagers.
TikTok makes accounts belonging to users under age 16 private by default. By creating fake accounts for research purposes, the advocacy groups found that default privacy settings for 17 year-olds differ depending on where in the world their account is registered.
“All of TikTok’s younger users deserve the strongest protections and greatest privacy, not just children from European jurisdictions where regulators have taken early action,” the groups’ letter says.
To contact the reporter on this story:
To contact the editor responsible for this story:
To read more articles log in.
Learn more about a Bloomberg Law subscription.