The company’s latest feature comes as lawmakers have gotten more vocal about protecting children online. Testimony last year from a whistleblower at Meta’s Facebook that the company was prioritizing profit over the health and safety of kids prompted a flurry of legislative action on Capitol Hill to hold big tech companies accountable.
New underage Instagram users will be defaulted to the “less” version of the platform, which reduces sexual, graphic, and violent content that doesn’t violate community guidelines but is considered inappropriate for minors. They still maintain the option to select the “standard” version of the platform. Individuals must be at least 13 years old to use Instagram.
Existing underage Instagram users will have a choice between the “less” or “standard” version, but the company said it will send a prompt encouraging them to opt for the restrictions.
“This is an exercise to empower teenagers and it’s constantly iterating. If we don’t see great adoption, we can change how this works,” said Jeanne Moran, policy communications manager at Meta. The company wouldn’t change the feature, only its approach to making teenagers aware of it, Moran added.
Earlier: Instagram Chief and Lawmakers Clash Over App’s Real World Harms
The Senate Commerce, Science, and Transportation Committee last month unanimously advanced a measure (
The committee also advanced a second proposal (
Another comprehensive privacy proposal (
Read More: Senators Advance Kids’ Privacy Bills as Bigger Effort Pauses
Meta says it builds defaults, protections, and tools for kids that complement legislative proposals, though the company hasn’t taken an official position on any bill.
Instagram Head Adam Mosseri answered senators’ concerns at a hearing last year by saying the company is developing “a new experience” for kids that will make it harder for them to encounter sensitive content.
‘More Needs to Be Done’
Josh Golin, executive director of kids advocacy group Fairplay, said Instagram’s effort won’t lead to significant change.
“The choice to see extreme or harmful content on Instagram should not be left up to the children using the platform — the adults who run Instagram ought to take on that responsibility themselves,” Golin said.
Jim Steyer, founder and CEO of kids advocacy group Common Sense Media, called it a delayed step in the right direction.
“Defaulting young users to a safer version of the platform is a substantial move that could help lessen the amount of harmful content teens see on their feeds,” Steyer said. “However, the efforts to create a safer platform for young users are more complicated than this one step and more needs to be done.”
To contact the reporter on this story:
To contact the editors responsible for this story:
To read more articles log in.
Learn more about a Bloomberg Law subscription.