Instagram’s automated software systems recommended that child “groomers” connect with minors on the app, making it easier for them to find victims, according to a 2019 internal document presented in court by the
The
The company found that 7% of Instagram follow recommendations made to all adults were minors. The report, titled “Inappropriate Interactions with Children on Instagram,” was shared among company executives in June 2019. It was presented in federal court on Tuesday as part of the FTC’s antitrust lawsuit against Meta.
The report also included an analysis of 3.7 million user reports flagging inappropriate comments to the company. Meta, which was called Facebook at the time, found that about one-third of those reports came from minors. Of the minors who reported an inappropriate comment, 54% were reporting an adult.
Earlier in the trial, the FTC offered evidence that Meta Chief Executive Officer
“Out-of-context and years-old documents about acquisitions that were reviewed by the FTC more than a decade ago will not obscure the realities of the competition we face or overcome the FTC’s weak case,” a Meta spokesperson said in a statement.
The company added that it has “long invested in child safety efforts,” and in 2018 began work to restrict recommendations for potentially suspicious adults and encouraged the National Center for Missing and Exploited Children to expand its reporting process to include additional grooming situations it noticed.
Read more:
Lawyers for the FTC surfaced the internal data as part of an argument that Meta’s acquisition of Instagram ultimately harmed consumers. Government lawyers have used emails and other internal documents, including testimony from Instagram founder
Earlier in the trial, Systrom argued that
The FTC surfaced more emails and documents Tuesday that supported that theory. In May 2018,
The FTC painted a portrait of a company reluctant to do so over several years. In a different exchange from February 2019, Rosen wrote in an email that he relayed his concerns that Instagram was being underfunded to Zuckerberg during a planning meeting about increasing company headcount. After speaking with Zuckerberg, Rosen concluded the resource allocation “was deliberate.” Zuckerberg thought Instagram had another year or two to catch up to Facebook and didn’t think the app needed as many resources. “I think we are not sure that’s the case anymore,” Rosen said.
An internal presentation titled “Instagram Well-being H1 2019 - planning” — a planning document for the first half of 2019 — acknowledged that Instagram’s integrity team was thin relative to the scope and importance of the work. Given resource limitations, “we will not be doing major proactive work” in areas like harassment, financial scams, credible threats of violence, impersonation, prostitution and sexual solicitation and forms of child exploitation, the presentation said.
Rosen, when cross-examined by Meta’s lawyers, said it wouldn’t be fair to say that Meta starved the Instagram integrity team, and that Zuckerberg was aligned with him in the need to support Instagram. He believes that nobody in the industry invested as much or prioritized these challenges as much as Meta.
“We’ve grown substantially,” Rosen said.
The company in September launched Instagram Teen Accounts, which have protections to limit who can contact teens and are private by default. “They’re in the strictest messaging settings, meaning they can’t be messaged by anyone they’re not already connected to,” Meta said in a statement, noting that teens under 16 need a parent’s permission to change the settings.
The company also pointed to technology it introduced in 2021 that helps “identify adults accounts that had shown potentially suspicious activity, such as being blocked by a teen, and prevent them from finding, following and interacting with teens’ accounts.” The suspicious accounts don’t get recommendations to follow teens, “or vice versa.”
(Updates with new details from testimony starting in the second paragraph)
To contact the reporters on this story:
To contact the editors responsible for this story:
Sara Forden, Michael Shepard
© 2025 Bloomberg L.P. All rights reserved. Used with permission.
Learn more about Bloomberg Law or Log In to keep reading:
Learn About Bloomberg Law
AI-powered legal analytics, workflow tools and premium legal & business news.
Already a subscriber?
Log in to keep reading or access research tools.