Bloomberg Law
Free Newsletter Sign Up
Bloomberg Law
Welcome
Go
Free Newsletter Sign Up

Kids’ Social Media Safety Bill Must Advance, US Senators Told

June 14, 2022, 10:00 AM

A coalition of advocates for children’s mental and physical health is urging Senate Commerce Committee leadership to move forward on bipartisan legislation to make social media platforms liable for harms to minors.

The committee should vote this month on the Kids Online Safety Act, more than 100 groups led by the American Psychological Association, the Eating Disorders Coalition, Fairplay, and Mental Health America said in a Tuesday letter to its Democratic Chair Sen. Maria Cantwell (Wash.) and highest-ranking Republican member Sen. Roger Wicker (Miss.).

“The enormity of the youth mental health crisis needs to be addressed as the very real harms of social media are impacting our children today,” their letter says. It cites studies suggesting that teens report being bullied on such platforms and kids are exposed to content that can promote eating disorders.

The bill (S. 3663) hasn’t advanced since it was introduced in February, though panels of the committee have held hearings focused on protecting kids online. That includes hearing testimony from Frances Haugen, a former employee of Meta Platforms Inc.‘s Facebook who leaked internal documents showing the company was aware of social media’s negative impacts on children. Committee members also heard from representatives of Snapchat, TikTok, and YouTube.

Children’s advocates have called on social media platforms to address what they see as addictive design choices that keep kids glued to their devices and expose them to potentially harmful content.

The legislation from Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) would establish a duty of care for social media companies to protect children age 16 and younger from harms such as suicide, eating disorders, or sexual exploitation. Platforms would have to offer settings for kids and their parents to disable features such as autoplay or to establish time limits, according to a summary of the bill.

Another bill provision would require platforms to disclose how children’s personal data is used to fuel recommendation algorithms and how such data is used in targeted advertising.

Companies also would need to undergo outside audits assessing risks to minors and give researchers insight into their platforms.

“The lack of transparency into the inner workings, policies and measured impacts of these platforms must be addressed now,” the groups’ letter said.

To contact the reporter on this story: Andrea Vittorio in Washington at avittorio@bloombergindustry.com

To contact the editor responsible for this story: Jay-Anne B. Casuga at jcasuga@bloomberglaw.com