- Bill directs platforms to prioritize young users’ well-being
- Bipartisan proposal reflects spotlight on mental health
Sens.
The Kids Online Safety Act, reintroduced in the Senate on Tuesday, would direct social media platforms to prioritize the well-being of users under age 17 and protect them from harmful content. Blumenthal and Blackburn first introduced the bill in February 2022, but a push to pass it as part of a spending package late last year fell apart.
Since then, dozens of parents and school districts across the US have sued platforms such as
Another federal bill proposed last week would bar social media companies from allowing kids younger than 13 years old to use their platforms.
Under Blumenthal and Blackburn’s bill, social media platforms would have to adopt safeguards such as settings that restrict access to a minor’s personal information and controls for parents to supervise their kids’ use of a platform. Platforms would be required to enable the strongest safety settings by default, according to the bill. It also would let academic and nonprofit researchers access platforms’ data to study harms to minors.
Blackburn said she and Blumenthal have met with “countless parents, psychologists, and pediatricians who are all in agreement that children are suffering at the hands of online platforms.”
“Big Tech has proven to be incapable of appropriately protecting our children, and it’s time for Congress to step in,” Blackburn said.
The Senate’s Committee on Commerce, Science, and Transportation unanimously approved the bill last year. The legislation didn’t reach a floor vote before the end of the last congressional session as lawmakers debated whether to pursue more comprehensive online protections for American consumers.
The proposed Kids Online Safety Act’s backers include Common Sense Media, the American Psychological Association, the American Academy of Pediatrics, and Fairplay.
The latest version of the bill still hasn’t allayed concerns raised by Fight for the Future and other digital rights groups. They have warned that the directive to shield minors from certain content could be used to restrict access to resources for LGBQ+ youth, for example. The groups also worry that the bill’s parental controls pose a safety risk for young people facing domestic violence and abuse.
Industry group NetChoice, whose members include Meta and TikTok, questioned the bill’s approach to checking a user’s age, which typically translates to gathering more personal information. The bill’s authors incorporated language saying age verification isn’t required, but the measure still makes online platforms responsible for failing to block users under 13 who don’t get parental consent, NetChoice says.
Another group called TechFreedom similarly voiced opposition to age verfication mechanisms that could force platforms to ask all their users to confirm how old they are in a drive to weed out underage users.
To contact the reporter on this story:
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
See Breaking News in Context
Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.
Already a subscriber?
Log in to keep reading or access research tools and resources.