- Social media bills focus on kids’ privacy, access
- States consider ID requirements for adult sites
Dozens of proposals pending in statehouses across the country that aim to regulate a child’s experience online are raising concerns over the future of anonymity on the internet.
Lawmakers are pushing a variety of bills aimed at boosting privacy protections for kids’ personal information, limiting their access to social media without parental involvement, or keeping them off of sites that include explicit content such as pornography.
The measures would rely on companies like
“Any time you create rules that depend on age, it raises questions around how age is verified and what risks it raises,” said Samir Jain, vice president of policy at the nonprofit Center for Democracy & Technology.
The tradeoff is part of a wide-ranging debate among state and federal officials over how to keep kids safe online amid a heightened focus on potential mental health harms of social media and children’s access to sexually explicit and other adult content. President
Online services that fail to abide by the kids-focused legislative proposals would face enforcement from state attorneys general or liability for lawsuits filed on behalf of minors, with a few bills threatening fines between $2,500 and $250,000 for violations.
Some states are looking to follow California’s first-in-the nation law to strengthen privacy and safety protections for users under 18 and require companies to prioritize a child’s best interests online.
The law recognizes that kids shouldn’t be kept off of the internet and that putting the onus on parents isn’t realistic, said Nichole Rocha, the head of US affairs for 5Rights Foundation, a kids’ digital rights group that pushed the legislation and is working with other states this year. Its provisions outline that age assurance should be as minimally invasive as possible based on risk, she said.
“The code specifically sets up guardrails to make sure that it’s balanced,” Rocha said.
Gauging Age
State lawmakers backing new requirements argue that teens are largely unprotected on the internet, with federal safeguards only applying to children under age 13. Online services that screen users based on age often ask users to enter a birthdate, and children’s safety groups say kids can easily lie about their age to access popular platforms.
Legislative approaches differ in what bills would require of companies, how sites would determine a user’s age, and how precise that determination needs to be. Stricter age checks that involve ID cards or selfies could lead companies to collect user data they didn’t collect before—a tradeoff privacy advocates say states must consider in their policies.
States including Maryland, New Mexico, New York, and Oregon, are considering bills to create new privacy standards for sites frequented by visitors under age 18 and make online services consider the impact of design features like autoplay or endless scroll. Some legislation is closely modeled after California’s Age-Appropriate Design Code, which allows fines as high as $7,500 per affected child. It faces a court challenge over what opponents allege are vague rules that may conflict with federal law.
Supporters of the Maryland legislation (H.B. 901/S.B. 844) raised examples of youth mental health crises, cyberbullying, and social media addiction during an online press conference this month. The proposal would help make sure young people can safely experience the positive aspects of the internet, said Del. Jared Solomon (D), a lawmaker leading the push.
“We know, frankly, that the status quo online is not acceptable,” Solomon said.
California-Style Bills
California’s law and similar legislation direct companies to estimate a child user’s age with a level of certainty balanced against the risks of an online service and its data practices.
A dating service, for instance, would want more precise age estimations than some other online activities, but that balancing test may be ambiguous for companies, said Bailey Sanchez, policy counsel for youth and education privacy for the Future of Privacy Forum.
“There’s some gray area,” she said.
Online services previously had less incentive to look for and weed out kids on their platform. Under the federal Childrens’ Online Privacy Protection Act, platforms are only liable for protecting kids if they know that a user is under age 13. It’s led to a “don’t ask, don’t tell” attitude toward underage accounts, according to Iain Corby, executive director of the Age Verification Providers Association.
“Now with regulations coming through, that is being flipped,” said Julie Dawson, chief policy and regulatory officer at Yoti Ltd. The digital identity company offers age estimation technology that can guess how old someone is by looking at features of their face.
Services including Meta’s photo- and video-sharing app Instagram, Facebook Dating, and Yubo’s streaming platform for kids are already using this technology, which doesn’t recognize who the person is but rather how old they appear to be, usually within a year or two of their actual age. Meta says it’s been able to stop the vast majority of younger teens on Instagram who attempted to edit their birthdays to 18 or older using age verification methods.
Some age check providers are also developing ways to signal to devices that a user is a child instead of repeatedly asking for age across different apps and websites.
“If parents could tell devices the age of their child, then we could act as an age-aware signal,” said Denise Tayloe, chief executive officer and co-founder of PRIVO, a company that helps protect minors online through age verification.
Tech industry group NetChoice, which is suing to block California’s age-appropriate design code, questioned how it would apply in scenarios where a kid is using their parent’s device or account to go online, meaning a web platform would need to know who the end user is and how old they are.
“That requires perfect identification on the internet, which just can’t be done,” said Carl Szabo, vice president and general counsel at NetChoice. The group’s members include companies such as Google, Meta, and TikTok.
Meta declined to comment on the pending bills. Google and TikTok didn’t respond to requests for comment.
Data Collection
Lawmakers elsewhere are pushing bills that would more narrowly address how kids use social media. A Texas proposal (H.B. 896), for instance, would prohibit social media use for people under 18 years old.
Users would have to verify their age using a driver’s license and photo of themselves, and companies would have to delete the information afterwards. Rep. Jared Patterson (R), who is pushing the social media bill and several others, said in a statement that social media “is the most destructive product teens have legal access to in this country.”
Proposed age verification requirements faltered in the Utah House, where state Rep. Jordan Teuscher (R) removed them from a social media bill (H.B. 311) following pushback by organizations including the tech industry policy group Chamber of Progress. Age verification through identification would result in more data collection and limit free expression, the group argued.
A separate Utah state Senate proposal (S.B. 152) would still require social media companies to verify the age of users to boost parental involvement for kids online and leave the specifics to future rulemaking.
Some state bills focus less on boosting privacy protections than “putting up a gate” around social media for kids, Sanchez said. Lawmakers considering age verification should weigh how proposals could impact all internet users—not just parents and children— and how companies would be collecting and retaining information, she said.
Adult Content
Age verification is central to a Louisiana law that took effect this year to mandate use of government identification, digital driver’s licenses, or commercial methods to ensure someone is at least 18 years old to access sites, such as Pornhub, where more than a third of content appeals to “the prurient interest.” The law targeting pornography—which states that companies can’t retain identifying information—has sparked similar proposals in Arizona, Arkansas, South Carolina, and other states.
Digital rights advocates are concerned that asking for age verification to access pornography online poses risks for the privacy and security of data being requested. “There’s a question of how that data is shared or deleted or protected,” said Jason Kelley, associate director of digital strategy at the nonprofit Electronic Frontier Foundation.
Envoc, the maker of Louisiana’s digital driver’s license, says compliance with the state’s porn law only requires knowing whether a person is under or over 18 years old. The software firm doesn’t keep a record of where a resident uses their digital ID.
“We are not in the data-warehousing business,” said Calvin Fabre, president of Envoc.
In Arkansas, the state Senate this month unanimously passed a proposal (S.B. 66) that’s similarly designed to have age information “be verified and dumped,” sponsor Tyler Dees (R) said during a legislative debate. He compared the age verification requirements to purchasing cigarettes at a convenience store but noted people should protect their personal information during any online activity.
“Anybody who has a concern that any website they visit is not reputable, I would advise them not to visit that website,” Dees said.
To contact the reporters on this story:
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
See Breaking News in Context
Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.
Already a subscriber?
Log in to keep reading or access research tools and resources.