States Move to Protect Minors Online with Patchwork of New Laws

April 7, 2023, 8:00 AM UTC

After children spent two years online for everything from school to interacting with friends, many parents are concerned about the negative mental health effects of excessive online activity. Congress quickly picked up on the problem that the Children’s Online Privacy Protection Act—the only national law protecting the privacy of children—was outdated and insufficient to protect children online.

COPPA only regulates online collection of information from children under 13, leaving minors between 13-17 years old without any privacy protections. The law doesn’t even address the mental health or well-being of minors of any age group.

At first, it appeared that Congress would fill COPPA’s legal gaps by passing updated children’s privacy legislation—following a year and a half of missteps despite bipartisan support. In response, the states and the Federal Trade Commission have taken matters into their own hands and have started to shape the future of children’s privacy with a patchwork of laws and enforcement actions.

These laws and actions reach well beyond standard COPPA privacy protections for children under 13. All companies—even those that provide online services to an audience over 13 or a “general audience” online service—must determine if they fall under these laws. This new patchwork of laws and enforcement actions fall into the following four buckets—all that provide protections beyond COPPA.

FTC Expansion of Section 5 Powers

In the Epic Games settlement, the FTC argued that use of the interactive features on Fortnite exposed children and teens to potentially harmful topics such as self-harm and suicide. The FTC argued Epic Games’ failure to require some type of opt-in before exposing minors to this interactive feature was a Section 5 violation or an unfair and deceptive trade practice.

Through this settlement, the FTC expanded its interpretation of its Section 5 powers to include teen privacy and the protection of minors’ wellbeing and mental health online. The FTC’s purview under COPPA does not cover either of these. Therefore, the Epic Games Settlement is a huge expansion of the FTC’s power in the privacy area.

Online services—especially ones with a large teen audience—should review the Epic Games settlement to determine if their services potentially violate the FTC’s expanded interpretation of Section 5.

General Privacy Laws Beyond COPPA

States such as Virginia, Connecticut, Utah, and Colorado have passed general privacy laws that incorporate COPPA. However, many of these laws arguably are more expansive than COPPA by protecting information not covered by COPPA, such as information collected offline and information collected from parents.

A business that collects children’s information from parents or offline should determine whether its information practices comply with these laws.

Age Appropriate Design Code

Last fall, California passed the Age Appropriate Design Code Act, the first of its kind in the US—and created a new framework for children’s privacy. Instead of being triggered by collection of information like COPPA, the CAADC applies to any website that is “likely to be accessed” by children under the age of 18.

Given the broad scope of the definition of websites that are likely to be accessed by children, it is likely that many sites that consider themselves general audience sites will have to comply with the CAADC. If these general audience sites do fall under the CAADC, they will have to start to identify which of its users are under 18.

If not, they will have to provide the onerous protections required by the CAADC to all their users, which includes adjusting default settings at the highest privacy settings and providing an obvious signal whenever the online service is collecting precise geolocation information.

Several other states have introduced legislation similar to the CAADC including Oregon, Connecticut, Maryland, and Minnesota. Companies that fall under these laws should review their information practices and whether their services and features are potentially harmful to minors.

Restrictions on Minors’ Use of Social Media

Recently, Utah became the first state to pass a law restricting minors’ access to social media. The law requires minors to have parental or guardian consent to have a social media account and gives parents tools to limit a child’s usage time of their social media account. In addition, the law prohibits addictive features on the social media platforms.

The Utah law—unlike California’s Age Appropriate Design Code—also includes a private right of action for parents to sue social media platforms for emotional harm. This creates an additional threat to social media companies. Several other states are now attempting to pass similar laws, including Arkansas, Florida, Maryland, and New Jersey.

With the number of states that are considering some form of children’s privacy legislation and the FTC’s new self-imposed mandate to protect teen privacy, we expect to see an expansion of the patchwork of laws protecting minors at the state level, along with a marked increase in enforcement actions.

Given the broad approach of these laws to include teenagers, online service providers—including those typically considered general audience online services—should be reviewing their privacy protections for minors.

This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.

Author Information

Nerissa Coyle McGinn, partner with Loeb & Loeb’s advanced media and technology group, counsels companies on matters involving the convergence of advertising and technology.

Write for Us: Author Guidelines

Learn more about Bloomberg Law or Log In to keep reading:

Learn About Bloomberg Law

AI-powered legal analytics, workflow tools and premium legal & business news.

Already a subscriber?

Log in to keep reading or access research tools.