The Federal Trade Commission in December entered into two separate settlements for more than $500 million against Epic Games Inc., a developer and distributor of the popular video game “Fortnite.” The first stipulated order is for children’s privacy violations, and the second for dark patterns practices that falsely manipulate consumers.
There are two important takeaways from these enforcement actions for commercial website operators.
First, operators need to be aware of all factors that may lead to a determination that a site is directed to children under 13. And second, the actions show the FTC’s enforcement priorities against practices it considers to be dark patterns.
If companies don’t pay proper attention to how content is presented, they risk an enforcement action.
The FTC’s first order imposed a $275 million penalty for violating the Children’s Online Privacy Protection Act. This act applies to commercial website operators that have actual knowledge of collecting personal information about children under 13 or that operate sites directed to children.
Whether a site is directed to children is based on several factors, including subject matter, visual content, use of animated characters, or other child-oriented activities and incentives.
Sites subject to COPPA must, among other steps, obtain verifiable parental consent before collecting, using, or disclosing personal information from children under 13.
The FTC, in support of finding that “Fortnite” was directed to children under 13, cited a 2019 survey reporting that 53% of US children aged 10-12 played “Fortnite” weekly. The commission noted the style of “Fortnite’s” cartoon-like graphics and colorful animation. The FTC also pointed out that Epic holds approved licensing deals for “Fortnite”-branded merchandise aimed at children, including clothing, festive costumes, school supplies, and toys.
The FTC further alleged that “Fortnite” launched with no parental controls and minimal privacy settings. The FTC also said that Epic engaged in unfair practices that put children at risk by enabling voice and text chat by default, which resulted in harm that included threats, bullying, and sexual harassment.
Epic agreed to delete all personal information associated with “Fortnite” users unless there is verifiable parental consent or information that the user is 13 or older. The company must also adopt strong privacy default settings for minors, ensuring that voice and text communications are off by default.
Under the second order, Epic will pay $245 million to refund consumers for its dark patterns and billing practices.
The FTC has publicly identified dark patterns as an enforcement priority. In September 2022, the FTC released a report summarizing concerns that companies are increasingly using sophisticated design practices, known as dark patterns, to trick or manipulate consumers into buying products or services or provide their personal data.
The report reflects the FTC’s findings that dark patterns are used in a variety of industries and contexts, including e-commerce, cookie consent banners, children’s apps, and subscription sales. Unlike neutral interfaces, dark patterns often take advantage of consumers’ cognitive biases to steer their conduct or delay access to information needed to make fully informed decisions.
The FTC’s research noted that dark patterns are highly effective at influencing consumer behavior. Dark patterns include disguising ads to look like independent content, making it difficult for consumers to cancel subscriptions or charges, burying key terms or junk fees, and tricking consumers into sharing their data.
Because dark patterns are covert or otherwise deceptive, many consumers don’t realize they are being manipulated or misled.
While Epic did allow users to cancel or undo charges for certain in-game items, it used design tricks or dark patterns to deter consumers from cancelling or requesting refunds for unauthorized charges.
What to Know Going Forward
Commercial website operators, including those designing new services and apps, should pay close attention to factors that may lead to a determination that the site is directed to children under 13, including the subject matter, visual content, and use of animated characters or child-oriented activities and incentives.
Sites directed toward children under 13 must comply with rigorous COPPA requirements, including obligations to obtain verifiable parental consent before any personal information is collected from users. Non-compliance can lead to fines of up to $46,517 for each violation, injunctive relief, including deletion of unlawfully collected data, audit/ oversight orders, reputational harm, as well as other legal and business consequences.
Second, the FTC is cracking down on dark patterns. The legal analysis is performed with some element of subjectivity, and it will require careful scrutiny.
These trends align with similar regulatory attention emerging at the state level. This includes restrictions under California privacy laws regarding the use of dark patterns in the context of privacy choices.
In addition, there is the EU’s General Data Protection Regulation where the European Data Protection Board issued detailed guidelines aimed to curb the use of dark patters.
For these reasons, and in accordance with the FTC report, companies should pay close attention to how content is presented, default settings deployed, opt-out options accessed, as well as other features and functionality.
Pre-launch diligence should include a legal review designed to identify and mitigate potentially unlawful design patterns for existing or new products, and document suitable due diligence and risk management.
This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.
Write for Us: Author Guidelines
Brian Hengesbaugh is a partner and chair of Baker McKenzie’s global privacy and security practice group. He focuses on global data privacy and data security issues in business transformations, compliance activities, and incident response/ regulatory inquiries.
Harry Valetk is a partner in Baker McKenzie’s global privacy and security practice. He advises global organizations on privacy and data security compliance requirements.