California’s first-in-the-nation ban on web and app designs that make it difficult for consumers to control use of their data for advertising could usher in greater regulatory scrutiny of tech companies’ privacy practices.
Newly finalized rules issued under the state’s privacy law prohibit so-called dark patterns that restrict consumers’ ability to tell companies not to sell their data. Such patterns can also be used to nudge consumers into settings that allow for more data gathering, or to influence consumer behavior in other contexts such as online shopping.
Policymakers in other states, such as Washington, and at the federal level could be next to regulate user interface designs as attention on dark patterns shifts from academia to policymaking.
The Federal Trade Commission is holding a dark patterns-themed workshop April 29 that likely foreshadows ramped up enforcement in this space, given the agency’s track record following similar events in the past. The event could also lead to new industry standards around when user interface design crosses a line between persuasion and manipulation.
“It’s really hard to regulate because ads are designed to manipulate and products are designed to be engaging,” said Justin Brookman, a former FTC official who’s now director of consumer privacy and technology policy for the nonprofit Consumer Reports. “The question is where do you draw the line?”
The FTC has taken legal action against companies for alleged violations of consumer protection laws after workshops on other topics such as event ticket sales. Workshops have also coincided with new agency rules on issues including children’s privacy. The events can likewise lead to new self-regulating standards from industry groups, as with video game loot boxes.
Dark patterns online have gotten attention mostly in academia until recently, when research started informing tech policy, according to Colin Gray, a professor at Purdue University who studies the topic. He said the phrase first emerged about a decade ago to describe digital designs that use inferences from behavioral psychology to convince consumers to act in a certain way.
One example in the privacy context is choices on whether to allow cookies that track users online. Options for allowing more data collection are often presented as an easier or more visually obvious choice, while more restricted settings tend to take more clicks to reach.
Gray called California’s new privacy rules, issued under the California Consumer Privacy Act, an “entrance point” for policy oversight of the issue. The rules could have a national impact, given how many companies do business in California.
The state’s rules also could interplay with the FTC’s look at dark patterns, by pushing federal regulators to consider more detailed proposals for dealing with design concerns, according to Christopher Savage, a partner in Davis Wright Tremaine LLP’s privacy and security group.
“That may feed back into what regulations California puts in place” for an update to its privacy law that voters approved last year, Savage said of the FTC’s work.
California’s regulations don’t mention dark patterns by name, but they do ban businesses from using confusing language or adding unnecessary steps for consumers to opt out from the sale of their data. They come after a Consumer Reports study found opt-out processes were often complicated and confusing.
A proposed privacy law in Washington’s state legislature also takes aim at dark patterns. Consumers would need to agree to how their data is handled, and any agreement obtained through dark patterns wouldn’t qualify as consent, according to the legislative language. The language echoes wording included in the update to California’s privacy law, known as the California Privacy Rights Act, which goes into effect in 2023.
“That’s an indication that policymakers broadly speaking are paying more attention to this issue,” Savage said. “It’s likely that we’ll see more of this as time goes on.”
Washington’s bill defines dark patterns as user interfaces designed to subvert or impair consumer autonomy, decision making, or choice. Federal policymakers could follow a similar path toward defining design practices that are prohibited, Gray said.
“Privacy is one area where the FTC could come up with rules around what patterns aren’t allowable,” Gray said, adding that it’s not yet clear “how activist the FTC is willing to get about these issues.”
The commission already has used its authority under consumer protection law to bring enforcement actions against companies that have employed what could be considered dark patterns. Snap Inc., for example, settled with the FTC in 2014 over allegedly deceiving users of its popular messaging app about how much personal data it collected and making other promises that turned out to be false.
Labeling user interface designs as dark patterns would help work out which practices may violate the Federal Trade Commission Act or similar state laws. Section 5 of the FTC Act allows the agency to pursue web or app designs that are deemed deceptive or unfair. The challenge is defining which designs would raise regulatory concern.
“You know it when you see it, when there’s a manipulative design that clearly subverts user choice,” said Pollyanna Sanderson, policy counsel at the nonprofit Future of Privacy Forum. “But there’s a huge gray area,” she said, between designs that are meant to influence consumer behavior and those that are meant to manipulate.
“This is something that will have to be worked out through guidance and FTC enforcement actions,” Sanderson said.
The agency could also use its investigative powers under Section 6(b) of the FTC Act. The authority lets the commission order tech companies to turn over information, which could be used for scrutinizing how and why dark patterns are deployed in online platforms’ design.
“That would help us understand if platforms are making design decisions that they know are against the best interest of their users,” said Gus Rossi from the Omidyar Network’s responsible technology team. Rossi said tech companies use “psychological tricks and learnings” to make consumers share information without fully informed consent and in ways that aren’t always necessary for the function of an app or website.
The FTC’s dark patterns workshop could help make the argument for added legal authority over business practices that are seen as abusive, according to Woodrow Hartzog, a professor of law and computer science at Northeastern University. Hartzog said he has advocated for Congress to modify Section 5 of the FTC Act to include oversight over abusive practices, “which would squarely address a range of dark patterns.”
“A little manipulation here and there might not reach the level of policy concern,” he said. “But when you can do it consistently, then it starts drawing attention.”