Bloomberg Law
Aug. 9, 2022, 8:00 AM

Dark Patterns: Deceptive App Design Isn’t Good for Business

Jamie Nafziger
Jamie Nafziger
Dorsey & Whitney LLC
Bianca  Tillman
Bianca Tillman
Dorsey & Whitney LLC

It’s time to reevaluate your company’s user interfaces. Now, companies operating online not only need to present privacy policies and terms of use that are thorough, up-to-date, and legally compliant. They must also ensure that their user interfaces are not running afoul of new laws prohibiting the use of dark patterns.

Dark Patterns = Deceptive Design

You’re online shopping when, in the last step of the checkout process, you discover some unexpected charges, such as a subscription or a protection plan, suddenly added to the bill. You decide it’s finally time to cancel that expensive online subscription, only to spend 20 minutes clicking through a sub-page on a help screen to spend another 45 minutes on hold with a customer service representative with whom you must beg to unsubscribe.

These are “dark patterns”—a term describing deceptive or confusing online interface designs used to trick or pressure users into doing things they didn’t mean to do. Dark patterns harm users by hindering their ability to effectively protect their personal data and make conscious choices.

In some cases, website operators take advantage of known cognitive biases. In other cases, these manipulative tricks in software make the user experience so complicated that the user becomes fatigued and gives up on their desired outcome, such as to cancel or unsubscribe.

The Legal Landscape and Risks of Non-Compliance

Companies are starting to face legal and financial consequences for their alleged use of dark patterns. In a recent complaint filed against GrubHub in the D.C. Superior Court, the D.C. attorney general alleges that the food delivery platform’s hidden fees and deceptive tactics, such as creating microsites to trick consumers into ordering from GrubHub, are harmful to both restaurants and consumers.

In a complaint filed against a credit bureau in the Northern District of Illinois, the Consumer Financial Protection Bureau alleges that the credit bureau tricked customers into spending extra money on its services by using dark patterns, such as including a disclosure in an image that took longer to load than the rest of the webpage, and by putting information in low-contrast fine print such that it was easily missed.

Consumers who requested their free credit report from the credit bureau allegedly were asked to provide credit card information that appeared to be part of an identity verification process; instead they were allegedly signed up for recurring monthly charges.

Comprehensive data privacy laws have begun to include prohibitions on “dark patterns.” Under the new California Privacy Rights Act, which comes into effect on Jan. 1, “dark patterns” are defined as “user interface[s] designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision making, or choice.”

The CPRA, the Connecticut Data Privacy Act, and the Colorado Privacy Act prohibit the use of dark patterns to obtain consent. Companies who continue to use tactics such as confirm-shaming, trick questions, or forced continuity could face severe civil and administrative penalties.

Avoiding, Removing Dark Patterns

To minimize the risk of litigation and serious penalties, companies should review their online service offerings for dark patterns and consider making changes including the following:

  • Review your click wrap agreement and privacy consent interfaces for hidden elements. Ensure that any privacy consent and legal agreement elements are visually striking in the interface by using high-contrasting colors and underlining. Avoid highlighting the most invasive options.
  • Replace clever or catchy language with neutral and concise language. When faced with the option to “sign up for free,” the option not to sign up should be neutral and clear to the user. While it may seem cute to replace “no thank you” with “no, I don’t like free stuff!” the latter is a classic demand on the user’s emotions through “confirm-shaming.”
  • Use clear and coherent wording. Across the website, the same wording and definitions should be used for the same privacy concepts. To avoid confusion, or even worse, a “bait-and-switch,” review consents or sign-up processes for inconsistencies or the use of double-negatives that may result in users sharing more information than they would have chosen otherwise.
  • Explain consequences. When users want to give or withdraw their consent or activate or deactivate a particular data control, inform them in a neutral way of the consequences of such actions.
  • Ensure cross-device consistency. If your platform is available across various devices (e.g. desktop, mobile, app etc.), ensure that settings and information related to privacy are equally accessible and located in the same spaces across the different versions.
  • Implement a symmetrical off-boarding experience. If your users can create an account or start a subscription in just a few simple clicks, they should not have to wait on hold with a customer service representative or search through multiple subsections of your website to cancel or unsubscribe. While it’s OK to require a password or other simple confirmation step in the off-boarding process, user off-boarding should not take much more time or energy than the on-boarding experience.

With the increased focus on dark patterns by legislators and enforcement agencies, companies should assess their user interfaces for dark patterns. In addition to reducing your company’s legal risk, making the user experience clearer and easier might just produce happier customers, too.

This article does not necessarily reflect the opinion of The Bureau of National Affairs, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.

Write for Us: Author Guidelines

Author Information

Jamie Nafziger is a partner at Dorsey & Whitney LLP and Chair of the firm’s Cybersecurity, Privacy and Social Media Practice Group.

Bianca Tillman is an associate at Dorsey, and member of the firm’s Cybersecurity, Privacy and Social Media Practice Group.