Legal Tips with Richard Sheinis: Beware Using Dark Patterns
By Richard Sheinis | August 30, 2023
Dark patterns in advertising are nothing new. We are all familiar with retail stores strategically displaying their products so you will buy more than just the item for which you initially entered the store. Why do you think grocery stores place racks of gum, mints, and candy at the checkout line? While waiting to pay for your groceries, why not add a candy bar to your other purchases?
When it comes to data privacy, the latest laws are prohibiting the use of dark patterns to subvert user free autonomy or decision making. The California Privacy Rights Act defines a dark pattern as “… a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice.” The concern is that the consumer will be “social engineered” into making decisions about their personal data that they would not otherwise make, or that are adverse to their own interests. The Connecticut Data Privacy Act and the Colorado Privacy Act also prohibit the use of dark patterns to obtain personal data.
Although the General Data Protection Regulation (GDPR) in the European Union does not specifically ban dark patterns, Supervisory Authorities recently found that using an “Accept All” button for cookies, without also providing a “Reject All” button, vitiated the user’s consent by encouraging users to accept all cookies.
The new European Digital Services Act (DSA), however, addresses dark patterns in Article 25: “Providers of online platforms shall not design, organize or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions.”
The Federal Trade Commission (FTC) has taken note of dark patterns used to obtain personal data and will take action against companies that use tactics designed to trick consumers. In fact, the FTC has said dark patterns are “squarely on the FTC’s radar.” In an action against smart-TV maker Vizio, the FTC alleged Vizio enabled default settings allowing the company to collect and share consumers’ viewing activities with third parties, only providing a brief notice to some consumers that could easily be missed.
Examples of dark patterns in data privacy include:
- Default settings that lead to the collection or use of personal data greater than necessary to provide the product or service requested or in a way the consumer did not expect.
- Making a privacy policy or terms of use difficult to find or understand.
- Toggle options that make one option more prominent than another. (In the cookie example noted earlier, the toggle used to select or allow certain cookies should not be set to a default “on” position.)
- When consent to process personal data is needed, using confusing language so the consumer is not sure what they are consenting to.
- Tactics which make it difficult for the consumer to withdraw consent or cancel an account. (Sometimes called a “roach motel” because it is easy to get in but difficult to get out, a well-known tag line from an old commercial for a pest control product!)
There are other types of dark patterns, but you get the idea. They all involve some type of deception or tactic to influence the consumer to acts in a way that is beneficial to the company without concern for the privacy of the consumer. Steering clear of using dark patterns should not be difficult. To avoid using dark patterns, even unintentionally, be transparent, say what you do and do what you say. Be open and honest in data collection and use practices.




