The FTC’s Efforts to Label Practices “Dark Patterns” Is an Attempt at Regulatory Overreach That Will Ultimately Hurt Consumers
Over the past year, the Federal Trade Commission (FTC) has noticeably increased its use of the alarmist term “dark patterns” in press statements and speeches, creating unwarranted fear and animosity towards the tech industry. As described by former FTC Commissioner Rohit Chopra, dark patterns are “design features used to deceive, steer, or manipulate users into behavior that is profitable for an online service, but often harmful to users or contrary to their intent.” The FTC’s workshop on the topic, its recent staff report on the issue, and its numerous press releases, give the impression that dark patterns are widespread in the tech industry, but actual evidence of an unchecked problem is thin.
Consider the three dark pattern settlements the FTC announced last year. First, in September 2022, the FTC touted its $3 million settlement against Credit Karma as an example of its enforcement against dark patterns. The FTC’s complaint alleged that the company used “experiments, known as A/B testing” to discover that consumers responded more to certain offers. However, the illegal activity was not that Credit Karma tested the effectiveness of its online marketing but that the company falsely claimed consumers were “pre-approved for credit offers”—a deceptive act that clearly violates Section 5 of the FTC Act.
Second, in November, the FTC described its $100 million settlement with Vonage as another example of “customers trapped by illegal dark patterns.” But digital design had nothing to do with the illegal activity. The allegations in the legal complaint were that Vonage charged consumers without their informed consent, failed to properly disclose fees for canceling their subscription, and made it unnecessarily difficult to cancel their subscription. All of these actions are explicit violations of the Restore Online Shoppers' Confidence Act. Despite the case having almost nothing to do with digital design, the FTC still insisted on describing this as a case of illegal dark patterns in its press releases.
Finally, in one of its last moves of 2022, the FTC reached an agreement with Epic Games, maker of the popular video game Fortnite, for the company to pay $245 million to settle allegations that the company duped its players into making unintentional purchases. The FTC press release states that Fortnite had “counterintuitive, inconsistent, and confusing button configuration” which resulted in players making inadvertent purchases. That may be true, but accidental purchases, whether from user error or poor design, are fairly common in games, apps, and online retail sites and are not, on their own, illegal. Indeed, the FTC’s legal complaint details two violations of the FTC Act: first, that Epic had unfair billing practices where it charged consumers without obtaining their express, informed consent; and second, that the company unfairly penalized some consumers who disputed unauthorized charges. Neither of these complaints were based on a confusing design of Epic’s in-game purchasing interface.
To be clear, the FTC can and should bring complaints against companies that unlawfully deceive consumers and seek redress for those individuals. But its complaints against Credit Karma, Vonage, and Epic are also part of an ongoing campaign to portray the tech industry as engaging in widespread manipulation of consumers using dark patterns and insert regulators into the design of digital products. Indeed, the pejorative term “dark patterns” itself has been popularized by anti-technology activists to spread fear that technology companies are using behavioral psychology to design products that manipulate their users into performing actions that are contrary to their best interests. It is a convenient label for these activists because it can excuse inconvenient consumer behaviors. For example, if consumers consent to share personal data with an online service, click on ads, or watch videos (as they routinely do on many social media platforms), privacy activists can claim the online service manipulated them into doing so against their wishes. Indeed, privacy activists justify their calls for paternalistic data privacy laws and regulations by arguing that consumers cannot effectively exercise control of their personal data because of dark patterns.
Moreover, organizations from virtually all sectors, including private sector businesses, non-profit organizations, political campaigns, and government agencies, use behavioral insights to design and refine the user interface (UI) and user experience (UX) design of their apps, websites, and services. Using A/B testing for marketing messages, online ads, or email campaigns is a normal business practice that helps organizations communicate clearly and effectively. However, because the term “dark patterns” is so vague and subjective, critics can easily label design choices they dislike as being nefarious simply because they involved user testing.
If the FTC continues down this path of labeling data-driven design practices as potentially illegal activity and conflating illegal practices with bad design, businesses will face a legal minefield where they will face penalties for failing to anticipate regulators’ subjective analysis of their product design decisions, ultimately limiting the development of better online apps, games, and services for consumers. Moreover, if the FTC continues to promote misleading terms like “dark patterns” and “surveillance economy” to attack the tech industry, then it risks further eroding its credibility as an objective regulator. Instead of seeking to inject regulators into the design of more online services—a skillset the average regulator does not have—the FTC should keep its focus on enforcing the laws already on the books to protect consumers.