Dark patterns are user interfaces whose designers knowingly confuse users, make it difficult for users to express their actual preferences, or manipulate users into taking certain actions. They typically exploit cognitive biases and prompt online consumers to purchase goods and services that they do not want, or to reveal personal information they would prefer not to disclose. Research by computer scientists suggests that dark patterns have proliferated in recent years, but there is no scholarship that examines dark patterns’ effectiveness in bending consumers to their designers’ will. This article provides the first public evidence of the power of dark patterns. It discusses the results of the authors’ large-scale experiment in which a representative sample of American consumers were randomly assigned to a control group, a group that was exposed to mild dark patterns, or a group that was exposed to aggressive dark patterns. All groups were told they had been automatically enrolled in an identity theft protection plan, and the experimental manipulation varied what acts were necessary for consumers to decline the plan. Users in the mild dark pattern condition were more than twice as likely to remain enrolled as those assigned to the control group, and users in the aggressive dark pattern condition were almost four times as likely to remain enrolled in the program. There were two other striking findings. First, whereas aggressive dark patterns generated a powerful backlash among consumers, mild dark patterns did not – suggesting that firms employing them generate substantial profits. Second, less educated subjects were significantly more susceptible to mild dark patterns than their well-educated counterparts. Both findings suggest that there is a particularly powerful case for legal interventions to curtail the use of mild dark patterns.
The article concludes by examining legal frameworks for ameliorating the dark patterns problem. Many dark patterns appear to violate federal and state laws restricting the use of unfair and deceptive practices in trade. Moreover, in those instances where consumers enter into contracts after being exposed to dark patterns, their consent could be deemed voidable under contract law principles. The article proposes a quantitative bright-line rule for identifying impermissible dark patterns. Dark patterns are presumably proliferating because firms’ secret and proprietary A-B testing has revealed them to be profit maximizing. We show how similar A-B testing can be used to identify those dark patterns that are so manipulative that they ought to be deemed unlawful.
[1]
Richard H. Thaler,et al.
Design Choices in Privatized Social-Security Systems: Learning from the Swedish Experience
,
2004
.
[2]
Daniel Pichert,et al.
Green defaults : Information presentation and pro-environmental behaviour
,
2008
.
[3]
G. Loewenstein,et al.
Misplaced Confidences: Privacy and the Control Paradox.
,
2010
.
[4]
Alessandro Acquisti,et al.
Strangers on a Plane: Context-Dependent Willingness to Divulge Sensitive Information
,
2011
.
[5]
Ryan Calo.
Digital Market Manipulation
,
2013
.
[6]
Daniel J. Solove,et al.
The FTC and the New Common Law of Privacy
,
2013
.
[7]
U. Gneezy,et al.
A Reference-Dependent Model of the Price–Quality Heuristic
,
2014
.
[8]
Frank Kargl,et al.
Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns
,
2016,
Proc. Priv. Enhancing Technol..
[9]
The modern FTC
,
2016
.
[10]
Jack M. Balkin,et al.
Information Fiduciaries and the First Amendment
,
2016
.
[11]
Matthew B. Kugler,et al.
Is Privacy Policy Language Irrelevant to Consumers?
,
2016,
The Journal of Legal Studies.
[12]
Marianne Junger,et al.
Priming and warnings are not effective to prevent social engineering attacks
,
2017,
Comput. Hum. Behav..
[13]
L. Strahilevitz,et al.
Interpreting Contracts via Surveys and Experiments
,
2017
.
[14]
Brian A. Nosek,et al.
The preregistration revolution
,
2018,
Proceedings of the National Academy of Sciences.
[15]
Pseudo-Contract and Shared Meaning Analysis
,
2018
.
[16]
Woodrow Hartzog,et al.
Privacy’s Blueprint: The Battle to Control the Design of New Technologies
,
2018
.
[17]
Colin M. Gray,et al.
The Dark (Patterns) Side of UX Design
,
2018,
CHI.
[18]
Lina M. Khan,et al.
A Skeptical View of Information Fiduciaries
,
2019
.
[19]
Cherie K. Lacey,et al.
Cuteness as a ‘Dark Pattern’ in Home Robots
,
2019,
2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[20]
Midas Nouwens,et al.
Dark Patterns after the GDPR: Scraping Consent Pop-ups and Demonstrating their Influence
,
2020,
CHI.
[21]
Linda Di Geronimo,et al.
UI Dark Patterns and Where to Find Them: A Study on Mobile Applications and User Perception
,
2020,
CHI.