“DARK PATTERN”: HOW DOES THE LAW DEAL WITH THE EXPLOITATION OF OUR COGNITIVE BIASES?

While the phenomenon of addiction to social networks and screens has entered everyday language, the United States has recently made the choice to regulate the use of techniques that adapt digital interfaces to our cognitive biases in order to capture our attention, cause a feeling of frustration, pleasure, or social gratification. This news raises questions about the advisability of national or even European regulation of the exploitation of our cognitive biases by designers of digital interfaces.
The adoption of the new California Consumer Privacy Act on 15 March 20211 made California the first state to ban the use of” Dark Patterns ” by digital interface designers. June 8, 20212 the State of Colorado followed suit and the version currently under discussion of the Washington Privacy Act3 provides for similar provisions against” Dark Patterns ”.
There are many who have never heard of the” Dark Patterns ”, this term was used for the first time in 2010 by a designer” User experience ” from London, Harry Brignull to refer to user interfaces that exploit human cognitive biases so that a user makes choices without being aware of them.
Cognitive biases are falsely logical, unconscious, and systematic human thought reflexes. Originally, their function is to allow the human brain to save time and energy by developing mental shortcuts. The concept of cognitive bias was first introduced by psychologists Daniel Kahneman (Nobel Prize in Economics in 2002) and Amos Tversky to explain certain tendencies towards irrational decisions in the economic field. Since then, a multitude of biases operating in several fields have been identified by research in cognitive and social psychology.
The” Dark Patterns ” rely on several techniques including the” Nudges ” who adapt a company's communication to our cognitive biases in order to capture our attention, cause a feeling of frustration, pleasure, or social gratification and lead us to deliver our personal data without thinking or making compulsive purchases.
These techniques belong to the science of “captology”4 are not new and have long been used by advertisers to sell their products and services. However, digital technology is changing the situation considerably since it allows algorithms to adjust in real time and continuously to our reactions and therefore to adapt to them through feedback. As early as 2016, Tristan Harris, a former Google engineer, warned about the practices of intercepting signals perceived by our brains, used by large groups in Silicon Valley in order to manipulate us.5.
To be convinced of this, it is enough to observe the resurgence of addictive behaviors caused by platforms and social networks, which have given rise to multiple reports warning about the deleterious effects of overexposure to screens.6 in recent years. In addition, the market for exploiting human cognitive biases is booming and some companies do not hesitate to offer this type of service without any ambiguity, like Dopamine Labs, which offered its customers,” Connect your smartphone application to our “persuasion AI”, and increase engagement and revenue by 30%, by giving your users our perfect dopamine shots ”7.
One of the first goals of using captology techniques within interfaces operated by intelligent algorithms is to capture our attention without our knowledge, because the more time we spend on an interface, the more likely we are to spend on it. It is important to note that behind the capture of our attention lies not only the chances of profits for manufacturers but also a risk of deregulation of our ability to identify stimuli in the environment, respond to them, sort them out, sort them out, ignore them or focus on some of them. It is both our ability and our inability to read and react to the world around us that is at stake.8.
These phenomena are currently not subject to any global regulation. However, the CNIL raised the subject of” Dark Patterns ” in its “Cahier IP Innovation & Prospective No. 06" of January 18, 20199 in which the question of the use of captology techniques included in the interfaces is addressed.
The example of recent changes adopted in the United States shows us that a movement towards a regulation of the use of captology techniques within interfaces operated by intelligent algorithms is at work. In Europe, it is already possible to observe the emergence of two trends (I), one sanctioning the exploitation of our cognitive biases on the basis of consumer law and the second (II) arguing for the creation of a “right to the protection of attention”.
I) The penalty for the exploitation of our cognitive biases on the basis of consumer law.
Like the new Consumer Privacy Acts adopted in the states of California and Colorado, it is on the basis of consumer law that the protection of users could be ensured in order to prevent the ergonomics of digital services from creating a significant imbalance to their detriment by biasing their consent.
On February 5, 2020, a bill to guarantee free consumer choice in cyberspace10 was adopted unanimously by the Senate Committee on Economic Affairs. This bill aims in particular to integrate a chapter on the sincerity of user interfaces into the Consumer Code in order to fight against the use of” Dark Patterns ” by ensuring the clarity and readability of these interfaces. According to the Committee, this amendment is in line with the spirit of the Consumer Code which, in article L211-1, states that
” The terms of contracts offered by professionals to consumers must be presented and written in a clear and understandable manner ”.
The emerging litigation concerning the control of unfair terms in the general conditions of use of certain social networks makes it possible to corroborate the trend of sanctioning the exploitation of our cognitive biases on the basis of consumer law.
Recently, the Tribunal de Grande Instance de Paris deemed unlawful a number of clauses contained in the “Terms of Use” and the “Privacy Rules” proposed to users as part of the subscription to Google and Twitter in two decisions.11.
These two decisions shed a decisive light on the relationship between consumer law and personal data protection law. On the one hand, the court confirms the application of article L212-1 of the Consumer Code relating to unfair terms notwithstanding the onerous or free title of the contract and the fact that it is addressed to consumers but also to professionals. On the other hand, the court considers that, as data controllers, the platforms are subject to the provisions of Law No. 78-17 of January 6, 1978 “Informatique et Libertés” and that the principles of this law are in perfect harmony with the objectives of consumer law, which aim to protect consumers.” in their various consumer activities ”.
There therefore seems to be a desire, both legislative and judicial, to approximate legislation on personal data and consumer law in order to punish abusive practices exploiting our cognitive biases. The next developments in consumer law will therefore be closely monitored by designers of digital interfaces, as will news in the field of personal data protection.
II) The penalty for the exploitation of our cognitive biases through the creation of a “right to the protection of attention”.
In 2019, an article published by Célia Zolynski, Marylou Le Roy and François Levin entitled” The attention economy captured by law ”12 throws a spanner in the water by arguing for the creation of a right to the protection of attention. The impetus for a new right to the protection of attention continued with the publication of the consultation summary of the Etats Généraux du Numérique in May 2020, which mentions the creation of a “right to control attention”.13. Finally, on April 15, May 12 and June 2, debates-workshops were held on the subject of “How can digital law regulate the attention crisis?” ”.
The recognition of a right to the protection of digital attention is based on rules that are already well known, such as the loyalty of platforms. This obligation requires operators of digital platforms to provide users with complete information on potentially addictive content, which could include information on the captology processes used by the platforms. The obligation to provide information could also provide access to the digital attention-capture metrics used by platforms to provide the user with perfect information.
Once the user is informed of the presence of” Dark Patterns ” on the digital interface, he could be given the possibility of customizing the parameters of the service used, in the same logic as that of personalizing the collection of personal data and the deposit of cookies on websites.
Continuing on the model of the European Data Protection Regulation, which advocates the” Privacy by design ” which consists in adopting organizational and technical measures at the design stage that make it possible to ensure the compliance of data processing, the design of digital interfaces could be achieved in compliance with a principle of” Ethics by Design ”. This design method would result in the development of interfaces that respect the attention of its users.
Finally, again in line with the principles governing the processing of personal data, a principle of the purpose of capturing attention could be envisaged (in particular the collection of attention for specific, explicit and legitimate purposes and the absence of further processing in a manner incompatible with the purposes originally intended).
So, as businesses design their customer and user interfaces, as well as their requests for consent, it seems wise to take into account the growing attention paid to the issue of” Dark Patterns ” and to the exploitation of our cognitive biases on digital interfaces by regulators, legislators and researchers.
Link to the article published on Village Justice
[1]https://www.theverge.com/2021/3/16/22333506/california-bans-dark-patterns-opt-out-selling-data
[4] The term “captology” was invented in 1998 by B.J. Fogg at Stanford, popularized in his book published in 2003 “Persuasive Technology, How can computers change what people think and do?” ”
[6]https://www.academie-sciences.fr/pdf/rapport/appel_090419.pdf https://cnnumerique.fr/files/uploads/2020/CNNum%20-%20EGNum%20-%20Surexposition%20aux%20e%CC%81crans.pdf
[7]https://usbeketrica.com/fr/article/dopamine-labs-renforcer-addiction-smartphones-notre-bien
[8] The sociologist and member of the Academy of Technologies as well as the National Academy of Medicine Gérald Bronner offers a summary of the “deregulation of the cognitive market” in his book “Apocalypse Cognitive” Puf, 2021
[11] TGI Paris, February 12, 2019, Google/UFC-Que Choisir, no. 14/07224 and TGI Paris, August 7, 2018, Twitter/UFC-Que Choisir no. 14/07300
[12] The economy of attention captured by law — Célia Zolynski — Marylou Le Roy — François Levin — Dalloz IP/IT 2019.