1. Introduction
In the digital age, user interfaces and user experience (UX) designs have become a fundamental element of commercial platforms. While these designs are developed to facilitate user interaction with platforms, it has become a common problem, especially on platforms where commercial considerations are paramount, for ethical design principles to give way to manipulative design strategies. These strategies, called dark patterns or deceptive patterns, are deliberately designed to lead to behaviors that do not align with the user’s true will.
2. The Concept of Dark Patterns
Dark patterns are design strategies that exploit cognitive illusions and behavioral tendencies, causing users to take actions they would not normally take. These designs intentionally manipulate users into purchasing or signing up for something. Typical examples of dark designs, also known as “dark patterns,” include misleading questions (trick questions), hidden costs, complicated cancellation processes (Roach Motel), boxes marked by default, and the use of manipulative colors.
Due to these misleading prompts, users often fail to grasp the full meaning of the question being asked, add unnecessary items to their online shopping carts, and even easily subscribe to a newsletter and/or account, only to later find themselves having difficulty unsubscribing. Dark patterns, in particular, often obscure advertisements on websites, encouraging users to click on them. Along with ads, dark designs can also lead users to accept website cookies without their knowledge. Similarly, prominently featuring only the Accept button in cookie banners, while presenting the Decline option in low and/or light tones at a second level, can lead users to share unwanted and non-personally identifiable data.
3. Dark Patterns and Valid Consent
Article 4 paragraph 11 of the EU General Data Protection Regulation (GDPR) stipulates that valid consent must only be based on a free, specific, informed, and unambiguous declaration of will. Furthermore, as stated in Recital 32 of the GDPR, silence, pre-ticking boxes, or inactivity cannot be considered consent. According to this recital, consent must be a freely given, specific, informed, and unambiguous affirmative action that clearly demonstrates the data subject’s acceptance of the processing of their personal data. Accordingly, pre-ticking boxes or remaining passive do not meet the elements of valid consent. In this context, the use of dark patterns systematically limits users’ free will, making it virtually impossible to obtain valid consent.
The characteristics required for consent, particularly regarding the use of cookies, are of considerable importance in terms of data protection law. The Article 29 Working Group4, which conducted research on this topic in 2013, concluded that consent: It emphasized that consent must be obtained before the start of the data processing activity, based on free will, information on a specific subject, and must be given through the active behavior of the individual, using unambiguous terms. In this context, a person’s failure to leave a particular website would not be considered active behavior. Therefore, in its Planet 49 decision (C-673/17), the CJEU ruled that pre-ticked boxes do not constitute valid consent. It would not be wrong to say that this approach is also reflected in Turkish law.
Similarly, hiding the “Reject” option in the cookie banner in the second layer or presenting it as a link practically prevents the user from withdrawing their consent, and forcing users to share unnecessary data to use the service is considered a violation of free consent according to Article 7 and paragraph 4 of the GDPR.
4. Types and Effects of Dark Patterns
While numerous studies have been conducted to classify types of dark patterns, due to the constant evolution of technology and the rapid change of user interfaces, a complete and permanent classification seems unlikely at this stage.
According to Brignull’s5 classification, one of the leading classifications, dark patterns are categorized as Comparison Prevention, Confirm Shaming, Disguised Ads, Fake Urgency, Hidden Subscription, and so on.
Another study examined how dark patterns are integrated into digital interfaces and their characteristics, and a classification was developed accordingly. Three main categories stand out in this classification. The first category encompasses dark patterns defined by clearly visible, clear, and distinct elements in the user interface. The second category is more complex and includes strategic information hiding, the consequences of which may be noticed by the user later, or practices imposed on the user in the form of coercion. The third category highlights examples of Dark Patterns, which are integrated into the system architecture of online services and operate in the background. These examples typically utilize complex algorithms or artificial intelligence to create personalized experiences that can lead users to make choices that are not in their best interest. Therefore, Dark Patterns often force users to incur additional costs without their knowledge and lead to the processing of user consent through illegitimate means.
5. Legal Assessment
The European Commission’s 2022 report addresses the obligation to provide information under the General Data Protection Regulation, emphasizing that this obligation must be fulfilled in a concise, transparent, understandable, and easily accessible manner, using clear and plain language. It also states that this obligation applies to all commercial applications involving the processing of personal data, and that personalization applications are also considered within this scope. The report states that failure to fulfill this obligation will not be considered valid under the General Data Protection Regulation and will constitute a failure to provide specific, informed, and unambiguous consent to the data subject’s requests for the processing of their personal data. The design and default settings of user interfaces must also comply with this Regulation. Therefore, the report considers fraudulent use of Dark Patterns, which aim to mislead users or maximize data collection, to be contrary to data protection principles. Similarly, the European Commission, in its guide to the interpretation and implementation of the Unfair Commercial Practices Directive published in December 2021, stated that Dark Pattern practices could be considered a breach of duty of care by traders, a misleading practice, or an offensive practice.
According to Article 5/1 (A) of the Artificial Intelligence Act (AI Act), which has become very popular recently and was published in the Official Journal of the European Union on July 12, 2024, the launch, provision, or use of artificial intelligence systems that are intended to, or have the effect of, significantly influence or distort the behavior of a person or a group of people by seriously impairing their ability to make conscious decisions is prohibited. In this context, artificial intelligence systems that use manipulative or deceptive techniques that would lead a person to decide they would not have made, and that would have a reasonably high probability of causing significant harm to themselves or another person due to such decision, are also within the scope of this prohibition.
Article 5/1 (B) of the Act states: It prohibits the launch, use, or provision of artificial intelligence systems that exploit the vulnerabilities of individuals with disabilities or vulnerable groups due to certain social or economic circumstances, thereby significantly directing or distorting their behavior, and that also have a reasonably high probability of causing harm.
In Türkiye, the term “dark pattern” is not directly regulated; however, Law No. 6502 on Consumer Protection, the Personal Data Protection Law (KVKK), the Law on the Regulation of Electronic Commerce, and the Regulation on Commercial Advertisements indirectly prohibit manipulative interface designs. Under the KVKK, designs that violate the obligation to inform and data processing without explicit consent are considered unlawful.
For instance, Articles 61, 62, and 48 of Law No. 6502 on Consumer Protection define commercial advertising and unfair commercial practices, as well as interventions based on incomplete or misleading information or that harm the economic interests of consumers, as unfair commercial practices. Similarly, in the regulation regarding interface designs introduced with the 2022 amendment to the Commercial Advertising and Unfair Commercial Practices Regulation, methods that negatively affect the will of consumers to make decisions and choices or aim to cause changes in the decision they would make under normal circumstances in favor of the seller and provider are shown as deceptive commercial practices.
6. Conclusion
Dark Patterns ostensibly obtain user consent, but they do not actually validate it. Therefore, design principles such as transparency, equal access, easy opt-out, and clear language should be adopted; consent mechanisms should be simple, understandable, and based on explicit consent. Therefore, the use of Dark Patterns should be monitored and necessary legislative regulations implemented to ensure ethical design, fair competition, and a safe digital environment that protects users.