On 1 December 2022, FTI Consulting Brussels and The Institute for Competitiveness (I-Com) will host the roundtable “Protecting EU consumers in the digital age” to discuss the Commission’s “Digital fairness – fitness check on EU Consumer Law” initiative.
The fitness check will evaluate three fundamental pieces of EU consumer policy legislation: the Unfair Commercial Practices Directive (UCPD), the Consumer Rights Directive (CRD), and the Unfair Contract Terms Directive (UCTD) to understand whether horizontal consumer law instruments remain adequate for consumer protection online, by looking at issues including dark patterns, personalisation practices, influencer marketing, the marketing of virtual items and the addictive use of digital products.
Ahead of the publication of the public consultation on the EU Consumer Law fitness check and in parallel with the 2nd Annual Digital Consumer Event, there is an opportunity for relevant stakeholders to discuss consumer protection in the digital environment.
This review on dark patterns is the first in a series that FTI Consulting and I-Com are co-developing to inform the public debate ahead of the event.
The concept of dark patterns refers to commercial practices, such as interface design elements, that affect the ability of users to make autonomous choices. While the meaning of the concept is still widely debated (see Lupianez-Villanueva et al. (2021) – section 2.1), in the recently adopted Digital Services Act (DSA) dark patterns are defined as, online interface design elements that either deceive, manipulate, or otherwise materially distort/impair the users’ ability to make free and informed decisions (Art. 25). Concrete examples of this type of interface design elements are mentioned in Recital 67 and include exploitative design choices, repeatedly requesting a recipient of a service to make a choice, or making the procedure of cancelling a service significantly more cumbersome than signing up.
Although some forms of dark patterns were discussed in the updated guidance document on the application of the Unfair Commercial Practice Directive (UCPD) as explicitly prohibited, the scope of application of this ban was limited. The Commission clarified that the ban concerns practices such as fake timers and limited stock claims on websites, repeated intrusions during normal interactions, and misleading free trials and subscription traps. In other cases, a case-to-case approach may be applicable in identifying a dark pattern.
Other forms of dark patterns are clearly banned following the requirements established by the General Data Protection Regulation (GDPR). This is evident regarding the principles related to the processing of personal data (Art. 5) and the conditions for consent for processing (Art. 7). For example, because consent should not be regarded as freely given if the subject has no genuine or free choice (Recital 42), dark patterns such as requests for consent for data processing presented in an incomprehensible form or utilising subliminal techniques, are effectively banned by the GDPR.
In follow up to this, on 14 March 2022, the European Data Protection Board (EDPB) released guidelines titled “Dark Patterns in Social Media Platform Interfaces: How to recognise and avoid them”. These Guidelines provide best practice recommendations to designers and social media platform providers on how to assess and avoid dark patterns in social media interfaces that violate the requirements of the GDPR. The EDPB has recognised six major categories of dark patterns – overloading, skipping, stirring, hindering, fickle, and left in the dark.
Additionally, as discussed above, the DSA established a horizontal ban on dark patterns, prohibiting providers of online platforms to “design, organise or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions” (Art. 25). The DSA also provides the EC the possibility to issue guidance on how the prohibition applies in relation to specific dark patterns – particularly false hierarchy, nagging and hard to cancel – and that the prohibition does not apply to practices already covered by the UCPD and GDPR. However, it is important to notice that the ban established by the DSA applies exclusively to online platforms, and as a result, has a much more limited scope than the UCPD and GDPR.
The Digital Markets Act (DMA) also includes an article that addresses the issue of dark patterns. Article 13, echoed in corresponding Recital 70, prohibits behaviour aimed at subverting or impairing user autonomy, decision-making, or choice, to circumvent the prohibition and obligations that the DMA lays down.
It’s also noteworthy that in October 2022, the OECD published the paper “Dark Commercial Patterns” which: proposes a working definition of dark commercial patterns; demonstrates the prevalence of dark patterns and underlines the effects on consumer decisions; identifies the educational, technical and business initiatives to address dark commercial patterns; describes the regulatory framework; and provides proposals to address the issue.
While several Civil Society Organisations, such as BEUC, have been calling for additional rules on dark patterns, the reaction of the private sector has been more cautious. Three sets of arguments have been raised in this context. First, stakeholders have emphasised the difficulties associated with drawing a line between marketing or personalisation techniques and online interface design elements that could be construed by some as dark patterns. Others have emphasised the regulatory fragmentation that applies to the concept of dark patterns and in turn, creates legal uncertainty. Finally, a third set of arguments revolves around the idea that all the regulatory instruments are already available to tackle the issue of dark patterns and what is missing, rather than an update of the UCPD, is an effective enforcement framework.