Deceptive patterns in data protection (and what UX designers can do about them)

A dive into the infamous world of deceptive design.

Luiza Jarovsky
UX Collective

--

Mirrored cube in a black and white spotted background
Photo by Michael Dziedzic on Unsplash

Deceptive patterns, according to Harry Brignull — the first designer who coined the term — are “tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something.” Some common examples are websites that make it almost impossible to cancel or delete an account, almost invisible links to unsubscribe from non-requested newsletters and insurance products that are surreptitiously added to your shopping cart. You can check more examples here or tweet and expose your own personal findings using the hashtag #deceptivepattern (they might be retweeted by this account — it’s worth checking out some outrageous examples there). There are many interesting academic papers on dark patterns, such as Gray et al and Mathur et al's research. You can find many more on public repositories.

One of the chapters of my ongoing Ph.D. in data protection, fairness and UX design is about deceptive patterns in the context of data protection (you can download the full article I wrote about the topic here). I defined them as “user interface design choices that manipulate the data subject’s decision-making process in a way detrimental to his or her privacy and beneficial to the service provider.” In simple terms, they are deceptive design practices used by websites and apps to collect more — or more sensitive — personal data from you. They are everywhere and most probably you have been encountering some form of deceptive patterns on a daily basis when navigating online. Below are some examples of practices I consider to be deceptive:

1- Screenshot from TikTok’s sign up page:

Pop up window asking the user "Confirm you are above 18 and allow personalized ads," with options no or yes below. Option yes is in bold.

In this example, you cannot know if the “Yes” or “No” buttons are for the “are you over 18” or for the “do you allow TikTok to show personalized ads” question. According to my taxonomy (that you can also find in the academic article and which was mentioned in the European Union's report about deceptive patterns), it would be a “mislead” type of deceptive pattern, as it misleads the user into not opting out from personalized ads (as the user must click on “yes” regarding the questions “are you over 18”).

2- Screenshot from LinkedIn settings:

Printscreen of Linkedin's settings (mobile). The sessions are "account preferences," "sign in and security," "visibility," "communications," "data privacy" and "advertising data"

Each of the sections above contains data protection related choices, such as “who can contact you” or “who can see your full profile” (despite the fact that a few of the sections’ titles do not seem related to data protection), accounting for almost hundreds of options. This excessive amount of available choices is not empowering; on the contrary, they distract users from the big picture and from the most important data protection issues and confuses them. This type of deceptive pattern is a “hinder,” as by adding an excessive amount of choices, it overwhelms the user, hindering his or her capability to choose according to his or her true preferences

3- Screenshot from the website Groopdealz:

Pop up window written "Daily Boutique Deals up to 70% off." Then there is the field for the user to add the email address with the button "sign up" on the side. Under the email field it is written: "No, thanks. I don't like to save money."

Here, through manipulative language, the website is pressuring the data subject to add his or her email and subscribe to the newsletter (see the underlined sentence "No thanks. I don't like to save money" under the field to add the email address), so the category of deceptive pattern is “pressure.”

*

Deceptive patterns exploit various cognitive biases in order to manipulate users into sharing more personal data. As such, sometimes it is very difficult to “resist’’ deceptive patterns, as we cannot realize they are there. This, as a consequence, gives a lot of power to UX designers, who can use their work to either help protect fundamental values such as privacy, autonomy, human dignity and fairness or to serve unlimited corporate power to harvest more profits, regardless of the consequences to the public.

As a lawyer, I recognize that law has an important role in restringing manipulative practices online, especially in the data protection context, where the complexity of data collection and processing activities may help disguise corporate abuse and disregard for fundamental values. However, even data protection laws that are considered models worldwide such as the General Data Protection Regulation (GDPR), enacted in the European Union, have not successfully tackled the issue of deceptive design, as it seems that companies always find a way to embed and disguise deceptive patterns so that they cannot be clearly identified as so.

And here I see a big opportunity for UX designers, both as individuals and as a community. The interaction between companies and users happens mainly through the user interface and the user experience design of products or services. It is there that companies can show who they are: their branding, their value proposition, their products and services and also how they handle privacy. Do they acknowledge privacy when planning and designing a new product or service? Do they offer meaningful choices? Do they think about possible vulnerabilities that might lead people to unwittingly share more than they would like to? Do they present privacy choices in a fair way — that does not mislead, pressure or hinder people from reflecting and deciding what they really want regarding their personal data? Do they acknowledge cognitive biases and design products that protect people from committing errors that might lead to privacy harm?

UX Designers can take the lead and bring all these questions to the surface when doing their job. Designers themselves know more than anyone that there is no “neutral” design. Every design carries values and has the power to shape choices, behaviors and people. Designing is power, and as with any power, it must be used wisely.

With this post, I hope to invite UX designers to join the privacy conversation. If UX designers start working with privacy in mind and designing products and services that take into consideration cognitive biases, informational vulnerabilities and user autonomy, soon we will have a much fairer and respectful web. I do not know you, but I think it is a great legacy to leave to the next generations.

✅ Before you go:

See you next week. All the best, Luiza Jarovsky

--

--

CEO of Implement Privacy, LinkedIn Top Voice, Ph.D. Researcher, Author of Luiza's Newsletter, Host of Luiza's Podcast, Speaker, Latina, Polyglot & Mother of 3.