Deceive, confuse, wear down: the dark patterns of UX

When profit is at stake, content and interaction designers put ethics aside all too often. It’s time to say: enough.

Piotr Ślusarski
UX Collective

--

A black, scary spider weaving its web.
Photo by Photoholgic on Unsplash

Top left corner, menu. Now: Account. Lower down. There it is — Prime Membership, hard to find this tab in the clutter of others. Header: Manage Membership. Looks like the lead is good, except the only thing it takes you to is a button saying: See All Your Prime Benefits. Well, you already figured it’s the benefits that are actually lacking here: that’s why you want to cancel your account. But how?

Oh, you need to click the header. Behind it is the End Membership button. Well, that concludes it…but then it doesn’t. You’re assaulted by another big, fat header saying Exclusive Benefits You Will Lose. There are three benefits listed there in total — but arranged in such a way that your finger hurts from scrolling down.

Finally, there are 4 buttons to choose from — Use Your Benefits Today, Keep My Benefits, Cancel My Benefits, and Remind Me Later. You select the third one but Prime doesn’t go down without a fight, it now says: Save $36.8 By Switching To Annual Payments. Phew…

What’s that then, more scrolling? By Cancelling, You Will No Longer Be Eligible…mhm, next. Huh? There’s weird graphics with a dog and a caption that says: woof woof.

Plus 4 more buttons: Switch To Annual Payments, Keep My Membership, Continue To Cancel, Remind Me Later. So you go with continue. And again, the entanglements: Pause The Membership on January 18, 2021, Keep My Membership, Remind Me

Wait a minute. None of those options! Oh well, let’s scroll. This time it’s your head that hurts, your finger is already numb. There you go! End Now. You can also End On January 18, but now just feels right. Still, now means…March 1, as the final message informs you.

Any chance you want to change your mind after all?

Masters of puppets

That’s how hard you had to work until recently to cancel your subscription to Amazon Prime, a service that includes cloud-based access to movies, music, or games.

The way Amazon muddled the opt-out path is one of the classic types of dark patterns that increasingly accompany interface design. A dark pattern is designed to push you to do something you don’t feel like doing: sign up for this, buy that, stay with whatnot.

When you do sign up for a service — and it’s easy — but then you want to part ways with it — and it’s nowhere near easy — it means you’re staying at a roach motel. Amazon Prime is just such a motel.

Roach Motel is a brand name of a US-made insect trap that attracts cockroaches with its scent and holds them to a sticky surface until they die. Roaches check in, but they don’t check out, the brand’s advertising slogan has proclaimed for years.

Over time, the term roach motel came to be used as a metaphor for a no-go situation.

It was also used by a British UX designer named Harry Brignull, who in 2010 compiled a list of disreputable industry practices, which he first referred to as dark patterns. Brignull listed 11 types of such practices: in addition to roach motel, they included confirmshaming, trick questions, and privacy zuckering. The first one is about playing with your guilt to get you to do something, the second one consists in leading your answers, and the third one, referring to Facebook’s CEO, Mark Zuckerberg, is aimed at getting hold of data you’d rather keep to yourself.

In 2018, Brignull’s typology was revised by Colin Gray, head of the UX Pedagogy and Practice Lab at Purdue University. Gray categorized dark patterns into: nagging, obstruction, sneaking, interface interference, and forced action.

The shades of gray

The typology of dark patterns by Colin Gray:

Nagging: persistent disruption of an activity by a message not directly related to that activity. Example: the Enable notifications suggestion only with OK and Not now options.

Obstruction: the act of discouraging an action, putting obstacles in the way. Examples: hiding a link in a wall of text, roach motel.

Sneaking: concealing, masking, or delaying the disclosure of information that influences final decisions. Examples: adding costs in the fine print, promising one thing, delivering another.

Interface interference: putting some actions above others in order to confuse the user. Examples: highlighting supposedly ending offers, confirmshaming, trick questions.

Forced action: granting access to a function or resource in exchange for performing a certain action. Examples: offering a file for download after creating an account, forced updates.

More: darkpatterns.uxp2.com

The five deadly sins

Nagging is sort of like a commercial break: it occurs when what you are doing is disrupted once or more by a message not directly related to it.

A dialog offering the YouTube Premium service; no ‘close’ mark given.
Nagging: no close button on YouTube Premium’s offer

Sneaking is about hiding, masking, or delaying the disclosure of important information: knowing it beforehand, you wouldn’t do what the screen wants you to do.

Interface interference puts some actions above others, confusing you and limiting your full view of the situation. Finally, forced action occurs when you gain or maintain access to a feature or resource only in exchange for performing a certain action.

A prompt to set up an account on ssrn.com in order to download a research paper — accompanied by a ‘why–you–should’ note from SSRN’s CEO.
Forced action: sign up to download

The sad truth? The Amazon Prime cancellation process fits into most of these categories. It’s all the sadder when you consider that dark patterns are, after all, used deliberately. It’s not good intentions for which there was not enough expertise, it’s bad intentions for which the expertise was more than enough.

Because at the root of the dark side is a deep understanding of human irrationality and the cognitive biases that stem from it, which we experience time and time again. The default effect tells us to stick with what we’ve been assigned because it’s more convenient. The sunk cost fallacy means we keep doing something as long as we have invested money or effort in it — even if in doing so we harm ourselves. The scarcity effect means that the less of something there is, the higher we value it.

And that’s why it’s all the easier to hijack the three of our resources at stake for those using dark patterns: money, data, and attention.

Struggling for breath

Circumstances conducive to the use of dark patterns:

1) You believe too much in growth. You stop caring about anything that keeps the needle from moving where you want it to. Too many unsubscribes? You make them harder; that’s easier than preventing them.

2) You are stuck in a rut with your product, you have no idea how to move forward. Where are the users and their money? Investors are getting impatient. You’re fighting to keep your head above water.

3) People only matter to you as customers, you prioritize customer experience over user experience. You care about someone if they pay you, if not — you don’t.

A new order?

In mid-January of this year, the Norwegian Consumer Council (Forbrukerrådet) filed a complaint with the local Office for Competition and Consumer Protection against Amazon for violating the EU’s Unfair Commercial Practices Directive — in effect in Norway under the European Economic Area Agreement. The reason for the complaint was Amazon weaving complications into how Prime can be canceled, which, the reasoning read, should be as easy as signing up.

Shortly thereafter, Forbrukerrådet was followed by 17 consumer organizations from Europe and the US, which asked the relevant authorities in their countries to investigate the matter. Overseas, the grounds for taking action was the Federal Trade Commission Act, which is similar in meaning to the European directive.

Only this directive is more than 15 years old. The US law has an even longer tradition: its origins date back to…1914. Laws that comprehensively protect us from Amazon-like practices simply don’t keep up.

There is some hope in the European Commission’s new consumer program for 2020–2025. One of its areas is digital transformation. It’s about countering commercial practices that disregard consumers’ right to make an informed choice, abuse their behavioural biases, or distort their decision-making processes. The term dark patterns is used in the program description.

In the US, the DETOUR (Deceptive Experiences To Online Users Reduction) bill from 2019 prohibits services with more than 100 million users from deliberately manipulating their decisions. The legislation has since gone nowhere though.

Dark patterns are like porn: both are hard to define.

The situation is not helped by the fact that dark patterns are like porn: both are hard to define. In the scientific world, as described by Princeton University researchers, there are at least 19 definitions of the patterns. This makes I’ll know it when I see it the only universal assessment criterion.

Ethics extended

Chrome extensions to curb unethical online practices.

No Stress Booking

Hides alerts like Last room or Availability drops on booking.com

Zoom Redirector

Automatically redirects every meeting on Zoom to the poorly visible Join via browser option

Consent-O-Matic

Automatically clicks cookie consent pop-ups — according to user preferences on what to consent to and what not to

Wolf in sheep’s clothing

We use cookies to…If you browse our site, we take it you accept this. Please see our privacy policy for more.

This is a classic form of obtaining consent to allow cookies to analyze what pages you visit on a website or what you do on them. Somewhere next to it is an OK button. Usually, there is nothing else: consent is the only option available.

Ever since GDPR, the EU’s new data protection law went into effect, things were supposed to be different. Consent Management Platforms (CMPs) have sprung up on the market. From this point on, the internet user could exercise broader control over how wide a footprint they left behind online. We value your privacy, announced the welcoming dialogs.

Today, we know that’s just smoke and mirrors. As a Danish-British-American study from last year shows, only 1 in 10 CMP-type platforms operates in compliance with GDPR. In compliance, meaning it allows you to explicitly give cookie consents with a click, or to give them as easily as to withdraw them. Every other platform doesn’t have a Reject all button.

A cookie consent dialog only allowing you to accept all settings.
Interface interference: no choice but to allow all

At the same time, a disturbing fact the researchers at the University of Zurich point out is that people are blind to dark patterns. The majority of our users were either not able to detect dark patterns or were not sure about it. Some (…) explained that dark patterns are so widely spread and common among modern applications that they become part of the normal interaction flow, reads a report from the Swiss study (2020). Its theme is dark pattern blindness.

For this study, the researchers analyzed 240 of the most popular apps from the Google Play store, from eight categories. 95 percent of these applications used dark patterns — the Swiss researchers identified nearly 1800 of them, which is about seven per one application. The most common were: nagging, false hierarchy (i.e. adding a recommended caption), and preselection (i.e. boxes checked upfront).

There are companies on the market today offering dark patterns as a service.

But what’s even more disturbing is that there are companies on the market today offering dark patterns as a service — possible to run on web pages with just a few lines of JavaScript. One such company openly extols its services on boostplugin.com: If you are just starting, you can use fake boosts. It’s about social proof pop-ups saying: Eva from Gothenburg bought this and that a second ago.

Screenshot of a website offering fake social proof pop–ups (e.g. ‘Person X bought Y a second ago’).
Fake it till you make it (boostplugin.com)

Cul-de-sac

Amazon is by no means the only big player making it difficult to give up on what it offers.

Teamwork is a project management software. It is used by 20,000 companies in most parts of the world, including Disney, Spotify, and HP. If any of them wanted to quit, they would be left with a phone call. To cancel your plan, you’ll need to jump on a call with us, proclaims a message in the app.

The interface of Baremetrics, an analytics solution for business, says in turn: If you’re 100% sure you want to cancel your account, please call Brian. Don’t worry, he’s not a salesperson! Brian’s smiling face is shown right next to the message.

The ‘Call Brian to cancel’ dialog shown in the Baremetrics app.
Obstruction: call to cancel

Want to cancel your account? Please call Brian. Don’t worry, he’s not a salesperson!

The Economist daily, for its part, makes no secret of the fact that it will apply sales techniques whenever someone notifies it of their desire to cancel — the canceling must be done over the phone or via chat. We’d love to discuss a more affordable or flexible subscription option, reads economist.com

The app of SiriusXM internet radio (35 million subscribers) which broadcasts in the U.S. and Canada, communicates with people in a similar vein. It doesn’t say I want to cancel, it says I would like to discuss canceling. And it provides a number to call.

‘I would like to discuss canceling my subscription’ dialog; a number to call provided.
Obstruction: convince us to let you leave

You’ll regret that

Even if opting out of a service is possible in the same way as signing up for it, those who want the former are sometimes urged to stay by being manipulated into feeling guilty.

Until recently, one of the more sophisticated examples was the Facebook account deactivation path. Are you sure you want to do this?, Facebook asked. Underneath were profile pictures of the friends you intended to leave behind. Peter will miss you, said the captions, Amy will miss you, Alex will…After 2018 Facebook no longer used this ploy.

It was the classic example of the said confirmshaming: by approving something, you have a major guilt trip. In the case of Facebook, the trick was quite unusual: normally, it takes the form of a link with a secondary action text. Want to subscribe? No, thank you. Only instead of No, thank you, it reads: I don’t like delicious food, I’m fine with losing customers, or…I’m a bad person.

A dialog nudging people to try Optinmonster, a lead generation tool. A ‘Get More Email Subscribers’ button is accompanied by a link saying ‘No, thanks, I’m fine with losing customers’.
Interface interference: confirmshaming

Viagogo, the infamous online ticketing platform that has been warned about by many, from the Polish actress Barbara Kurdej-Szatan to the British minister of culture Margot James, confirmshames in its own inverted way. Along with an offer for a Lady Gaga concert ticket, it displays a Take it or be sorry button message.

A portion of search results on Viagogo, a ticketing platform. One of the offers has a ‘Bierz albo żałuj’ button next to it, which in Polish means ‘Take it or be sorry’.
Interface interference: ‘Bierz albo żałuj’ (Polish for: ‘Take it or be sorry’)

Viagogo has other messages too: Last items available, This event is selling out fast, A customer from Poland bought 2 tickets…very similar to the ones booking.com paid dearly for. Last year the Hungarian Competition Authority (GVH) fined booking.com more than 7 million euros for exerting undue psychological pressure on customers. Despite this, the online travel agency’s website still says: Booked twice on the selected date within the last 6 hours or Availability in the city of Sandomierz on the selected date is decreasing.

Nasty but effective?

Until 2019, there was no publicly available study that talked about whether dark patterns in the interface design are effective when it comes to getting someone to do something. That gap was filled by researchers at the University of Chicago.

They asked a large group of people to complete a survey about online privacy. Those who agreed were automatically enrolled in an identity theft protection program. The program was free for the first six months, then paid monthly.

The surveyed were able to decline the offer, though not all as easily. The researchers divided them into three subgroups: against the first group, they used no are-you-sure methods, against the second they used some — and mild, and against the third one, they used many — and aggressive ones.

The mild ones included a highlighted Confirm and continue button with the word Recommended under it, a hidden opt-out option, and the I don’t want to protect my data or credit history link. The aggressive ones, on their part, included being forced to read additional eye-opening information and stay on the screen with that information for a certain amount of time, alarming language (Identity theft can damage your credit status), and trick questions (the Cancel button that canceled the cancellation).

The results? Where manipulative patterns were absent, only 11 percent of the respondents remained in the program. Where they were mild, that percentage more than doubled, and where they were aggressive, it nearly quadrupled. With A/B testing, the researchers note, companies can crank up those results even more.

Does increased enrollment have any value if we give it a hard push?

However, not everything can be measured by A/B testing. Does increased enrollment have any value if we give it such a hard push? In the Chicago experiment, the aggressive patterns, though they performed best, were met with strong opposition among the subjects. But the mild ones were not — and it was these that the researchers found to be the most dangerous.

Tristan vs. Predator

Its mouth is riddled with sharp teeth. It lives in the deep sea, where the sun’s rays do not reach. It has a flexible, glowing pole-like organ on its head — akin to a fishing rod with a light bulb at the end. It is with this rod that it catches its prey: when one comes closer, attracted by the glow, it is devoured in the blink of an eye.

The anglerfish opening its jaws.

It’s the anglerfish, one of the strangest animals of the ocean. Some compare that creature to our industry today. This is all too often how UX design is considered and practiced, tweeted Erika Hall, co-founder of the Mule Design agency, late last year. She illustrated her tweet with a photo of an anglerfish, noting: its mouth is a business model, its rod is UX. Too often, Hall explained, UX is just a decoy to distract people from predatory business practices.

UX is just a decoy to distract people from predatory business practices.

And that’s what makes UX veterans lose faith in the industry. As Mark Hurst, whom Forbes once described as a user experience jedimaster, writes on his blog, UX no longer deserves its name. Why? Because it doesn’t stand on the side of the user these days. Designers have stopped building products for people, they now build them at people’s expense. Hurst removes experience from UX and puts another word beginning with ex- in its place. Welcome to the era of user exploitation.

Ann Light from the University of Sussex talks, in turn, about bovine design. Bovine design creates tools that promote passivity and unreflectiveness, making us more like grass-chewing cows.

The Center for Humane Technology (CHT), an organization dedicated to aligning technology with humanity’s best interests, aims to resist this trend. Led by ex-Googler Tristan Harris, CHT stresses the main problem is that by upgrading technology, we downgrade humanity. The organization gathers such heavyweights as Tom Gruber, creator of Siri, Evan Sharp, head of Pinterest, and Roger McNamee, an early investor in Facebook.

By upgrading technology, we downgrade humanity.

The Center for Humane Technology has some advice for designers. The first: be obsessed with values, not metrics. The other, and most intriguing one: feel the pain of those you create for. But not in an hour-long workshop. Designed an airplane? Your partner is to be its passenger on the first flight. Designed an algorithm to recommend the next videos to watch? Your mother ignores covid recommendations because of it. Designed a social networking app? Your child is being bullied in it.

Within a few months, CHT is expected to launch an online course teaching design with a friendly face. Until then, perhaps all we need to do is simply ask ourselves: would we do what we happen to be doing the same way for our families and friends?

The UX Collective donates US$1 for each article we publish. This story contributed to World-Class Designer School: a college-level, tuition-free design school focused on preparing young and talented African designers for the local and international digital product market. Build the design community you believe in.

--

--

Ex-journalist turned content designer. Once almost died at a shadowing session gone wrong. Thinks he prefers the first person to the third. More: piotr.fyi