This is Part 3 of a series on Ethical Design.
In Part 2, we discussed ways of implementing Compassionate UX design.
Implementing compassionate UX design isn't about grand gestures, but a continuous commitment to the user's well-being. It begins with deep, empathetic research to truly understand their emotional states and pain points, not just their tasks. This insight then translates into proactive design choices, like clear error messages, intuitive progress indicators, and readily available, contextual help, anticipating frustration before it even sets in. Finally, it involves empowering users with control, transparency about their data, and easy recourse if things go wrong, building a foundation of trust that benefits both the user and the business in the long run.
So what shold we AVOID?
Enter the world of “Dark UX Patterns”
These are user interface tricks designed to manipulate users into doing things they didn't intend to do, often for the benefit of the company. They are "technically effective" because they increase conversions, sign-ups, or data collection, but they are harmful because they exploit cognitive biases, erode trust, and can lead to frustration, regret, or financial loss for the user.
They can be enticing for business owners. I, myself, have been subject to pressure from Product Owners who want short term gains but ultimately it is a no-win scenario.
Examples of Dark Patterns and how they harm the user:
"Roach Motel"
Easy to sign up for a service, but extremely difficult to cancel.
It may create high retention rates and continued subscriptions but it makes users frustrated. They feel trapped, like they wasted money. Ultimately it creates added stress and high churn because people evangelize against you.
A perfect example of this is trying to cancel Facebook. It is buried incredibly deep in the menu system and not documented at all. And even when you cancel, your information remains for 30 days and a single log in restores everything. There is no real sense of leaving even after the user puts in the effort to do so.
–––––
"Forced Continuity"
Automatically charging a user after a "free trial" without clear notification or easy cancellation.
This might convert free users to paying subscribers but it creates unwanted charges and feelings of deception in the user. No one like financial surprise and it ends up creating an adversarial effect which means you might have paying customers in the short term, but at the expense of long term customers.
A perfect example of this is Tinder, who entices you with a free trial and silently charges you once it is up.
–––––
"Confirmshaming"
Making the user feel bad for opting out of something.
This comes early on in the process with a pop-up usually requesting email opt-ins with the promise of a discount. This might increase opt-ins but it comes at the expense of the user’s well being. It hinges on a sense of guilt, psychological pressure and the undermining of personal autonomy. No one likes passive aggressive pressure.
Examples of this are declinations like “No thanks, I don’t like saving money”.
–––––
"Sneak into Basket"
Adding unwanted items to a user's cart during checkout.
Sure, this increases average order value but it’s not only unethical, but in some cases illegal. Aside from the unexpected financial costs, it comes at the user’s expense as they feel tricked and have to be on guard to carefully review every purchase.
In 2015, the UK sports retailer “SportsDirect.com” was found to be guilty of sneaking an unwanted magazine subscription into users’ baskets during the checkout process. They had to actively remove it if they did not want to purchase the subscription.
–––––
"Disguised Ads"
Making advertisements look like native content or functional buttons.
Everyone wants higher click-thru rates on ads but clicking on unwanted content is not only frustrating and misleading, it can also lead to viruses and unwanted downloads.
This is especially egregious in the case of download sites where designers want to download a font or a PNG for mockup use. I, myself, have run into this many times where there are 3 “download now” buttons on the page and you have to decipher which one actually leads you to the font download. I always leave and never return.
–––––
"Trick Questions"
Using confusing language in forms or checkboxes to get users to agree to something unintentionally.
This might gather more user data but it comes without consent. People might unknowing share their data or sign up for something – and often your email gets sold and used for spam.
Examples of this are checkboxes that come pre-checked (which actually can result in a $10,000 fine) or checkboxes with phrasing like “uncheck if you don’t want…”
–––––
Data Exploitation and Lack of Privacy
Products might be technically effective at collecting vast amounts of user data, but without transparency and user control, this can be harmful.
Default Opt-ins
Automatically enrolling users in data sharing or marketing without explicit consent.
Ever search for something and suddenly you get Instagram ads for that subject? Or even worse, simply spoken it out loud just to find your IG account serves you up that one product you literally mentioned on the phone to your friend? Or it seems every time you sign up for something you get more spam. This is perhaps the most common Dark Pattern alive today (barring social media).
It might result in larger data sets and more marketing reach, but it is a privacy breach that results in distrust, feelings of exploitation and unwanted communications.
–––––
Vague Privacy Policies
Making it difficult for users to understand how their data is being used.
This is all about legal compliance without true transparency. It may not seem that dark as it goes unnoticed, usually. But the harm to the user is lack of informed consent and the inability to make informed choices about their data.
In 2010, GameStation added a clause to its license agreement that stated if users didn’t uncheck a certain box, they would grant to company a “non-transferable option to claim, for now and forever more, your immortal soul.” Currently, many people on social media are subject to terms and conditions that state the platform has the rights to all the content users create and share on their platform and have come under recent scrutiny for allowing such content to train their AI models.
––––––
Addictive Design Elements
While not always "dark patterns" in the sense of overt deception, some design choices are optimized for maximizing engagement and screen time, which can lead to addictive behaviors and negative impacts on mental health.
This is especially prevalent in social media which optimizes engagement at the direct expense of the user’s mental health.
Infinite Scroll
Prevents natural stopping points, keeping users perpetually engaged.
This is designed to maximize your time on the platform and expose you to as much content and as many ads as possible. It results in reduced focus, sleep deprivation, FOMO and can strongly exacerbate anxiety or depression.
–––––
Gamification and Variable Rewards
The unpredictable nature of positive feedback triggers dopamine, creating a compulsive checking habit.
Gamification drives repeated engagement and keeps you coming back for that little hit of dopamine. But this is actually a subtle form of addiction. It drives anxiety from constant checking and sometimes negative self esteem from comparing oneself to others.
–––––
Personalized, Algorithmic Feeds
Content is relentlessly optimized to keep users hooked
Circular algorimic content is the hallmark of social media like Facebook. It potentially pushes users into echo chambers or exposes them to harmful content (e.g., extreme views, self-harm content) if the algorithm prioritizes engagement over well-being.
A tailored experience may seem enticing but it is a trap that can be hard to escape because it reinforces one’s own views - real or not. This is where conspiracy theories and radicalization is born and results in severe mental health issues, distorted perception of reality and reduced exposure to diverse viewpoints (or even facts.)
–––––
In Conclusion…
Dark patterns represent a critical ethical challenge in UX/UI design, deliberately exploiting human psychology for short-term business gains at the user's expense. These designs erode trust, induce anxiety, and can lead to financial and psychological harm. While tempting for quick wins, embracing such practices ultimately undermines user loyalty and brand reputation. True, sustainable success lies in compassionate design that prioritizes transparency, autonomy, and genuine user well-being over manipulative persuasion.