top of page

When Dark Patterns Meet Defensive Users

  • Writer: Elizabeth Benker
    Elizabeth Benker
  • Oct 11
  • 3 min read
Computer monitor showing GenAI text window with the phrase, "How can I help?

Nearly every AI app I try starts the same way: a blank text box in the middle of the screen, no explanation, no pricing, no context. Just type something.


So, I do. I type something. I hit enter. And then — bam! — a wall: “Sign up to continue.”


No overview, no clue what this thing actually does, just a demand for my email. I can’t even decide if it’s worth trying without giving up some data first. Feeling cornered, I hand it over. A few clicks later, I hit a paywall: tokens used, free trial over, upgrade to continue. All before I’ve had a chance to determine if the service is worth paying for.


These are examples of dark UX patterns: design choices that trick or pressure users into doing something they didn't intend, like signing up for a trial that's hard to cancel. These patterns aren't new. The web used to pull these tricks in the early 2010s before we learned better ways to drive durable engagement.


So, why are we seeing them again?


UX Amnesia: How Every Tech Wave Forgets What We’ve Learned

Every major technology shift seems to come with a bout of collective amnesia.


We forget what we already know about human behavior and act surprised when users push back. I’ve seen this cycle repeat across mobile, social, cloud, and now AI. Each one starts with a wave of experimentation and ends with a rediscovery of the same truth: people reward products that respect their time, attention, and trust.


Technology evolves quickly. Humans don't.


Fifteen years in, I still meet users in testing sessions who don’t know what a hamburger menu does. (That pattern dates back to around 2009.) Before dismissing those users as "out of touch," it's worth asking whether your product is designed for real human behavior, or just your idealized version of it.


When Dark Patterns Meet Defensive Users

What’s fascinating about this new crop of AI experiences is how users adapt. For every manipulative tactic, there’s an equal and opposite defensive behavior: small acts of resistance that preserve autonomy in the face of opacity.

Dark Pattern

Defensive Response

Not explaining your product or pricing on your site

Abandoning your site to learn about it on Reddit or YouTube... and maybe never returning

Forcing an email sign-up before showing potential for value

Providing a fake email, or immediately unsubscribing and blocking... maybe forever

Requiring a subscription too early

Canceling immediately after the first charge or not signing up at all

Dark patterns breed defensive behaviors in users. When people feel manipulated, they look for workarounds, guard their data, and stop trusting. Dark patterns might inflate early metrics, but they also train users to approach a product that leverages these techniques with skepticism.


The Illusion of Success

I often wonder whether teams building these experiences are looking at their customer journey holistically. Marketing is optimizing for sign-ups. Product is optimizing for subscriptions. Each team celebrates its own upward trends, but few pause to ask whether those numbers tell the full story.


Durable revenue comes from understanding why people stay (or why they don’t). Some AI apps clearly deliver value; others struggle with churn but can’t see where users are slipping away. Copying the same funnel tactics as everyone else doesn’t fix that. It just masks the problem.


When users drop off, chasing more traffic won’t help. You have to dig deeper: trace the entire journey, connect quantitative signals with qualitative insight, and listen carefully to what people say when they leave. Metrics point to the leak, but only humans can tell you why it’s there.


Research in digital trust and AI design shows that when people understand what a system is doing — and feel confident they can leave at any time — they’re far more likely to stay engaged.


The Bigger Lesson

It's clear that AI is transforming technology. It doesn't override human nature, however.


We already know how to earn trust, communicate value, and create clarity from hard-won lessons over decades of digital design. Forgetting them every time a new technology emerges is an avoidable regression.


If you want sustainable growth, measure what matters: retention, satisfaction, advocacy. People want to understand what they're getting into, see how it works, and know they can leave if they choose.


The next generation of AI products will be defined by trust. And for digital trust to happen, your product has to earn it.


Comments


bottom of page