Dysfunctional systems

Addictive mental models in product design create patterns of thought that become harmful patterns in UI that become harmful patterns of behavior

A neon sign of an empty like notification

Photo by Prateek Katyal from Pexels

My path to product design and design systems wasn’t linear. I’ve bounced around different design disciplines and working environments, gravitating towards places with loose definitions of what a designer’s role is.

I like too much of the process — from the strategy, to the implementation, to the way teams get work done together.

That’s what I love about design systems. You get to touch all these different parts of the business, pull it all together into principles, processes, and designs that become a real experience for other people.

It can be hard to draw a line where my job ends and someone else’s begins. Because every time I find something that makes me say, “that’s not my job,” a little voice in the back of my head says, “Well, maybe it could be.”

I have to tell you, the conversation I want to have isn’t easy. There are parts that aren’t easy for me to share and there are parts that may not be easy to read.

It’s a conversation about addiction. How addiction shapes the work we do. How the work we do impacts the human beings who use our products. And what a healthier future for our field might look like.

Some of this might not feel like it should be your job. But I hope that when you hear that voice in the back of your head suggesting that maybe it could be, you hear it out with an open mind.

But first, a story about my dog.

To understand why this matters so much to me, I need to tell you about about my dog. His name is Guff. Guff is a rescue—half bulldog, half pug, and 100% attitude. (His folds give him enough surface area for that math to check out, ok?)

Everyone loves Guff, and Guff mostly loves everyone. Except for birds. Guff hates birds. Hates. Them. Especially pigeons, who are always beating him to the best garbage.

Me, Guff, and my husband—all wearing suits.

Guff as the Goodest Man at my wedding

The happiest I’ve ever seen him is charging into a flock of pigeons in a park. He’d run at them — in as close to a run as he ever gets — and they’d scatter and settle. He’d charge right back in. They’d scatter and settle again. The sheer joy was like watching a child jump into a ball pit.

My husband, Guff, and I recently bought a house on the Sonoma coast, and there aren’t any pigeons there. But there are wild turkeys.

So one day, Guff’s wandering in our backyard, and he sees a huge flock of turkeys. He charges at them as fast as he can until he’s about 6 feet away and they’re not scattering. So he skids to a stop, realizing in that moment how different his odds look, now that he’s this close.

The lesson in that, the reason I’m talking to you today, is things often look different when you get close to them.

8 years ago, My brother Jesse died of a heroin overdose.

Jesse and me at one of our last Christmases together.

Jesse struggled with what my family believes was undiagnosed bipolar disorder. He was depressed and anxious about his future. He wrote about being aimless falling leaves. He was an artist, and much better at words than me.

He was warm and funny. He once threatened to poison me so he’d never have to hear another teacher ask if he was John’s brother. It’s still one of the nicest things anyone’s ever said to me.

He was the type of person who would direct all the love and kindness he couldn’t afford himself to you.

His loss was sudden, and sharp. It was the kind of loss that completely severs the thread between your life before it and your life after it.

A typewritten poem on brown paper bag

”I am aimless falling leaves, waiting to be caught up in your breeze.” One of the many scraps of poetry that filled Jesse’s bedroom.

My brother was not the first young person to lose his life to heroin in my hometown. The suburbs are not the first community to be devastated by a drug crisis. But this was the first time in my life I was so close to this kind of loss, and things started looking different.

Things started sounding different after losing Jesse, too. Any talk of substance use or addiction sounded harsh. It stung like paper cuts, reminding me of that larger, sharp loss.

I hadn’t realized before then, how often and how casually we talk about addiction. From tv plots to our daily conversation. We joke about the things we love being our “crack” or “heroin” and describe ourselves as “reality tv junkies.” And as I moved into the world of product design, I heard it more and more.

I was struck by how thoughtlessly we use the language of addiction in our work. We talk about building “addictive” features, making bingeable content, and chasing likes for the dopamine hit.

One of the most popular models of engagement is Nir Eyal’s Hook model, which is usually presented as either an infinity symbol or a downward spiral.

Nir Eyal’s Hook Model envisions engagement as an endless cycle of Trigger, Action, Reward, and Investment.

Depictions of Nir Eyal’s Hook Model suggests a vision of engagement in which a user never escapes our product.

By adopting the language of addiction in how we think and talk about our work…by shifting our thinking to behaviors like daily active use, time in app, and binging, instead of the people engaging in those behaviors…we make addiction not just a risk of our work, but the goal.

And we’ve been successful in building addictive products.

People spend uncontrollably on freemium games, chasing incentives we’ve designed to be unpredictable using variable reward schedules.

Young people fall into depressions over other people’s meticulously curated online lives.

Because in our hyperconnected world their own social networks are turned against them to foster a fear of missing out or falling behind at a time when those relationships are the most important to their development.

I think we’ve all experienced the adverse effects of scrolling instead of sleeping because someone else has removed the friction from and ability to track our own consumption.

Texting while driving is six times more likely to cause an accident than driving under the influence of alcohol because in a world of interruptions that prioritize driving metrics over driving safely, their attention is not our own.

By adopting the language of addiction in how we think and talk about our work…we make addiction not just a risk of our work, but the goal.

This is not to conflate the use of technology and the use of substances — or ring a moral panic about either.

I’m not someone who thinks tech is inherently harmful because the people who use it look weird to the people who don’t. And I’m not talking about kids these days being on their phone too much.

We live in a digitally connected world that’s made amazing things possible. I met my husband online.

Digital products allow people from marginalized groups who might otherwise feel completely alone in their day-to-day lives find each other and build communities. When your support network exists solely online, how much time in an app is too much?

Even researchers aren’t totally aligned on digital addiction. Whether it’s a distinct disorder or an expression of other mental health issues.

That doesn’t mean the harm isn’t real.

Tech has changed the world, and when you change the world, it can take decades before you can even begin to understand how you’ve changed it.

It’s 1964 for our industry.

It wasn’t until 1964 that the Surgeon General published a study linking smoking with heart disease and lung cancer. It took thirty years after that for the FDA to officially recognize nicotine as a drug that produces dependency.

Of course, tobacco companies knew that cigarettes cause cancer by the late 1950s.

A vintage camel ad featuring a physician urging people to smoke a “fresh cigarette”

A vintage camel ad

Over decades, the complexity of the digital products we build has grown so much that we’ve created digital products to help us manage our digital products. And in some ways, we’re just starting to appreciate the complexity of the systems we build.

We can’t wait for our work to show up in a diagnostic manual before we consider the impact it has on the people we create it for.

Because while most of us may not be doctors or scientists, we get up every day and make decisions meant to shape someone else’s behavior.

Decisions like:

  • Whether or not we’ll implement pagination, a load more button, or infinite scroll.

  • Will that video in someone’s feed autoplay? And will another one start right after?

  • Just how many anxiety-producing red badges will we show the people using our apps, and how much agency do we give them in whether or not they see them?

  • What do we give prospects a taste of during their free trial so they’ll let that auto renew process?

  • How much do we play into people’s fear of missing out when they try to stop using our product? And can they take their data with them when they leave?

Unlike other professions that deal in human behavior, we have no agreed upon scope of practice. Our work touches every facet of life. We build products people depend on for their medication management, fitness, community connections, and mental health.

Tech has changed the world, and when you change the world, it can take decades before you can even begin to understand how you’ve changed it.

When we frame our work in terms of addiction — of hooking people, of creating habit-forming experiences they can’t get enough of — it’s easy to let harmful patterns of thought become harmful patterns in our UI.

Those can become harmful patterns of behavior in the people who use our products.

We don’t set out to cause harm, but in accepting the wrong framing of the problems to be solved, we become inoculated to the problems we create.

You get what you measure.

We track if we are driving use, but not if we are driving value.

Because it’s easier to measure if somebody has seen something or clicked on something than it is to measure whether or not that click or impression was beneficial to them. Or how that benefit compares to what they could have been doing instead of engaging with our product.

What we use to measure engagement is not the full measure of our impact on people’s lives.

That’s why we see active Twitter users calling it a hell site they can’t escape. And YouTube viewers logging on for video game hacks and leaving radicalized. And people on Facebook choosing misinformation over family, friends, and their own health.

For all the good these products offer, they’ve framed success in such a way that fixing these problems would mean failure in the eyes of market analysts.

What we use to measure engagement is not the full measure of our impact on people’s lives.

In that way, we also fall victim to the addictive mental models we’ve adopted.

We chase less and less sustainable metrics because we’ve built our work around a principle that can never be satisfied. Addiction only ever wants more.

It wants bigger user bases, more daily active users. And it makes it easy to lose sight of what’s in the best interest of those people.

Because getting new people to sign up for your product is hard. But getting that person who’s already using it a lot to look at it a couple more times each day is easy.

And that person who’s already using your app a lot might not think anything of those couple extra visits. They may even enjoy them.

Research, and its interpretation, is not neutral.

That’s the nature of dependency. When we find something that gives us the sweet sweet dopamine that’s so desperately missing from this, the year 2021, we want more.

So we may run usability studies in which participants tell us they prefer endless scroll to having to tap to load more. Or that disruptive notifications aid discoverability.

The tobacco industry ran studies that showed smokers preferred the cool taste of menthols.

We hide complexity and ignore responsibility.

We can’t expect every person who uses our products to have perfect knowledge of their potential downsides.

This is not about absolving people of personal responsibility for their use of technology. Anyone who’s known someone doing the hard work of recovery knows people want to be healthy — when the systems they live in support it and they can access the resources to achieve it.

Addiction is a result of complex individual factors, yes, but also environmental, societal, and systemic influences.

Right now, though, while we may work hard to reduce the complexity of our products’ UIs, we put all the complexity of our products’ effects onto the individual.

And we’ve seen the devastating effects of expecting people to understand every aspect of the world we live in as well as the experts.

Just look where we’ve ended up, requiring everyone to play their own immunologist, pharmacist, and media analyst.

We chase less and less sustainable metrics because we’ve built our work around a principle that can never be satisfied. Addiction only ever wants more.

We can’t just look at KPIs and assume our systems are working as intended.

To understand how our decisions are affecting the people who use our products, we need to talk to them. We need to understand them on a human level. We need to value that qualitative input as much as the quantitative.

What are A/B tests if not studies being run on human participants?

You may remember in 2012, data scientists at Facebook manipulated what people saw in their news feeds to gauge its impact on their moods.

If that experiment had been planned by an academic institution, no community review board would have approved it. Maybe user research needs external review boards to ensure its safety.

We need to start thinking about what “informed consent” looks like in the studies we run. How do we help the people who use our products understand the mechanics of them in a way that gives them agency?

Is there a future in which the legally-mandated cookie settings we see today live alongside settings for which in-app experiments you participate in?

Of course, a solution that doesn’t add to the sticky-bannerfication of the web would be nice…

Of course, a solution that doesn’t add to the sticky-bannerfication of the web would be nice…

If that idea makes you uncomfortable, sit with why. If you’re testing how the way you describe your product affects understanding, or how imagery affects conversion, why not be transparent about that?

The people who use our products want us to do what we can to create the best experiences possible. When I go on an e-commerce site, I want to understand the product, and usually I want to buy something.

But if you’re a social media giant studying how manipulating emotions increases content sharing, or a ride share app testing just how much you can charge during peak hours, I understand why that transparency would give you pause.

We’ve been moving fast and breaking people for so long, maybe a pause is just what we need.

Chinese regulators are drafting legislation that would make the basic principles behind algorithmic recommendations transparent to—and manageable by—consumers. They’d also include an explicit ban on encouraging excessive consumption.

Regulation has its own complexities and dangers that are very real. Of course those dangers are often not faced by those in tech but rather by the most marginalized communities using our products.

At the same time, we expect bartenders to know when someone has had too much, and we hold them responsible if they don’t cut them off. Why not expect the same of an algorithm?

What are A/B tests if not studies being run on human participants?

Mark Zuckerberg, Jeff Bezos, and the like have all but dared the world to legislate them, knowing that whatever risks legislation carries for their users and the web as a whole, it won’t impact them personally any more than a parking fine.

Meanwhile, those of us creating digital products will be held responsible, too — certainly morally, and very likely legally.

It’s inevitable that we’ll see increased laws protecting technology consumers from irresponsible handling of data, untested algorithms, and manipulative design patterns.

We shouldn’t wait for the lawsuits to start rolling in before we start creating more beneficial, sustainable, and ethical products.

Where do we go from here?

As the people building these experiences, we have a unique opportunity to start creating healthier products today.

We can continue to examine how the mental models we share influence the experiences we put out into the world. We can challenge each other to think critically about what we mean when we say we want to build something addictive and promote a more intentional dialogue around our work.

We can make the right decisions easier to implement. If the fact that a solution is more ethical or accessible doesn’t persuade a business partner, maybe the fact that it’s ready to go will.

We can make empathy, and kindness, and responsibility to one another foundational principles of our work. And evaluate every new decision based on those values instead of the harmful mental models we’ve allowed to shape our industry.

These are small steps, but every journey to recovery begins one small step at a time. Systems built over decades don’t change over night—and they don’t change themselves.

Before this is a technology problem, this is a people problem. Solving it requires centering the people who are impacted and committing together to being the people who will do something about it (and recommitting every day).

Addiction is a system, and it’s been designed. If we can be more thoughtful, more human in our design approach, we can break the cycle that keeps us always striving for more instead of better.

We have to. It’s our job.

This post is adapted from my Clarity 2021 talk. Check it out, and please get in touch if you have other speaking opportunities! It was also published to UX Collective on Medium, if you’d like to save or clap for it.

John Voss

John Voss is a design systems designer with a generalist background and specific vision for the design field in which designers think about their impact beyond the screen.

Previous
Previous

To Guff

Next
Next

From march to merch: Pride is becoming overdesigned