The UX behind screen time guilt

How digital wellness tools offload responsibility on users instead of fixing the core problems

Illustration of a person holding a phone and looking stressed as speech bubbles with negative symbols appear around them.
illustration by author

Digital wellness is a strange concept when you think about it. Our phones track how much we use them, present this data as evidence of failure, and then leave us alone with our guilt never really addressing why we reached for them in the first place or offering constructive suggestions for improvement.

A brief history of digital wellness tools

Digital wellness tools emerged around 2018 in response to concerns about device addiction. With Apple’s Screen Time and Google’s Digital Wellbeing, suddenly our phones came equipped with features to monitor and limit their own use through dashboards and constraints.

The digital wellness market is now worth almost $1 billion — which is not an insignificant number for an industry built on helping us use our phones less yet entirely dependent on our phones to exist.

Apple’s Screen Time app (collage by author, image source)

These products’ framing deliberately takes from self-care vocabulary (using words like “wellbeing” and “focus”, some apps “diagnose” your focus levels) positioning themselves as caring for our health. But under the surface, these tools share a common mechanism: they make us feel guilty about our behaviour instead of questioning the patterns that influence it.

There are many digital wellness apps out there, and most follow similar patterns. A couple of the most popular ones are Forest and Opal.

Forest is a focused-timer app that uses simple gamification: you set a timed session (like a Pomodoro timer) and a virtual tree begins to grow. If you leave the app to check notifications or else, the tree dies. Completing sessions adds trees to your collection, and you can aim to gradually build a virtual forest.

Instead of growing a tree, Opal works by blocking selected apps and notifications for a set period of time, creating intentional friction between you and your distractions. When a session is active, attempting to open a blocked app triggers a screen reminding you of your focus goal, making the act of “breaking” your session a more conscious decision. Completing sessions adds gems to your collection and ranks you on a leaderboard.

Examples of how guilt is used by Forest app to nudge users (image by author)

Guilt as the special sauce

These are great examples of behavioural design at work, and even better examples of how digital wellness tools have convinced us that excessive phone use is a personal failure requiring personal intervention.

The problem is that this leads to a lot of self-guilt that doesn’t just come up naturally from our phone habits, but it’s also baked into these tools. From a psychological perspective, guilt emerges when we believe we’ve violated a personal standard. Digital wellness features tap into this directly: they highlight “failures” (ie. your tree died because you made a phone call) amplify them with comparisons (ie. your friend is in the top 1% on the focus leaderboard and you are not) and present usage as deviations from a norm.

Every dead tree or broken streak are feedback mechanisms the apps use to prove their worth, but they’re also guilt triggers aimed directly at users.

Why data alone is not enough

I don’t think we should have to pay our way out of the attention economy so as an iPhone user, I rely on Screen Time to monitor my habits. It’s free, built in and, at least at surface level, useful. In my opinion, however, it has one major flaw: it tries to teach users how to build better habits while making no changes whatsoever to the core product mechanics. In other words, it’s a diagnostic tool and, as such, its benefits are limited to making the user aware of a problem but not necessarily helping find a solution to it.

This is a common problem among digital wellness tools and it’s become one of the most widely accepted examples of how technology frames our behaviour as a personal problem to fix, and therefore a source of guilt.

Apple’s Screen Time is a great example of a data-rich but insight-poor tool. It shows you the numbers but doesn’t help you interpret or act on them. In UX terms, it lacks actionable insights. Data without support often turns into guilt, not change (image by author)

At the same time, the metrics presented aren’t even that accurate. Screen Time can’t distinguish between types of use, and it doesn’t let you exclude certain apps from the totals. So users are confronted with numbers that tell them almost nothing useful about their actual behaviour.

I will use my own data as an example. Screen Time told me last Monday I’d spent 5 hours on my phone. If that were 5 hours of scrolling TikTok, fair enough — that could be concerning. But when I dug into the breakdown, I found an hour of Google Maps because I was driving and another 30 minutes of YouTube because my son wanted a dance party after school and we had music playing in the background.

So my “real” screen time was closer to 3.5 hours. Still a lot probably, but notably less alarming than the initial number. The app had done its job by presenting decontextualised numbers and letting my brain fill in the narrative of personal failure, so before I even looked into what it meant, I felt that pang of shame.

There is a clear discrepancy between the intended use and purpose of this app versus what it’s able to deliver to users, which is an incomplete experience that doesn’t provide guidance or suggestions on how to take more control over our time.

Quick exploration of what it could look like if Screen Time leveraged data to prompt users to take action by providing guidance through insights (image by author)

What guilt-free design could look like

Screen Time presents usage as a static result rather than a dynamic behaviour, showing numbers but offering no interpretation.

An alternative to guilt-based design wouldn’t be to remove all this feedback but rather empower users to make change by contextualising the information and offering proactive suggestions.

My phone knows I’m at work — I have Work Focus on — so why can’t it contextualise my usage? Why not tell me I spent 30 minutes on Instagram during work hours specifically, letting me take accountability where it matters? Or if my Saturday usage suddenly spikes compared to my usual pattern, why not nudge me with a reflection point by letting me know that this is unusual for me at the weekend?

All these small interventions could help us understand our behaviour instead of just measuring it.

Examples of how Slack proactively reduces noise and distraction by decluttering the interface. It doesn’t inform the user about the number of inactive or muted channels, but rather offers proactive suggestions on what to do with them, guiding users to a solution (screenshot by author)

Digital wellness, to truly live up to its name, should feel more like an environment that supports the user, rather than just a dashboard that informs.

That could mean more supportive defaults, interfaces and prompts that are able to adapt to context, so instead of leaving users alone with the numbers, design can create conditions where healthy behaviour can develop more naturally.

I saw a comment on a LinkedIn post about the launch of yet another digital wellness app that I wish I had screenshotted, but it was along the lines of “I don’t need another app to manage how I use my phone. If I want to stay away from it, the best way to do it is simply to leave it another room”.

Apps like Opal still rely on frequent motivational notifications to encourage “healthy” phone habits, but this means users have to pick up their phone in order to be reminded to put it down. These interruptions however generate additional screen interactions that end up plugging users right back in (image by author)

Using our phones to manage our phone usage will always have its obvious limitations. Digital wellness apps are still apps — they need our attention to teach us about attention and require engagement to promote disengagement, and it feels like a lot of the current solutions are just slapping a timer on a problem that’s way more complex.

They can track, nudge, or create a bit of friction, but they can’t fundamentally change the habits or patterns that the system itself creates. However, acknowledging those limits makes the whole experience less about judgment and more about awareness.

As users, what we can do with that awareness is to make small, realistic adjustments, and as designers what we can do in turn, is to build interfaces that make those adjustments feel a little easier instead of adding more cognitive load, working towards making the relationship between us and our devices feel a bit more intentional.


The UX behind screen time guilt was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.

 

This post first appeared on Read More