How to choose a trustworthy mental health app in a market bloated with options

How to choose a trustworthy mental health app in a market bloated with options

myboys.me/Shutterstock

I like to start my day with a bowl of cereal. It’s quick, light and, if I make the right choice, good for me. I also have the freedom to make a less healthy choice – but when I go to the supermarket and look over the array of options in front of me, I need only look at the back of the box to understand what I’ll be eating and, most importantly, whether the ingredients are nutritious.

Today’s vast market in mental health apps feels similar to choosing cereal. With the waiting lists for NHS mental health services being months long, apps appear to have the potential to ease pressure and are relatively cheap. Their accessibility and light-touch approach suggests that people in need of support will be able to manage their own mental health.

But increasingly, we’re seeing reports that, for all their promise, mental health apps might not be all they’re made out to be. With questions being asked about the need for regulation, is it possible that these apps are doing more harm than good?

The frustrating answer is: we don’t yet have enough information to say one way or another.

Unlikely to cause damage, but may not support wellbeing

It’s unlikely that most apps are actively damaging people’s mental health, although some encourage behaviour that is unlikely to support wellbeing.

To get around the problem of responsibility, many apps categorise themselves as wellness rather than therapy. They cannot offer advice that needs to be regulated, but they can point to services that might offer more help. This approach also reduces their responsibility for monitoring problems such as someone reporting they are going to self-harm.

Apps are also different from face-to-face therapy as they are generally designed to be used in short, ten-minute bursts that are accessible as and when they’re needed.

There’s certainly a lot of choice. App stores are bloated with options offering different levels and types of support. Unfortunately, hardly any offer extensive evidence of their effectiveness – in terms of controlled trials and in-depth analysis rather than user reviews – and even if they did, the app store would not tell you that before you downloaded.

This leaves potential users in a situation where they don’t know what they’re getting, and it could be stopping them from accessing actual evidence-based care.

Effective interventions should always be based on evidence, but they also require the user to engage with them over a period of time. While they are easy to download, apps are also easy to ignore. There are numerous examples of trials using app-based interventions in which participants download but never actually open the app, or which have a very steep drop-off in engagement after a few sessions.

It’s clear to me that these apps are more than a passing fad. When the National Institute for Care and Excellence (Nice) made the decision to approve eight online interventions in March 2023, I (cautiously) welcomed the idea. Digital therapies have the potential to offer additional support for people in need and provide a welcome bridge between sessions of therapy.

Importantly, these eight apps will be scientifically assessed for how and where apps can be effective in the real world, laying the foundations for the rest of the market to follow.

Four principles for mental health apps

More than anything, people need to know what they are getting, and we need to see greater transparency from providers. This is likely to increase their market share, so it really is in their interests.

In 2019, Stephen Schueller and I set out four principles that all mental health apps should work towards, in order to provide the most transparent possible offering to their users. What personal information is collected, and how is it used? Were target users involved in the design of the app? How much should you use it, and is it safe? Are there measurable benefits to using the app?

My view in 2019 was that formal regulation wasn’t necessarily needed. Anyone making false claims on an app can be reported to advertising standards authorities, and anyone providing actual therapy already operates within a regulated market – although it’s always worth reminding those seeking therapy to do their due diligence and find the therapist that is right for them.

My view about regulation hasn’t changed, but this isn’t to say that no changes are needed. While set up as libertarian havens, app stores such as Google Play and Apple’s App Store need to adopt some rules for the way that health apps are marketed.

By tightening up what’s considered a “health” app and setting out clear rules like those I’ve listed, mental health app consumers should have greater confidence that the app they’re downloading has the power to help them. Essentially, consumers need to be able to choose their mental health apps as easily as I can make a choice about my cereal.

The Conversation

til.wykes@kcl.ac.uk has received government research grants to investigate the benefit of digital therapies