The Woebot Will See You Now

Could chatbots be the answer to a lack of mental health provision?

It’s no secret that mental health services are in crisis. A lack of funding and huge levels of demand have pushed the system to the brink – and it’s patients who suffer. Waiting lists for talking therapies routinely exceed several months, leaving patients unsupported during times of need. As ever, new technology could help to provide the solution, and that’s exactly what the creators of therapy chatbot Woebot hope to achieve.

Silicon Valley based Woebot Labs have developed an AI powered chatbot to help people suffering from symptoms of depression and anxiety. Aware that many are put off from seeking help for their mental health because of the social stigma attached to sufferers, the team’s aim is to enable people to open up without any fear of judgement. Woebot, a chatbot hosted on the Facebook Messenger platform, is always ready and waiting to listen. Programmed to mimic natural human conversation, it checks in with users at random points throughout the day. With knowledge of clinical practices such as Cognitive Behavioural Therapy, it is on hand to offer expert advice or merely to provide a sounding board for its users’ thoughts.

Can robots handle our emotions?

Woebot joins a list of several previous chatbots expressly developed to get people to talk. In 2016, tech startup X2AI created Karim, a psychotherapy chatbot designed to help Syrian refugees. Where other sources of therapy aren’t easily available, the benefits of talking to a chatbot are clear. Opening up about your emotions in any way has got to be better than keeping everything bottled up inside. What’s more, a clinical trial recently conducted by Woebot Labs did indeed show that talking to Woebot reduced depression in college students. The group of students which regularly talked to Woebot reported significantly lower signs of depression and anxiety when compared to the group which merely had access to a mental health e-book. The fact that Woebot checked in with its users regularly seemed to be particularly effective, with students saying they felt supported and cared about. Since a stated aim of the creators of Woebot is for their bot to make an emotional connection, they seem to have resoundingly achieved their goal.

A sticking plaster, not a solution

One of the major problems with bots such as Woebot is that they operate in a grey area between medical therapy and the simple provision of information. For good reason, Woebot Labs cannot market their chatbot as a source of medical treatment, or imply in any way that its services constitute medical practice. It should therefore be seen as a supplement to – not a replacement of – traditional therapies. Whilst Woebot is cheap and more easily available than traditional treatments, the conditions it is suitable for are also far more limited. Chatbots are best at helping people in the moment, when they are really distressed and would benefit from talking things through to calm them down. They are not suited to dealing with complex needs, where people have a lot of different issues that they wish to talk about.

The way that Woebot delivers its services could also be a sticking point for many people. The bot is currently only available via Facebook Messenger – a fact which will put off many users who do not wish to divulge their sensitive personal information to the technology behemoth. In fact, wider privacy concerns could prove to be a real issue with therapy bots as a whole. In the US, they are not covered by the Health Insurance and Portability and Accountability Act, which prevents healthcare providers from sharing information about their patients. Personal data collected by chatbots is therefore held at the discretion of individual companies.

Underpinning all of these issues is the central fact that users of these chatbots need to be comfortable with talking to a robot. Woebot Lab’s clinical trial may have shown the bot to be effective, but it should be noted that it was only tested on students aged 18-28. Young people might be ready to accept technology into their lives, but those outside of this demographic might be a little more reluctant to do so. It all comes down to the fact that for a bot therapist to be effective, its user has to believe that it truly cares. When Woebot tells you it hopes you feel better soon, for many people it may be impossible to escape the knowledge that it was merely programmed to do so.

Can chatbots help to reduce the stigma around mental health? Would you feel comfortable opening up to Woebot? What advantages do chatbot therapists have over their human equivalent? Comment below with your thoughts.