From Eliza to Ellie: the evolution of the AI Therapist

Men are all alike 
IN WHAT WAY? 
They're always bugging us about something or other 
CAN YOU THINK OF A SPECIFIC EXAMPLE? 
Well, my boyfriend made me come here 
YOUR BOYFRIEND MADE YOU COME HERE? 
He says I'm depressed much of the time 
I AM SORRY TO HEAR YOU ARE DEPRESSED

This conversation took place between a young woman and a computer programme (in CAPS) called Eliza (after the Pygmalion character); relatively unremarkable you might think in an era of chatbots and voice assistants, but what if I told you that the conversation took place in 1966

Mental Health in Crisis

Whether you see the increase in demand for mental health services as a result of an increase in stress in our daily lives, a decrease in the taboo surrounding mental health issues, or simply as a result of an increased population, the fact remains that the services we look to for support are buckling under the pressure of those needing to access them.

A report published by the all-party parliamentary group on mental health highlights; “People are being turned away from services or put down to the bottom of waiting lists because they are ‘not sick enough’ for secondary mental health services"  and mental health support for young people in the UK has been described by frontline staff as a 'silent catastrophe' with an "increasing mismatch between need and treatment offered". 

As we've highlighted in previous blogs, in their report entitled "Mental health of those suffering with physical or learning disabilities" the Shaw Foundation highlighted; "UK researchers have found that 30% of those with a long term physical condition also have a mental health problem, and 46% of people with a mental health problem have a physical condition" and that; "25-40% of those in the UK with a learning disability have a dual diagnosis with a mental health disorder."

So, as this series of blogs has explored, are we now at a point where we can turn to increasingly sophisticated technologies for support?

Why? Why? Why? Eliza.

The Eliza program above was created by MIT professor Joseph Weizenbaum and although from a programming point of view it was sophisticated at the time, the technique it employed was a very crude version of Rogerian, or person-centred counselling. This form of counselling is based on the notion that the patient knows what they need, and the role of the therapist is to provide the environment that allows the patient to access this. A skilled (human) practitioner of Rogerian counselling is trained to react to what was being said and could guide a person along their own journey, Eliza merely deconstructed the sentence and repeated it back…

…the issue was Eliza was successful; very successful, so much so that users of the program felt that Eliza had a genuine understanding of them and would spend long sessions on the computer 'talking' things through. The program gained popularity on the MIT campus and was shared to other universities. 

Weizenbaum was apparently so horrified that people were placing their trust in what he saw as a trivial creation that he would eventually become an avid supporter of social responsibility in science and an outspoken critic of AI.

50 years is an eternity in terms of the development and advances in technology and from the early days of Eliza, we have progressed to far more sophisticated methods of interaction. Previous blogs have looked at both chatbots and voice assistants and how they can offer support for wellbeing and help us manage low-level mental health issues such as stress, anxiety and depression, but how about seeing an AI therapist?

Ellie; the AI 'therapist'

Ellie was developed by the USC Institute for Creative Technologies as part of a project called SimSensei

Ellie is designed to monitor micro-expressions, to respond to facial cues, to perform sympathetic gestures and build rapport. Ellie is one of those truly remarkable technologies you see where you keep expecting someone to pull back a curtain and reveal The Wizard of Oz this time with a webcam and a keyboard.

Although the project creators are keen to point out that Ellie was designed as a “decision support tool designed to gather information” and Ellie ‘herself’ will also make clear during the sessions that she is “not a therapist, but is there to listen,” it is fairly clear that her levels of sophistication unavoidably take us into the realm of ‘replacement’ therapist and the information gathered from interactions with her would suggest we are more comfortable with this than you might think. 

In a test group of American soldiers who as part of their return from a tour have to fill out a Post Deployment Health Assessment, it was they reported more symptoms of Post-Traumatic Stress Disorder to Ellie than they did on their assessment form, more significantly, they also reported more symptoms to Ellie than they did on an anonymised version of the form.

Speaking in an interview with Motherboard Albert Rizzo, one of the team behind Ellie explained;

"I don't think it's a replacement for all the good things that human therapists do, but I think it can extend their capabilities … when you interact with a digital character you use all the same parts of your brain as when you talk to a real person, but with none of the fear of them judging you."

There are several avenues of development for Ellie that the project team are keen to explore including interviewing cancer patients about their treatment and difficulties as well as speaking with people who have experienced sexual trauma; an area notoriously difficult in terms of therapeutic intervention. 

A digital therapist is also not limited geographically or in terms of hours available; they do not tire or get distracted. A digital therapist can offer support to people who are isolated physically or socially, at a time that is unconstrained by office hours.

Ethics

There are obviously ethical concerns and debates that are ongoing in the field of AI, and as exciting, powerful, or useful as AI may be, those looking to implement it for mental health support are wrestling with many dilemmas. Is AI no different than a front-line volunteer; a means of answering a query, and triaging need? Is access to AI-based support better than no support at all? Does having an ever-present therapist risk addiction and reliance; will we be reaching for our phones at the slightest emotional hiccup?

We may be faced with the headlines stating how AI is taking jobs or outperforming humans in every field that it touches, but regardless of the voracity of these statements, we need to remember that we retain our ability to make choices; do I want instant access to a digital therapist or do I choose to wait for access to a human? Does discussing mental health require that the recipient has access to the human condition, or do I just need something to listen to me?

The 13th of May marks the beginning of Mental Health Awareness Week in the UK.

Related articles: