Could a chatbot be your therapist? At Character.ai, millions of young people are turning to the "Psychologist" for help

from character.ai

We track AI for the way it can empower individuals and communities. This is what seems to be happening in this BBC story, where Character.ai’s artificial therapist - titled “Psychologist” is helping millions of users:

A total of 78 million messages, including 18 million since November, have been shared with the bot since it was created by a user called Blazeman98 just over a year ago.

Character.ai did not say how many individual users that is for the bot, but says 3.5 million people visit the overall site daily.

The bot has been described as "someone who helps with life difficulties".

The San Francisco Bay area firm played down its popularity, arguing that users are more interested in role-playing for entertainment. The most popular bots are anime or computer game characters like Raiden Shogun, which has been sent 282 million messages.

However, few of the millions of characters are as popular as Psychologist, and in total there are 475 bots with "therapy", "therapist", "psychiatrist" or "psychologist" in their names which are able to talk in several languages.

Some of them are what you could describe as entertainment or fantasy therapists like Hot Therapist. But the most popular are mental health helpers like Therapist which has had 12 million messages, or Are you feeling OK?, which has received 16.5 million.

Psychologist is by far the most popular mental health character, with many users sharing glowing reviews on social media site Reddit. "It's a lifesaver," posted one person. "It's helped both me and my boyfriend talk about and figure out our emotions," shared another.

The user behind Blazeman98 is 30-year-old Sam Zaia from New Zealand.

"I never intended for it to become popular, never intended it for other people to seek or to use as like a tool," he says. "Then I started getting a lot of messages from people saying that they had been really positively affected by it and were utilising it as a source of comfort."

The psychology student says he trained the bot using principles from his degree by talking to it and shaping the answers it gives to the most common mental health conditions, like depression and anxiety.

He created it for himself when his friends were busy and he needed, in his words, "someone or something" to talk to, and human therapy was too expensive.

Sam has been so surprised by the success of the bot that he is working on a post-graduate research project about the emerging trend of AI therapy and why it appeals to young people. Character.ai is dominated by users aged 16 to 30.

"So many people who've messaged me say they access it when their thoughts get hard, like at 2am when they can't really talk to any friends or a real therapist,"

Sam also guesses that the text format is one with which young people are most comfortable. "Talking by text is potentially less daunting than picking up the phone or having a face-to-face conversation," he theorises.

More here.

A service cited in the BBC article is Limbic Access, a chat-bot that helps people self-refer for psychological help.

Their website claims: “Over 100,000 NHS patients were asked if Limbic helped them access care - 92% responded with the highest score. Meanwhile, clinicians using Limbic report feeling less rushed during clinical assessments.” A pre-print research paper has just been released, claiming to show that:

.…The tool led to a 15% increase in total referrals, which was significantly larger than the 6% baseline increase observed in matched services using traditional self-referral methods during the same time period.

Importantly, the tool was particularly effective for minority groups, which included non-binary (235% increase), bisexual (30% increase), and ethnic minority individuals (31% increase).

This paints a promising picture for the use of AI chatbots in mental healthcare and suggests they may be especially beneficial for demographic groups that experience barriers to accessing treatment in the traditional care systems

More here. Check out this Conversation UK piece on why chatbots won’t ever replace human therapists. And the WHO and World Economic Forum lay out some guidelines and tools for AI in mental health.