- Jane Wakefield
- technology reporter
Share your deepest fears with Alexa? Or ask Siri for emotional support after a particularly stressful day?
Chatbots are increasingly being used in smart speakers, websites and apps to answer questions.
And as these systems, powered by artificial intelligence (AI) software, become more sophisticated, they’re starting to provide pretty good and detailed answers.
But will such chatbots be human-like enough to be effective therapists?
Computer programmer Eugenia Kuyda is the founder of Replika, an American chatbot app that provides users with “an AI companion that always listens, talks, is always by your side, and cares.” says.
It launched in 2017 and now has over 2 million active users. Each has its own chatbot or “replica” as the AI learns from conversations. Users can even design their own cartoon avatar for the chatbot.
Kuyda says people using the app range from autistic kids who use it as a way to “warm up before socializing” to adults who are just lonely and need a friend. .
Some people use Replika to practice job interviews, talk politics, or use it as a marriage counselor.
And while the app is primarily designed to be friends and companions, it also claims to help with mental health, including helping users “build better habits and reduce anxiety.”
The WHO adds that “only a fraction of the people in need have access to effective, affordable and quality mental health care.”
And while anyone with concerns, either for themselves or for a relative, should go to a medical professional in the first place, the growth of chatbot mental health therapists could provide welcome support to so many people. There is a possibility
Dr Paul Marsden, member of the British Psychological Association, says apps aimed at improving mental health can help, but only if you find the right one, and in limited ways. only.
“When I looked it up, there were 300 apps that made me feel uneasy… so you don’t know which one to use?
“They should only be seen as supplements to face-to-face therapy. The consensus is that apps are not meant to replace human therapy.”
But at the same time, Dr. Marsden says he’s excited about the power of AI to make therapeutic chatbots more effective. “Mental health support is based on conversation therapy, and what chatbots do is talk,” he says.
Dr. Marsden highlights the fact that major AI chatbot companies, such as OpenAI, the company behind the recent headline ChatGPT, are opening up their technology to other companies.
This will allow mental health apps to use the best AI “with its vast knowledge, improved reasoning ability and proficient communication skills” to power chatbots, he says. Replika is one of the providers he already uses OpenAI’s technology.
The New Tech Economy is a series that explores how technological innovation is shaping the emerging economic landscape.
The news story emerged after Luka, the company behind Replica, updated its AI system to prevent such sexual exchanges.
Not all users are happy with this change. One person wrote on his Reddit:
The Italian watchdog said the app was used by users under the age of 18 and had received “completely age-inappropriate replies. The app was designed to protect individuals who are still in a developmental stage or who are in an emotional state.” It can also increase risk,” he added. Vulnerability”.
The move could limit the use of replicas in Italy and Luca could be fined. It’s progressing,” he said.
UK online privacy campaigner Jen Persson says global regulation of chatbot therapists needs to be tightened.
“AI companies that claim their products to identify or support mental health, or that are designed to affect your emotional state or mental health, are classified as health products and are subject to quality and safety standards accordingly. must be subject to
Kuyda argues that Replica is more of a pet companion than a mental health tool. She adds that it should not be seen as a substitute for help from a human therapist.
“Real-life therapy offers incredible insight into the human psyche, not just text and words, but by meeting in person and seeing body language, emotional reactions, and incredible knowledge of your history. It provides insight into the
Other apps in the mental health space are much more cautious about using AI to begin with. One of them is his Headspace, a meditation app with over 30 million users and is approved by the NHS in the UK.
“Our core belief and entire business model at Headspace Health is based on people-led, people-focused care. Live conversations with coaches and therapists via chat, video, or in person The connections our members have through are priceless,” said Headspace chief. Executive Russell Glass.
He added that Headspace uses some AI, but does so “very selectively” while maintaining “depth of human involvement.” The company says he doesn’t use AI to chat with users, but instead provides users with personalized content recommendations or assists human caregivers in taking notes. Glass says it uses AI to
But Dr. Marsden says AI-powered therapy chatbots will continue to improve. “New AI chatbot technology appears to be evolving skills for effective mental health support, such as empathy and understanding how the human mind works,” he says.
his comment comes after Recent research by Cornell University In New York State, we put ChatGPT through a number of tests to see how well people can understand that other people might think differently. The AI scores were comparable to those of a 9-year-old child.
Previously, this kind of cognitive empathy was considered unique to humans.