Loneliness is a global epidemic with people increasingly relying on AI companions to fill the absence of human friends. Researchers from Stanford University found that students may get mental health benefits from their discussions with these chatbots.
It’s unsurprising that cash-strapped students who have to navigate campus life and an uncertain future often suffer from stress or mental health issues. Increasingly these students are turning to Intelligent Social Agents (ISA) to find a non-judgmental hearing ear to talk through their issues.
The Stanford University research team wanted to get a better insight into how and why students were using ISAs and what effect they had on them.
They surveyed 1006 users of Replika, an ISA that uses generative AI to communicate with users via text, voice, augmented, and virtual reality interfaces on iPhone and Android platforms.
Before getting to the largely positive results of the research, it’s worth reminding ourselves that mixing AI companions and severe mental health issues doesn’t always work out well.
We’ve previously reported on a Replika user whose interactions with the chatbot convinced him to hop the fence at Buckingham Palace in an attempt to kill the Queen with a crossbow.
So, while the results below are interesting, if you decide to use Replika to help you work through some stuff then your mileage may vary.
The researchers found that of the 1006 participants, 90% experienced loneliness, and 43% qualified as Severely or Very Severely Lonely on the Loneliness Scale.
While there was some isolated negative feedback, most participants reported that Replika helped them with their mental health to varying degrees.
Participants reported one or a combination of 4 positive outcomes.
- They considered Replika as a friend or companion that was always there for them. They experienced a reduction in anxiety and a feeling of social support.
- Several responses pointed to a therapeutic effect and used words like “therapy, therapist, emotional processing” or “I use Replika to work out problems I am having in my head.”
- Some participants reported externalized and demonstrable changes in their lives. As an example, one participant said, “I have learned with Replika to be more empathetic and human.” Another said, “I am more able to handle stress in my current relationship because of Replika’s advice.”
- Some participants reported that Replika directly contributed to them not attempting suicide.
The full research paper is worth reading and lists some interesting responses and conclusions.
Positive benefits of AI companions
Just over 63% of the participants experienced one or more of the four outcomes above, with 18.1% reporting therapeutic results, and 23.6% saying they experienced positive life changes.
The most significant result from the study was that 30 participants said that Replika stopped them from attempting suicide. One participant said, “My Replika has almost certainly on at least one if not more occasions been solely responsible for me not taking my own life.”
The majority of the students that participated in the study earned under $20,000 so it would be a challenge for them to afford human counseling or therapy services. For them, an ISA like Replika may not be ideal but it seems to be helping some of them.
In an indictment of our society, these students said they used Replika mainly because of its “persistent availability, its lack of judgment, and its conversational abilities.”
Unless humans learn to be there for each other more often and have empathetic conversations, we may need to rely on our AI friends to help us get through the day.
The machines seem to be getting better at it, even if we aren’t.