The future of AI-powered therapy is here and mostly unregulated

AI-powered therapy bots are gaining popularity, but researchers caution that not every service claiming to be a therapist qualifies as one.

Listen 8:02
AI chatbot dialog on smartphone screen. (Bigstock/Wrightstudio)

AI chatbot dialog on smartphone screen. (Bigstock/Wrightstudio)

This story is from The Pulse, a weekly health and science podcast.

Find it on Apple PodcastsSpotify, or wherever you get your podcasts.


Last spring, psychologist and therapist Jessica Jackson got word about a mysterious website that had become notorious among her colleagues.

“There was a company, an anonymous company. So, they weren’t sharing who they were, but they were paying people to upload their therapy sessions,” Jackson said. 

They were paying $50 via Venmo or Paypal for people in therapy who were willing to share 45-minutes of clear audio from their sessions. 

No one seemed to know who was making this offer — all of the website domain ownership details were kept private. But Jackson and her colleagues had a hunch as to why that audio was worth money. 

“We assumed that they were training a large language model on these things,” said Jackson, who chairs the American Psychological Association’s Mental Health Technology Committee. 

She suspected that whoever was buying the therapy session audio was using it to train a chatbot, a robot therapist. 

In principle, Jackson wasn’t offended by the idea behind the website. Recording therapy sessions have long been a part of training for human therapists.

“When I was in grad school, we would record sessions,” said Jackson. “Our supervisor would listen to it and give feedback.”

But the recordings she trained with were made after clients consented to very specific conditions. Their personal information was anonymized, the audio was only available to other therapists in training. This website seemed more like a free-for-all, missing disclosures about how the client’s information would be used. And the therapist’s, too.

“It became a big thing in the therapist community, because that means that clients were now taping their sessions and not letting their therapist know. And then they were getting paid to upload this session and the therapist did not know that was happening.”

Subscribe to The Pulse

Jackson says audio like this is likely a valuable tool in the race to deploy chatbots to help solve the ever-widening mental health crisis. She says the idea of texting with a bot as opposed to opening up to a real-life person may be a sign of changing times.

“I think younger demographics tend to be a little bit more open to it,” said Jackson, who consults for several technology companies.  

She says the pandemic helped make people comfortable with the idea of finding help online and disclosing sensitive information to and through machines. 

“It normalized seeking help through technology because everything became virtual,” Jackson said.

In the 1960s, computer scientist Joseph Weizenbaum created ELIZA, a computer program that engaged people in typed conversation with a computer with less memory than most thumbdrives. Despite those early limitations, after a few brief exchanges, Weiznberg’s secretary famously asked the MIT professor to leave the room so she could type to the computer in private.

Today, companies are vying to scale-up the experience. 

As the pandemic slowly subsided, Apple introduced a new Journaling app, encouraging iPhone-users to reflect on their day within their phone. 

The news received puzzled reactions. There were already plenty of journaling apps out there, so this new feature seemed years behind and lacking real purpose, a hollow repository for your thoughts and feelings. But in the age of mental health chatbots, that repository of personal information could prove extremely valuable as the company introduces “Apple Intelligence” to its devices. 

Now, talking to a chatbot instead of a real human seems like just one more step along a path that could lead technology companies right into the $75 billion psychology and counseling industry. 

And some people don’t think that’s necessarily a bad thing. 

“The excitement is democratization of expertise,” I. Glenn Cohen, a professor and bioethicist at Harvard Law School, said. 

Cohen, a self-described techno-optimist, was not surprised to learn how common it has become for people to seek out these relatively cheap, human-sounding chatbots for therapy.

“[If] you are trying to get access to a therapist in America, let alone in a lower middle income country, the waitlist, the cost, it’s extremely high,” said Cohen. “So if we want to take people’s mental health seriously, and we’re not willing to scale-up the supply, or we can’t afford to scale-up the supply of therapists, it’s really exciting there might be opportunities to help and engage people’s mental health and help improve it through the use of some amount of automation or artificial intelligence technology.”

But at least for now, Cohen says most of the technology is not ready for prime time. 

Still, dozens of “therapy bots” have emerged online spouting buzzwords and claiming to be “here for you.”

“What worries me is that we already actually have a reported case from Belgium of a man chatting with a general purpose large language model. And essentially, at the end of this conversation, the way it went without guardrails, it advises the man to end his life — and the man ends his life,” Cohen said. 

While Cohen acknowledges the complexity of identifying what actually caused the man’s death, he says the case may provide a bleak window into the future. 

“When people engage and when they’re in vulnerable positions, as many people in mental health crises are, the concern is that if this is not being used in a responsible way, in a way that can determine if somebody needs something more than the LLM, it really has the potential of putting people in risky situations.”

Cohen says therapy chatbots fall within a regulatory loophole. 

While some companies have received FDA approval to deploy their chatbots for cognitive behavioral therapy, many simply label themselves as non-medical wellness apps to legally skirt the FDA oversight and state regulations pertaining to humans offering therapy. 

“As a result, I think they fall in this very interesting middle space, which for innovators and entrepreneurs is exciting because it’s a possibility to really explore and to build out. But for those of us who might have concerns about it, it’s something that we want to flag and be worried about and be thoughtful about how we might do better,” said Cohen. 

Psychologist Jessica Jackson thinks there is a role for artificial intelligence as a tool for therapists, like an updated crisis hotline and mental health surveillance tool – the first line of defense fielding calls and guiding people toward professionals who can help. 

But for now, she has started encouraging her colleagues to ask their clients if they have sought help from chatbots before — to open up a conversation and let their clients know that not everything that calls itself a therapist actually is one — or, has any real expertise in mental health issues.

“If you’ve ever looked at the GPT store and then looked up mental health GPTs, anyone can create one,” said Jackson. “There are several companies out there right now who have built startups that are focused on leveraging AI only and call themselves therapists.”  

Jackson questions how these AI “therapists”  are being trained. 

“There are some data sets that people can use, but they’re not full-on therapy scripts. But also these [chatbots] are [not] being created by clinicians. So how do you know what exactly is the training that’s happening?”

Jackson says people should always ask their therapists for permission before recording their sessions, and remain cautious about uploading versions of these sessions for money without really knowing what that audio will be used for – or how their most private information is stored. 

WHYY is your source for fact-based, in-depth journalism and information. As a nonprofit organization, we rely on financial support from readers like you. Please give today.

Want a digest of WHYY’s programs, events & stories? Sign up for our weekly newsletter.

Together we can reach 100% of WHYY’s fiscal year goal