Human therapists worry about the limitations of chatbots

The rapidly evolving AI landscape is transforming the world and the ways in which some people work, learn and communicate. The new frontier includes mental health therapy provided by chatbots, and while AI-driven technology is promising, some mental health experts have concerns about the potential pitfalls.

Psychotherapist Elise Gold says her job fills her cup.

“I love the work I do. Why? Because it’s really about the human connection, it’s about heart to heart, eye to eye,” she said.


What You Need To Know

  • AI-generated chatbots like ChatGPT and Wysa are now used for advice or mental health support, but it’s a concern for therapists like Elise Gold
  • Gold believes in human-to-human counseling because it allows for her to see the client and feel their feelings as they are speaking, and to watch their body language
  • There are, however, benefits to AI-powered platforms like Wysa
  • Meanwhile, Gold understands why some may turn to chatbots for a quick fix, however, she believes their problems are likely far deeper and require more time in an environment with a human therapist

Her office is a safe space for her clients who typically sit on a couch. Gold said she’s worried about the recent rise of AI-generated chatbots like ChatGPT and Wysa, tools that are now being used by people who want advice or mental health support to deal with depression, anxiety or isolation in a way that mimics a human therapist interaction.

“So much of what I do, as a psychotherapist, is seeing the person, feeling them,” Gold said. “Certainly, those bots can hear the responses by reading the words that a person might type in, but for me, I might see if their shoulders go up, that their knee is jiggling a million miles a minute.”

There are, however, benefits to AI-powered platforms like Wysa. It’s on-demand support, free to use and has the potential to revolutionize access to mental health care for people who face logistical or financial barriers to accessing human counseling.

Another AI-driven chatbot service, Woebot Health, stands by its technology. The company says Woebot is meant to be used in addition to clinical care – not as a replacement to it.

Meanwhile for people like Gold, she understands why some people may turn to chatbots for a quick fix. However, she believes their problems are likely far deeper and require more time in an environment with a human therapist.

“My sense is there won’t be real healing maybe in that moment it might feel good, maybe, I don’t really know, but my sense is the healing won’t be long-term,” she said. “Being a therapist, being a health care practitioner, is really an act of love, it’s an act of service, it’s meeting somebody where they are, being to being, human to human, heart to heart.”

Woebot warns users up front about the limitations of its service, including how it should not be used in crisis situations.

Gold said she has ethical concerns, such as a chatbot providing unvetted feedback or giving a mental health diagnosis to a user. Her personal practice before giving a diagnosis to a client is to consult with a colleague or her supervisor.

Source link

credite