Can AI Replace Therapists? And More Importantly, Should It?


“Can machines think?” It’s a question that mathematician Alan Turing first posed in 1950 and became the cornerstone of his experiment, known as “The Turing Test,” in which a human and a machine are presented with the same dilemma. If the machine could imitate human behavior, it was considered intelligent, something Turing predicted would increasingly happen in the decades to come. He didn’t have to wait long: by the 1960s, MIT professor Joseph Weizenbaum had introduced the world to ELIZA, the first chatbot and forebearer of modern AI—and ELIZA was programmed to imitate a psychotherapist. But Turing’s question feels more prescient than ever now, as we find ourselves at a disconcerting crossroads with technology advancing and extending its reach into the various touchpoints of our lives at a rate so quick that the guardrails haven’t yet been created to corral it.

In 2025, Turing’s initial question has evolved into something different though: Can machines feel or understand feelings? Because, as increasing numbers of people turn toward AI in lieu of a human therapist, we are asking them to do just that. The technology has indeed come a long way since ELIZA. Now, you have options like Pi, which bills itself as “your personal AI, designed to be supportive, smart, and there for you anytime.” Or Replika, “which is always here to listen and talk.” There’s also Woebot, and Earkick, and Wysa, and Therabot, the list goes on if you’re just looking for someone—well, thing—to talk to. Some of these chatbots have been developed with the help of mental health professionals, and more importantly some haven’t, and it’s hard for the average client to discern which is which.

One reason that more people are turning to AI for mental health help is cost—sessions with a human therapist (whether virtual or in-person) can be pricey and are often either not covered by insurance or require a lot of extra effort to navigate whether they will be covered. For younger generations, recession-proofing their budget has meant ditching a real therapist for a bot stand-in. Then there’s the lingering stigma around seeking out mental health help. “Many families, whether it be because of culture or religion or just ingrained beliefs, are passing down stigmatized views about therapy and mental health through generations,” says Brigid Donahue, a licensed clinical social worker and EMDR therapist in L.A. And there’s the convenience factor: this new wave of mental health tools are available on your schedule (in fact, that’s Woebot’s tagline). “Your AI therapist will never go on vacation, never call out or cancel a session, they’re available 24/7,” says Vienna Pharaon, a marriage and family therapist and author of The Origins of You. “It creates this perfect experience where you’ll never be let down. “But the truth is you don’t heal through perfection.”

That healing often comes with the ruptures, friction, and tension of a therapy session that isn’t automated. “When you eliminate imperfection and human flaws and the natural disappointments that will occur, we really rob clients of the experience of moving through challenges and conflicts,” says Pharaon. The so-called imperfections of a human therapist can actually be reassuring for many clients. “For anyone who grew up with the intense pressure to be perfect, ‘mistakes’ made by a therapist can actually be corrective,” adds Donahue.





#Replace #Therapists #Importantly

Related Posts

Why, Exactly, Is Carrie Bradshaw Living in a Gramercy Park Townhouse?

Finally, something to live for: And Just Like That…, the Sex and the City sequel most accurately described as a series produced under the influence of a gas leak, has…

Hat Trick: On the Endurance of the Bucket Hat

Soon, I had in my possession a Prada that looked like a cheerleader’s pom pom and a 1920s cloche had a baby, a Tory Burch that had an extended back…

Leave a Reply

Your email address will not be published. Required fields are marked *