Artificial Intelligence
Will AI Companions and Therapists Transform Psychotherapy?
Artificial intelligence can provide evidence-based guidance instantly and efficiently.
Posted April 24, 2025 Reviewed by Michelle Quirk
Key points
- AI therapists offer 24/7 access, privacy, and nonjudgmental support.
- Many people feel more comfortable sharing with AI than with humans.
- Younger generations may prefer AI therapy over traditional options.
As a clinical psychologist and sex therapist with decades of experience, I’ve witnessed the evolution of psychotherapy—from traditional talk therapy to telehealth. Yet nothing is poised to transform our field more than the emergence of artificial intelligence (AI) companions and AI therapists. Many believe that “AI can’t replace a human therapist” or that “People will always prefer a human therapist,” both of which may very well be true. However, I do not think these arguments are enough to prevent AI from having a massive impact on the field of therapy. Just preferring a human therapist won’t diminish the impact of this coming wave. For a variety of reasons, I already see a different reality emerging.
Authenticity Is a Perception
A common concern about AI therapy is its lack of “authentic emotion.” But humans are remarkably responsive to perceived empathy. AI companions and therapists are already quite skilled at simulating warmth, attentiveness, and understanding. When an AI listens, responds thoughtfully, and remembers your story, it can feel genuinely supportive—even when you know it’s a program. For many, the experience of feeling heard and valued is what matters most, regardless of whether that support comes from a conscious being.
Accessibility and Compassion Are Extremely Valuable—Regardless of the Form They Takes
Perhaps the most profound advantage of AI therapists is accessibility. They are available 24/7, require no appointments, and are (or soon will be) free or extremely affordable. In a world where mental health services can be expensive, stigmatized, or hard to access, this is nothing short of revolutionary. For example, some professions, like pilots and physicians, still today require licensees to disclose having had mental health care when applying for their license to practice. Obviously, this can leave some professionals reluctant to seek psychological care.
Imperfection Is Universal—Human and AI Alike
There’s a persistent myth that human therapists are always better. Yet therapists are human: We can get tired, distracted, or say the wrong thing. Many clients have experienced moments when their therapist seemed inattentive or missed the mark. AI isn’t perfect either—it can misunderstand context or make mistakes. But neither humans nor AI are flawless, and perfection isn’t the standard people expect or need. What matters is the genuine intention to help and the consistent presence, both of which AI can provide in its own way.
The Power of Anonymity and Nonjudgment
One of the most significant barriers to seeking therapy is the fear of being judged. With AI, this concern fades into the background. Users can share their deepest thoughts and feelings without worrying about criticism. For many, especially those who have felt vulnerable or let down by human interactions, this sense of safety and anonymity can be profoundly healing.
Information and Guidance at Your Fingertips
AI has access to a vast and ever-growing body of research, best practices, and real-time data. No single therapist can match the breadth of information an AI can draw upon. This means AI can offer evidence-based guidance and up-to-date resources instantly, personalizing support in ways that would be impossible for most practitioners.
Detecting Suicidal Risk
A thoughtful concern often raised is whether AI can reliably detect when someone is at risk of self-harm or suicide. This is a deeply important issue, and it’s true that AI will not always catch every warning sign. But it’s also true of human therapists: Even with training and experience, we sometimes miss subtle cues or are not told the full story. In fact, one could argue that some people may be more likely to admit to suicidal thoughts to an AI than to a human, precisely because they believe it won’t immediately trigger mandatory reporting or emergency interventions. The privacy and nonjudgmental nature of AI may encourage greater honesty, allowing for earlier and more open conversations about distress.
The Next Generation’s Preference
Skeptics say, “People will always prefer a human therapist.” I respectfully disagree—especially when it comes to younger generations. Digital natives are already comfortable forming meaningful connections online. For them, the distinction between digital and “real” is increasingly blurred. If an AI companion or therapist offers support, guidance, and connection—without cost, stigma, or delay—many will choose it, and some may even prefer it.
The Coming Disruption
The rise of AI companions and therapists is not just a technological shift—it’s also a fundamental disruption to the field of psychotherapy, shifting expectations and redefining what support looks like. Therapists will likely feel uneasy about what this means for the profession, and that’s understandable. But the reality is clear: AI will become a primary source of emotional support for many—not because it’s perfect, but because it’s accessible, affordable, nonjudgmental, and effective enough for most people’s needs.
For better or for worse, the future of psychotherapy is being rewritten. The question isn’t whether AI will play a major role—it’s how soon, and how profoundly, it will reshape the way we approach mental health and its treatment. It may sound as though I’m cheering for this shift, but that’s not the case. Like most therapists, I have mixed feelings about the prospect of AI taking on such a central role in our profession—just as many people in other fields feel uneasy about the changes AI is bringing to their work. But this isn’t about enthusiasm or resistance; it’s about recognizing reality. Whether we welcome it or not, the integration of AI into mental health care is already underway. Rather than ignore or condemn it, I suggest we acknowledge this transformation and thoughtfully consider what it means for the future of therapy—the field itself, its providers, but most importantly, those receiving it.
If you or someone you love is contemplating suicide, seek help immediately. For help 24/7, dial 988 for the 988 Suicide & Crisis Lifeline, or reach out to the Crisis Text Line by texting TALK to 741741. To find a therapist near you, visit the Psychology Today Therapy Directory.
References
Heinz, M. V., Mackin, D. M., Trudeau, B. M., Bhattacharya, S., Wang, Y., Banta, H. A., Jewett, A. D., Salzhauer, A. J., Griffin, T. Z., & Jacobson, N. C. (2025). Randomized trial of a generative AI chatbot for mental health treatment. NEJM AI, 2(4), doi.org/10.1056/AIoa2400802
Inkster, B., Sarda, S., & Subramanian, V. (2018). An empathy-driven, conversational artificial intelligence agent for digital mental well-being: Real-world data evaluation. JMIR mHealth and uHealth, 6(11), doi.org/10.2196/12106