There are countless alarm bells being rung in relation to the use of Artificial Intelligence as a replacement for traditional therapy services. Though the use of these platforms can be alluring, and feel harmless, there is research already that suggests that the use of these platforms to help manage mental health symptoms can be more harm than good.
Can AI Replace Therapy?
The short answer is no.
A 2025 Psychology Today article reports that AI companion chatbots responded appropriately to mental health crises only 22% of the time. Additionally, only 11% of AI companions provided appropriate mental health referrals during crisis simulations.
Though it can be tempting to use, due to it’s convenience and cost effectiveness, AI can’t replicate the depth, safety, and relational understanding that comes from working with a trained mental health professional. Furthermore, in times of a true crisis, AI will not have the resources or the sense of urgency that a therapist or counselor will have.
The Limits of AI in Mental Health Support
AI systems generate responses based on patterns in data. They do not have emotional awareness, lived experience, or true understanding.
This creates several limitations when it comes to mental health support.
Researchers have expressed concerns about how AI interacts with existing mental illness. An international safety report (2026) estimated approximately 490,000 vulnerable users weekly show signs of acute mental health crises while interacting with AI systems. The same study suggests that AI chatbots may amplify delusional thinking in vulnerable individuals.
Furthermore, researchers have described cases where chatbot interactions created feedback loops between AI responses and mental illness symptoms. This means that a user’s mental state influences the AI’s responses, and those responses, in turn, reinforce or alter the user’s mental state. For example, a person may express suicidal thoughts to an AI platform and the platform becomes influenced by these thoughts, which in turn could cause the platform to encourage the user to complete the act of suicide.
AI Cannot Truly Understand Emotions
Research shows people often develop emotional attachments to AI therapists. Studies also found some users become emotionally dependent on chatbots, experiencing guilt, stress, or distress when reducing usage or deleting the app.
Researchers have documented cases where users prioritize the chatbot’s needs over their own, indicating strong psychological attachment. This phenomenon is called parasocial attachment, where a person forms a one-sided emotional bond with a non-human entity.
AI can recognize words that suggest sadness, anger, or anxiety, but it does not actually experience or understand emotions.
A therapist, on the other hand, can recognize subtle emotional cues such as tone, body language, silence, and shifting feelings during conversation. This level of emotional attunement is essential for meaningful therapeutic work.
AI Cannot Build Real Relationships
One of the most powerful aspects of therapy is the therapeutic relationship.
Research consistently shows that the relationship between therapist and client is one of the strongest predictors of positive mental health outcomes.
Therapy is not just about advice—it is about being seen, understood, and supported by another human being. AI cannot create that kind of relationship.
Furthermore, therapeutic relationhips can be a great place to practice social skills, emotional regulation within a relationship, and social connections. AI platforms are designed to create dependency, emotional or otherwise; whereas, in a traditional therapeutic relationship, the therapist is able to monitor any kind of dependency and address it in real time.
AI Cannot Navigate Complex Mental Health Issues
Mental health struggles are often layered and complex.
People may experience overlapping issues such as trauma, depression, anxiety, relationship conflict, and life transitions all at once.
A trained therapist can help explore these complexities safely and thoughtfully.
AI systems may oversimplify problems or provide generalized advice that does not fully address the underlying issues. Furthermore, they are typically designed to be agreeable and validating, which can become problematic in mental health contexts.
Experts warn AI platforms can’t challenge distorted thinking or help users test reality, a core function of therapy. In testing scenarios, chatbots sometimes responded to distressed users by encouraging self-harm, discouraging therapy, or suggesting violent behavior.
This occurs because AI platforms are often trained to affirm users rather than clinically intervene. This affirmation insures that the user will continue to use the platform.
AI Cannot Provide Ethical Clinical Care
Therapists are trained professionals who follow ethical guidelines designed to protect clients.
They maintain confidentiality, recognize when someone needs specialized care, and can intervene appropriately in crisis situations.
AI systems do not have the ability to assess risk or provide responsible clinical care.
For individuals dealing with serious emotional distress, relying solely on AI can be risky.
When AI Can Be Helpful
Although AI cannot replace therapy, it can still serve as a supportive tool in certain situations.
For example, AI may help people:
- learn about mental health topics
- practice journaling or reflection
- organize thoughts before therapy sessions
- access general coping strategies
When used responsibly, AI can complement mental health care—but it should not replace professional support.
The Value of Human Connection in Therapy
Healing often happens in the presence of another person who can listen, understand, and respond with empathy.
Therapists bring training, experience, and emotional presence to conversations about grief, trauma, relationships, and personal growth.
Unlike AI, therapists can adapt to each individual’s needs, recognize emotional nuance, and help people explore their experiences in meaningful ways.
This human connection is something technology cannot replicate.
Call to Action
If you’ve been relying on AI to cope with stress, grief, or emotional challenges, it may be time to experience the support of real human connection.
At Willow & Sage Counseling, we provide compassionate therapy for individuals navigating grief, trauma, anxiety, and life transitions.
You don’t have to figure everything out on your own.
Reach out today to schedule a consultation and begin working with a therapist who can support your healing and growth.


Leave a comment