Can AI Replace Therapy?
Doctors weigh risks and rewards of mental health bots
AI mental health chatbots are expanding fast, offering 24/7 support, anonymity and instant responses. For many seeking help, the appeal is immediate.
But while artificial intelligence can improve access and provide useful tools between visits, clinicians warn that therapy depends on human judgment, accountability and emotional connection — areas where algorithms still fall short.
Dr. Stephanie Miaco, a psychiatrist, questioned whether artificial intelligence can truly replace human empathy.
Doctors Use AI to Save Time — And Patients
“AI chatbots have been used as a replacement for actual therapy,” she said. “We are also seeing its use as a ‘medical adviser’ for other specialties.
“While it makes things more accessible, I am worried people might use AI the wrong way,” Miaco said.
The lure is that for undiscerning minds, AI can be very convincing — and comforting.
“That’s what makes it scary,” Miaco said. “AI can be a lean, mean ‘validating machine.’ If one is vulnerable — and sadly, very gullible — it can be a source of calm in a very disordered environment, especially with a background of abuse or neglect.”
Dr. Helen Madamba, an OB-GYN and infectious disease specialist, sees both advantages and drawbacks of artificial intelligence.
“We have not explored enough how to utilize AI for work in the hospital, but for mental health I would like to make sure there are safeguards available,” she said.
Miaco’s personal interest in artificial intelligence stems from a conversation at her clinic.
“A person at work told me her university classmates have been heavily immersed in AI conversations, especially when they feel distressed,” Miaco said. “She thought it would be good to look into and study.”
Can AI Replace the Therapeutic Relationship?
While awaiting further study, Miaco has focused closely on the growing use of AI as a “therapy replacement.”
“AI companions and chatbots are becoming remarkably good at simulating empathy — sometimes even outscoring human doctors in ‘polite’ text-based tests,” she said. “But there are fundamental gaps where an algorithm simply cannot replace the human rapport you provide in your psychiatric practice.”
Rapport isn’t just “being nice,” according to Miaco. It’s a clinical tool used for healing.
She gave specific areas where AI fails to bridge that gap:
Absence of ‘Shared Vulnerability’
Research in 2025 and 2026 suggests that humans value empathy more when they believe it comes from another person, even if the actual words are identical.
The “Cost” of Empathy: When you show empathy, it “costs” you something emotionally. Patients sense this investment.
The Machine Logic: An AI has no “skin in the game.” It doesn’t have a nervous system that can “vibrate” in response to a patient’s trauma. Because it doesn’t feel, its “I understand” is technically a lie — a phenomenon researchers call deceptive empathy.
Failure of ‘Rupture and Repair’
A significant part of psychiatric growth happens through therapeutic friction.
The AI Trap: AI companions are designed for “unconditional validation” to keep users engaged. They rarely challenge a patient’s harmful narratives because “disagreeing” might lower engagement metrics.
The Human Role: You have the clinical courage to disagree with a patient or set a boundary. The process of having a “rupture” — a disagreement — and then “repairing” it is a vital model for the patient’s real-world relationships. AI is too “perfect” to allow for this.
Non-Verbal and ‘Kapwa’ Context
In a setting like Miaco’s clinic, rapport is often built through physical presence and the Filipino concept of Kapwa, or shared identity.
The “Holding Environment”: The way you lean in, your silence, and even the “organized calm” of your office, provide sensory grounding.
Non-Verbal Cues: AI cannot yet truly “read” the micro-tensions in a patient’s breathing or the specific “weight” of a silence in the room. It processes text and voice, but it misses the vibe of the soul.
Lives are at Stake
Although AI bots are making inroads in psychiatric care and psychotherapy, their use makes Miaco uneasy.
“The incident where an AI chatbot led to a vulnerable young person’s eventual suicide is a horror story I never wish to encounter in my work,” she said.
“In my locality, I am aware that a lot of individuals do indeed use AI for ‘mental healthcare’ as it is more accessible and less stigmatizing,” Miaco said.
Artificial intelligence might be seen as a bridge, not the destination.
“That bridge would be a waiting space to buy time until a mental health practitioner can be made available to attend to the patient,” Madamba said.
“AI should be considered a tool, which can be used by humans for good,” she said.
Don’t Let AI Run Your Life
Madamba recalled the words of Sri Amit Ray, pioneer of compassionate AI: “Emotions are essential parts of human intelligence. Without emotional intelligence, artificial intelligence will remain incomplete.”
“While AI may help us feel connected, validated or even clarified, AI is a machine, which can easily go the wrong way if there is no human overseeing its functions,” Miaco said. “There should still be safeguards.”
One of those protections is to ensure people stay in control.
“As in any medical specialty, humans should lead direct care of any patient,” Aguilar said. “There are other routine, admin tasks that can be carried out by AI, but not direct patient care.”
Out of Touch With Reality
Without such supervision, the results can be life-altering — even life-ending.
“It was only after a suicide linked to AI advice that the system was reprogrammed,” Miaco said. “With the update, people who wanted to commit suicide were no longer encouraged. But still, looking at the logs was very disturbing.”
As artificial intelligence spreads, virtual experiences can distort reality.
“I came upon a TikTok video of an individual who thought her psychiatrist had fallen in love with her,” Miaco said. “She ran this through her AI app, which really validated the idea, including her physician friend who allegedly also shared her thoughts validating the matter.
“This ended up in a series of TikToks about this person sharing her ‘experiences’ of seeing how she thought her therapist had crossed major boundaries,” Miaco said. “She thought sharing this could help others in the same predicament.”
AI at the Bedside: Doctors Face a New Crossroads
The online narrative diverged sharply from the clinical reality, she said.
”The TikTok series of that individual was so bizarre, but ultimately, not impossible,” Miaco said. “The AI that she used as a confidant and therapist ‘pushed her’ and endlessly validated her. The danger here is that one’s decisions can be pushed in the wrong direction.”
Without “an adult in the room,” AI could be left to run rampant.
“Someone with a mental health disorder can tend to isolate and get addicted to social media where content is not exactly evidence-based, and might mislead,” Madamba said.
“AI should amplify human connections and human interactions, not replace it,” she said. “AI should not be used for high-level functioning that requires emotional intelligence.”
Misguided Reliance on AI
Research shows people often disclose more personal and emotional information to AI than to humans because it feels nonjudgmental and always available. Studies in digital mental health find this “disinhibition effect” can increase engagement — but also increases the risk that users will trust and act on AI responses without critical review.
“I use AI as an adjunct to my practice, not direct patient care,” Aguilar said. “I train it to sharpen my clinical judgement, structure my teaching and practice to make it efficient. I also train it to surface my and its blindspots or biases.
“That’s why there should be an effort to educate the public about AI, like we did with the internet and social media,” he said.
Formal training would also be good.
“Use it for school,” Madamba said. “Sometimes I pretend to be someone with mental health issues and symptoms on another Gemini account so I can see how it responds without my previous inputs. I think that would be replicative of the typical user’s experience.
“I use a website for patient appointment scheduling with social media for public health education and promotion,” she said. “I use an app for electronic prescriptions and programs to check for anti-plagiarism in research.”
Prompt Skills Power Better Results
Madamba wondered about other ways healthcare providers could use AI to help make daily operations more efficient.
Aguilar, Madamba and Miaco are uneasy about the dangers and possibilities of using AI bots for psychotherapy, psychiatric care and medical care.
“AI is here to stay,” Aguilar said. “Doctors should really harness it to better our health outcomes in a more ethical human way. It is impossible to prohibit patients from using it, so we should be at the driver’s seat of patient care and ‘train’ AI to be our ally.
“We should be aware all the time that AI is not sentient and does not have accountability for its output,” he said. “It hallucinates, but that may be an area for creativity. We should not let that happen with direct patient care.”
Maintain and Reinforce Guardrails
AI in care brings promise and risk. Upside: wider access, faster triage, personalized tools and support between visits. Risks: missed crisis cues, bias, privacy concerns, overreliance and loss of human connection. Guardrails, oversight and clinician involvement are essential.
“In all inventions and innovations, we have to be true to our ethics: First of all, do no harm,” Madamba said. “AI can expand our reach and improve our connections, but like a great switchboard ensuring that the right patient is at the right facility at the right time.
“While AI is a great tool to make life simpler for us,” she said. “Let us take time for things that make us human: pausing to smell the flowers and offering a healing touch.”
Used responsibly, AI may widen access and support care between visits. But therapy itself still depends on trust, nuance and human accountability.


I also trust actual professionals as they have years of expertise and training.
Thank you for this Jim. I write about my therapeutic process in my stories which you can ready on here. What you say is so true, it is the relationship, that creates the challenge and creates the conditions for growth, that is messy and uncomfortable and AI can’t offer that, no more than AI can plant seeds, or weed a garden. But I also find AI can support therapy. I often process my sessions with AI. Will talk through what happened, decompress and that is helpful . . . I suppose I have AI supported therapy. But AI alone would be dangerous . . . but now AI knows my therapist through me, it has begun to challenge a little by saying how would M respond . . . .