Can a Chatbot Replace Your Therapist?
March 20, 2026
As more people rely on artificial intelligence (AI) in their daily lives, it’s gaining traction among adults looking for help managing anxiety, depression, sleep disorders, stress, relationship issues and other mental health concerns.
“People are turning to AI therapy tools at increasing rates for many reasons,” says Sara Zryl, MA, PsyD, a clinical psychologist and Associate Training Director at University Hospitals Behavioral Health Institute. “These tools can include anything from generative AI chatbots to wellness apps.”
Why More Adults Are Turning to AI
The use of virtual health tools has become increasingly common since the COVID-19 pandemic. Recent advances in AI technology have helped pave the way for more alternatives to traditional healthcare. Using AI for mental health advice is especially popular among young adults, according to recent studies.
According to a recent report, more than a million people use ChatGPT tool for emotional support each week. These numbers signal an unmet demand for immediate, low-barrier access to support.
“Many people who are turning toward AI are driven by a need for emotional support,” says Dr. Zryl. “Some people also want to connect or feel supported without having to be emotionally vulnerable, face to face, with someone else.”
Other reasons people are embracing AI for mental health include:
- Easy access: No waitlists, referrals or insurance approvals necessary.
- Convenience: AI is available any time, day or night.
- No time constraints: You can interact with chatbots and apps for as long (or as little) as you like.
- Affordable: You get help at low or no cost.
- Perceived privacy: People believe their data and conversations are private, even though they may not be.
The Risks of AI for Mental Health Support
Using AI-driven tools to support mental health can be helpful in some ways, but there are also risks. How much risk is involved depends on the type of tool you’re using, what you’re hoping it’ll do for you and how you use it.
AI tools should not be used as a substitute for care from a qualified mental health professional, says Dr. Zryl. “One of the main risks is the lack of scientific evidence and necessary regulations to ensure the safety of the person using the tool,” she says. “Completely replacing a human therapist and human connection with an AI chatbot is particularly risky.”
There are also ethical issues to consider, including:
- How transparent the AI therapy tool is about how it works.
- How it protects sensitive data (including personal information that you may want to keep private).
- Whether it may reflect bias. AI that’s trained on unvetted internet data can perpetuate social and cultural biases based on that data.
- How likely it is to provide misinformation.
- Who is responsible if something goes wrong while using the AI tool.
Safety is a major concern, especially for people with serious mental health challenges. “The ability of AI chatbots to safely guide someone who is in a mental health crisis is limited and unpredictable,” says Dr. Zryl. It can be especially dangerous if frequent or severe symptoms are disrupting someone’s daily life, or if they are experiencing suicidality, homicidality or self-harm behaviors, she says. “Relying on AI therapy tools for appropriate intervention and treatment recommendations for these types of problems is very risky.”
Potential Benefits for Adults
Despite the limitations, AI tools offer some significant potential benefits. For people facing high costs, long waitlists or lack of access to a mental healthcare provider, AI offers a quick path to mental health support.
Dr. Zryl adds that though wellness apps and tools should not replace human therapists, they can serve as successful supplements to therapy. “Using AI therapy tools for tracking patterns or symptoms alongside individual or group therapy is a more well-rounded approach,” she says.
AI tools can be used to help track:
- Moods
- Unhelpful thinking patterns
- Sleep
- Self-care
- Well-being
As an example, Dr. Zryl explains that you can track your sleep patterns on a wellness app and share the results with your therapist. “This information can help you and your mental healthcare provider better understand your sleep patterns and help you come up with a plan to approach sleep hygiene or make lifestyle changes to improve emotional health and wellness,” she says.
Wellness apps and other AI tools can also help people learn and build new skills, do therapy-related “homework” assignments and practice self-care outside of their therapy appointments. “The patient can use a wellness app to take what they’ve learned in treatment and apply it in their day-to-day life.” Dr. Zryl suggests that apps like Insight Timer, Headspace or Calm can be used along with therapy to build meditation skills, track moods and restructure unhelpful thoughts.
The Bottom Line
“AI will likely play a critical role in future healthcare and mental healthcare,” says Dr. Zryl. Your provider can help you decide whether an AI tool may be a helpful supplement to your mental healthcare and guide you toward the safest options.
It’s also important to remember that there are qualities that human therapists offer that AI tools can’t. “There is something unique about meeting with a provider face to face, sitting with that person and developing a human connection,” she says. A human therapist can listen, attend to you, empathize and provide feedback in ways that devices simply cannot. “The benefit of human connection and developing a therapeutic relationship is such a significant variable in the progress and growth I see in the patients I work with.”
Related Links
The mental health professionals at University Hospitals work with patients to manage a wide range of mental health conditions. They are also available to discuss AI tools for mental healthcare and help patients choose safe and effective options to support their treatment.
Tags: Mental Health, Sara Zryl, MA, PsyD