AI in Therapy: Help or Hindrance?

 
a woman using a laptop to talk to AI or ChatGPT for therapy Massachusetts Maine Vermont
 

Over the past year, conversations about artificial intelligence have started to show up more and more often in therapy spaces, including in my own sessions with clients. Many people have begun experimenting with tools like ChatGPT for things like journaling prompts, emotional check-ins, or even simulated “therapy-style” conversations. It’s easy to see why: AI is available anytime, it listens without judgment, and it can offer calm, structured responses when you’re overwhelmed or unsure what you’re feeling.

For some people, this kind of support can be really comforting. Having a place to organize thoughts, explore emotions, or practice skills between therapy sessions can help them feel less alone or more grounded. AI can also be a way to reflect when you don’t yet have words for what’s going on, and it gives you something to respond to, which can help you start to untangle what you’re feeling.

But there are also limits and risks to using AI in emotional spaces, and it’s important to know what those are so that you can continue to use AI safely.

When AI Support Misses the Mark

What matters most isn’t the technology itself, but how we use it.

Even though AI can sound warm and empathetic, it doesn’t actually understand emotions or context in the way a human does. It can’t pick up on tone of voice, subtle cues, or the deeper meaning behind what you’re saying. Sometimes it might give advice that sounds reasonable but misses the heart of what’s really happening.

There’s also the risk of leaning on AI in moments when what’s really needed is human connection or professional care. For someone who’s struggling with trauma, grief, or suicidal thoughts, a chatbot can’t provide the kind of attunement or safety that a therapist or trusted person can. And while it can be good at validating words on a screen, it can’t hold the emotional complexity of real relationships or life experiences.

Sometimes, using AI can even backfire. For example, if someone uses it during a spiral of anxious or obsessive thinking, it might accidentally reinforce the loop by responding in ways that keep the rumination going. Or if a person seeks reassurance, it might give endless reassurance instead of helping them find their own confidence or coping skills.

So while AI can be a useful companion, it shouldn’t be a replacement for connection, or a place to pour everything without reflection.

Using AI Wisely Between Sessions

If would like to use AI to support your therapy work, it can help to keep a few guidelines in mind:

a journal in front of a bouquet of flowers, psychotherapy counseling boston amherst northampton brattleboro montpelier portland augusta
  • Use it as a reflection tool, not a replacement for therapy. Let it help you think through something or name your emotions, but bring what comes up in your AI conversations to your actual therapy sessions.

  • Be mindful of privacy. Even though responses may feel private, they’re not fully confidential in the way therapy is. Avoid sharing identifying details or sensitive information.

  • Notice how it affects you. After using AI, do you feel calmer and clearer, or more confused and detached? Your body’s response can tell you whether it’s helping or not.

  • Let your therapist in on it. If you’re using AI between sessions, talking about how it’s helping or not helping can be part of your work together.

A Note on Therapists Using AI

Some therapists are also experimenting with AI for things like writing notes, organizing information, or summarizing sessions. While this might make documentation easier, since AI is so new that there are few regulations in place to adequately address ethical and confidentiality concerns. Therapy notes often include deeply personal information, and right now, most AI tools are not private or secure enough to protect that data.

For that reason, I’m not using AI for my notes or clinical writing, and I won’t unless these systems become fully confidential and ethically sound. If that ever were to be the case, both verbal and written informed consent from my clients will be an important part of the implementation process.

A Final Thought

AI can be a helpful tool, but it’s not a substitute for human understanding. If you’re using it to reflect, learn, or support yourself between sessions, that can be a good thing, especially when it’s paired with the safety of real therapeutic work.

What matters most isn’t the technology itself, but how we use it — thoughtfully, carefully, and in ways that keep real human connection at the center.

Next
Next

Coping with the Quiet After the Holidays