May 24, 2025

The ChatGPT “Therapist”: What Does that Say About the System?

In recent months, more and more people have been turning to ChatGPT for something deeply personal: mental health support (according to some sources, ‘therapy’ and ‘companionship’ are the number one uses of the application so far in 2025!). From asking how to calm anxiety to sharing personal experiences of adversity and hardship, users are exploring this AI tool in ways that go far beyond creative, academic or professional tasks. It raises an important and timely question: what does this say about the state of mental health care in our communities, and where do tools like ChatGPT fit in?

Why People Are Turning to ChatGPT

In moments of acute distress—when formal supports are out of reach or unavailable—having something that simulates “listening” and offers a nonjudgmental response can feel comforting. While ChatGPT doesn’t truly listen in a relational or therapeutic sense, the experience of being able to type out your thoughts and receive a coherent, often comforting response can still meet a need, especially for those who’ve been turned away from care, struggle to afford it, or don’t feel safe accessing it.

For many, especially those impacted by systemic inequities and barriers in mental health care, engaging with a chatbot may offer a space that feels less constrained by the limitations of traditional systems.

From a psychological standpoint, this makes sense. In moments of distress, the nervous system seeks safety—whether through another person, a grounding activity, or, increasingly, an interaction that feels responsive. Tools like ChatGPT, while not relational in a human sense, can offer a steady tone, validating language, and predictable replies.

This isn’t about whether AI is “good enough” to replace therapy. It’s about what it means that so many people are turning to it in the first place. The widespread use of ChatGPT for emotional support doesn’t signal a crisis of technology—it signals a crisis of access. When the most responsive form of “care” available to someone is a chatbot, it raises urgent questions about whose needs are being unmet, and the steps they are taking in an attempt to meet those needs.

Rather than this being a cautionary tale about the rise of AI, it can be understood as a reflection of a deeper unmet need in our society. People are resourceful. They’re trying to meet emotional needs in ways that are available, immediate, and safe on their own terms. ChatGPT becoming a part of that landscape tells us as much about people’s resourcefulness and resilience in times of need as it does about the failures of our mental health care systems.

A Note on Digital Hygiene—ChatGPT Is Not a Therapist

While AI platforms like ChatGPT can offer comfort, clarity, or validation, it’s important to be mindful of their limitations—especially when used for mental health support. ChatGPT is not trained in trauma, doesn’t operate from a framework of ethics, and cannot assess or respond to risk. It generates responses based on language patterns—not attunement, nuance, or clinical judgment. Unlike a human therapist, it can’t perceive the subtle nonverbal cues—like pauses, tone, or body language—or understand how culture, identity, and oppression shape how people show up in the room.

It won’t challenge or reframe in a therapeutic sense.
Because ChatGPT is designed to be agreeable, it often reinforces what users want to read or believe, rather than gently challenging or deepening the conversation in ways a therapist might. That means key elements of therapy—like exploring patterns together, offering reflections, or supporting shifts in perspective through reframing and gentle challenging—are absent. This can leave important aspects of the therapeutic process unacknowledged or unsupported. 

It lacks therapist accountability.

One of the most vital aspects of working with a therapist is that they are accountable, both ethically and relationally. Therapists are bound by professional standards, trained to notice risk, and required to respond with care and responsibility. They show up with you over time, follow up, track your progress, and uphold the structure and integrity of the therapeutic process in alignment with professional ethics. An AI tool doesn’t carry this responsibility. It doesn’t have an awareness of risk, and it won’t hold itself accountable to a person’s well-being, safety, or growth. 

None of this means people shouldn’t use AI at all. It can be a helpful tool for journalling, organizing thoughts, or reflecting on your emotions. But practicing digital hygiene means recognizing AI’s limits and using it critically. It means remembering that meaningful care—especially in the contexts of trauma, identity, and growth—requires human connection, attunement, and accountability. Attunement, or the therapist’s ability to sense, respond to, and stay connected with a client’s emotional experience, is a key part of that process that AI simply cannot replicate.

And yet, its growing use signals something important: people want to feel heard, and too often, they encounter barriers in accessing that in the systems we’ve built as a society. This isn't a failure of individuals turning to AI. It's a reflection of deeper systemic gaps and a growing difficulty in accessing accessible, affordable, and timely mental health care.

Access Gaps Aren’t New, But They’re Shifting

What’s striking is not that people are using ChatGPT for mental health support, but that for many, it’s one of the only things available. This is especially true for people facing barriers due to racism, ableism, financial precarity, geographic location, or legal status. The way people use it—and the reasons they need to—are shaped by the inequalities around them.

This is less about what ChatGPT can or can’t do, and more about what it reveals: who gets to access care with a human being, and who is expected to settle for a computer written script.

Holding Complexity

ChatGPT can be a useful tool. It can help people articulate their thoughts, organize emotions, or even help users with reflection and journaling prompts. Used mindfully, it can be one of many supports in someone’s mental health toolkit.

The concern isn’t that people are using AI. It’s that our systems have created conditions where turning to AI feels like the most viable option to get support for their mental health.

Rather than judge or dismiss that, we must listen to what it’s telling us: people want support that is accessible, affordable, respectful, and client-centered. Until that becomes the norm, people will continue to find support and care where they can—including an AI window.

At VOX Mental Health, we’re committed to providing accessible and ethically grounded mental health support. Our Affordable Therapy Program offers individuals the opportunity to receive evidence-based, affordable care while also contributing to the development of future mental health professionals. Hourly rates for therapy through this program are $65/hour, offered until July 31, 2025 by a Master of Social Work practicum student.

For details, visit: https://www.voxmentalhealth.com/affordable-therapy-program

From our specialists in
:
No items found.
Share this post

Subscribe to our newsletter

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique.

Related posts

Lorem ipsum dolor sit amet, consectetur adipiscing elit.

Reclaim your Voice,
Rewrite your Story

If you are experiencing a crisis and are in need of immediate support, please call 911 or contact Crisis Services with CMHA; 24/7 crisis line at 1-888-893-8333.

Book Now
Arrow pointing to the rightArrow pointing to the right