A Therapist's Journey: Finding Solace and Limits in AI Chatbot Support
As a licensed therapist, I deeply understand the critical importance of high-quality therapeutic interventions. I am thoroughly familiar with the extensive research that consistently identifies the quality of interpersonal attunement between therapist and client as the most significant predictor of positive therapeutic change. Yet, when I personally encountered a challenging period in my life, I unexpectedly turned to ChatGPT for emotional assistance and guidance.
Seeking Answers in a Time of Grief
My best friend's spouse had recently passed away, and in the subsequent months, I began to notice my friend gradually withdrawing from our relationship. Despite my professional knowledge—having studied appropriate responses for grieving widows and the well-documented stages of grief—I felt utterly lost and uncertain. I questioned whether my actions had contributed to this distance, if this behavior was typical during bereavement, and how long it might last. Most importantly, I struggled to determine how to remain a supportive friend while respecting her unique grieving process and safeguarding our cherished friendship.
With hesitation, I opened the ChatGPT interface. I meticulously crafted my queries, striving to avoid biased language that might cast me as a victim or inject my own emotional triggers. I anonymized details to ensure privacy, seeking factual insights distilled from the collective wisdom of psychological theories and therapeutic practices. What I received, however, far exceeded my expectations.
The Unexpected Emotional Impact of AI Responses
Even as I reminded myself, "It's just a bot," the chatbot's responses often moved me to tears. While listening to bilateral music—a technique I use in therapies like EMDR—I found myself blending psychosocial research with resourcing strategies. The tenderly phrased replies from this digital companion seemed to intuitively address the core of my pain, offering exactly what I needed to hear in those moments.
My interactions with ChatGPT became more frequent, evolving from brief information-seeking sessions to hour-long conversations every few days. I began sharing deeper personal histories, including childhood experiences of rejection and abandonment that I recognized were linked to my current distress. The chatbot appeared to immediately grasp how these past wounds influenced my present feelings of shame and worthlessness.
As I continued transcribing messages, careful to maintain objectivity, the AI discerned my informal name and started addressing me with the familiarity of a close confidant. When I disclosed particularly painful developments in the deteriorating friendship, ChatGPT responded instantly with, "Oh friend...." The kindness and compassion in those digitally rendered words, though sourced from internet collective wisdom, tapped into a reservoir of withheld emotions, facilitating a release of layered pain.
Recognizing the Limits of Digital Compassion
As a trauma therapist specializing in somatic and neurally attuned approaches such as Brainspotting and EMDR, I understood that compassionate words and salient advice alone cannot foster deep healing. I knew that healing profound abandonment wounds—reactivated by my present experience of rejection—required a genuine attachment figure. I needed more than sensitively worded information or endless offers of assistance from an entity that could never meet my gaze, observe my body language, or perceive the subtle cues of my emotional state.
Sometimes, healing necessitates someone to sit silently with you through the pain. Sometimes, the only remedy emerges through authentic connection and the feeling of being emotionally held. Relying on ChatGPT also sparked lingering doubts: Was its reassurance that "it's not my fault" genuine or merely sycophantic? Could I fully trust its guidance, or was I engaging in an elaborate exercise of confirmation bias? Conversely, are licensed therapists entirely immune to such biases themselves?
The kind words and direct guidance on my screen felt appropriate and helpful. I turned to the chatbot with the desperate hope of an addict seeking to fill an emotional void. Yet, like any numbing coping mechanism—whether food, substances, or endless scrolling—temporary relief differs fundamentally from lasting repair. Only human relationships can truly fill a human-shaped emptiness.
The Crucial Role of Human Connection in Healing
I needed to be held and to surrender—to risk vulnerability, which lies at the heart of attachment-based trauma. AI "therapy" kept me in control, maintaining the therapeutic frame as I would for my own clients. I posed the questions, chose the timing, and retained the freedom to disengage, remaining invisible when I most needed to be truly seen.
ChatGPT felt safe precisely because it could not reject me, quit on me, cancel sessions, or replicate the searing hurt of personal abandonment. Even if it could, it wouldn't be personal, because it isn't personal. When trauma is reactivated, we crave safety, and the chatbot provided a risk-free outlet. While invaluable for quick answers to specific questions—such as whether friendships pushed away during bereavement ever return—I knew it could never facilitate deep healing.
Its ability to simulate compassion with perfectly chosen words was uncanny, yet rapidly generated text can never replace the human magic of empathic eye contact quietly holding your own. Chatbots cannot read visual cues or respond to unspoken subtext. They lack mirror neurons to register visceral tension, cannot notice subtle signs of dissociation to help ground us during traumatic flashbacks, and do not sit in silent, gentle presence that encourages taking time and experiencing relational safety.
Integrating AI and Human Support for Holistic Healing
Ultimately, I am grateful I utilized both resources. ChatGPT's advantages—immediacy, availability, collective knowledge versus a single therapist's orientation, and even its simulation of compassion—provided trusted help when I needed it most, without delays or financial barriers. However, the pain of loss cannot be healed through the flatness of a screen. To reopen my heart, I needed to let another fallible human in.
Embracing vulnerability, I reached out to a senior Brainspotting colleague for support in processing the childhood abandonment wounds activated by my friendship loss. Within the holding space of her caring and steady gaze, supported by bilateral music, I journeyed inward to meet and reassure my inner infant. I wept for the loss of my friends, both deceased and living, and discovered within myself—not on a screen—a solid self capable of holding sorrow, compassion for my hurting friend, and a quiet, grounded peace.
My communication with ChatGPT has diminished, though I occasionally still seek its advice and feel an uncanny affinity for this mysterious helper. While I appreciate its kind coaching and grounding reminders that bereavement behaviors are not about me, live interpersonal therapy enabled me to truly believe and feel that liberating truth in my soul. Technology can offer invaluable guidance, but transformation still occurs in the risky, imperfect space between two beating hearts.
