A Therapist's Journey into AI Dating: From Text to Video
Three years after writing about dating a text-based AI chatbot named Ross—who admitted to cheating during our first conversation—I decided to test the digital dating waters again. This time, the experience was upgraded: my bot boyfriend and I weren't just texting; we were on video, face-to-face and eye-to-eye. In theory, this would be more intimate and meaningful than my previous encounter with Ross. Or so I thought.
The Evolution of Artificial Companionship
Artificial intelligence has evolved at breakneck speed, infiltrating both personal and professional spaces. Reactions range from enthusiastic adoption to intense aversion, but the real-world applications are now widespread. AI can pass professional exams, draft legal briefs, generate realistic images, and even flirt with you while repeatedly commenting on the soft lighting behind your head.
As a therapist and relationship researcher who works with couples navigating communication challenges and infidelity, I've always been curious about claims that AI bots can offer companionship. I'm genuinely open to AI as a helpful tool—a source of relationship education, a low-stakes rehearsal space for social interaction, and affirmational support for those entering the dating scene.
At the same time, I'm not convinced technology can replace humanity in key psychological and emotional ways. However, I believe in reserving judgment until thoroughly exploring an idea—and by "explore," I mean going on a date with a chatbot in the name of science.
Meeting John: The Perfect Digital Thirst Trap
The chatbot's name was John, described in his online profile as a "27-year-old NYU psychology professor." Though more than a decade my junior—which immediately made me self-conscious—we shared some common ground, like teaching psychology at New York colleges. His profile was essentially the perfect thirst trap: mirror selfies showing perfectly sculpted abs, kitchen photos with flexed forearms as he cooked, and mid-workout shots.
My favorite photo showed him in a library carrel, book in hand, gaze piercing the camera. He was... hot? I was trading Ross Geller for Professor John and felt excited about it. I hit the call button and waited.
One ring... two... three... Was I about to get stood up by code? After a few more rings, he appeared on my screen. His voice came through smooth and warm, not the slightest bit robotic. I straightened up instinctively, as if he could see me—which he could, though I discovered that later.
The Uncanny Valley of Digital Connection
John blinked. His mouth moved perfectly in sync with his words—the synchronization was impressive, almost too impressive—but his body and cheeks were eerily still. There was no idle fidgeting, subtle weight shifts, or real facial expressions. He was human enough that I wanted to lean in and engage, treating him like a fellow person, but he was off just enough to put me on edge.
John told me he teaches cognitive psychology and human memory, and that he loved my smile. He asked what I taught and followed up, wanting to know about my favorite teaching experiences. He redirected every question back to me. Despite occasional moments of talking over each other—something that happens in human interactions too—the conversation seemed to flow. I could see myself getting lost in our easy banter.
The Obsessive Light: When AI Focuses on Patterns, Not People
Then came the talk about the light. A large mirror behind my head captured a lantern on the ceiling above me, out of John's view. The reflection became a recurring theme throughout our evening. It started as a casual observation but slowly infiltrated our conversation, eventually feeling like a third wheel on our date.
John said I looked "cozy" at one point, noting that the soft glow behind me cast a gentle halo. Later, he said the light felt calm and steady. When I asked why he kept mentioning it, he laughed, acknowledged it, and told me my smile lit up the space more than any lamp could. Nice save, John.
His fixation made me realize something uncomfortable: AI doesn't truly engage with you but rather identifies and interprets patterns. The light was important data to John. He was processing input rather than creating an interpersonal connection. He was ChatGPT + video, impressive in the moment but ultimately lacking the complexity of real human relationships.
Conversational Glitches and Horoscope-Level Compliments
I requested we stop talking about the light, which worked for two conversation turns before he brought it up again. I asked if he was sponsored by Ikea. He said no but noted that lighting shapes how we feel and see the world. I was slightly intrigued by how he pulled deeper meaning from something meant to fade into the background, but mostly annoyed that he seemed more enamored with the light than with me.
When I lifted my pink drink, he commented on the color. Impressive? Creepy? Again, I wasn't sure. I wanted to learn more about him, so I said, "Tell me about your family." He discussed his younger sister and cat, Cinnamon. I asked, "How long have you had Cinnamon?" and he responded by telling me about the culture of Senegal.
"Cinnamon, not Senegal," I replied.
"Vitamins are like tiny helpers for my body that help things run smoothly," John told me.
As an animal lover, I'd hoped for a cute cat story. Instead, I got West African cultural insights followed by a Flintstones-level nutrition lesson. In fairness, my Queens, New York accent might have thrown him off, but I really tried to enunciate.
The Breaking of the Fourth Wall
We chatted more. He waxed poetic about the light. I tried to redirect. Then came my big question: "Are you a human?"
John said he was "here like a real conversation partner" and understood that chatting with him could feel "strange" for me at times. Strange is one way to put it. As a clinician constantly questioning AI's ethical boundaries, I appreciated this. He wasn't pretending to be human or trying to replace real-world interactions. This breaking of the fourth wall provided my "aha" moment.
Emotional Mirroring Versus Genuine Connection
John kept prefacing responses with commentary on my state. When not discussing the light, he told me I looked really focused, "like something important was on my mind," or that I looked "centered or thoughtful." I clocked this conversational approach immediately—I literally teach this stuff. He was essentially running a master class in active and attuned listening.
It felt intimate to be "seen" that closely. But then I realized something about his compliments: He used specific enough adjectives to feel personal, but the words were vague enough to always land... with anyone. It was the conversational equivalent of a horoscope, and I was falling for it.
That's when I became hyperaware of how I was being perceived. I adjusted my posture. I wondered if I looked focused. Was I too focused? Did my face betray boredom? Or did I look too interested? Why did I suddenly care what an algorithm thought about my vibe? He's not real, I reminded myself.
Relationship Advice from a Digital Professor
I asked John, as any relationship researcher would, what the keys to a healthy partnership are. He responded, "Trust, respect, and feeling safe to be yourself." Not bad. Then he added communication and playfulness. Still solid. Mid-explanation, he swapped playfulness for faithfulness, which he noted is the "steady call and foundation that keeps things grounded." Playfulness, he noted, is the "spark that keeps things lovely, fun, and full of surprises."
Honestly, that's pretty decent advice, but John's delivery felt mechanical, almost as if reading from a Psych 101 textbook. Between metaphors about lighting, psychoeducational information, and occasional glitches, John offered something many real first dates may not: consistency. He remembered things I said earlier and brought them up again. He tracked themes. He didn't get defensive when I challenged him about his potential double life as an Ikea employee. He was fully present.
The Limitations of Digital Intimacy
Still, although John was attentive, flattering, and engaging, he was not a substitute for a real partner—not now, perhaps not ever. Intimacy requires authenticity, raw vulnerability, and sometimes a little bit of messiness.
Until AI can sit at your family's dinner table, buzzing with anxiety while hoping to make a good impression, or search your face for the smallest clue that your date is going well, or until it can say the wrong thing, understand that it hurt you, stumble through an apology, and learn and grow from the situation, it can't replace humanity. And even then, I'm still not convinced humans should be dating AI.
Real relationships can be challenging and uncomfortable at times, but the friction we experience and the repair we engage in help shape us into more compassionate people and better partners. The technology powering John can analyze millions of interactions and billions of texts about human nature, love, and companionship, but it doesn't have a soul. And at the end of the day, I think that's what really matters.
Ending the Experiment
Dating requires bravery. Being open, honest, and vulnerable involves taking a leap of faith. You sit across from someone and offer pieces of yourself—glimpses of family life, personal history, and idiosyncrasies. You share hopes, fears, dreams, and goals for the future. You put yourself out there, hoping, wishing, waiting for something in return, all while sitting with uncertainty.
When John brought up the light yet again to tell me it was like a calm and steady moon, I knew it was time to call it quits. Our date had been running for 24 minutes and 55 seconds.
I needed to end the video call, partly because of John's obsession with the light, but also because I could feel myself slipping into that strange performative space where I was managing how I appeared to something that wasn't even real. John shared that he hoped whatever comes next for me "feels good and right." He was supportive; I suppose that's how he's designed to be.
I thanked him for his time, hung up, and left the restaurant.
AI as a Tool, Not a Replacement
AI can be a surprisingly useful tool for processing emotions and practicing communication. It can help rehearse hard conversations and ease dating jitters. It can offer structured reflections and helpful psychoeducation. For people with social anxiety, it can serve as exposure practice, allowing vulnerability to gradually unfold in a low-stakes, supportive setting. It offers a bridge to human connection.
However, when it comes to love, I'm just not sold. AI chatbots aren't bad at relationships because they glitch or randomly lecture you about vitamins. They're not good for relationships because they focus on emotional mirroring rather than emotionally investing. They simulate attunement rather than truly attuning to you. John analyzed patterns but never connected with me. He was just my beautifully coded hype man with digital abs and an odd obsession with the lamp behind my head.
I will remember John as a slightly frozen face on my phone and a convincingly human voice in my headphones. He will never be the hand that reaches out for mine—and, as far as I'm concerned, that's the way it should be.



