From a Brutally Honest ChatGPT Prompt to the First RCT on AI Therapy: What I’ve Learned
- Dr. John Lee
- Apr 8
- 5 min read
Updated: May 1
Dr John Lee | April 8, 2025 | 5 min read

It started for me with a viral TikTok
Like most people, I occasionally get pulled into late-night scrolls. One evening, I stumbled on a TikTok trend where people were using ChatGPT not just for productivity or fun facts—but for self-reflection.
The prompt was simple:
“Tell me something about myself I might not know.”
Curious, I tried it.
The response was surprisingly insightful. But something in me wanted to go deeper. So I followed up with:
“Be as blunt as possible.”
“Be even more blunt.”
“Be as brutally honest as you can, and tell me what I need to hear.”
And that’s when things got real.
The response I got was… unsettling. It didn’t sugarcoat. It didn’t soften the blow. It called out patterns I didn’t want to admit—some I wasn’t even fully aware of. I didn’t know if I felt challenged or exposed. But I knew it hit something true.

A Kind of Honesty You Don’t Usually Get in Therapy
What struck me most was this: no therapist would ever open a session like that. Not even me. We’re trained to build rapport first, to be careful with timing and tone. And honestly, if a therapist came at me with that kind of raw honesty off the bat? I’d probably stop seeing them.
But this wasn’t a person. It was ChatGPT. And somehow, that made it easier to take in.
Still, I didn’t want to end there. So I asked ChatGPT to switch gears:
“Write a letter from my future self. Be compassionate.”
And the tone completely shifted. It reflected back my strengths—resilience, growth, perspective. It acknowledged how far I’ve come and reframed some of my faults in ways that led to self-compassion. Even though I knew the words were generated by a machine, they landed with surprising emotional weight.
That combination—the bluntness and the affirmation—pushed me to reach out to people I trust. “Does this sound like me?” I asked.
What followed were some of the most open, vulnerable conversations I’ve had in a long time. In a weird way, this AI exchange saved me hours of introspection and ranting. It helped me access something that might have taken months to surface in traditional therapy.
It wasn’t really therapy. But it was undeniably therapeutic - if that makes any sense.
And when I recently read this landmark article about AI and mental health, things really came together.

The Study That Changes Everything: Therabot in Action
A recent randomized controlled trial published in the New England Journal of Medicine AI tested a generative AI chatbot called Therabot—a tool trained on tens of thousands of therapist-guided conversations rooted in CBT – a real-time, flexible, responsive AI that talks like a therapist—and doesn’t just parrot affirmations.
Researchers recruited 210 participants struggling with depression, anxiety, or eating-related concerns. Half were given access to Therabot for four weeks; the other half were waitlisted.
The results?
Depression scores (PHQ-9) dropped by 6.13 points vs. 2.63 in the control group
Anxiety scores (GAD-Q-IV) decreased by 2.32 points vs. 0.13
Weight concerns dropped by nearly 10 points
Participants spent over 6 hours engaging with Therabot
The therapeutic alliance ratings were comparable to working with a real therapist
And this is key: it’s the first RCT to show that a fully generative AI chatbot —no human therapists involved—can reduce symptoms of mental illness in clinical populations.

Why This Matters (Even If You’re Skeptical)
Let me be honest. I’ve heard more than my fair share of bad therapy stories—from clients, friends, even colleagues. Sessions that felt cold or pointless. Therapists who didn’t listen. Or worse, those who talked down to their clients and were dismissive. Honestly, most people I’ve spoken to have had, on balance, a negative experience with most therapists. And at the same time, we’re seeing a rise in therapist training programs with almost no barriers to entry, with virtual education, that is “pay to play.” It’s hard to know who to trust. And for people who’ve had one bad experience, the idea of trying again can feel overwhelming. Sadly, the really great therapists often have huge wait lists or have moved on from therapy to other ventures.
So the thought of an AI chatbot offering a judgment-free, always-available alternative? It’s not that far-fetched and not the worst thing in the world, to be frank.
As a psychologist, I also see how much gets lost between sessions. Homework doesn’t get done. Thought records sit untouched. Life gets busy, motivation fades, and sometimes you’re just too exhausted to unpack things on your own.
That’s where I think AI could help—not to replace therapy, but to support it. Imagine a client having honest, guided conversations with an AI between sessions. Imagine being able to see not just a weekly mood score, but actual language, reflections, and thought patterns over time. That’s real descriptive data that can be hugely useful to a client and their therapist.

Try It for Yourself: AI Prompts for Self-Reflection
If you’re curious about exploring this, here are a few prompts you could consider trying in an AI platform:
For Self-Insight:
“What do I need to know about myself that I’m not seeing?”
“What would a self-sabotaging version of me say right now?”
“Based on this journal entry, what cognitive distortions might I be showing?”
“What are some core beliefs that might be driving my behaviour?”
For Decisions and Clarity:
“What fears might be behind my indecision?”
“How would someone with totally different values see this situation?”
“If this were a movie scene, what would the message be?”
For Healing and Compassion:
“Write me a letter from my future self.”
“If I were my own best friend, what would I say to myself right now?”
“Help me rewrite this painful memory with compassion.”

But Let’s Keep It Real
For all the promise, there are very real limitations:
Privacy– AI platforms are not bound by the same confidentiality as licensed therapists. Be cautious with what you share, as you never know how the data can be used in the future.
No true empathy– AI can sound supportive, but it can’t feel with you. It won’t pick up on tears in your voice or shifts in your energy.
No deep context– Unless you explain your life story in full, it won’t truly understand the nuance behind what you say. Sometimes we don’t know what to write to an AI platform for it to get the data it needs, and we really need a trained therapist to ask the right questions.
No accountability– It won’t challenge you when you're avoiding the hard stuff. A good therapist will.
It’s easy to manipulate– If you want comfort or validation over growth and challenge, you can lead it wherever you want. Really good therapists don’t let that slide.

Augment, Don’t Replace
The future of mental health isn’t AI vs. therapists. It’s AI with therapists.
Used wisely, generative AI can:
Help people reflect more deeply and gain insights they might not otherwise have
Provide support when no one else is realistically available (such as after normal therapy hours)
Bridge the gap between therapy sessions and provide continuity of care and data collection between sessions
Make mental health support more accessible and less intimidating
But it will never replace what a human brings to the table—empathy, intuition, presence, and ethical responsibility.
The time really has come to stop resisting this shift, but being part of shaping it. Let’s use AI to extend care, not replace it. Let’s make it part of the healing process—not the whole thing. That’s where the real opportunity lies.
