top of page

Can AI Replace Therapists for Students? Why ChatGPT Isn’t Enough

By Victoria Diaz



It's the night before an exam and your anxiety is through the roof. You've been balancing so many things and you're constantly at full capacity. You don't know who to talk to, and you just need to know how to fix this feeling ASAP. So you open ChatGPT. If this sounds familiar, you're not alone. As AI becomes more accessible and mental health care becomes harder to reach, more students are turning to chatbots for emotional support. But how helpful is AI when it comes to mental health? Before you type out your feelings, there's something important you should know: we are still in the very early days of understanding what AI can and cannot do in this space, and the difference matters.


The reality is that getting mental health support as a student is already hard. The cost of living has gone up across the board, groceries, school supplies, healthcare, and mental health care is no exception. Students across Canada are feeling that pressure and often can't afford the supports they need (MHRC, 2025). These challenges are what researchers call "sludge," the unnecessary frictions that make it harder to get support (Shahab & Lades, 2021). Rising costs, long waitlists, confusing intake processes, and not knowing where to start all add up, especially when you're already overwhelmed. As a student myself, I know the first couple of years of university already come with a lot of change, new environment, new people, new habits, and having to navigate a complicated system on top of that can feel like too much.



So it makes sense that AI feels like an easier option. The more students use it, the more comfortable and confident they become with it, and the more useful it seems (Sultana et al., 2025). Tools like ChatGPT, Claude, and even Siri and Alexa can feel genuinely conversational, and that accessibility means students are increasingly turning to them not just for studying, but for emotional support too (Lee et al., 2026). Which brings us to the real question: can AI actually replace a therapist? The short answer is no, and here's why.


Why AI Can't Replace Your Therapist


The first reason is sycophancy. AI responses are not objective, they are designed to validate what you say, even when what you're feeling or believing isn't accurate or helpful (Lee et al., 2026; Lopez-Lopez et al., 2025). Over time, that constant validation can reinforce false beliefs, create unrealistic expectations, and make you more dismissive of perspectives that challenge your thinking (Lee et al., 2026; Lopez-Lopez et al., 2025). Think of it like that one friend who only tells you what you want to hear. It feels good in the moment, but it's not always what you actually need.



The second reason is that AI is not a licensed professional. A trained therapist knows how to read cultural cues, maintain therapeutic boundaries, and make clinical judgements that AI simply cannot replicate (Iftikhar et al., 2024). When it comes to real clinical situations, AI cannot reliably assess risk, challenge distorted thinking, apply safety protocols, or escalate care when someone needs it most (Iftikhar et al., 2024). This means AI often fails to respond appropriately to things like suicidal ideation, trauma, OCD, or delusions (Iftikhar et al., 2024). For a student in crisis who turns to AI expecting real support, that gap can be genuinely dangerous.



None of this means AI is useless, it means it's being used outside of its lane. There is a meaningful difference between asking AI to be your therapist and asking it to help you organize your thoughts. The research on AI in mental health is still catching up, and right now there is no established clinical framework for using AI as a standalone mental health tool. What we do have is some early evidence for specific, low-stakes uses that support wellness without trying to replace professional care.


So What Can AI Actually Help With?


It is worth noting that the American Psychological Association's ethical framework does allow for structured, low-risk AI use, as long as there is professional oversight, informed consent, and harm mitigation in place (APA, 2025). Within those boundaries, one area where AI shows some genuine promise is journaling.



Instead of asking AI to interpret your emotions or act as a therapist, you can use it for something much simpler: helping you stay consistent and reflective over time. AI can support long-term guided journaling by helping you (Evans, 2025):


  • spot recurring stressors

  • track emotional patterns

  • monitor progress toward personal goals


Research on human-AI collaborative journaling suggests that well-designed prompts can deepen self-reflection and help people articulate their feelings with more clarity (Yang et al., 2025), and this kind of structured journaling can support (Jain et al., 2024; Yang et al., 2025):


  • emotional resilience

  • stress reduction

  • personal development 


The key distinction is that AI is offering structure, not clinical guidance. Used that way, it supports mental health awareness without overstepping into territory that belongs to a professional.


What’s Next?


AI is not going anywhere, and neither is the pressure students face when trying to access mental health support. It makes sense that a tool that is always available, never judges you, and responds instantly feels appealing when you are overwhelmed. But knowing what AI actually is, and what it is not, changes how you use it. It is a tool, and like any tool, it works best when it is used for the right job. Journaling, organizing your thoughts, tracking patterns over time, those are jobs AI can support. Being your therapist is not. The research is still catching up, and until there is a clear framework for clinical AI use in mental health, the standard remains professional care. If you are struggling, the resources below are a good place to start. You do not have to figure this out alone, and you do not have to figure it out with a chatbot.


Free and Reliable Mental Health Resources for Students:


On-Campus

University

Mental Health Resource Information

University of Guelph

University of Toronto

York University

University of Waterloo


Don’t see your post-secondary school?

You can find what services your school may provide by searching your institution’s name online and following that with “mental-health services”.


Off-Campus 24/7 Crisis & Immediate Support

Organization

Immediate Support

Mental-Health Resource Information

Good2Talk

1‑866‑925‑5454

Suicide Crisis Helpline

9-8-8

ConnexOntario

1‑866‑531‑2600


Key Takeaways



  1. AI tools like ChatGPT are becoming a go-to for students, but they were not designed to be therapists and should not be used as one.

  2. Sycophancy means AI is built to validate you, not challenge you, which can do more harm than good when it comes to mental health.

  3. AI cannot assess risk, maintain therapeutic boundaries, or respond appropriately in a crisis the way a licensed professional can.

  4. There is currently no established clinical framework for using AI as a standalone mental health tool.

  5. AI can support low-stakes wellness habits like journaling, but structure is not the same as clinical guidance.

  6. Real support is out there. If you are struggling, campus and off-campus resources are available to you.



References


American Psychological Association. (2025, December). Ethical guidance for AI in the professional practice of health service psychology. Apa.org. https://www.apa.org/topics/artificial-intelligence-machine-learning/ethical-guidance-ai-professional-practice

Evans, A. (2025, August 6). Journaling Meets AI: How Smart Tools Can Support Therapeutic Progress Tracking. Emosapien. https://emosapien.com/journaling-meets-ai-tools-enhancing-therapy-progress-tracking/

Iftikhar, Z., Ransom, S., Xiao, A., & Huang, J. (2024). Therapy as an NLP Task: Psychologists’ Comparison of LLMs and Human Peers in CBT. ArXiv (Cornell University). https://doi.org/10.48550/arxiv.2409.02244

Jain, A., Sandhu, R., Singh, G., & Manik Rakhra. (2024). The Role of AI Counselling in Journaling for Mental Health Improvement. 1–6. https://doi.org/10.1109/iceect61758.2024.10739128

Lee, D., Wan, C., Ng, P. M. L., Fung, Y.-N., & Wu, N. (2026). Effects of AI Companions’ Sycophancy and Emotional Mimicry on Consumers’ Continuance Intention and Social Wellbeing. International Journal of Human–Computer Interaction, 1–23. https://doi.org/10.1080/10447318.2026.2626809

Lopez‐Lopez, E., Abels, C. M., Holford, D., Herzog, S. M., & Lewandowsky, S. (2025). Generative artificial intelligence–mediated confirmation bias in health information seeking. Annals of the New York Academy of Sciences, 1550(1), 23–36. https://doi.org/10.1111/nyas.15413

Mental Health Research Canada. (2025, December). Meeting the Needs: Post-Secondary Students Speak on Mental Health and Support Gaps. Mental Health Research Canada. https://www.mhrc.ca/students-qualitative-report

Shahab, S., & Lades, L. K. (2021). Sludge and transaction costs. Behavioural Public Policy, 1–22. https://doi.org/10.1017/bpp.2021.12

Sultana, A., Abdul Latheef, N., Siby, N., & Ahmad, Z. (2025). Exploring Students’ Attitudes Toward Artificial Intelligence (AI): Psychometric Validation of AI-Attitude Scale. Sage Open, 15(4). https://doi.org/10.1177/21582440251378375

Yang, H., Park, J., Lee, J., & Oh, H. (2025). Human-AI Collaborative Journaling with POCKET-MIND: A Dual-Prompt Framework for Emotional Exploration and Goal Attainment. International Journal of Human–Computer Interaction, 1–13. https://doi.org/10.1080/10447318.2025.2593550


 
 
 

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating

© 2025 Enriched Mindset

bottom of page