“It’s okay, I spoke to ChatGPT about it this morning and feel so much better.” Artificial intelligence is steadily weaving its way into our conversations about mental health, whether in the media, clinics, or casual conversations with friends. Maybe you’ve heard someone say this recently, or perhaps you were the one saying it. Tools like ChatGPT are being used by an increasing amount of people to seek guidance, for coping strategies, or simply the comfort of a listening ear in a busy world. But can AI language models truly support psychological wellbeing? And when should we draw the line between talking to AI and talking to a therapist?
One of the clearest advantages of tools like ChatGPT is accessibility. With long waitlists for psychologists and psychiatrists, AI can offer an immediate space for reflection and reassurance. For some, typing out their worries and receiving an empathetic response can bring clarity and grounding. It lowers barriers and can even help people begin to put language to what they’re experiencing. But the question remains: when do conversations with AI shift from breaking down barriers to putting up barriers in accessing support?
Curious, I, a psychologist at the clinic, asked ChatGPT what it thought about its ability to take on a therapy-like role. I prompted it to provide the mechanics it works under along with its own critique.
1. Reinforcing maladaptive beliefs
ChatGPT is designed to respond empathetically, which means it leans heavily on validation. That can feel positive, especially if you’re feeling isolated, but it also risks over-validating unhelpful thoughts and behaviours. For example, if someone says, “I’m a terrible person because I have intrusive thoughts,” ChatGPT might respond with comfort (“You’re not terrible!”) but stop there, without the therapeutic challenge needed to break the cycle and promote learning and change.
2. Encouraging Avoidance or Reassurance-Seeking
Therapy often requires gentle confrontation. This is an important part of helping clients approach fears, test out unhelpful beliefs, and tolerate discomfort. In OCD work this looks like getting clients to have short-term anxiety with the goal of alleviating anxiety in the long- term. ChatGPT, however, may collude with avoidance by reinforcing comfort over challenge. For those with OCD, it could even become a form of reassurance-seeking, feeding into compulsions.
3. Missing Nuance and Relational Awareness
Therapy is more than words: it’s relational. A psychologist notices body language, tone, readiness, cultural context, and the subtle relational dynamics occurring in the room. ChatGPT cannot replicate this level of awareness, nor can it adapt to relational cues that are central to effective therapy.
So where does AI fit in?
ChatGPT can provide validation and reflection, but it cannot hold the delicate balance that leads to meaningful change. Effective therapy is dialectical: a balance of validation and challenge. Over-validating may soothe temporarily, but it risks leaving people stuck. At its best, ChatGPT may play the role of a supportive companion, however, it cannot ethically or effectively replace a trained psychologist, who provides treatment planning, risk management, challenge, and relational depth that makes therapy transformative. Simply put, the human difference.
————————————————————————————
Please call our team on 9882-8874 to book in with one of our team members today. Alternatively fill in our contact form here to get in touch.
To subscribe and listen to our podcast “Breaking the Rules: A Clinician’s Guide to Treating OCD”, click on the following links: Spotify, Google Podcasts, and Apple Podcasts. Episodes will be released fortnightly and will simultaneously be published on our webpage here.