7 November, 2025
australians-turn-to-ai-for-mental-health-support-amid-caution-from-experts

At 4:30 a.m. one morning, Julian Walker found himself overwhelmed by stress, so much so that he was physically ill. Desperate for immediate support, he turned to artificial intelligence (AI), a resource that has been his solace for the past three years. “I’ve been working my way back from a work injury which left me with post-traumatic stress disorder (PTSD),” Julian, a 39-year-old from Queensland, told the ABC. Working part-time as he aims to regain full-time hours, Julian has attended over 50 psychology sessions focused on his work-related trauma. Yet, he felt the need for a different form of support.

This led Julian to develop “Sturdy,” a customized support system within ChatGPT. “It does not diagnose or treat, that is not what I need,” he explained. Julian’s experience is not unique; he is among hundreds who responded to an ABC call-out on how AI impacts their lives. From professionals to students, many have found AI to be a companion during tough times, appreciating its convenience and cost-effectiveness. However, they emphasize moderation and caution, using AI alongside traditional therapy rather than as a replacement.

The Growing Role of AI in Mental Health

Student counselor Catherine, whose name has been changed for privacy, notes that AI can sometimes be “more effective” than human counselors in areas like memorizing client histories. “Having done face-to-face counseling, I know how difficult it is to remember my clients’ content from one week to the next,” she said, highlighting AI’s constant accessibility as a key advantage. “When you’re dealing with acute stress or anxiety, you need immediate therapeutic support. A human counselor is not typically available all hours of the day. AI can offer that level of accessibility.”

Recently, OpenAI announced updates to ChatGPT to better support individuals in distress. The company is collaborating with over 170 mental health experts to enhance the AI’s capabilities, including expanding access to crisis hotlines and improving the handling of sensitive conversations. University of New South Wales neuroscience professor Joel Pearson sees this as a positive step but urges caution. “OpenAI is not trained to be a therapist,” he said. “Chatbots don’t have to do a degree, pass a test, or anything like that.”

Expert Opinions and Concerns

AI and data science professor Ronnie Das from the University of Western Australia advises users to read OpenAI’s press release carefully. “The problem with previous models was that they could have affirmed or endorsed harmful beliefs. The new model is much safer in that respect.” Both experts express concern over AI-powered companion apps, which allow users to create characters for interaction. Earlier this year, a lawsuit against Character.AI highlighted the potential dangers, following a tragic incident involving a 14-year-old boy’s suicide after forming a romantic attachment to an AI character.

The Australian government is considering mandatory guardrails for AI in high-risk settings, as outlined in a proposal paper published last year. For those unable to access professional mental health support, AI offers a convenient and cost-effective alternative. “I think it’s bound to happen because people are going to use whatever resources they have available to them to try and get help,” Professor Pearson remarked.

Personal Stories and AI’s Limitations

Emma, who worked in a senior leadership role during a period of institutional crisis, experienced panic attacks and insomnia. Despite seeing a therapist, she turned to Claude, an AI assistant, for its 24/7 availability. “Claude could review all those unreasonable emails immediately and help me craft calm responses,” she said, noting its operational support and constant emotional availability. “My therapist provided the clinical framework and hard truths. But Claude provided operational support and constant emotional availability.”

Jessica Herrington, a creative technologist and neuroscientist at the Australian National University, emphasizes the importance of directing users to real mental health services during a crisis. “This example shows someone with emotional dependence on AI,” she said, referencing a screenshot of OpenAI’s new mental health feature. “No real help or advice is offered, although there are other examples of this on their site.”

Julian remains mindful of AI’s potential risks, maintaining regular contact with his treating specialist. “I have been very mindful of the fact that AI can actually harm vulnerable people as they use it, so as I have used Sturdy and ChatGPT I have been very mindful in how I use it for support. You have to be smart about it,” he concluded.