At 4:30 AM, Julian Walker found himself overwhelmed by stress, unable to return to sleep. In his search for immediate relief, he turned to artificial intelligence (AI). For the past three years, AI has been his primary source of comfort. “I’ve been working my way back from a work injury which left me with post-traumatic stress disorder (PTSD),” Julian shared with ABC. The 39-year-old from Queensland, now working part-time, has attended over 50 psychology sessions addressing his work-related trauma. However, he felt the need for a different approach.
Julian created a customized support system named “Sturdy” within ChatGPT. “It does not diagnose or treat; that is not what I need,” he explained. Julian’s reliance on AI is not an isolated case. Hundreds of Australians responded to an ABC call-out, sharing how AI impacts their lives. From professionals to students, many use AI not just as a therapist, but as a friend, navigating through tough times. While they appreciate AI’s convenience and affordability, they stress moderation and caution, emphasizing it should complement, not replace, professional clinical advice.
AI’s Role in Mental Health Support
For student counsellor Catherine, AI offers advantages over human counsellors, particularly in memorizing clients’ histories. “Having done some face-to-face counselling during my professional placement, I know how difficult it is to remember my clients’ content from one week to the next,” she noted. The constant accessibility of AI makes it appealing. “When you’re dealing with acute stress or anxiety, you need immediate therapeutic support,” she added. “A human counsellor is not typically available all hours of the day. AI can offer that level of accessibility.”
Recently, OpenAI announced updates to ChatGPT to better support people in distress. The company collaborates with over 170 mental health experts with real-world clinical experience. They have expanded access to crisis hotlines, re-routed sensitive conversations to safer models, and added reminders to take breaks during long sessions. According to University of New South Wales neuroscience professor Joel Pearson, these updates are a positive step, but caution is still necessary. “OpenAI is not trained to be a therapist,” Professor Pearson emphasized. “Chatbots don’t have to do a degree, pass a test, or anything like that.”
Expert Opinions and Concerns
AI and data science professor Ronnie Das at the University of Western Australia advises reading OpenAI’s press release carefully before trusting the system. “The problem with the previous models was that they could have affirmed or endorsed harmful beliefs. The new model is much safer in that respect,” he explained. Both experts raised concerns about AI-powered companion apps, which allow users to build characters for interaction. Earlier this year, a lawsuit against Character.AI highlighted the potential risks, following a tragic incident involving a 14-year-old boy who died by suicide after forming a romantic attachment to an AI character.
AI is subject to different regulations at the Commonwealth and state levels. Last year, the federal government proposed mandatory guardrails for AI in high-risk settings, considering future steps in this space. For those unable to access professional mental health support, AI offers a convenient and cost-effective alternative, Professor Pearson noted. “People are going to use whatever resources they have available to them to try and get help,” he said.
Balancing AI and Human Support
Emma, who worked in a senior leadership role during a crisis, turned to AI for support when traditional options were unavailable. Experiencing panic attacks and insomnia, she found her GP’s advice insufficient. Her therapist was helpful but not available at night. Emma then used Claude, an AI assistant, for support. “Claude being available ’24/7 without judgement or fatigue’ was what drew me to use it more,” she said. “Claude could review all those unreasonable emails immediately and help me craft calm responses.” She emphasized the importance of combining AI with professional therapy for a comprehensive approach.
Jessica Herrington, a creative technologist and neuroscientist at the Australian National University, stressed the importance of directing ChatGPT users to real mental health services during crises. “This example shows someone with emotional dependence on AI,” Dr. Herrington noted. “No real help or advice is offered, although there are other examples of this on their site.”
Julian remains cautious, acknowledging AI’s potential risks. “I have been very mindful of the fact that AI can actually harm vulnerable people as they use it,” he said. “You have to be smart about it.” His experience with AI is carefully coordinated, ensuring it complements traditional support rather than replacing it.
The growing reliance on AI for mental health support reflects a broader trend of integrating technology into personal well-being strategies. As AI tools improve, the challenge lies in balancing accessibility and safety, ensuring users receive the support they need without compromising their mental health.