The chief justice of Australia’s high court has raised concerns about the increasing reliance on artificial intelligence (AI) in legal proceedings, stating that judges are being forced to act as “human filters” for AI-generated legal arguments. Stephen Gageler, speaking at the Australian Legal Convention in Canberra on Friday, warned that the current use of AI in courts has reached an “unsustainable phase.”
Gageler highlighted that both self-representing litigants and trained legal practitioners are using AI to create machine-enhanced arguments, prepare evidence, and formulate legal submissions. This trend, he noted, is placing an unprecedented burden on judges and magistrates, who must now sift through and adjudicate competing machine-generated arguments.
Despite these challenges, Gageler acknowledged the potential benefits of AI in the legal system, particularly in delivering civil justice that is “just, quick, and cheap.” However, he emphasized the need for careful assessment of AI’s role in decision-making processes within courts and tribunals, as the rapid development of AI technology threatens to outpace human capacity to fully understand its risks and rewards.
AI’s Role in Legal Decision-Making
Gageler, who presides over the seven-member high court, expressed concerns about the prospect of AI being used in judicial decision-making. He described these as “existential issues” that require the Australian judicature to address urgently. The pace of AI development, he argued, is surpassing human ability to evaluate its implications, necessitating a re-evaluation of the value of human judgment in the legal process.
To address these concerns, practice guidelines for AI use in law have been issued across most jurisdictions in Australia. Additionally, the Victorian Law Reform Commission is conducting a specialist review to further explore these issues.
Global Challenges and Local Implications
The challenges posed by AI in the legal field are not unique to Australia. Globally, cases of false precedents being cited in court due to AI errors have proliferated. In a notable incident in September, a Victorian lawyer became the first in Australia to face professional sanctions for using AI-generated false citations. The lawyer’s failure to verify the precedents led to the revocation of his ability to practice as a principal lawyer.
Appointed in 2023, Gageler will serve on the high court until reaching the mandatory retirement age of 70 in July 2028. During his address, he also urged judges and magistrates to speak openly about their wellbeing, highlighting the stress and mental health challenges they face due to their demanding roles and constant scrutiny from the media and public.
Addressing Sexual Violence in the Justice System
Gageler also addressed the justice system’s shortcomings in supporting victims of sexual violence. He noted that the system often fails to hold perpetrators accountable and that legal proceedings can be intimidating for complainants. An Australian Law Reform Commission report tabled in federal parliament in March revealed that one in five women and one in 16 men over the age of 15 have experienced sexual violence.
Improving responses to family and sexual violence, Gageler argued, requires the involvement of the entire legal system. “It is incumbent on us all to be aware of the problem and to be part of the solution to the extent we can,” he said, emphasizing the need for systemic change to better support victims and ensure justice is served.
The announcement comes as the legal community grapples with the implications of AI technology, balancing its potential benefits with the ethical and practical challenges it presents. As AI continues to evolve, the legal system must adapt to ensure that justice remains fair and accessible for all.