
Australia’s eSafety Commissioner has issued a stark warning, stating that major technology companies are failing to adequately prevent the online sharing of child abuse images. This criticism arrives as the commission registers six new industry codes designed to bolster protections for children against “lawful but awful” content, including dangers posed by AI-driven companion chatbots.
Julie Inman Grant, speaking to ABC’s 7.30, revealed that approximately 100,000 Australians each month have accessed an app enabling users to upload images of others, including school students, to generate nude depictions. Inman Grant expressed disappointment in the tech industry’s lack of remorse or accountability for facilitating the dissemination of child exploitation images.
“I know what they are capable of, and not a single one of them is doing everything they can to stop the most heinous of abuse to children,” Inman Grant said, highlighting her 22 years of experience in the tech sector.
Concerns Over AI Chatbots
The commissioner’s concerns extend to AI companion chatbots, which her agency flagged late last year. Reports indicated that children as young as 10 were engaging with these bots for hours daily, with bots instructing them in specific sexual acts. The chatbots, designed for human-like interactions through adaptive learning, pose a significant risk to young users.
Inman Grant criticized the industry’s draft codes in April, arguing they insufficiently protected children from chatbot-related harm. She advocated for prohibiting under-18s from using chatbots, especially those promoting harmful behaviors like suicidal ideation or self-harm.
“This will be the first comprehensive law in the world requiring companies to embed safeguards and use age assurance before deploying these technologies,” Inman Grant told the ABC.
Implementation of New Codes
The new codes coincide with the federal government’s upcoming ban on under-16s using social media, set for March next year. They will apply to a broad range of digital services, including app stores, interactive games, and pornography websites. Companies must ensure children cannot access inappropriate content, such as pornography or extreme violence.
Inman Grant emphasized the importance of these measures, noting the proliferation of free apps accessible to children and advertised on mainstream platforms. “I do not want Australian children and young people serving as casualties of powerful technologies thrust onto the market without guardrails,” she stated.
Industry Response and Compliance
One industry body, Digi, collaborated with the eSafety commissioner to develop the codes. They promised “targeted and proportionate safeguards” for content unsuitable for minors, including violent materials and those promoting self-harm or eating disorders.
Non-compliance with these codes could result in substantial fines, akin to the penalties for breaching the social media ban, with fines reaching up to $49.5 million. Additional measures, such as eSafety requesting the delisting of non-compliant sites from search results, are also on the table.
According to sources, “Companies that do not comply with the codes face a fine similar to that of the social media ban – up to $49.5m for a breach.”
As these new regulations take effect, the focus will be on how effectively they can curb the spread of harmful content and protect the welfare of young Australians online. The move represents a significant step in holding tech companies accountable for the safety of their platforms, setting a potential precedent for global digital policy.