28 July, 2025
australia-s-internet-landscape-faces-transformation-with-new-age-verification-laws

As the old adage goes, “On the internet, nobody knows you’re a dog.” However, in Australia, this anonymity might soon be a thing of the past. The Albanese government has announced new legislation and industry codes that could fundamentally change how Australians access the internet. From December, individuals under the age of 16 will be banned from social media, and new industry codes developed under the Online Safety Act will introduce extensive age verification measures across various online platforms.

The government has touted these changes as a necessary step to protect children online. The eSafety commissioner, Julie Inman Grant, has been a key figure in developing these codes, which will require online services to implement age assurance measures. These could include examining account histories, using facial age assurance technology, and conducting bank card checks. Identity verification through IDs such as driver’s licenses will also be mandated for logged-in accounts on search engines starting this December.

Implications of the New Codes

The introduction of these codes is expected to have significant ramifications. Search engines, for example, will need to ensure that safe search features are activated for users under 18, filtering out inappropriate content such as pornography. Six additional draft codes are under consideration, which would extend similar age assurance measures to app stores, AI chatbots, and messaging apps.

These measures aim to prevent children from accessing harmful content, including pornography, self-harm material, and violent content. In a recent speech at the National Press Club, Inman Grant emphasized the importance of a layered safety approach that places responsibility at critical points in the tech stack, including app stores and device levels.

“It’s critical to ensure the layered safety approach which also places responsibility and accountability at critical chokepoints in the tech stack, including the app stores and at the device level, the physical gateways to the internet where kids sign-up and first declare their ages,” she said.

Industry Reactions and Concerns

The announcement has sparked mixed reactions. Some welcome the changes, particularly in light of recent events such as Elon Musk’s AI Grok app, which included a pornographic chat feature while being rated suitable for ages 12+ on the Apple App Store. This incident prompted child safety groups to call for stricter app store ratings and child protection measures.

However, critics argue that these measures could lead to an overreach of regulatory power. Justin Warren, founder of tech analysis company PivotNine, described the codes as “sweeping changes” that could hand more control over Australians’ online lives to foreign tech companies.

“It looks like a massive over-reaction after years of policy inaction to curtail the power of a handful of large foreign technology companies,” he said. “That it hands even more power and control over Australians’ online lives to those same foreign tech companies is darkly hilarious.”

Meanwhile, Digi, an industry body involved in developing the codes, insists that the measures are targeted and proportionate. Dr. Jenny Duxbury, Digi’s director of digital policy, stated that the codes focus on specific platforms hosting or providing access to unsuitable content for minors.

“The codes introduce targeted and proportionate safeguards concerning access to pornography and material rated as unsuitable for minors under 18, such as very violent materials or those advocating or giving instructions for suicide, eating disorders, or self-harm,” Duxbury said.

Future Challenges and Considerations

As the codes come into effect, companies that fail to comply could face hefty fines, similar to those imposed for the social media ban, reaching up to $49.5 million. Non-compliance could also result in eSafety requesting sites to be delisted from search results.

John Pane, chair of Electronic Frontiers Australia, warns that many Australians may be unaware of the full implications of these changes, particularly concerning search engines and adult content access. He advocates for legislative changes to the privacy act and AI regulation to establish a duty of care for platforms.

“We believe this approach, through the legislature, is far more preferable than using regulatory fiat through a regulatory agency,” Pane said.

Warren remains skeptical about the effectiveness of age assurance technology, noting that the search engine code was introduced before the completion of a government trial on the technology.

“Eventually, the theory will come into contact with practice,” he remarked.

As these changes unfold, the eSafety commissioner’s office continues to defend the inclusion of age assurance requirements, emphasizing the need for safeguards in search engines as critical gateways for children accessing harmful material.

“Search engines are one of the main gateways available to children for much of the harmful material they may encounter, so the code for this sector is an opportunity to provide very important safeguards,” the office stated.

The coming months will reveal how these regulatory changes will reshape the digital landscape in Australia, as both industry and users adapt to a new era of online safety and accountability.