Social media platforms are facing a significant challenge from Australia’s landmark legislation that bans users under 16 from having accounts, a move closely monitored by governments worldwide. According to eSafety Commissioner Julie Inman Grant, these companies have been resistant to the policy, approaching it “kicking and screaming.” The regulation, implemented in December, aims to protect children from harmful content and addictive algorithms on platforms such as Instagram, Snapchat, and TikTok.
The announcement comes as the UK considers similar legislation, with the House of Lords recently supporting a ban for under-16s through an amendment to the government’s schools bill. Australia’s firm stance has been justified by campaigners and the government as a necessary step to safeguard young users.
Industry Resistance and Compliance Challenges
Despite agreeing on the need for enhanced safety measures, companies like Meta argue that a blanket ban is not the optimal solution. Some experts share this concern, suggesting that the ban might not effectively address the issue. However, the Australian government reported shutting down 4.7 million accounts identified as belonging to children, declaring the policy a success.
Inman Grant highlighted the lucrative nature of the youth market for social media platforms, emphasizing the addictive qualities of platforms initially designed for adults. She stated, “They’re building a pipeline for the future, and they do not want this to be the first domino,” pointing out the lack of incentive for companies to fully comply with the ban.
Monitoring and Enforcement
With the legislation in effect for over a month, researchers are closely observing shifts in young people’s online behavior. Initial concerns suggested that under-16s might migrate to other platforms, but Inman Grant noted that while there was an initial spike in downloads, sustained usage did not follow.
Under the law, companies face fines of up to A$49.5 million ($33 million, £24.5 million) if they fail to take “reasonable steps” to keep children off their platforms. Inman Grant mentioned that a second series of concern notices is imminent, with Snapchat being a key focus.
“[The policy] is certainly exceeding our expectations, but we are playing the long game here,” she said, emphasizing the need for a dynamic approach to online safety.
Global Implications and Future Developments
Australia’s policy is the strictest globally, setting a higher age limit of 16 and denying exemptions for parental approval. The ban currently includes ten platforms: Facebook, Instagram, Snapchat, Threads, TikTok, X, YouTube, Reddit, and streaming platforms Kick and Twitch. Notably, it excludes dating websites, gaming platforms like Roblox and Discord, and AI chatbots, which have recently been criticized for inappropriate interactions with minors.
Social media companies have consistently pushed back against the legislation. Meta has suggested that age verification should occur at the app store level to ease compliance burdens. Meanwhile, Reddit, although complying, has launched a legal challenge in Australia’s highest court, citing concerns over privacy and political rights.
Australia’s Communications Minister Anika Wells has made it clear that the government will not be swayed by legal threats. “We will not be intimidated by big tech. On behalf of Australian parents, we will stand firm,” she stated in parliament.
This development follows a broader global trend of increasing scrutiny and regulation of social media platforms, with countries exploring various measures to protect young users. As more governments consider similar actions, Australia’s experience may serve as a pivotal case study in balancing regulation with digital rights and innovation.