In a groundbreaking move, millions of Australian children awoke on Wednesday to find their access to social media platforms restricted. This unprecedented ban aims to protect those under 16 from the perils of addictive algorithms, online predators, and digital bullying. As the first nation to implement such sweeping measures, Australia’s initiative is being closely monitored by global legislators.
The ban affects ten major platforms, including Instagram, Facebook, Threads, Snapchat, YouTube, TikTok, Kick, Reddit, Twitch, and X. These platforms have pledged to comply by employing age verification technology to identify and suspend accounts belonging to under-16s. However, many of these companies remain skeptical about the ban’s effectiveness in enhancing children’s safety.
Government’s Perspective and Implementation
Australian Prime Minister Anthony Albanese has hailed the ban as a success, citing increased family discussions about social media use. While some children and parents are expected to circumvent the ban, there are no penalties for doing so. Albanese emphasized the symbolic importance of the law, stating, “We’ve said very clearly that this won’t be perfect… but it’s the right thing to do for society to express its views, its judgment, about what is appropriate.”
Under the new legislation, platforms must demonstrate they have taken “reasonable steps” to deactivate accounts used by under-16s and prevent the creation of new accounts to avoid fines of up to 49.5 million Australian dollars ($32 million).
Platform Responses and Compliance
Actions Taken by Major Platforms
Snapchat users under 16 will face a three-year suspension or until they reach the age threshold. YouTube will automatically sign out account holders on December 10, preserving their data for reactivation at 16. TikTok plans to deactivate all under-16 accounts by the same date, utilizing age verification technology to ensure compliance. Meanwhile, Twitch will prevent new accounts from December 10 but will delay deactivation of existing accounts until January 9.
Meta has already begun removing accounts of teens under 16 from Instagram, Facebook, and Threads, allowing users to download their content for future reactivation. Reddit will suspend under-16 accounts and block new ones from being created. X, however, has not clarified its compliance strategy and has criticized the legislation as an infringement on free speech.
Exemptions and Criticisms
Notably absent from the ban are platforms like Discord, GitHub, Google Classroom, LEGO Play, Messenger, Pinterest, Roblox, Steam, WhatsApp, and YouTube Kids. The exclusion of Roblox, in particular, has raised eyebrows due to allegations of adult predators targeting children within its games. eSafety Commissioner Julie Inman-Grant noted ongoing discussions with Roblox, which has agreed to introduce new age controls this month.
Age Verification and Public Concerns
The law mandates active age verification, which has sparked concerns among adult users about privacy. The Age Assurance Technology Trial conducted earlier this year convinced the government that age checks could be implemented without compromising privacy. Platforms are employing methods such as live video selfies, email verification, and official documents to verify ages. Yoti, an age verification company, reports that most users prefer video selfies, which estimate age using facial data points.
Children’s Reactions and Alternative Platforms
In response to the ban, some children are seeking alternative platforms not covered by the legislation. Yope, a photo-sharing app, has gained 100,000 new Australian users, while Lemon8, a TikTok-like platform, has been promoted as a backup option. Both platforms are under scrutiny by the eSafety Commissioner, with Lemon8 agreeing to comply with the new laws and Yope claiming exemption due to its lack of messaging features with strangers.
The evolving nature of the banned sites list has drawn criticism, with some arguing the government is playing a game of “whack-a-mole” it cannot win. Youth counselors express concern that children may migrate to unregulated digital spaces, potentially exposing them to greater risks.
Future Implications and Monitoring
One of the ban’s objectives is to encourage children to engage more with the real world. Officials plan to measure outcomes such as increased sleep, social interactions, reduced antidepressant use, and greater participation in outdoor activities. However, they will also monitor unintended consequences, such as potential shifts to darker areas of the web.
To assess the ban’s impact, the eSafety Commissioner will collaborate with six experts from Stanford University’s Social Media Lab. An independent Academic Advisory Group, comprising academics from the United States, the United Kingdom, and Australia, will review the findings. Stanford University has committed to publishing its approach, methods, and findings for public and policymaker scrutiny worldwide.
As Australia embarks on this ambitious social media regulation experiment, the world watches closely to see if the benefits outweigh the challenges and whether other nations will follow suit.