The European Union is closely monitoring Australia’s groundbreaking move to ban social media access for users under 16, a law set to take effect on December 10. This initiative requires major platforms like TikTok, Instagram, and Snapchat to implement measures preventing young users from accessing their services, or face substantial fines of up to $49.5 million. As Europe considers similar restrictions, the continent is also exploring additional measures such as a late-night “curfew,” limitations on addictive features, and the development of an EU-wide age verification app.
Australia’s pioneering legislation has sparked interest across Europe, with several countries already implementing or proposing their own age-related social media restrictions. The European Parliament recently voted in favor of banning under-16s from social media without parental consent and imposing a complete ban on users under 13. This decision aligns with the sentiments expressed by European Commission President Ursula von der Leyen, who lauded Australia’s bold approach at a United Nations event in September.
Current Restrictions Across Europe
Denmark has committed to prohibiting social media access for users under 15 without parental consent. In November, the European Parliament’s vote reflected a growing consensus to protect minors from addictive features like infinite scrolling and automatic video playback. These features, often criticized for their potential to encourage excessive use, are under scrutiny as Europe seeks to safeguard young users.
Italy already mandates parental consent for users under 14, and both Spain and Norway are moving towards similar regulations. France, having introduced laws in June 2023 requiring age verification and parental consent for users under 15, is poised to extend these measures. French President Emmanuel Macron has even threatened a national ban on under-15s using social media if EU-level progress stalls. Additionally, French legislators have proposed a “digital curfew” for teenagers, and prosecutors have launched an investigation into TikTok over allegations of exposing children to harmful content.
“We all agree that young people should reach a certain age before they smoke, drink, or access adult content. The same can be said for social media.” — Ursula von der Leyen
Developing an EU-Wide Age Verification App
The European Commission is collaborating with Denmark, France, Greece, Italy, and Spain to create an age verification app. This initiative aims to enable users to verify their age without sharing sensitive information with social media platforms, a contrast to Australia’s approach, which places age-assurance responsibilities on tech companies.
Professor Lisa Given from RMIT, an expert in age assurance technology, emphasizes the importance of government oversight in ensuring compliance with privacy laws. She highlights the risks associated with data storage and sharing during the age verification process, pointing out potential privacy concerns.
“Having government oversight around the interventions that are going to be used to ensure that companies are complying with privacy legislation is really critical.” — Professor Lisa Given
Australia’s current system has faced criticism, particularly regarding Snapchat’s method of verifying ages through selfies or government IDs submitted to third-party platforms. However, the Australian government assures that its laws include stringent protections for personal data, with severe penalties for non-compliance.
Lessons Australia Could Learn from Europe
While Australia leads with its under-16 social media ban, it could benefit from observing Europe’s strategies. The Australian eSafety Commission’s recent collaboration with the UK and the European Commission to share knowledge on age assurance technologies marks a step towards potentially adopting similar measures. Australia’s laws are scheduled for independent review in two years, offering an opportunity to incorporate insights from abroad.
Europe’s focus on disabling addictive features for young users might also influence Australian policies. Professor Given notes the significance of algorithms and reward structures in fostering prolonged engagement, suggesting that addressing these elements is crucial.
“It’s not just the content on the platform or the platform itself. It is the algorithm.” — Professor Lisa Given
Moreover, Europe’s approach of allowing parental consent for 13 to 15-year-olds to access social media could resonate with Australian parents seeking more control over their children’s online activities. Professor Given points out that platforms like YouTube, which allow logged-out access, lack safety controls, highlighting the need for parental involvement in digital decision-making.
As the EU and Australia navigate these complex issues, the global landscape of social media regulation continues to evolve. The outcomes of these legislative efforts will likely influence future policies worldwide, shaping the digital experiences of young users for years to come.