An Australian mother, Amanda*, recently faced a setback when Snapchat refused to delete her 14-year-old son’s account, citing his self-declared age of 25 as the reason. This incident highlights the challenges parents face in enforcing social media age restrictions, despite government efforts to regulate underage access.
Parents have been encouraged to report their children’s accounts to social media platforms to ensure compliance with age restrictions. However, as Guardian Australia reports, these platforms, including Snapchat, often do not take action. Amanda, a resident of Tasmania, reported her son’s account, expecting that her complaint as a parent would lead to its removal. Instead, Snapchat acknowledged her report but did not act, citing the declared age of 25 and a lack of behavioral signals indicating the account holder was under 16.
Challenges in Age Verification
The incident with Amanda’s son underscores a broader issue with age verification on social media platforms. A spokesperson for Snapchat explained that the company does not examine message contents, limiting the signals available to verify a user’s age. Locking accounts based solely on reports could lead to false claims against users aged 16 and older, they argued.
Meanwhile, the eSafety Commissioner in Australia has been receiving similar complaints from other parents. The commission is actively engaging with companies like Snapchat to ensure compliance with age-related regulations. According to a spokesperson, eSafety’s guidelines recommend accessible pathways for reporting underage accounts and suggest a “waterfall” approach to age verification throughout the user journey.
Government and Regulatory Responses
In January, the Australian government highlighted the success of a social media ban that resulted in the removal of nearly five million accounts. eSafety Commissioner Julie Inman Grant had advised parents to report their teenagers’ accounts. However, a recent News Corp survey revealed that 70% of teenagers aged 10 to 16 remained on social media despite the ban.
Snapchat has acknowledged the technical challenges in preventing young people from using their app under the new regulations. The company suggests that better age verification solutions could be implemented at the operating system or app store levels. Despite these challenges, Snapchat continues to assess reports of underage users and takes action when non-compliance is confirmed.
Implications and Future Directions
Following inquiries by Guardian Australia, Snapchat contacted Amanda to request ID documentation for her son, leading to the account’s shutdown. Amanda expressed frustration, noting that while the government assured parents the ban would simplify enforcement, the burden seems to have shifted back to families.
She questioned whether platforms should be required to implement technologies like facial age estimation or ongoing age verification, rather than relying on parental reports. This sentiment reflects a broader concern about the effectiveness of current age restriction measures and the responsibilities of both parents and platforms.
The eSafety Commissioner has announced a study involving over 4,000 teenagers and parents to evaluate the ban’s success in the coming years. This study aims to provide insights into the effectiveness of current strategies and inform future policy decisions.
As the debate over age verification on social media continues, the balance between user privacy, parental control, and regulatory enforcement remains a complex issue. The outcomes of ongoing studies and regulatory discussions will likely shape the future landscape of social media access for young users in Australia and beyond.
*Names have been changed