Mark Zuckerberg, CEO of Meta Platforms, faced rigorous questioning in a Los Angeles Superior Court over the company’s handling of child safety on Instagram. The inquiry is part of a high-profile trial addressing social media addiction, where Zuckerberg acknowledged the “very difficult” challenge of enforcing age restrictions on the platform.
During his testimony, Zuckerberg explained that Meta has implemented “proactive tools” to detect and remove accounts of users under the age of 13. However, he admitted the difficulty in accurately verifying ages, as many users lie about their age. “There are a set of people – potentially a meaningful number of people – that lie about their age,” Zuckerberg told the jury, emphasizing the complexity of the issue.
Trial Context and Implications
The trial, which began on February 9, centers on Kaley G.M., a 20-year-old woman who attributes her mental health struggles to prolonged exposure to Instagram and YouTube. Zuckerberg is the second Meta executive to testify, following Instagram head Adam Mosseri. Both executives have been scrutinized over Meta’s youth engagement strategies and safety protocols.
Kaley, identified in court documents as K.G.M., was present for part of Zuckerberg’s testimony. Her lawyer, Mark Lanier, argued that Kaley had an Instagram account at just nine years old, questioning the adequacy of Meta’s age-verification measures. Lanier challenged Zuckerberg on whether it was reasonable to expect a child to understand the platform’s terms and conditions.
Debates Over Responsibility and Legislation
Meta has long contended that age verification should occur before an app is downloaded, placing the onus on Apple and Google’s app stores. This stance has led to lobbying efforts across various states to influence potential legislation that could determine responsibility for user protection.
The trial’s outcome could set a precedent for numerous lawsuits against social media giants like Meta, Google, TikTok, and Snap, with potential damages reaching billions if juries rule against them. TikTok and Snap have already settled confidentially with the plaintiff’s legal team.
Zuckerberg’s Public Persona and Media Training
Lanier also interrogated Zuckerberg about his media training, referencing internal feedback suggesting he should appear more “authentic” and less “robotic.” Zuckerberg denied being coached, stating that feedback was merely advisory. “I think I’m actually well known to be sort of bad at this,” he remarked, acknowledging his awkward public speaking history.
Despite these challenges, Zuckerberg remained consistent in his testimony, focusing on Meta’s efforts to create a valuable platform while refuting Lanier’s interpretations of his statements.
Algorithm Adjustments and Youth Engagement
Lanier highlighted a 2015 memo where Zuckerberg aimed to “reverse the teen trend” and increase user engagement by 12%. This focus on youth engagement has led Meta to adjust its algorithms, mimicking TikTok’s strategy of showcasing content beyond users’ immediate networks.
Profit versus safety was a recurring theme, with Lanier questioning the decision to lift a ban on photo filters simulating cosmetic surgery effects. Internal emails revealed that despite staff concerns, both Mosseri and Zuckerberg supported the change.
Historical Criticism and Recent Developments
Meta has faced criticism for its handling of young users’ safety. Internal documents released in 2021 indicated awareness of Instagram’s potential negative impact on teens, particularly girls. Further scrutiny arose during a Federal Trade Commission antitrust trial, revealing Instagram’s algorithms inadvertently connecting child “groomers” with minors.
In response, Meta has enhanced privacy settings for teens, introducing “teen accounts” in 2024 that restrict content and interactions for users under 18. Recent changes include setting default content settings to “PG-13” and limiting younger teens’ live streaming capabilities.
Zuckerberg has previously apologized to Congress for the exploitation of children on social media, reflecting ongoing efforts to address these critical issues.
The trial, expected to conclude in March, will be pivotal in shaping future regulations and corporate responsibilities regarding youth safety on social media platforms.