Kylie Moore-Gilbert, a political scientist and writer, recently shared a harrowing personal account of how social media algorithms can trap users in a cycle of distressing content. Her experience began during the massacre in Iran, which unfolded on January 8, 2026, when the regime shut down internet access, leaving a chaotic digital footprint of violence and despair.
Bodies stacked in refrigerated vans, the crack of gunfire, and the cries of fleeing youths filled Moore-Gilbert’s social media feeds. These disturbing images were interspersed with hopeful protests, as families united in the streets, chanting for peace and change. However, the hopeful narrative was overshadowed by the grim reality of the regime’s brutal response.
The Algorithmic Bubble
Moore-Gilbert’s feeds on Instagram and X became inundated with Iran-related content, a shift she attributes to the algorithm’s insatiable quest for engagement. This transformation was not just a reflection of her interests but a demonstration of how algorithms can amplify certain topics to the exclusion of others.
Initially, she felt a duty to bear witness and amplify the atrocities in Iran. However, the relentless exposure to blood-soaked streets and haunting images began to invade her dreams, leaving a lasting impact on her mental well-being. The algorithm had funneled her into a digital echo chamber where nothing existed beyond the horrors of Iran.
The Power and Perils of Social Media
The role of social media in shaping public perception is not a new phenomenon. During the Arab Spring of 2011, platforms like Facebook and Twitter were hailed as tools of democratization, helping young activists organize and challenge authoritarian regimes. However, the optimism of those early days has given way to concerns about the darker side of social media.
According to Pulitzer Prize-winning journalist Anne Applebaum, in her work “Autocracy Inc.,” the internet is no longer a marketplace of ideas. Instead, it has become a battleground where foreign entities and governments manipulate algorithms to foster radicalism and outrage. Countries like Iran, China, and Russia have been accused of using these tactics to influence public opinion.
“Malign actors have an interest in fostering echo chambers of radicalism and outrage, bombarding users with content that over time comes to saturate our brains in such a way that a single perspective becomes the only perspective.”
The Human Cost of Algorithmic Manipulation
Moore-Gilbert’s experience highlights the human cost of algorithmic manipulation. While she does not believe her feed was curated by an influence campaign, she acknowledges the presence of influencers and commentators who cast doubt on the events in Iran. These individuals, likely ensconced in their own algorithmic bubbles, may view her as being duped by propaganda.
Social media has become a powerful tool for those seeking to influence public opinion. This realization underscores the importance of recognizing when one is trapped in an ideological echo chamber. Caring deeply about a cause does not prevent one from falling into this trap; in fact, it might make it more likely.
Looking Forward
Iran continues to grapple with the aftermath of the massacre, and it is crucial to amplify the voices of those who have witnessed these horrors. However, on a personal level, it is equally important to recognize when to step back and look away.
Kylie Moore-Gilbert’s experience serves as a cautionary tale about the power of algorithms to shape our perceptions and the need for vigilance in navigating the digital landscape. As social media continues to evolve, the challenge will be to harness its potential for good while mitigating its capacity for harm.
Kylie Moore-Gilbert is an academic in Middle Eastern political science at Macquarie University and the author of “The Uncaged Sky: My 804 Days in an Iranian Prison.” She is a regular columnist for The Age and The Sydney Morning Herald.