18 March, 2026
study-reveals-social-media-algorithms-can-shift-political-leanings

A new study published in Nature has revealed that the algorithm used by the social media platform X can shift users’ political opinions towards a more conservative stance. Conducted by Germain Gauthier from Bocconi University in Italy, this rare real-world randomized experimental study highlights the significant influence of social media algorithms on political attitudes.

The study involved 4,965 active US-based X users who were randomly assigned to one of two groups. The first group used X’s default “For You” feed, which employs an algorithm to select and rank posts based on user engagement likelihood, including posts from accounts they do not follow. The second group used a chronological feed, displaying posts from followed accounts in the order they were posted. The experiment spanned seven weeks in 2023.

Algorithmic Influence on Political Views

Findings from the study indicate that users who switched from the chronological feed to the “For You” feed were 4.7 percentage points more likely to prioritize policy issues favored by US Republicans, such as crime, inflation, and immigration. These users also showed a higher tendency to view the criminal investigation into US President Donald Trump as unacceptable.

Moreover, the study found a shift in users’ attitudes towards the Russia-Ukraine conflict. Users became 7.4 percentage points less likely to view Ukrainian President Volodymyr Zelenskyy positively and scored higher on a pro-Russian attitude index.

The algorithm increased the share of right-leaning content by 2.9 percentage points overall, and 2.5 points among political posts, compared to the chronological feed.

Long-Term Effects and Broader Implications

One of the most concerning findings is the algorithm’s longer-term effects. The study showed that the algorithm nudged users towards following more right-leaning accounts, and these new following patterns persisted even after users switched back to the chronological feed. This suggests that the algorithm’s impact extends beyond immediate effects, reshaping users’ social media landscapes in lasting ways.

This study aligns with previous research, including a 2022 study that found X’s algorithmic systems amplified content from the mainstream political right more than the left in six out of seven countries. An experimental study from 2025 further demonstrated that re-ranking X feeds to reduce exposure to antidemocratic content shifted users’ feelings towards political opponents significantly.

Algorithmic Bias and Social Media Infrastructure

My own research, conducted with colleague Mark Andrejevic, adds to this picture of algorithmic bias. We analyzed engagement data from prominent political accounts during the final stages of the 2024 US election, revealing a spike in engagement with Musk’s account following his endorsement of Trump. This surge in visibility for right-leaning accounts continued for the remainder of the study period.

The implications of these findings are profound, given X’s global user base of over 400 million. As social media platforms become embedded as infrastructure, they shape society at its foundations, often without users’ conscious awareness. The design and governance of these platforms have real consequences, similar to how Robert Moses’s overpass bridges in New York were designed to exclude certain populations.

The Need for Transparency and Accountability

The study underscores the need for transparency and accountability in social media algorithms. Governments worldwide, including Australia, where the eSafety Commissioner has powers to enforce “algorithmic transparency and accountability,” must mandate genuine transparency over these systems. Just as governments intervene when infrastructure becomes harmful, similar actions are necessary for social media platforms to protect users.

The age of taking platform companies at their word about their algorithms’ design and effects must end. The findings of this study highlight the urgent need for regulatory measures to ensure that social media platforms operate in a manner that does not inadvertently shape political discourse and public opinion in biased ways.