X's Algorithm Rapidly Shifts Users' Political Views to the Right, Study Finds
A groundbreaking study published this month in the scientific journal Nature reveals that X, the platform formerly known as Twitter, can significantly shift users' political views to the right in just a few weeks. The research provides some of the first experimental evidence from a real-world randomized trial on a major social media platform, demonstrating the powerful influence of algorithmic feeds on political attitudes.
Experimental Design and Key Findings
The study, led by Germain Gauthier, an assistant professor in the Department of Social and Political Sciences at Bocconi University in Italy, examined 4,965 active U.S.-based X users over seven weeks in 2023. Researchers randomly assigned participants to one of two groups: one using X's default "For You" algorithmic timeline, which includes recommended posts alongside content from followed accounts, and another using a chronological feed showing only posts from followed accounts in the order they were posted.
The results were striking. Users who switched from chronological to algorithmic feeds showed measurable shifts toward right-wing political opinions. Those exposed to the "For You" algorithm were 4.7% more likely to prioritize Republican policy issues such as immigration, inflation, and crime. They also became more likely to view criminal investigations into former President Donald Trump as unacceptable.
Persistent Effects and International Implications
Perhaps most concerning was the persistence of these changes. Users' new follow patterns continued even after they switched back to chronological feeds, indicating that toggling off algorithmic recommendations doesn't reset users to a neutral state. The algorithmic group also showed increased pro-Kremlin views regarding the war in Ukraine, being 7.4% less likely to view Ukrainian President Volodymyr Zelenskyy positively and scoring higher on a pro-Russian attitude index overall.
"Deeply held concepts like partisan identity are unlikely to move over such short timeframes," Gauthier explained, "but what's striking is that opinions on current politics did shift in just a few weeks. That naturally raises the question of what years of exposure might do."
How the Algorithm Drives Political Shifts
The researchers analyzed how X's algorithm amplifies certain content to drive these political shifts. Conservative-leaning posts were about 20% more likely to appear in algorithmic feeds, while liberal posts were only 3.1% more likely. The "For You" algorithm significantly demoted posts from traditional news organizations while promoting content from political activists.
"While our study did not address this," Gauthier noted, "one could imagine that removing a large share of traditional news from users' feeds would affect how politically informed they are."
Platform Ownership and Political Context
The findings come amid increased scrutiny of X since Elon Musk's 2022 acquisition of the platform. Musk, an ally of Donald Trump who has invested over $200 million in pro-Trump political action committees, has made no secret of his conservative political views. He has publicly endorsed right-wing parties internationally, including Nigel Farage's Reform party in the U.K. and Germany's Alternative für Deutschland.
Analysis of Musk's own X activity in January revealed that on 26 out of 31 days, he posted content about the white race being under threat, alluded to eugenics, or promoted anti-immigrant conspiracy theories—content similar to what the "For You" algorithm amplifies.
Broader Implications for Social Media
While the study focused specifically on X, Gauthier emphasized that the broader research goal was to determine whether feed algorithms in general can influence political opinions. The sample in this study was 78% white, 52% male, and relatively well-educated, with 58% having completed at least four years of university. Among participants, 46% identified as Democrats and 21% as Republicans.
"Whether you are a conservative or a liberal, these findings should interest you," Gauthier concluded. "They raise deeper questions about how algorithms shape our consumption of political information. It's time people realize that these algorithms shape our societies, and for them to reflect on what kind of influence they are comfortable with, and how we should think about accountability."
The study represents early but significant steps in quantitatively understanding the societal effects of social media algorithms, with implications that extend far beyond any single platform.
