TikTok, a widely used social media platform with over a billion active users worldwide, has become a key source of news, particularly for younger audiences. This growing influence has raised concerns about potential political biases in its recommendation algorithm, especially during election cycles. A recent preprint study examined this issue by analyzing how TikTok’s algorithm recommends political content ahead of the 2024 presidential election. Using a controlled experiment involving hundreds of simulated user accounts, the study found that Republican-leaning accounts received significantly more ideologically aligned content than Democratic-leaning accounts, while Democratic-leaning accounts were more frequently exposed to opposing viewpoints.
TikTok has become a major force among social media platforms, boasting over a billion monthly active users worldwide and 170 million in the United States. It has also emerged as a significant source of news, particularly for younger demographics. This has raised concerns about the platform’s potential to shape political narratives and influence elections.
Despite these concerns, there has been limited research investigating TikTok‘s recommendation algorithm for political biases, especially in comparison to extensive research on other social media platforms like Facebook, Instagram, YouTube, X (formerly Twitter), and Reddit.
“We previously conducted experiments auditing YouTube’s recommendation algorithms. This study published at PNAS Nexus demonstrated that the algorithm exhibited a left-leaning bias in the United States,” said Yasir Zaki, an assistant professor of computer science at New York University Abu Dhabi.
“Given TikTok’s widespread popularity—particularly among younger demographics—we sought to replicate this study on TikTok during the 2024 U.S. presidential elections. Another motivation was the concerns over TikTok’s Chinese ownership led many U.S. politicians to advocate for banning the platform, citing fears that its recommendation algorithm could be used to promote a political agenda.”
To examine how TikTok’s algorithm recommends political content, the researchers designed an extensive audit experiment. They created 323 “sock puppet” accounts—fake accounts programmed to simulate user behavior—across three politically diverse states: Texas, New York, and Georgia. Each account was assigned a political leaning: Democratic, Republican, or neutral (the control group).
The experiment consisted of two stages: a conditioning stage and a recommendation stage. In the conditioning stage, the Democratic accounts watched up to 400 Democratic-aligned videos, and the Republican accounts watched up to 400 Republican-aligned videos. Neutral accounts skipped this stage. This was done to “teach” TikTok’s algorithm the political preferences of each account.
In the recommendation stage, all accounts watched videos on TikTok’s “For You” page, which is the platform’s main feed of recommended content. The accounts watched 10 videos, followed by a one-hour pause, and repeated this process for six days. Each experimental run lasted one week. The researchers collected data on approximately 394,000 videos viewed by these accounts between April 30th and November 11th, 2024.
To analyze the political content of the recommended videos, the researchers downloaded the English transcripts of videos when available (22.8% of unique videos). They then used a system involving three large language models—GPT-4o, Gemini-Pro, and GPT-4—to classify each video. The language models answered questions about whether the video was political, whether it concerned the 2024 U.S. elections or major political figures, and what the ideological stance of the video was (pro-Democratic, anti-Democratic, pro-Republican, anti-Republican, or neutral). The majority vote of the three language models was used as the final classification for each question.
The analysis uncovered significant asymmetries in content distribution on TikTok. Republican-seeded accounts received approximately 11.8% more party-aligned recommendations compared to Democratic-seeded accounts. Democratic-seeded accounts were exposed to approximately 7.5% more opposite-party recommendations on average. These differences were consistent across all three states and could not be explained by differences in engagement metrics like likes, views, shares, comments, or followers.
“We found that TikTok’s recommendation algorithm was not neutral during the 2024 U.S. presidential elections,” explained Talal Rahwan, an associate professor of computer science at New York University Abu Dhabi. “Across all three states analyzed in our study, the platform consistently promoted more Republican-leaning content. We showed that this bias cannot be explained by factors such as video popularity and engagement metrics—key variables that typically influence recommendation algorithms.”
Further analysis showed that the bias was primarily driven by negative partisanship content, meaning content that criticizes the opposing party rather than promoting one’s own party. Both Democratic- and Republican-conditioned accounts were recommended more negative partisan content, but this was more pronounced for Republican accounts. Negative-partisanship videos were 1.78 times more likely to be recommended as an ideological mismatch relative to positive-partisanship ones.
“We observed a bias toward negative partisanship in TikTok’s recommendations,” Zaki noted. “Regardless of the political party—Democratic or Republican—the algorithm prioritized content that criticized the opposing party over content that promoted one’s own party.”
The researchers also examined the top Democratic and Republican channels on TikTok by follower count. Republican channels had a significantly higher mismatch proportion, meaning their videos were more likely to be recommended to accounts with an opposite political leaning. Notably, videos from Donald Trump’s official TikTok channel were recommended to Democratic-conditioned accounts nearly 27% of the time, while Kamala Harris’s videos were recommended to Republican-conditioned accounts only 15.3% of the time.
Finally, the researchers analyzed the topics covered in partisan videos. Topics stereotypically associated with the Democratic party, like climate change and abortion, were more frequently covered by Democratic-aligned videos. Topics like immigration, foreign policy, and the Ukraine war were more frequently covered by Republican-aligned videos. Videos on immigration, crime, the Gaza conflict, and foreign policy were most likely to be recommended as ideological mismatches to Democratic-conditioned accounts.
To build on this work, future research could explore how TikTok’s algorithm behaves across different election cycles, investigate how misinformation is distributed within partisan content, and compare TikTok’s political content recommendations with those of other major platforms. Additionally, studies incorporating real user data alongside automated experiments could provide a more comprehensive understanding of how individuals experience political content on TikTok. Given the platform’s growing role in shaping public discourse, continued scrutiny of its recommendation system will be essential for assessing its impact on political knowledge and voter decision-making.
“We want to address fundamental questions about the neutrality of social media platforms,” Rahwan said.
The study, “TikTok’s recommendations skewed towards Republican content during the 2024 U.S. presidential race,” was authored by Hazem Ibrahim, HyunSeok Daniel Jang, Nouar Aldahoul, Aaron R. Kaufman, Talal Rahwan, and Yasir Zaki.