A new study published in Frontiers in Social Psychology suggests that the popular social media platform TikTok may be manipulated to conceal content critical of the Chinese government while amplifying narratives aligned with the Chinese Communist Party (CCP). The research, which involved three separate studies, found that TikTok users were exposed to significantly less content critical of China compared to users of other platforms like Instagram and YouTube. Additionally, the study found that heavier TikTok users tended to have more positive views of China’s human rights record and were more likely to consider China a desirable travel destination.
Authoritarian regimes such as those in Russia and Iran have increasingly used social media to manipulate information and advance their strategic interests. China, in particular, has developed sophisticated strategies to control narratives and influence public opinion through digital platforms. This phenomenon, often referred to as “networked authoritarianism,” involves state actors using subtle tactics like algorithmic manipulation and strategic content curation to shape narratives on popular social media platforms.
These tactics are particularly effective because they are often invisible to users, making overt censorship or manipulation difficult to detect. Given that TikTok is owned by the Chinese company ByteDance, concerns have been raised that it may be susceptible to influence from the CCP, either directly or through algorithmic adjustments. The study aimed to assess whether TikTok’s content curation patterns differed from those of other social media platforms and, if so, whether these differences aligned with Chinese government interests.
“China is a communist country. The CCP has a vast propaganda apparatus and a long record of enforcing conformity to its preferences,” said study author Lee Jussim, a distinguished professor of psychology at Rutgers University and author of The Poisoning of the American Mind. “Many before us have raised concerns about whether the CCP exerts undue influence on TikTok to further its goals. We decided to empirically assess whether this was the case regarding issues about which the CCP was likely to be sensitive.”
To address this, the researchers conducted three studies.
In Study 1, Jussim and his colleagues employed a user journey methodology to simulate the experience of new users on TikTok, Instagram, and YouTube, specifically focusing on content related to sensitive issues for the Chinese Communist Party (CCP). They created 24 new accounts on each platform, designated as belonging to 16-year-old users from the United States. The researchers then used these accounts to search for four keywords: “Uyghur,” “Xinjiang,” “Tibet,” and “Tiananmen.” For each search, they collected the first 300 videos served by the platform’s algorithm, recording data such as the video’s URL, upload date, and the time spent viewing it.
After collecting the video data, two independent analysts coded each video as “pro-CCP,” “anti-CCP,” “neutral,” or “irrelevant” based on a pre-defined coding system. Content explicitly critical of the Chinese government’s actions or policies was coded as “anti-CCP,” while content promoting positive narratives about China or supporting the Chinese government was coded as “pro-CCP.” Videos unrelated to politics or the search terms were classified as “irrelevant,” and those presenting factual information without explicit criticism or support were coded as “neutral.”
The researchers found significant differences in the type of content served on TikTok compared to Instagram and YouTube. Notably, TikTok searches yielded far less anti-CCP content than searches on the other platforms. For instance, only 2.5% of search results for “Uyghur” on TikTok were coded as anti-CCP, compared to 50% on Instagram and 54% on YouTube. Similarly, only 8.7% of “Tiananmen” search results on TikTok were anti-CCP, in contrast to 51% on Instagram and 58% on YouTube.
Additionally, TikTok searches produced a higher proportion of irrelevant content across most search terms compared to the other platforms, which aligned with the researchers’ “distraction hypothesis” – the idea that sensitive topics might be obscured by a flood of unrelated content.
Study 2 built on the findings of Study 1 by examining user engagement metrics, specifically the number of likes and comments, for pro-CCP and anti-CCP content across TikTok, Instagram, and YouTube. The researchers analyzed whether the prevalence of pro- and anti-CCP content aligned with user engagement data, which typically drives content amplification on social media platforms. They hypothesized that if TikTok’s algorithm was unbiased, the ratio of pro-CCP to anti-CCP content should be similar to the ratio of likes and comments for each type of content. Conversely, if TikTok was suppressing anti-CCP content, the ratio of pro-CCP to anti-CCP content would be much higher than the engagement ratios.
The researchers found that, across all platforms, users engaged far more with anti-CCP content than with pro-CCP content, as indicated by likes and comments. However, TikTok was the only platform that produced vastly more pro-CCP content than anti-CCP content in its search results. Specifically, on TikTok, users liked or commented on anti-CCP content nearly four times as much as they liked or commented on pro-CCP content, yet the search algorithm produced nearly three times as much pro-CCP content.
This discrepancy between user engagement and content served suggests that TikTok’s algorithm may be suppressing anti-CCP content despite its popularity among users. In contrast, Instagram and YouTube showed patterns more aligned with engagement metrics, indicating that their algorithms were likely driven more by commercial considerations than by potential propaganda objectives.
“The CCP exploits TikTok to protect and advance its interests,” Jussim told PsyPost.
In Study 3, Jussim and his colleagues conducted a survey of 1,214 American adults to investigate the relationship between social media usage and perceptions of China. Participants were recruited through Amazon’s Prime Panels CloudResearch service, and the sample was stratified to match U.S. census data on demographic categories. The survey assessed participants’ daily time spent on various social media platforms (Facebook, Instagram, TikTok, X, Reddit, and YouTube), their perceptions of China’s human rights record, and their views on China as a travel destination.
Participants rated the human rights records of ten countries, including China, on a scale from 1 (extremely poor) to 10 (extremely good), and indicated whether they believed China was one of the most desirable travel destinations in the world.
The researchers found a positive correlation between time spent on TikTok and favorable views of China. Specifically, the more time users reported spending on TikTok, the more positively they rated China’s human rights record and the more likely they were to agree that China is a desirable travel destination. These relationships held even when controlling for time spent on other social media platforms and demographic variables such as age, gender, ethnicity, and political affiliation. The relationship between TikTok use and favorable views of China was stronger than the relationships observed for other social media platforms, suggesting a unique association between TikTok usage and perceptions of China.
“I was most surprised at how strong the relationship was, in the third study, between TikTok use and ratings of China’s human rights record,” Jussim said. “The more time people spent on TikTok, the more positively they rated China’s human rights record.”
However, he noted that “our third study was a survey, not an experiment. So we know that the more time people spent on TikTok, the more positive views they held regarding China’s human rights record. Although it is possible that time spent on TikTok caused their views, our study demonstrated only the correlation, not a causal relationship.”
Additionally, while the study identified patterns suggesting potential algorithmic bias, it could not determine whether this bias results from direct interference by the Chinese government, self-censorship by TikTok’s parent company, or other factors.
Future research could examine whether similar patterns exist across different user demographics and geographic regions. Experimental studies could also assess whether exposure to different types of content on TikTok directly influences political attitudes over time. Additionally, researchers suggest investigating whether other social media platforms, including those owned by American companies, may also exhibit biases in content distribution related to geopolitical issues.
Despite these caveats, the findings contribute to a growing body of evidence that highlights the potential for social media platforms to serve as vehicles for state-sponsored narratives.
“One of America’s great strengths is that it is an open society,” Jussim said. “But that great strength also means that it is perennially vulnerable to manipulative propaganda promoted by those who wish to advance their own interests at the expense of our own, and to sow discord and weaken America. Exposing how malicious foreign actors do this is a labor of love. See also our other recent paper on how foreign funding of U.S. institutions of higher education predicts erosion of support for free speech and academic freedom, and increases in antisemitism.”
The study, “Information manipulation on TikTok and its relation to American users’ beliefs about China,” was authored by Danit Finkelstein, Sonia Yanovsky, Jacob Zucker, Anisha Jagdeep, Collin Vasko, Ankita Jagdeep, Lee Jussim, and Joel Finkelstein.