Are online quizzes secretly changing your vote? Surprising study uncovers an “opinion matching effect”

A new study published in PLOS One suggests that online quizzes designed to help people determine their political alignment may be influencing their opinions and voting preferences without their knowledge. Researchers found that some of these quizzes, which claim to match users with political candidates or parties based on their responses, produce biased results that favor one side over another. In an experiment with eligible United States voters, the study showed that such biased recommendations could significantly sway voting preferences—all while participants remained unaware of any manipulation.

The internet has introduced powerful new methods of influence, some of which can shape public opinion and decision-making in ways that users do not consciously recognize. Political quizzes that match users to candidates or parties have become a popular feature on various websites, promising to help voters make informed choices. However, if the algorithms behind these quizzes are designed in a biased way—either intentionally or inadvertently—they could be subtly steering people toward a particular political preference.

Robert Epstein of the American Institute for Behavioral Research and Technology and his colleagues wanted to investigate whether such bias exists in real-world quizzes and whether a controlled experiment could demonstrate how these quizzes influence voter preferences. They were also interested in whether participants would notice any manipulation or if the effect would remain undetected.

The researchers first conducted a study to examine whether political quizzes available online produced results that systematically favored certain political parties or ideologies. To do this, they used automated scripts to simulate users taking these quizzes multiple times. These simulated users selected their responses randomly, ensuring that no specific political preference was reflected in their answers. The expectation was that if a quiz were truly neutral, its recommendations would be evenly distributed among all possible political affiliations over many trials.

The results of this analysis revealed significant bias in some of the quizzes. One quiz, hosted by a website called My Political Personality, was designed to match users with one of four political parties: Democratic, Republican, Libertarian, or Green. If the quiz were unbiased, each party should have been recommended roughly an equal number of times. However, the researchers found that the quiz disproportionately recommended the Democratic Party at twice the expected rate while never recommending the Green Party at all.

Another quiz, hosted by the well-known Pew Research Center, claimed to classify users into one of nine political categories based on their answers. However, the researchers found that some categories were recommended far more often than others. Most notably, users were never categorized as “Progressive Left,” even after hundreds of trials. The researchers concluded that these quizzes, whether intentionally or unintentionally, contained statistical biases that could subtly influence the political opinions of users who took them.

For their second study, the researchers conducted a controlled experiment to determine whether a biased political quiz could actually shift voting preferences. They recruited 773 eligible voters in the United States and randomly assigned them to different groups. Participants were first asked about their opinions of two Australian political candidates, Scott Morrison and Bill Shorten. Because these candidates were not well known in the United States, participants were unlikely to have strong preexisting opinions about them, making them ideal subjects for testing the influence of a biased quiz.

After providing their initial opinions, participants completed a political quiz that was designed to appear as though it would match them with the candidate who best aligned with their views. However, the results of the quiz were manipulated: some participants were falsely told that they strongly matched with Morrison, while others were told they strongly matched with Shorten. A control group received neutral results, with both candidates being presented as equally compatible.

After taking the quiz and receiving manipulated results, participants were asked again about their voting preferences. The researchers found that the number of participants who said they would vote for the quiz’s favored candidate increased dramatically—by as much as 95% in some groups. This means that participants who initially had no preference for either candidate became significantly more likely to support the candidate they were told was the best match for them.

Interestingly, while voting preferences shifted substantially, participants’ general opinions about the candidates changed only slightly. This suggests that the quiz primarily influenced voting decisions rather than deeply held beliefs. Perhaps most notably, none of the participants who had been given biased results reported any awareness of manipulation. This indicated that the effect operated beneath the level of conscious awareness, making it a powerful yet invisible tool for shaping voter behavior.

The researchers coined the term opinion matching effect to describe this phenomenon, in which individuals are more likely to align with a candidate or party when they are told their views match—even if that match is fabricated or manipulated.

They emphasize that this form of influence differs from other methods of online persuasion because it occurs in a setting where users expect to receive objective, personalized guidance. Unlike traditional political advertisements or campaign messages, which people often approach with skepticism, opinion-matching quizzes present themselves as neutral tools for self-discovery. As a result, users may be more trusting of the recommendations they receive, making them particularly susceptible to the subtle nudging of biased results.

While the findings highlight a powerful and largely invisible form of influence, the study has some limitations. The researchers focused on short-term shifts in voting preferences, but it remains unclear how long these effects last. If someone takes a biased quiz months before an election, their views may revert over time. However, if a voter encounters such a quiz shortly before casting their ballot, the effect could have a more immediate impact.

Epstein and his colleagues concluded their study with a harrowing warning: “we hope this study will serve as a reminder to scientists, public policy makers, and interested members of the general public that the internet is very much out of control. The content of print media has been constrained in various ways since not long after the printing press was invented, but there are still virtually no constraints on the kind of content that can be posted online.”

“This means, among other things, that new means of manipulation that the internet has made possible can be used, and almost certainly are being used, to impact the thinking and behavior of billions of people in potentially destructive or self-destructive ways without their knowledge or consent. [The opinion matching effect matters] because it is a powerful tool for shifting people’s opinions and voting preferences which appears to be completely invisible to users. If we can discover this, so can bad actors.”

The study, “The “opinion matching effect” (OME): A subtle but powerful new form of influence that is apparently being used on the internet,” was authored by Robert Epstein, Yunyi Huang, Miles Megerdoomian, and Vanessa R. Zankich.