Conservatives exhibit greater metacognitive inefficiency, study finds

A new study published in the Journal of Experimental Psychology: General reveals an asymmetry in how people across the political spectrum perceive their ability to detect misinformation. While both liberals and conservatives show some awareness of their ability to judge the accuracy of political information, conservatives exhibit a notable weakness when faced with information that contradicts their political beliefs.

In other words, when confronted with news that goes against their political views, conservatives’ confidence in their judgments does not align well with their actual accuracy. This mismatch, or “metacognitive inefficiency,” suggests that conservatives are less aware of when they are wrong when the information contradicts their ideological commitments.

While much research has been done on people’s ability to distinguish between true and false political information, less attention has been paid to how aware people are of their own accuracy—or lack thereof—in making these judgments. This study sought to fill that gap by investigating whether people’s confidence in their truth judgments is justified, particularly in politically charged contexts.

“My interest in this topic stemmed from recent research showing that people tend to be metacognitively confused about politicized science,” said study author Michael Geers, a postdoctoral fellow at the Max Planck Institute for Human Development. “Specifically, for some knowledge such as climate change and COVID-19, people are unaware of the accuracy and fallibility of their own knowledge, which has been dubbed metacognitive blind spots. This made me wonder if all population subgroups are equally prone to such metacognitive blind spots.”

To explore this, the researchers used data from a longitudinal study involving 1,191 participants in the United States. The participants were recruited by YouGov and surveyed over six months, between February and July 2019. The sample was carefully matched to the broader U.S. population using stratified sampling, ensuring that it included a diverse range of ages, genders, education levels, and political ideologies.

Every two weeks, the researchers identified 20 viral political news stories that had garnered significant engagement on social media platforms, particularly Facebook and Twitter. These stories were selected for their virality and were evenly split between true and false information. The researchers then crafted concise statements summarizing the key claims of these articles, which were presented to the participants for evaluation.

For each statement, participants rated whether they believed it was true or false and then indicated how confident they were in their judgment. This two-step process allowed the researchers to measure both the accuracy of the participants’ judgments (i.e., whether they correctly identified true or false statements) and their metacognitive insight (i.e., whether their confidence accurately reflected the correctness of their judgments).

To determine the political slant of these statements, the researchers employed crowdworkers from Amazon Mechanical Turk. These crowdworkers, who identified as either Democrats or Republicans, rated how each statement would affect their feelings toward their political ingroup and outgroup if the statement were true. Based on these ratings, the statements were categorized as favoring either the Democratic or Republican party, or as neutral if no clear partisan benefit was identified.

Geers and his colleagues found that participants across the political spectrum generally demonstrated a good level of metacognitive insight, meaning that their confidence tended to align well with their ability to distinguish true from false information.

However, when it came to political statements that contradicted their ideological beliefs — referred to as discordant information — an asymmetry emerged. Conservatives and Republicans were found to have significantly lower metacognitive efficiency in these cases, indicating that they were less aware of whether they were right or wrong when judging the truthfulness of statements that challenged their political views.

“The key takeaway is that insight into political knowledge is asymmetrical,” Geers told PsyPost. “While generally, both liberals and conservatives tend to know when they are right or wrong, this metacognitive insight is substantially impaired for conservatives judging political information that challenges their ideological commitments.”

This asymmetry was particularly pronounced among individuals with extreme conservative views, who showed the greatest difficulty in accurately assessing the correctness of their judgments when confronted with discordant information. This suggests that, while people generally have a good sense of their ability to detect misinformation, this ability falters among conservatives when the information is ideologically challenging.

“It is interesting that conservatives show such low metacognitive insight for statements at odds with their ideology,” Geers said. “Notably, our analyses control for people’s level of knowledge. So, while conservatives already have a hard time judging these discordant statements, they really seem to be unaware of how well they’re doing—above and beyond what is to be expected based on their level of knowledge.”

But the study, like all research, includes some caveats. One limitation is that the study focused primarily on political misinformation in the United States and on statements that were selected based on their virality. This means the findings may not fully generalize to other countries, cultural contexts, or to different types of information, such as non-political misinformation.

“We don’t know whether our results generalize to other knowledge domains beyond politics,” Geers noted. “Similarly, future research may investigate whether political preferences are equally linked to metacognitive insight in countries other than the United States.”

The study, “The Political (A)Symmetry of Metacognitive Insight Into Detecting Misinformation,” was authored by Michael Geers, Helen Fischer, Stephan Lewandowsky, and Stefan M. Herzog.