People trust journalists less when they debunk misinformation

When journalists confirm that a claim is true, most people tend to trust them. However, when those same journalists correct a false claim, they face a much harder battle to earn the trust of their audience. This was the key finding from a recent study published in Communication Research, which examined how people perceive fact-checks and the journalists who provide them.

Public confidence in the media has been declining for decades, and this trend has accelerated in recent years. This widespread distrust is partially driven by concerns that journalistic coverage is biased, sensationalized, and often dishonest. To combat these perceptions, many news organizations have ramped up their efforts in fact-checking, hoping that by rigorously verifying information, they can regain public trust.

Fact-checking, however, has not been without controversy. Critics argue that it can be biased or even an attempt by journalists to control public perception of the truth. Given the ongoing skepticism toward the media and the mixed results of previous studies on the effectiveness of fact-checking, the researchers behind this study wanted to explore whether the public actually trusts these fact-checks.

“I wanted to cover something new on the topic of why it’s hard to correct misinformation. I saw an opportunity to take kind of a classic idea from linguistics, that people intuitively understand that you shouldn’t be negative, so being negative (and a correction or debunking is saying someone else is wrong, so it’s negative) carries with it a risk of suspicion,” said study author Randy B. Stein, an associate professor of marketing at Cal Poly Pomona.

The researchers conducted two studies to test their hypothesis that people trust journalists more when they confirm claims rather than correct them. In the first study, 691 participants were presented with a series of eight political or economic claims. These claims ranged from statements about homelessness rates to the prevalence of fentanyl in overdose deaths. Participants were asked to rate how likely they thought each claim was to be true on a scale from 1 to 9.

After this initial evaluation, they were randomly assigned to see a fact-check for one of the claims. The fact-check either confirmed the claim as true or corrected it as false. These fact-checks were real, taken from a political fact-checking website known for its non-confrontational style (checkyourfact.com). After reading the fact-check, participants were asked to rate their level of distrust in the journalist who wrote it. They also rated how surprising they found the fact-check’s outcome, how much evidence they felt was needed to believe it, and whether they thought the fact-check was exploitative.

The second study followed a similar procedure but focused on different types of claims. This time, 691 participants were presented with marketing claims about products, such as whether certain food products were humanely sourced or whether a particular cooking hack was effective. Again, participants rated their initial belief in the claim before being shown a journalistic report that either confirmed or corrected the claim. Unlike the first study, the reports in the second study were not explicitly labeled as fact-checks to see if the effect of distrust would still hold without that label.

The findings from both studies were consistent. Participants showed significantly more distrust toward journalists who corrected false claims than toward those who confirmed true claims.

“People are generally trusting of journalists, and trust of confirmatory articles was pretty high,” Stein told PsyPost. “So despite the stereotype, it’s not true that people don’t trust fact-checks and journalists at all, but there is more suspicion centered around corrective/debunking articles. (I want to add we use ‘corrections’ in the sense of any article that points out that some claim is false, like ‘correcting misinformation that is out there’ or a fact-check, not in the sense of an outlet issuing a correction of an error they made in a previous article.)”

“It’s a really robust effect,” he added. “We found it in two separate contexts, and we found it among both liberals and conservatives. Conservatives are more distrusting overall, but both show more distrust towards corrections.”

This effect was observed even when the correction aligned with what participants already believed to be true. In other words, people were more likely to trust journalists when they told them something they already suspected was true, but they were more skeptical when journalists corrected misinformation they believed was untrue.

“We got the effect even for claims people suspected weren’t true – i.e., ‘I know this thing is probably false, but I’m still a bit suspicious of the person pointing it out,’” Stein said.

The researchers identified three main reasons for this increased distrust toward corrections:

Surprise: Corrections were generally seen as more surprising than confirmations. This is likely because negative statements, such as corrections, are less common and therefore more unexpected in communication.

Need for Evidence: Participants felt that corrections required more evidence to be believed compared to confirmations. This indicates a higher level of skepticism when journalists declare something to be false.

Exploitation: Corrections were perceived as more exploitative, suggesting that participants were more likely to think that the journalist was using the situation to push an agenda or gain attention when they corrected a claim rather than confirmed it.

Interestingly, the study also found that even when corrections successfully changed participants’ beliefs about the claim, they still harbored more distrust toward the journalist who provided the correction. This finding indicates that while corrections can be effective in altering beliefs, they may simultaneously damage the credibility of the journalist delivering them.

“The deck is really stacked against journalists (or anyone) trying to correct misinformation (again, or anything),” Stein said. “The research suggests that “we intuitively hold negative communications to a different standard, demanding more evidence to believe them and doubting the motives of people who make them.”

While the study provides valuable insights, it has limitations that warrant further exploration. First, the study was limited to claims that participants initially found plausible. The researchers argue that this is appropriate because it mirrors the types of claims that are most likely to be the subject of misinformation. However, it remains unclear whether the effect would hold for more outlandish or easily debunked claims.

“We can reasonably assume that people have no issues with debunkings of things they already strongly believe are untrue and are invested in,” Stein noted.

The researchers suggest that future studies could explore ways to mitigate the distrust associated with corrections. For example, providing more detailed information, avoiding attention-grabbing “debunking” language, or framing corrections in more positive terms could potentially reduce the negativity bias and increase trust.

The study, “Whose Pants Are on Fire? Journalists Correcting False Claims are Distrusted More Than Journalists Confirming Claims,” was authored by Randy Stein and Caroline E. Meyersohn.