confirmation bias

We Believe Information We Like: A New Take on Confirmation Bias

You’ve probably heard of confirmation bias before. It’s one of those technical psychology terms that made its way into public use. People use confirmation bias to explain all sorts of seemingly irrational ways of thinking.

The gist of confirmation bias is that people tend to look for and pay attention to information more when it’s consistent with what they already think is true. So if someone already believes that milk is healthy to drink, he’ll pay more attention to pro-milk information than information claiming milk isn’t that healthy. This means that people’s beliefs are pretty hard to change. Confirming information gets incorporated into the belief, and disconfirming information just doesn’t.

Last year, in the throes of the election, I wrote about similar research in psychology showing how people look to discredit information that challenges their views while quickly accepting information that’s consistent with them.

A new study, though, raises a question about confirmation bias and challenges the way it’s typically treated.

Desirability Bias vs. Confirmation Bias

The problem with the existing confirmation bias studies is that they usually lump two key factors together. That is, what people believe is true and what people wish to be true are often the same thing in these studies. So it’s hard to say whether people believe some information more because it’s consistent with their current beliefs (“confirmation bias”) or because it’s consistent with what they want to be true (“desirability bias”).

For example, people tend to think they have many positive qualities, and they also tend to want to have positive qualities. So if you tell someone, “You’re super!”, she’s apt to believe it either because it’s consistent with how she sees herself already (confirmation bias) or because it’s consistent with how she wants to see herself (desirability bias).

Believing the Polls in the 2016 Election

confirm02To figure out which of these biases is more potent, researchers ran a study during the last few months of the 2016 United States presidential election. What an electrifying time!

Things were tense on all sides. Many polls were showing that Hillary Clinton would win the presidency, but some polls were suggesting that Donald Trump would win. This made it an interesting chance to pit desirability and confirmation against one another. Although many people believed that Clinton would win, some of those people didn’t want her to win whereas others did. And plenty of people believed that Trump would win while also wishing he wouldn’t or wishing he would.

So the researchers recruited more than 800 people to participate in a simple study.[1]  They did a little work up front to make sure they got roughly even numbers of people who wanted/believed each candidate would win. They asked everybody how confident they were that either Clinton or Trump would win the election as well as which candidate they wanted to win.

Then everyone read a brief article about recent polling numbers, but each participant randomly received one of two versions of the article. One version presented polling data emphasizing that Clinton was likely to win, and the other version gave evidence emphasizing that Trump was likely to win. After reading this, everyone again reported how confident they were that either Clinton or Trump would win the election.

The Showdown

To see how this setup helps address our question, consider one scenario in this study. What happens when a Trump supporter who believes Clinton will win learns that the polls point to a Trump victory? A classic confirmation bias aficionado would say that this person will reject this new information and continue to believe that Clinton will win. After all, the polls contradict the person’s belief. On the other hand, the desirability bias would mean that this person will accept the new information and update his belief even though it disconfirmed his original belief.

This study found overwhelming support for the desirability bias, and it happened for Trump and Clinton supporters alike. People updated their beliefs more when they learned information that was consistent with their desired outcome. In fact, people updated their beliefs the most when the article gave evidence that their preferred candidate would win and disconfirmed their initial belief.

We Believe What We Want to Be True

So it seems that the desirability bias pulls a lot of weight. Even when information disconfirms what we currently believe, if it supports what we wish to be true, we’re happy to change our belief.

Does this mean it’s time to throw away confirmation bias? This new study seems to show that what we’ve been calling confirmation bias may have actually been desirability bias all along. But we shouldn’t be too quick to toss out the idea that people can prefer information that coheres with established beliefs.

Although the authors don’t discuss it, I was reminded of the research on self-verification theory. The idea is that when people have low self-esteem, they actually prefer to get negative feedback about themselves because it’s consistent with how they already see themselves. Surely in a case like this, people would desire a more positive self-image, but they are nonetheless motivated by a need to confirm their current self-views. Time will tell how well desirability vs. confirmation biases apply in such cases.

Footnotes   [ + ]

1.  They did a little work up front to make sure they got roughly even numbers of people who wanted/believed each candidate would win.

Leave a Comment