As with most things about this 2016 election, something got me thinking about social psychology. Just 11 days before the election, the FBI released a statement saying that they were re-opening the case regarding Hillary Clinton’s emails. Some “new evidence” was deemed relevant, and apparently it was worth a public declaration.
Not surprisingly, people had a field day with it. It ignited new suspicions, rallied those who oppose Clinton, and fueled a fire built on the belief that these emails contained evidence so damning that this would be the final straw. Over the weekend, though, it all came to a head with FBI Director James Comey’s statement: “Based on our review, we have not changed our conclusions that we expressed in July.” Just two days before the election, the uproar was met with this simple disconfirmation of many people’s suspicions.
I wondered, though, how much this new story did to reverse people’s beliefs. Was the FBI’s actual judgment enough to put out the fire that their previous week’s announcement started? (Well, the election is today, so we’ll see…)
After all, research in psychology has shown that people are really good at holding onto their beliefs even after the basis of those beliefs are discredited.
Keeping the Belief
Social psychologists use the term “belief perseverance” to refer to times when people hold tightly to their initial beliefs even when new information directly contradicts it. In other words, beliefs persevere in spite of clear evidence to the contrary. Specifically, belief perseverance is what happens when new information discredits the basis for forming the belief at all.
As a relevant example, let’s say the news reports a story that Libertarian nominee Gary Johnson has recently been accused of insider trading and is therefore untrustworthy. A belief has been formed! But then two days later, the news outlet reports that their sources turned out to be false and there was never any legitimacy to the claim from the beginning.
Any rational person in this situation would realize that since the first story had nothing to do with reality, he should discredit the belief that he’d formed about Johnson. Instead, even though the basis of the belief was firmly and completely discredited, people often come away from an experience like this holding onto at least some of the original belief.
Evidence for Belief Perseverance
Consider one study that tested belief perseverance. In the study, people had the fairly morbid job of classifying 25 suicide notes as real or fictitious. As they did this, they were given pre-scripted feedback. Regardless of how they categorized the notes, they were either told that they were correct most of the time or incorrect most of the time, which led them to form beliefs about their ability to discern real notes from fake ones.
At the end of the study, though, even though the experimenters carefully explained that the accuracy feedback they gave was pre-scripted and unrelated to their performance on the note classification task, the people who had been told they did well on the task continued to believe they were better at judging real vs. fake notes than the people who had been told they didn’t do well.
This can be an issue in courtrooms as well. Imagine someone provides an eyewitness testimony in a trial that strongly suggests that the person in question is guilty, but it later comes to light that the testimony was made up. According to belief perseverance, discrediting the testimony may not do much to change the jury’s verdict. In fact, some research has shown this to be the case in mock trial settings. Similarly, other studies show that jurors sometimes have difficulty fully avoiding the use of evidence that the judge deemed inadmissible when making their final verdicts.
Why It’s Hard to Shake a Belief: The Power of Explanation
Belief perseverance relies on the power of explanation. We often form beliefs by creating compelling explanations for why something is true. So if you tell me that I did really well at distinguishing real suicide notes from fake ones, I’m unlikely to take that at face value. Instead, I start to create a compelling narrative for how I was able to do so well at that task (“I have an eye for detail,” “I’m good at scrutinizing written communication,” “I’ve been successful in the past at similar tasks,” etc.).
At this point, when you tell me that the original information was fake, you’ve only discredited the event that inspired my explanation. You haven’t discredited all of the reasons I came up with on my own to explain why I’m good at this.
In one study, people read a bunch of information, and their job was to figure out if that information suggested a link between certain personal characteristics and people’s behavior. In fact, this information was specifically designed so that people ended up believing that there the was a relationship between risk-taking and being a successful firefighter. Weird, I know, but it’s going somewhere….
After everyone figured out the connection between being a risk-taker and being a successful firefighter, the experimenters broke the news to some of the study’s participants: all the information they read was completely made up. Everyone else kept believing the information was real, though. As you would expect from belief perseverance, even when the participants knew that the original information was fake, they continued to believe in the belief they had formed about the relationship between risk-taking and success as a firefighter.
Here’s the most important part, though. The researcher had also asked everyone to write down their reasoning behind the beliefs they formed. If people spontaneously generated more intricate explanations, they held onto their initial beliefs more firmly. As other studies have shown, when experimenters directly ask people to come up with an explanation for a relationship, they hold onto those initial beliefs more than people who never had to explain their beliefs.
Could I Discredit This Article?
This seems to happen all the time. We come to believe something, and then we’re faced with evidence to the contrary. The source wasn’t reliable, new evidence comes to light, the data were made up… When that happens, we could do the rational thing and “undo” our original belief because we never should have had it in the first place. Or we can say, “Listen, the facts might not be there, but we ended up convincing ourselves anyway.”
The same could be said about Hillary Clinton’s emails and the one-week span between the FBI re-opening and re-closing the case against her. Does the FBI’s recent decision ease the suspicions of those fervently fighting against Clinton and who originally saw the FBI’s move as a victory? Likely not. If anything, the whole incident may have inspired people to come up with their own new explanations for their distrust of Clinton. That the email debacle turned out to be largely baseless can’t hold a candle to people’s own self-persuasion.
Footnotes [ + ]
|1.||↑||Weird, I know, but it’s going somewhere…|