Did you hear? We’re in the midst of a war against the “liberal media.” These days, one of the key themes of Donald Trump’s campaign is an attack against the press. It’s even gotten to the point that Trump’s supporters are actively antagonizing journalists.
Sure, journalists have opinions (and they’re allowed to express them), but it does seem that people only cry “bias!” when a bit of news clashes with their current opinion.
A key theme in social psychology is that we’re biased by our pre-existing beliefs and desires. Somewhat ironically, our own biases bias us to see bias. In other words, when something disagrees with our beliefs, we ramp up our motivation to find fault in it. But when something confirms our beliefs? We’re content to just accept that information without another thought.
And if you look real close…you might just see those biases in our current election Or, you know…any election.
We Don’t Trust Information We Don’t Like
If you love red wine, you’ll be happy to accept the conclusions of a study claiming that “drinking wine makes you live longer.” But if the same news agency reported on a study that claimed “drinking wine cuts your life short,” you might start to doubt the quality of the research study.
One study tested how far this would go. Psychologist Ziva Kunda concocted an article about how consuming caffeine is bad for your health. Specifically, though, the article claimed that caffeine consumption was unhealthy for women, according to recent research.
Kunda gave this article to people who were already heavy caffeine consumers and to people who didn’t consume much caffeine at all, and she asked them how convinced they were by the evidence.
In general people were moderately convinced by the evidence…except for one group. Women who were heavy caffeine consumers were most likely to be skeptical of the evidence. In other words, the people who had the most to lose if the article was right were the least likely to believe it.
When Science Disagrees With Us, It’s “Bad Science”
Kunda’s study showed that people tend to be less convinced when a conclusion isn’t what we want, but why? Is it just a knee-jerk reaction to disagreement? Or is it really that we treat the information differently?
Following up on the caffeine study, Kunda once again gave people an article about a recent study that showed caffeine leads to a particular disease. Like in the other study, heavy caffeine consumers were less convinced of the article than the low caffeine consumers.  Kunda also had a second version of the article, which reported the results of a study showing that caffeine can help prevent a disease. The opposite results came from this article–low caffeine consumers were less convinced of the “caffeine is good” article than the heavy caffeine consumers.
More importantly, though, everyone not only said whether they believed the article but also how good they thought the research study was. That is, did it seem to use sound scientific methods?
Heavy caffeine users thought the (anti-caffeine) scientific study itself was poorly done, compared to the low caffeine users. In other words, they were motivated to find fault in the evidence that threatened their beloved caffeine. The low caffeine users, on the other hand, were less critical of the study.
It’s Not Just Caffeine News…
Of course, this would be a pretty boring finding if it only pertained to caffeine science. Other studies, though, show how “fair weather” skepticism can show up in other ways.
In a classic experiment, researchers showed people the results of two different scientific studies. One of the studies found that capital punishment is an effective way to reduce crime, but the other study showed that capital punishment actually increases crime.
The participants then evaluated the quality of each of the research studies. The result was biased skepticism in action. People were more critical of whichever study found the opposite of their pre-existing belief. For example, proponents of capital punishment thought that the “anti-capital punishment” study was conducted more poorly than the “pro-capital punishment” study. But opponents of capital punishment thought their study was conducted better than the anti-capital punishment study.
The scary part about this is what you find when you look at people’s beliefs after reading the two studies. Even though people read about two studies with conflicting conclusions, it only served to make them further entrenched in their original beliefs. Proponents became more in favor of capital punishment and opponents became more against it…even though they all read about the same two studies.
The “Hostile Media Effect”
Okay, I know I opened this post talking about “media bias” and then I talked about how people don’t like science experiments. My point is more generally that people are skeptical of information that disagrees with them but more blindly accepting of information that agrees with them. This happens on both sides of the aisle. The capital punishment study showed that people on both sides of the issue were susceptible to bias.
With media coverage, this can result in people thinking the press is biased against them. Communications scholars call this the hostile media effect. A bunch of studies have shown that when someone thinks that the media are biased, he usually think the media are biased against his position.
This can happen when media actually do a good job of offering balanced coverage. Take a presidential campaign, for instance. No, really. Take it away from me–I’m sick of it. News agencies will be reporting on a range of stories–some that favor one candidate and some that favor another candidate. In the big soupy mess of our own psychological biases, we end up being disgusted with the coverage that opposes our candidate while quietly accepting the coverage that does us no harm. In the end, we focus on the negatives and see media bias around every corner–and it’s always against us.
One study analyzed people’s interactions with media during the 1992 presidential race between Bill Clinton and George Bush. The researchers analyzed more than 6,500 pieces of newspaper media as well as responses from more than 1,300 people living in areas served by those newspapers.
Their analysis of the newspapers themselves showed no strong case of media bias. Instead, newspapers were pretty neutral, offering an article that was favorable to Clinton on one page and an article that was favorable to Bush on another.
Nevertheless, when they asked people whether their newspaper was pro-Clinton or pro-Bush, people’s own party attachments swayed their judgment. Strong Republicans saw newspapers as having a pro-Democrat bias, and strong Democrats saw them as having a pro-Republican bias. This is even true when you look at people’s perceptions of the very same newspapers!
This Article is Biased (Unless You Agree with Me)
It seems that people have a hard time escaping their treasured beliefs and opinions. I’ll desperately cling to my coffee addiction no matter what you say, and I’ll find reasons to criticize any evidence that coffee is bad for me. Oh, but coffee’s actually good for me? That’s some good science.
There’s some sense to it, though. We build our whole lives around these core beliefs. Changing them could bring our whole sense of the world crashing down. So we trick ourselves into living in a world that perfectly protects our precious beliefs and opinions. Our minds are quick to discount any challenges and accept any support.
In the end, you might still say that I’m just some liberal hippie who’s been brainwashed by the mainstream media. Maybe you’re right…but I’d rather just say you’re biased.
Footnotes [ + ]
|1.||↑||Or, you know…any election.|
|2.||↑||Kunda also had a second version of the article, which reported the results of a study showing that caffeine can help prevent a disease. The opposite results came from this article–low caffeine consumers were less convinced of the “caffeine is good” article than the heavy caffeine consumers.|
|3.||↑||No, really. Take it away from me–I’m sick of it.|