Fake news: It’s all in your brain
Confirmation bias can be helpful — or wreak havoc
Both the American Dialect Society and Collins Dictionary named “fake news” as their word of the year for 2017. A term that wasn’t part of the popular lexicon even 18 months ago has spread like a deadly infection.
One of the runners-up was “echo chamber,” referring to a sort of bubble where people live with exposure only to their own opinions and those who subscribe to the same ideas. Fake news often bounces around echo chambers with no one to dispute its veracity.
We all have a few people in our social media networks who share ridiculous things, such as a crazy aunt who genuinely believes the government covers up extraterrestrial contacts or a distant cousin who claims Sandy Hook victims were actually paid actors. But most of us have also encountered wellinformed, sane people who share blatantly incorrect propaganda.
Why does this happen? After all, a quick Google search often verifies whether something is real or a hoax. The answer lies in our brains and a little-known phenomenon called “confirmation bias.” It works like this: When we see new information, we try to decide whether or not to believe it. Generally, if the new information confirms existing beliefs we hold, we buy into it automatically (and hence retweet and share). But if the new information is contrary to what we already know, we’re most likely to discard it in order to maintain cognitive consistency. When information agrees with your beliefs, it takes no time to confirm it; when information disagrees, it takes many, many contrary facts before we even consider changing our minds.
Confirmation bias is evolutionarily efficient and generally helpful. Reinforcing what we already know helps us make good decisions quickly. And affirmation is critical to our survival; it keeps us on the right path. It is for this reason scientists have demonstrated time and again that we prefer familiar words, paintings, shapes, even sounds — the more we are exposed to something, the more we like it.
If you’ve ever disliked a song you heard on the radio but then found yourself enjoying it after it plays a few more times, you’ve experienced what psychologists call an exposure effect. This effect explains why test subjects rate a familiar face as happier and better looking than a stranger’s face (even if both are showing the same expression). The more time we spend with people, the more attractive they become to us. From an evolutionary perspective, there is little downside to enjoying, seeking out and reinforcing the value of what’s familiar, as the familiar is often the safest. So, by and large, it’s a good thing when an exposure effect creates confirmation bias.
But when problems get more complex, confirmation bias wreaks havoc. It impedes our ability to process information that goes against what we think is true and makes it all but impossible to change our own minds. It contributes to racism, sexism, bigotry.
Confirmation bias makes us vulnerable to false claims that confirm what is familiar but may be wrong. It also makes us suspicious of other people promoting falsehoods that don’t mesh with what we believe. The result is intense polarization.
With confirmation bias influencing our news, there are really only two options. You can try to be hyper-vigilant with fact-checking and genuinely listen to those who disagree (i.e., go against human nature), or you can shut it out entirely. In my last column, I admitted to the latter and have dramatically reduced my news intake from sources that may be transient, biased or uninformed. In the era of fake news, it’s the only way we’ll make it out of our echo chambers alive.
Jeff Stibel is vice chairman of Dun & Bradstreet, a partner of Bryant Stibel and an entrepreneur who also happens to be a brain scientist. Follow him on Twitter at @stibel. The views and opinions expressed in this column are the author’s and do not necessarily reflect those of USA TODAY.