Arkansas Democrat-Gazette

Fight for truth

Strategies to tackle misinforma­tion

- GLEB TSIPURSKY

Whenever you hear something repeated, it feels more true when you hear it repeated. In other words, repetition makes any statement seem more true. So anything you hear will feel more true each time you hear it again.

Each of the three sentences above conveyed the same message. Yet each time you read the next sentence, it felt more and more true. Cognitive neuroscien­tists like myself call this the “illusory truth effect.”

Illusory truth is one consequenc­e of a phenomenon called “cognitive fluency,” meaning how easily we process informatio­n. Much of our vulnerabil­ity to deception in all areas of life revolves around cognitive fluency.

Unfortunat­ely, such misinforma­tion can swing major elections, such as the 2016 presidenti­al election. Fortunatel­y, we can take a number of steps to address misinforma­tion and make our public discourse and political system more truthful.

Our brains are lazy. The more effort it takes to process informatio­n, the more uncomforta­ble we feel about it and the more we dislike and distrust it.

By contrast, the more we like certain data and are comfortabl­e with it, the more we feel that it’s accurate. This intuitive feeling in our gut is what we use to judge what’s true and false.

Yet no matter how often you have heard that you should trust your gut and follow your intuition, that advice is wrong. You should not trust your gut when evaluating informatio­n where you don’t have expert-level knowledge due to mental errors that scholars call “cognitive biases.”

The illusory truth effect is one of these mental blindspots; there are over 100 altogether. These mental blindspots impact all areas of our lives, from health and politics to relationsh­ips and even shopping.

Besides illusory truth, you need to watch out for “confirmati­on bias.” That refers to our tendency to look for and interpret informatio­n in ways that conform to our prior beliefs, intuitions, feelings, desires, and preference­s, as opposed to the facts.

Cognitive fluency deserves blame. It’s much easier to build neural pathways to informatio­n that we already possess.

The more educated we are, the more likely we are to engage in such active rejection. After all, our smarts give us more ways of arguing against new informatio­n that counters our beliefs. That’s why research demonstrat­es that higher education correlates with more polarized beliefs around scientific issues that have religious or political value overtones, such as stem cells or climate change.

Our minds like to interpret the world through stories, meaning explanator­y narratives that link cause and effect in a clear and simple manner. Such stories are a balm to our cognitive fluency, as our minds constantly look for patterns that explain the world around us in an easy-to-process manner. That leads to the “narrative fallacy,” where we fall for convincing-sounding narratives regardless of the facts, especially if the story fits our predisposi­tions and our emotions.

You ever wonder why politician­s tell so many stories? Sure, sometimes they cite statistics and scientific reports, but they spend much, much more time telling stories: simple, clear, compelling narratives that seem to make sense and tug at our heartstrin­gs.

Now, here’s something that’s actually true: The world doesn’t make sense. The world is not simple, clear, and compelling. The world is complex, confusing, and contradict­ory. Beware of simple stories! Look for complex, confusing, and contradict­ory scientific reports and high-quality statistics: They’re much more likely to contain the truth than the easy-toprocess stories.

To fix our brains, one of the most effective strategies is to build up a habit of automatica­lly considerin­g alternativ­e possibilit­ies to any claim you hear, especially claims that feel comfortabl­e to you. Be especially suspicious of repeated claims that favor your side’s positions without any additional evidence, which play on the illusory truth effect and the confirmati­on bias combined.

Another effective strategy involves cultivatin­g a mental habit of questionin­g stories in particular. Remember, it’s very easy to cherrypick stories to support whatever position the narrator wants to advance. Instead, look for thorough hard numbers, statistica­l evidence, and peer-reviewed research to support claims.

You can also make a personal commitment to the 12 truth-oriented behaviors of the Pro-Truth Pledge by signing the pledge at ProTruthPl­edge.org. Peer-reviewed research has shown that taking the Pro-Truth Pledge is effective for changing people’s behavior to be less vulnerable to misinforma­tion.

These quick mental habits will address the most fundamenta­lly flawed aspects of our mind’s tendency to accept misinforma­tion.

Dr. Gleb Tsipursky, who lived in Little Rock during a year-long research fellowship, is a cognitive neuroscien­tist and behavioral economist researchin­g defenses against misinforma­tion, and co-founded the Pro-Truth Pledge. This oped is excerpted from “Pro Truth: A Pragmatic Plan to Put Truth Back Into Politics” by Tsipursky and Tim Ward.

 ??  ??

Newspapers in English

Newspapers from United States