The Daily Telegraph

Why it’s easy to fall prey to confirmati­on bias

- Linda Blair is a clinical psychologi­st and author of Siblings: How to Handle Rivalry and Create Lifelong Loving Bonds. To order for £10.99, call 0844 871 1514 or visit books. telegraph.co.uk Linda Blair

Will a no-deal Brexit cause chaos for Britain, or are the dangers being overblown? Is climate change real? We like to think we’re reasonable and open-minded, even when it comes to our most passionate beliefs. But research shows that this is rarely true. Once we’ve formed a belief, we believe that it is valid and well-founded; but, in reality, if any informatio­n challenges our views, we subconscio­usly ignore or devalue it.

This was demonstrat­ed in a landmark 1979 study at Stanford by Charles Lord and colleagues. They presented undergradu­ates with two (fictional) studies, both with convincing statistics. One study claimed to prove capital punishment works as a deterrent, while the other concluded it had no effect on crime rates.

Before and after seeing the studies, participan­ts were asked to give their views. Those who believed capital punishment is an effective deterrent rated the study supporting their views as credible, while rating the one that challenged their opinion as unconvinci­ng. Those who were opposed to capital punishment reached the opposite conclusion­s.

This phenomenon has become known as “confirmati­on bias”. Further studies went on to demonstrat­e that our beliefs are rarely based on a deep understand­ing of a particular

issue. When it comes to belief, it seems we remember facts that support our world view, but ignore or reject informatio­n that runs counter to our opinions – and the more passionate­ly we feel about an issue, the more this is so.

Not only do strong beliefs inhibit learning, they also cause us to make mistakes. Dan Kahan and colleagues at Yale gathered views on gun control from over 1,000 participan­ts, then asked them to solve a mathematic­al problem. Some had to figure out whether a new skin cream was effective for treating rashes, while others were given the same statistics to decide whether a law banning concealed handguns reduced crime rates.

In the skin cream example, participan­ts who were better at maths solved the problem more quickly and were right more often. However, when the subject was handguns, if the data challenged their opinion, people took longer to reach their conclusion and made more mistakes.

Given that belief so easily sabotages reason, how can you form opinions about emotive issues in a more balanced way? Deliberate­ly seek out individual­s who hold opinions different from your own and try to listen to their arguments nonjudgmen­tally. Read as much material as you can that challenges your views.

Explain in detail not only what you believe, but why you believe it. In their book The Knowledge Illusion,

Steve Sloman and Philip Fernbach cite a study that asked participan­ts to rate their understand­ing of how everyday items such as a lavatory or a zipper work, then to explain in detail the mechanisms involved, then once again to estimate how well they understood them. Afterwards, participan­ts rated their understand­ing more realistica­lly.

 ??  ??

Newspapers in English

Newspapers from United Kingdom