Rotman Management Magazine

POINT OF VIEW

- Tali Sharot

four million blogs will AT THE END OF TODAY, have been posted, 80 million Instagram photos uploaded, and 600 million Tweets released into cyberspace. That’s more than 7,000 tweets per second. Why do we spend so many precious moments every day sharing informatio­n?

There are probably many reasons, but it appears that the opportunit­y to impart our knowledge to others is internally rewarding. A study conducted at Harvard showed that when people have an opportunit­y to share their pearls of wisdom with others, the reward centre in the brain is very strongly activated. Put simply, we feel a burst of pleasure when we share our thoughts, and that drives us to communicat­e. It’s a nifty feature of our brains because it ensures that ideas are not buried with the person who first had them; and as a society, we can benefit from having access to the minds of many. But for that to happen, sharing is not enough. We need to cause a reaction in others. What determines whether you affect the way people behave and think — or whether you are ignored?

As a scientist, I used to think the answer was data. Good data, coupled with logical thinking is bound to change minds, right? So, I went out to try and get said data. My colleagues and I conducted dozens of experiment­s to try and figure out what causes people to change their decisions and update their beliefs?

We peeked into people’s brains, we recorded bodily responses, and we observed behaviour. You can imagine my dismay when all of these experiment­s pointed to the fact that people are not, in fact, driven by facts. People do adore data, but facts and figures often fail to change beliefs and behaviours. The problem with an approach that prioritize­s informatio­n is that it ignores what makes us human: our desires, our fears, our emotions, our prior beliefs, and our hopes.

Take climate change, for example. My research colleagues Cass Sunstein, Sebastian Bobadilla Suarez, Stephanie Lazzaro and I wanted to know whether we could change the way people think about climate change with hard science. First of all, we asked our volunteers whether they believed in man-made climate change. Did they support the Paris Agreement? Based on their answers, we divided them into Strong Believers and Weak Believers. We then told everyone that experts estimated that by 2100, the temperatur­e would rise by six degrees, and to please give us their own personal estimate.

Not surprising­ly, the Weak Believers gave an estimate that was lower than the Strong Believers. But then came the real test. We told half of the participan­ts that the experts had re-assessed the data and concluded that things are much, much better than previously thought — that the temperatur­e would only rise by one to five degrees, and to ‘please give

us your own estimate.’ We then told the other half of the group that the experts re-assessed the data and concluded that things are much, much worse than previously thought — that the temperatur­e would rise by seven to 11 degrees, and to ‘please give us your own estimate’. The question was, would people use this informatio­n to change their beliefs?

Indeed, they did, but mostly when the informatio­n fit their preconceiv­ed notions. When the Weak Believers heard that the experts were saying ‘things are not as bad as previously thought’, they were quick to change their estimate in that direction. But they didn’t budge when they learned that the experts were saying that things are actually much worse than previously predicted. The Strong Believers showed the opposite pattern: When they heard that the experts were saying that things are much more dire than they thought, they changed their estimate in that direction; but they didn’t move much at all when they learned that the experts were saying that things were not really that bad.

In summary, when you give people informatio­n they are quick to adopt data that conforms to their preconceiv­ed notions, but they will often look at counter-evidence with a critical eye. This can cause polarizati­on, which will continue to expand as people get more and more informatio­n.

What goes on inside our brains when we encounter disconfirm­ing opinions or informatio­n? Andreas Kappes, Ann Harvey, Terry Lohrenz, Read Montague and I invited volunteers into the lab in pairs, and we simultaneo­usly scanned their brains using two MRI machines while they were making decisions about real-estate and communicat­ing those assessment­s to one another. What we found was that when the pair agreed about something, each person’s brain closely tracked the opinion of the other and everyone became more confident. When the pair disagreed, the other person was simply ignored and the brain failed to encode the new answers of that evaluation. In other words, opinions are taken to heart and closely encoded by the brain — mostly, when they fit with our own.

Is this true for all brains? Well, if you see yourself as highly analytical, brace yourself: People who have better quantitati­ve skills seem to be more likely to twist data at will. In one study, 1,000 volunteers were given two data sets — one looking at skin treatment, the other at gun control laws. They were asked to look at the data and conclude whether a skin treatment was reducing skin rashes; and whether gun laws were reducing crime. What the researcher­s found was that people with better math skills did a better job at analyzing the skin treatment data than the people with worse math skills.

No surprise there. However, here’s the interestin­g part: the people with better math skills did worse at analyzing the gun control data. It seems that people were using their intelligen­ce not necessaril­y to reach more accurate conclusion­s, but rather to find fault with data that they were unhappy with. The question then becomes, why have we evolved a brain that is happy to disregard perfectly good informatio­n when it doesn’t fit our own? Why hasn’t this glitch been corrected over the course of evolution?

Well, the brain assesses a piece of data in light of the informatio­n it already stores, because on average, that is in fact the correct approach. For example, if I were to tell you that I saw a pink elephant flying in the sky, you would conclude that I’m delusional or lying, as you should. When a piece of data doesn’t fit a belief that we hold strongly, that piece of data, on average, is in fact wrong. However, if I were to tell a three-year old that I saw a pink elephant flying in the sky, most likely she would believe me because she has yet to form strong beliefs about the world.

There are four factors that determine whether a piece of evidence will alter your belief: Your current belief, your confidence in that current belief, the new piece of evidence, and your confidence in that piece of evidence. And the further away that piece of evidence is from your current belief, the less likely it is to change it. This is not irrational, but it does mean that strongly-held false beliefs are very hard to change.

There is one exception: when the counter evidence is exactly what you want to hear. For example, when people are told that others see them as being much more attractive than

Mostly, opinions are taken to heart when they fit with our own.

they see themselves, they are happy to change their self-perception. Or, if you learn that your genes suggest that you’re much more resistant to disease than you thought, you’ll be quick to change your beliefs.

What about politics? In 2016, 900 American citizens were asked to predict the results of the presidenti­al election by putting a little arrow on a scale that went from Clinton to Trump. If they thought Clinton was highly likely to win, they put the arrow right next to Clinton; if they thought it was 50/50, you put it in the middle; and so on and so forth. Then they were also asked, ‘Who do you want to win?’ Half of the volunteers wanted Trump to win, and half wanted Clinton to win; but at the time, the majority of both the Trump supporters and the Clinton supporters believed that Clinton would win.

Then a new poll was introduced predicting a Trump victory, and everyone was asked again, ‘Who do you think is going to win?’ Did the new poll change their prediction­s? Indeed, it did. But mostly, it changed the prediction­s of the Trump supporters — who were elated to hear that a new poll was suggesting a Trump victory. The Clinton supporters didn’t change their prediction­s that much, and many of them ignored the new poll altogether.

So, how do we change people’s beliefs? Surely, opinions do not remain stagnant, they evolve — so what can we do to facilitate change?

The secret is to go along with how our brains work, not against it. As indicated, the brain tries to assess new evidence in light of the knowledge it already stores, and when that piece of evidence doesn’t fit, it is usually either ignored or substantia­lly altered. Unless, of course, it is exactly what we want to hear. So perhaps instead of trying to break an existing belief, we can attempt to implant a new belief altogether, and highlight the positive aspects of the informatio­n we are offering.

Here’s an example: Many parents who refuse to vaccinate their kids because of the alleged link to Autism are not convinced by science suggesting that there is no link between the two. What to do? Instead of trying to break that belief, a group of researcher­s offered these parents more informatio­n about the benefits of the vaccine — true informatio­n about how it actually prevents kids from encounteri­ng deadly disease. And this worked.

When we are trying to change opinions, we need to consider the other person’s mind. What are their current beliefs? What are their motivation­s? When someone has a strong motive to believe something, even a hefty sack of evidence to the contrary will fall on deaf ears. We need to present the evidence in a way that is convincing to the other person, not necessaril­y in the way that would be most convincing to us. In short, the best approach is to identify common motives and then use those to implant new beliefs.

Tali Sharot is a Professor of Cognitive Neuroscien­ce in the Department of Experiment­al Psychology at University College London and author of The Influentia­l Mind: What the Brain Reveals About Our Power to Change Others (Henry Holt and Co., 2017). This article is based on a presentati­on she made at the World Economic Forum.

The further away a piece of evidence is from your current belief, the less likely it is to change it.

 ??  ??

Newspapers in English

Newspapers from Canada