The Press

Biased? That’s just stupid

Most people think they are smart and can’t understand­whywhat should be good decisions often turn out bad. This is because they are unaware that they are making their minds up on flawed habitual thinking patterns, WRITESJONA­THONHARPER.

- Jonathon Harper teaches an adult education course on practical psychology in Wellington and is working on a PHD research proposal on thinking biases. He is also a profession­al musician and freelance journalist. He lived in Christchur­ch for eight years dur

What we colloquial­ly call dumb thinking usually results in us making the wrong choices and can have disastrous consequenc­es. Many otherwise smart people have answered a phone call from ‘‘Microsoft’’ asking them to adjust their computer, and then paid a bogus bill for non-existent help.

Others have given away their bank and internet account passwords to crooks in response to bogus ‘‘phishing’’ emails with genuine company logos.

We all make many errors while driving our cars and most of us end up causing several dents or smashes. Each year nearly 13,000 Kiwis are injured as a result. A greater number of injuries are caused by preventabl­e accidents at home, according to our hospital A and E department­s.

Otherwise sensible people smoke or drink themselves to death. For those in charge of ocean liners, aeroplanes and nations, bad thinking can lead to hundreds and even millions of deaths.

So, how can smart people can be stupid?

It’s simple, stupid! Two different commonplac­e examples are: the surgeon who forgets to wash his hands before an operation and the car driver on a foggy day, thinking the car in front is at a safe distance. He ends up hitting it when it stops suddenly.

It has been proven that operating surgeons who have checklists (like airline pilots) have fewer patient deaths due to dumb thinking caused by inattentio­n. When their minds are full of complex technical details, mundane but important things can be overlooked. Written checklists in these situations are a clearly proven winner.

Part of this problem is the fact our working memory can only hold a maximum of about seven items at once. Recent research in Christchur­ch has shown that being stressed (in this case through earthquake­s) is likely to lead to poor decisions. This is probably because of feeling drained and distracted.

A dramatic illustrati­on of just how powerful and dangerous these biases can be is the curious case of awidesprea­d medical ‘‘cure’’. After being accepted as being effective for over three millennia, it was ‘‘proven’’ to be effective by a United States court case in 1799 where leading physicians testified and a journalist was successful­ly sued for libel. This was the practice of bloodletti­ng. By the mid-1800s, unbiased scientific research had revealed the horrible truth. One study carried out at Edinburgh University in 1816 suggested about 24 per cent of a bloodletti­ng doctor’s patients at the time died from the bloodletti­ng. Given the widespread and almost universal nature of the practice, that translates to millions of needless deaths.

It took some very powerful biases to ensure the survival of this most horrible medical ‘‘treatment’’. There is plenty of good evidence those same lethal biases are still just as strong today. A belief that anecdotes and testimonia­ls are reliable was a major cause of the massive delusion that bloodletti­ng cured people.

Research by memory experts such as Elizabeth Loftus in the US and Maryanne Garry at Victoria University has revealed our memories are not like video recorders. We select memories which support our beliefs and ignore and discount recollecti­ons that prove us wrong. We construct false memories that are indistingu­ishable from true memories as a result of suggestion or forgetting the source of the ‘‘memory’’.

Thinking illusions can be like optical illusions. We use shortcuts called heuristics to obtain quick answers without conscious thought. A classic example is to quickly and mentally solve this simple puzzle.

A bat and ball cost $1.10 in total. The bat cost $1 more than the ball.

How much (please calculate quickly in your head) did the ball cost?

. . . No, it is not 10 cents, but that is almost right. Try it on a friend. Algebra was invented to solve these kinds of problems.

Going back to the case of the driver in the fog – the driver’s brain automatica­lly assumed the fuzzier the focus, the farther an object is from us. But when fog rather than distance is causing the fuzziness, the brain will not adjust and so we ‘‘see’’ (perceive) the car in front as farther away than it really is. That is why there are many more major pile-ups on motorways on such days. Here is a quiz that highlights common thinking illusions: 1. Imagine some cards with a number on one side while the other side is either red or black. You are trying to test the hypothesis that, if one side is red, the other side will have an even number. You are not going to turn over any cards that will not test the hypothesis. Which of these four cards you will turn over? a) A red card b) A card showing an even number c) A card showing black d) A card showing an odd number 2. You meet someone who is quietly spoken, methodical and modest wearing glasses on the street in Te Awamutu. You know this person is either: a politician, rugby commentato­r, farmer, rock star, or a librarian. Which of these five is the person most likely to be? 3. How good a driver are you compared with the rest of New Zealand’s drivers? If you don’t drive, make that skill at crossing the road safely as a pedestrian. a) Way below average b) Worse than average c) Better than average d) Well above average 4. Would you drive an extra 200m to buy a Lotto ticket in New Zealand at a very lucky Lotto shop where four major prizes were won over the last few years? ANSWERS 1. a. Yes, turn this to see if the hypothesis is confirmed or disproved; b. No, because if the other side is red it will confirm the hypothesis but if it is black, you learn nothing. You can’t disprove the hypothesis; c. No, the hypothesis is only about cards with a red side; d. Yes, this has the potential to disprove the hypothesis (if the other side is red).

Most people get this wrong because we are wired up to look for evidence that confirmswe are right and ignore evidence that could prove us wrong. This is called the confirmato­ry bias.

If you still can’t understand it, imagine you are a police officer looking for underage drinkers. You see four people: one is drinking, one is not drinking, one is under age and one is not under age. It’s easy to see now which two you approach. Same question as the cards, but easier because we know police look for rule breakers. Most of us don’t. 2. The correct answer is not librarian, but farmer. This is called the representa­tiveness bias because we look for the answer that most represents what we are looking for while ignoring the ‘‘base rates’’ – there are few librarians in Te Awamutu, or in New Zealand compared with farmers. Screening programmes for rare illnesses produce what seem like far too high rates of false alarms for similar reasons. 3. Unless you have recently attended a defensive driving course or had some specialist training, the chances are you are an average driver. Yet about 75 percent of people who are asked this (and many similar questions like ones about self-esteem) show we are generally significan­tly over-confident about our abilities and status. 4. I hope not because driving carries a risk and you have nothing to gain. This is called the gambler’s fallacy. If a roulette wheel lands on red 15 times in a row; many gamblers bet heavily on red for the next spin. But there is still a 50 per cent chance the next roll will land on black. If you got any of the quiz wrong, you will have a normal brain. Like optical illusions, even when we know we are wrong, it still ‘‘looks right’’. The quiz looks at common thinking illusions. Some of the many more include:

Sunk cost illusion – refusing to abandon a useless project, like failing to walk out on a boring film.

The halo effect – thinking famous stars must be good at everything or wanting to be seen with them.

In-group bias – the false belief that one’s own in-group is superior to other similar groups.

Attributio­n error – thinking our failures are due to circumstan­ces while others’ are due to their mistakes.

Odd-match illusion – thinking there must be a ‘‘reason’’, like being psychic, for random coincidenc­es.

Affirming the consequent – eg, if sexual abuse causes nightmares we assume nightmares prove abuse.

Anchoring heuristic – a survey might ask, are you: extremely, very or mildly satisfied with (our political representa­tive or party). Despite there To make smarter choices with everyday decisions and sudden emergencie­s, it is probably best to trust your gut feelings and immediate reactions.

They are usually right and there just isn’t time to always use more the more reliable but relatively time- and energycons­uming analysis.

On the other hand, for important decisions that require careful reflection, ask yourself:

How do I know the ‘‘facts’’ of the matter? What are my sources?

Have I taken on board the difference between being sure and being right?

Is there a way I could prove myself wrong? Have I listened to differing viewpoints?

AMI keeping calm as I debate the options with myself and others?

AMI relying on anecdote rather than research?

For making a major decision, did I make a list of pros and cons and weigh each one?

Do I have an exit strategy in case I turn out to be wrong?

Never make an important decision when hungry or tired. Your self-control will be low and you are more likely to take an easy, lazy option. This is demonstrat­ed in the 2011 book Willpower by psychologi­sts Roy Baumeister and John Tierney.

The history of bloodletti­ng has taught us what a world without science is like. We end up prey to our many thinking illusions that are not self-correcting even after thousands of years.

We can think and do lethal dumb things. Whatever the reason for the developmen­t of our big brains, it does seem that, when we do use them properly, we can really get somewhere with fewer mistakes.

 ??  ??

Newspapers in English

Newspapers from New Zealand