Cor­re­la­tion does not im­ply cau­sa­tion

Campaign Middle East - - CAMPAIGN -

Epi­demi­ol­ogy is ev­i­dence-based re­search. The sta­tis­ti­cal ev­i­dence of data lead­ing to a de­ci­sion. If we gather enough data, we must even­tu­ally come to the right con­clu­sion. But there’s an­other view. The view that cor­re­la­tion does not im­ply cau­sa­tion. In other words, the data may be right but the in­ter­pre­ta­tion may be wrong.

The prob­lem is the mind jumps to data as a con­clu­sion in­stead of in­put.

For in­stance, in one study, the data showed women who had taken hor­mone re­place­ment ther­apy had a lower in­ci­dence of breast can­cer than women who hadn’t.

The ob­vi­ous con­clu­sion was that HRT low­ered the risk of breast can­cer.

But, ac­tu­ally, that was what a lazy read­ing of the data im­plied. A more care­ful read­ing of the data showed that the women who took HRT were all from a higher so­cioe­co­nomic class.

They could af­ford HRT be­cause they were bet­ter off, so they were able to af­ford pri­vate health­care, gym mem­ber­ships and a gen­er­ally health­ier diet.

Which may be the real rea­son they had a lower in­ci­dence of breast can­cer.

An even more care­ful read­ing of the data would have re­vealed some­thing else.

Among the women in the higher so­cio-eco­nomic bracket, those who took HRT ac­tu­ally had a slightly higher risk of breast can­cer.

So the data, in­ter­preted two dif­fer­ent ways, ac­tu­ally showed two dif­fer­ent re­sults.

This is the prob­lem with data: peo­ple avoid the dis­com­fort of think­ing.

We want to jump to the eas­i­est, most ob­vi­ous, con­clu­sion. What Bud­dhists call “the lazy mind”. This is par­tic­u­larly preva­lent in ad­ver­tis­ing and mar­ket­ing.

That’s what made John Web­ster dif­fer­ent.

He used data as ex­actly what it was: in­for­ma­tion not con­clu­sion.

Each time he would in­ter­ro­gate the data be­yond the ob­vi­ous sur­face read­ing.

For in­stance, his ini­tial cam­paign for Sugar Puffs fea­tured a small char­ac­ter called Honey Mon­ster, based on Se­same Street’s Cookie Mon­ster.

It loved the honey in Sugar Puffs and broke things if it couldn’t get any.

It bombed out in re­search – mums and kids hated the cam­paign.

Most peo­ple would start again, but John looked fur­ther into the data.

He found mums hated it be­cause it was wil­fully de­struc­tive – kids hated it be­cause it was small and whiny, not much of a mon­ster.

So John made it much big­ger for the kids, and clumsy not naughty for the mums.

He also made it af­fec­tion­ate and gave it the line: “Tell ’em about the honey, Mummy.”

With a dif­fer­ent in­ter­pre­ta­tion of the data, that cam­paign ran for nearly 30 years.

Paul Bains­fair of­ten used the fol­low­ing ex­am­ple to ex­plain the mis­use of data.

In an ex­per­i­ment to dis­cover how grasshop­pers hear, 100 grasshop­pers were used.

A loud noise was made next to each grasshop­per– in each case, the grasshop­per jumped.

The next step was to re­move the hind legs of every sin­gle grasshop­per.

When this was done the ex­per­i­ment was re­peated – a loud noise was made next to each grasshop­per.

This time, not one of the grasshop­pers jumped – the re­sults of the ex­per­i­ment were 100 per cent con­sis­tent.

The ex­per­i­ment proved that grasshop­pers hear through their hind legs.

The point be­ing that data is nei­ther good nor bad; it is just in­for­ma­tion.

With­out a brain to in­ter­pret it cor­rectly, it’s use­less.

Newspapers in English

Newspapers from UAE

© PressReader. All rights reserved.