Campaign Middle East

Big data is blind faith

-

In 1952, the Boston Symphony Orchestra was worried about falling standards due to nepotism. They thought conductors were choosing their own students over the best musicians.

So they decided auditions would take place with a curtain between the conductor and applicants.

If the conductor couldn’t see who was playing, they could only judge on abilit y. But the results were disappoint­ing. Pretty much the same young men were picked again. So the musicians were asked to repeat the audition, but take their shoes off f irst. When they did, the results were very different. This time, half the selected musicians were female – previously, there had been hardly any women chosen.

They thought they were being fair by not seeing the applicants but, subconciou­sly, they could still tell their sex by the sound of their shoes.

What they were listening to wasn’t the music but their own subconscio­us bias.

Since 1952, blind auditions have become common and half of the top 250 orchestras are now largely composed of female musicians.

Subconciou­s bias also plays a large part in our era of faith in big data and algorithms. Alongside another bias: quantifica­tion bias. This is the belief in valuing the measurable over the immeasurab­le.

Cathy O’Neil is a mathematic­ian and data scientist; she wrote Weapons of Math Destructio­n.

She says: “Algorithms don’t make things fair, they repeat past practices – they automate the status quo.”

She says the reason for this is “Algorithms are simply opinions embedded in code.

“People think algorithms are objective, true and scientific – but this is a marketing trick.

“People trust and fear algorithms because they trust and fear mathematic­s.”

She summarises: “Algorithms are not objective – the people who build them impose their own agenda on the algorithms.”

Tricia Wang is an alumna of Harvard’s Berkman Klein Centre for Internet & Society.

She says: “Relying on big data alone increases the chance that we’ll miss something by giving us the illusion that we know everything.”

She addresses the question: why is big data not helping us make better decisions?

She says: “Big data suffers from a context loss because big data doesn’t answer the question ‘ why?’”

Big data is a $122bn industry in the US, where Wang advises companies on the use of technology.

She says: “Algorithms need to be audited, because quantifyin­g is addictive.

“People have become so f ixated on numbers that they can’t see anything outside of it.”

That seems to be the problem with big data and algorithms in general.

As O’Neil says, “An algorithm is just data plus a definition of success.

“The data is gathered from the past, and whatever data is used is decided by the person building the algorithm. “As is the definition of success.” So, far from being an objective measure, an algorithm is subjectivi­ty plus more subjectivi­ty.

The data used isn’t decided by a machine; neither is the definition of success. Both are decided by f lawed, biased human beings. There’s nothing wrong with being biased – we all are.

The only thing that’s wrong is not being aware of the bias, and not admitting it.

Because the results of the algorithms are cranked out by a machine, we think those hidden biases are facts.

Newspapers in English

Newspapers from United Arab Emirates