Geelong Advertiser

Machines lie

- Peter JUDD Peter Judd is newsroom operations manager for News Corp and a former editor or the Geelong Advertiser.

IF you’re looking for a female CEO, don’t try and find one in Google’s image search.

Punch in CEO and watch the white males rise to the top as they do in real life.

When I did this yesterday, the first eight places were men and the first woman was NSW Premier, Gladys Berejiklia­n.

The next woman was a Getty images stock photo used on dozens of websites.

This wouldn’t surprise more than 8000 devotees of artificial intelligen­ce who spent days gnawing their white knuckles at Long Beach, California, earlier this month.

Keynote speaker Prof Kate Crawford put it bluntly: “If our systems keep producing biased results, if people are unfairly kept in jail, or they can’t get insurance, or receive incorrect medical treatment, then people will no longer trust these tools.”

She then projected the Google “CEO” search results for the US on a big screen and pointed out the first female exec.

“Can you kinda guess who it would be?” teased Crawford. “She’s right down there on the end.

“It’s CEO Barbie. Seriously. Not a great look.”

Is it bias? Yes, and no. If only 8 per cent of global CEOs are women, then you might say the results reflect the current state of society.

But if machine learning algorithms continue to make decisions based on statistica­l norms and not our shared goals, then expect unfairness to be embedded in our society like never before.

Crawford cites research where women have been recommende­d lower paying jobs by employment programs using smarter data matching technology.

Just like Google’s image search, statistica­l bias has slithered into AI-assisted decision-making at critical moments in a person’s life. Buying a car. Getting a job. Dealing with bureacracy. We all have weighted numbers rs against our identities.

At the heart of the problem is s the way a machine learns how to o classify us.

Unlike a human child, when a machine first learns anything about us, it works from a collection of training data, not from wise teachers.

You might expect that data to o be accurate and objective. It’s not.

Historical data captures life as we know it up until now, not life as we want it to be.

If you don’t like the way a machine is determinin­g housing g loans, then you might want to tweak the code with some moral l guidance. But whose moral guidance? Humans rarely agree on anything, so getting them to agree on how to tune the moral compass of an artificial intelligen­ce will be a bridge too far.

“This is actually a really hard decision,” Crawford says. “It is not a straightfo­rward question and it has a lot of political implicatio­ns.”

Take, for example, the Stanford University study earlier this year that could accurately distinguis­h between gay and heterosexu­al people using facial recognitio­n technology.

“Gay men had narrower jaws and longer noses, while lesbians had larger jaws,” the researcher­s said with more than 80 per cent accuracy for men. The study caused an uproar. For Crawford, this is an issue for ethics in classifica­tion because homosexual­ity is still criminalis­ed in 78 countries, “some of which apply the death penalty”.

The possibilit­ies for “extreme vetting” using technologi­es such as this are frightenin­g.

If an AI could predict the future, I reckon it would be seriously worried about 2018.

It’s stacking up as a year of reckoning.

Expect crusaders from both the conservati­ve and liberal movements to turn the scientific community into a political and moral battlefiel­d.

It will be ten-fold more virulent than the global warming debate — because the harmful consequenc­es are tangible and immediate.

Bad technology, clickbait research and political expediency will undermine public confidence in a field that promises so much.

And, I fear, science will have lost its faithful.

 ??  ?? NSW Premier, Gladys Berejiklia­n. Seige.
NSW Premier, Gladys Berejiklia­n. Seige.
 ??  ??

Newspapers in English

Newspapers from Australia