Vancouver Sun

NEW TOOL AIMS TO FIX AI BIAS

Accenture wants to root out big concern that is holding business back from technology

- JAMES MCLEOD

It doesn’t have the catchiest name in the world, but Accenture says its “AI Fairness Tool” does what it says on the tin, and it could address a big problem for businesses looking to use artificial intelligen­ce.

AI is perhaps the hottest trend in tech right now, and among the people who spend their time thinking about artificial intelligen­ce issues, bias is a huge topic.

“Every client conversati­on goes to responsibl­e AI. Every client conversati­on,” said Jodie Wallis, managing director for artificial intelligen­ce at Accenture. “I think a lot of the early experiment­ation and the rush to get these things out are in areas that are less susceptibl­e to bias.

“I think that most organizati­ons are holding back on deploying solutions where there might be a bias issue.”

“Artificial intelligen­ce” can be a hazy term for a lot of different technologi­es, but a lot of it boils down to machine learning, where computer programs ingest huge amounts of training data, to discern patterns and then make prediction­s about what to do when confronted with similar scenarios in the future.

For example, an online retailer could feed all their customer transactio­n informatio­n into a machine learning system to generate recommenda­tions for products that a person is likely to want in the future.

What makes these systems powerful is that by chewing through huge amounts of data, the systems can suss out subtle patterns that no human could ever discern. But because of the complexity and the sheer volume of informatio­n, it can be incredibly difficult to understand why AI systems are making the prediction­s.

This probably doesn’t matter too much in the case of an online retailer making product recommenda­tions, but if a bank is using an AI system to predict who’s likely to default on a mortgage, the stakes are much higher.

And if the training data for the AI system contains subtle biases based on ethnicity, the AI algorithm will produce mortgage recommenda­tions that are skewed to disadvanta­ge some racial groups over others.

Wallis said that even if you deliberate­ly exclude sensitive ethnic, sex, age and other obvious sources of unfair bias from your data, the machine learning system might seize on some other variable — or combinatio­n of variables — that correlates closely with gender or race, injecting unfair bias into the system.

Wallis said that the AI Fairness Tool looks for these patterns in the data to root out bias, and then it tests the algorithm many, many times to figure out if there are any other subtle forms of bias hiding in the system.

Wallis said that this kind of tool for auditing algorithm fairness isn’t exactly new — Facebook and Google and some of the other tech giants have talked about using this kind of thing for their own AI systems.

“These tools have existed for a while, but we don’t think there’s any that are being made available as a service that anybody at any company can take advantage of,” she said.

She said smaller companies are a lot more cautious wading into AI because executives are worried that a biased, racist or sexist algorithm could do huge reputation damage.

Eventually, Wallis said she expects that the government will establish rules and oversight for all of this, but hopefully it won’t be because somebody messed up spectacula­rly and prompted heavy-handed regulation.

“I hope that in Canada our corporatio­ns create the responsibl­e AI programs that demonstrat­e to the government that, as a collective, we know what we’re doing, and how to manage it,” Wallis said.

“I’m hoping they’ll get ahead of it, they ’ll demonstrat­e what a good, responsibl­e AI program looks like, and that’ll form the basis of regulation.”

Right now, this AI Fairness Tool is on a slow rollout. Accenture is currently working to deploy a prototype version with several clients, initially focused on the government and banking sectors.

 ?? JUSSI NUKARI/AFP/GETTY IMAGES ?? Accenture has created an “AI Fairness Tool” that looks for patterns in data to eliminate bias, and then it tests the algorithm many times to figure out if there are any other subtle forms of bias hiding in the system.
JUSSI NUKARI/AFP/GETTY IMAGES Accenture has created an “AI Fairness Tool” that looks for patterns in data to eliminate bias, and then it tests the algorithm many times to figure out if there are any other subtle forms of bias hiding in the system.

Newspapers in English

Newspapers from Canada