Hid­ing in the al­go­rithm: the bat­tle to root out prej­u­dice

Margi Mur­phy ex­am­ines why au­to­mated de­ci­sions have failed to re­move bias from the fi­nan­cial sec­tor

The Daily Telegraph - Business - - Business -

On a hot week­end last June, Sarah Jane Carsten walked to the Hertz air­port kiosk, hav­ing flown home for a friend’s wed­ding. With­out a credit card, Carsten’s in­for­ma­tion was run through an au­to­mated sys­tem to see if she was a risk. De­spite her good credit score and healthy bank bal­ance, the 26-year-old lawyer was de­nied the pre­paid Sedan.

An ex­pen­sive Uber jour­ney later, she re­turned the next morn­ing and asked the man­ager what had hap­pened. The con­ver­sa­tion be­came tense and he held a com­puter print­out, which stated that it was “unable to sup­ply spe­cific rea­sons why we have de­nied your re­quest to pay by debit card”, and asked her “if I could read”, Carsten says.

She pointed to the ticket and asked him to show pre­cisely where it ex­plained the de­nial, ac­ci­den­tally knock­ing over a bot­tle of water on the counter, when the man­ager “took sev­eral steps back from the counter as if he was afraid”.

Carsten, frus­trated, sug­gested that he was “afraid of black peo­ple”. Af­ter that, the man­ager told her he would be re­fus­ing her any ser­vices and called the air­port po­lice.

“My first re­ac­tion was pure em­bar­rass­ment, and the sec­ond was anger. It felt as if the whole sys­tem was a scam,” she says. Since then, Carsten has tried in vain to find out what in­for­ma­tion the au­to­mated sys­tem used to de­cide she was a risk: was it be­cause she was a woman, black, or sim­ply a glitch?

It is this lack of trans­parency that led the Gov­ern­ment to ap­point re­searchers to un­der­stand in­grained al­go­rith­mic bias across the fi­nan­cial ser­vices sec­tor.

The re­port, which is due to be pub­lished by the Cen­tre for Ethics in Data In­no­va­tion in April, has been de­layed be­cause of the coro­n­avirus pan­demic.

Bri­tish con­sumers have been ex­posed to au­to­mated de­ci­sion­mak­ing for years. On pa­per, it made sense to hand over de­ci­sions about how likely some­one is to re­of­fend, ex­cel in a cer­tain role or re­pay a loan in a timely fash­ion to ma­chines, elim­i­nat­ing from the equa­tion the sub­con­scious prej­u­dices held by hu­mans.

But in the years since, the very his­tor­i­cal bi­ases we were try­ing to fight are creep­ing in.

“Some­thing needs to be done,” says Paul Res­nick, a pro­fes­sor at the University of Michi­gan School of In­for­ma­tion. “There needs to be a regime of trans­parency.”

Sci­en­tists have warned for years that algebra may not dis­crim­i­nate but the data that it uses can be bi­ased. For ex­am­ple, if a bank has a history of lend­ing to white men and their data set is trained on those records, an al­go­rithm is more likely to lend to that group. Al­go­rithms are ac­cu­rate, but they are not al­ways fair.

“When you in­ter­act with sys­tems that are mak­ing au­to­mated choices about you on your be­half, they make mis­takes,” says Prof Res­nick.

They make fewer mis­takes, on av­er­age, than a hu­man, but the sheer rate at which they are made means the vol­ume of mis­takes is higher, and they do not get cor­rected, nor prompt re­train­ing, like a hu­man might.

“It may not be clear whether it’s a ran­dom mis­take, or whether it’s some­thing that is be­ing un­fair to you based on a char­ac­ter­is­tic,” Prof Res­nick adds.

“I think that’s one of the things that makes it really frus­trat­ing. Was I turned down for this loan be­cause I’m black? Some­times you were, some­times you weren’t.”

The UK’s anti-dis­crim­i­na­tion laws of­fer pro­tec­tion from dis­crim­i­na­tion, whether hu­man or al­go­rith­mic.

How­ever, aca­demics and of­fi­cials now fear that the banks them­selves may never know the scale of the prob­lem be­cause sys­tems have al­ready been trained on in­com­plete or un­rep­re­sen­ta­tive data. Fur­ther, they may con­sist of a hodge­podge of third party soft­ware fu­elled by dis­parate data bro­kers.

“Of course bias is there, it is just in­cred­i­bly dif­fi­cult to per­ceive,” says Ge­nie Bar­ton, who sits on the re­search board of the In­ter­na­tional As­so­ci­a­tion

‘Was I turned down for this loan be­cause I’m black? Some­times you were, some­times you weren’t’

of Pri­vacy Pro­fes­sion­als, based in New Hamp­shire.

Last year, a man and a woman of equal fi­nan­cial stand­ing were given com­pletely dif­fer­ent credit lim­its when ap­ply­ing for the new Ap­ple Card. The man, David Hans­son, a soft­ware en­gi­neer, de­scribed the card as a “sex­ist pro­gram” and the New York state opened an in­ves­ti­ga­tion. How did he find out? The pair were mar­ried. In fact, de­spite Hans­son re­ceiv­ing a limit 20 times higher than his wife, she ac­tu­ally had a bet­ter credit score.

The Fi­nan­cial Om­buds­man re­ceives com­plaints based on al­go­rith­mic de­ci­sions, which can be viewed on­line, some of which re­veal the com­pli­ca­tions of the sys­tems used by the big­gest high street names.

Iron­i­cally, Bri­tish banks feel that new Euro­pean pri­vacy laws have made it more dif­fi­cult to elim­i­nate bias be­cause they are barred from col­lect­ing data on race, gen­der or dis­abil­ity to test whether their sys­tems are fair.

Bri­tain has be­come a pi­o­neer in sand­box­ing, a pro­gramme run by the Fi­nan­cial Con­duct Author­ity that helps start-ups or large com­pa­nies such as Bar­clays play with syn­thetic data and run hy­po­thet­i­cal mod­els, which can be au­dited for is­sues such as bias. How­ever, the pro­gramme is a vol­un­tary one.

Lord Cle­ment-Jones, the Lib­eral Demo­crat peer who chairs the Lords ar­ti­fi­cial in­tel­li­gence com­mit­tee, says he is frus­trated at the lack of re­sponse from the in­dus­try de­spite en­cour­ag­ing pro­grammes. Now, more than ever, he says, fi­nance needs to prove it is work­ing on deep-rooted is­sues that keep com­ing back to haunt us.

“Are we sim­ply re­peat­ing the prej­u­dices of the Seven­ties? That was ex­actly the ques­tion we asked in the House of Lords two years ago,” he says. “And it has still not been answered.”

Com­puter sys­tems may not dis­crim­i­nate, but the data that they use can be bi­ased, sci­en­tists have warned

Newspapers in English

Newspapers from UK

© PressReader. All rights reserved.