Ama­zon gets rid of ‘sex­ist’ hir­ing tool

Daily Dispatch - - News - JAMES COOK

AMA­ZON has scrapped a “sex­ist” in­ter­nal tool that used ar­ti­fi­cial in­tel­li­gence to sort through job ap­pli­ca­tions.

The pro­gramme was cre­ated by a team at Ama­zon’s Ed­in­burgh of­fice in 2014 as a way to sort through CVs and pick out the most promis­ing can­di­dates.

How­ever, it taught it­self to pre­fer male can­di­dates over fe­male ones, mem­bers of the team said.

They no­ticed it was pe­nal­is­ing CVs that in­cluded the word “women’s”, like “women’s chess club cap­tain”. It also re­port­edly down­graded grad­u­ates of two all­women’s col­leges.

The prob­lem stemmed from the fact that the sys­tem was trained on data sub­mit­ted over a 10-year pe­riod, most of which came from men.

The AI was tweaked in an at­tempt to fix the bias. How­ever, last year, Ama­zon lost faith in its abil­ity to be neu­tral and aban­doned the project.

Ama­zon re­cruiters are be­lieved to have used the sys­tem to look at the rec­om­men­da­tions when hir­ing, but did not rely on the rank­ings. Cur­rently, women make up 40pc of Ama­zon’s work­force. Ste­vie Buck­ley, co-founder of UK job web­site Hon­est Work, used by com­pa­nies like Snapchat to re­cruit for tech­nol­ogy roles, said: “The basic premise of ex­pect­ing a ma­chine to iden­tify strong job ap­pli­cants based on his­toric hir­ing prac­tices at your com­pany is a sure-fire method to rapidly scale in­her­ent bias and dis­crim­i­na­tory re­cruit­ment prac­tices.” The dan­ger of in­her­ent bias in the use of al­go­rithms is a com­mon prob­lem in the tech­nol­ogy in­dus­try. Al­go­rithms are not told to be bi­ased, but can be­come un­fair through the data they use.

Jes­sica Rose, a tech­ni­cal man­ager at ed­u­ca­tion start-up Fu­tureLearn said: “The value of AI as it’s used in re­cruit­ment is lim­ited by hu­man bias. De­vel­op­ers and AI spe­cial­ists carry the same bi­ases as tal­ent pro­fes­sion­als, but we’re of­ten not asked to in­ter­ro­gate or test for these dur­ing the devel­op­ment process.”

Last month, IBM launched a tool de­signed to de­tect bias in AI. The Fair­ness 360 Kit al­lows de­vel­op­ers to see clearly how their al­go­rithms work and which data is used. Ama­zon de­clined to com­ment. –

Ama­zon re­cruiters are be­lieved to have used the sys­tem to look at the pro­pos­als when hir­ing, but did not rely on the rank­ings

Newspapers in English

Newspapers from South Africa

© PressReader. All rights reserved.