Who’s Your Next Job In­ter­viewer?

The In­equal­ity of Fa­cial Anal­y­sis AI

Indwe - - Contents - Ivan Manokha: De­part­men­tal Lec­turer in In­ter­na­tional Po­lit­i­cal Econ­omy, Univer­sity of Oxford / www.the­con­ver­sa­tion.com Im­ages © iS­tock­photo.com

Ar­ti­fi­cial in­tel­li­gence and fa­cial anal­y­sis soft­ware is be­com­ing com­mon­place in job in­ter­views. The tech­nol­ogy, de­vel­oped by US com­pany HireVue, analy­ses the lan­guage and tone of a can­di­date’s voice and records their fa­cial ex­pres­sions as they are videoed an­swer­ing iden­ti­cal ques­tions.

It was used in the UK for the first time in Septem­ber but has been used around the world for sev­eral years. Some 700 com­pa­nies – in­clud­ing Vodafone, Hil­ton and Ur­ban Out­fit­ters – have tried it out.

Cer­tainly, there are sig­nif­i­cant ben­e­fits to be had from this. HireVue says it speeds up the hir­ing process by 90% thanks to the speed of in­for­ma­tion pro­cess­ing. But there are im­por­tant risks we should be wary of when out­sourc­ing job in­ter­views to AI.

The AI is built on al­go­rithms that as­sess ap­pli­cants against its data­base of about 25,000 pieces of fa­cial and lin­guis­tic in­for­ma­tion. These are compiled from pre­vi­ous in­ter­views of “suc­cess­ful hires” – those who have gone on to be good at the job. The 350 lin­guis­tic el­e­ments in­clude cri­te­ria like a can­di­date’s tone of voice, their use of pas­sive or ac­tive words, sen­tence length, and the speed they talk. The thou­sands of fa­cial fea­tures an­a­lysed in­clude brow fur­row­ing, brow rais­ing, the wide eyes open or nar­row they close, lip tight­en­ing, chin rais­ing and smil­ing.

The fun­da­men­tal is­sue with this, as is of­ten pointed out by crit­ics of AI, is that this tech­nol­ogy is not born in a per­fect so­ci­ety. It is cre­ated within our ex­ist­ing so­ci­ety, which is marked by a whole range of dif­fer­ent kinds of bi­ases, prej­u­dices, in­equal­i­ties and dis­crim­i­na­tion. The data on which al­go­rithms “learn” to judge can­di­dates con­tains these ex­ist­ing sets of be­liefs.

As UCLA pro­fes­sor, Safiya Noble, demon­strates in her book Al­go­rithms of Op­pres­sion, a few sim­ple Google searches shows this hap­pen­ing. For ex­am­ple, when you search the term “pro­fes­sor style”, Google Im­ages re­turns ex­clu­sively mid­dleaged white men. You get sim­i­lar re­sults for a “suc­cess­ful man­ager” search. By con­trast, a

search for “house­keep­ing” re­turns pic­tures of women.

This re­flects how al­go­rithms have “learnt” that pro­fes­sors and man­agers are mostly white men, while those who do house­keep­ing are women. And by de­liv­er­ing these re­sults, al­go­rithms nec­es­sar­ily con­trib­ute to the con­sol­i­da­tion, per­pet­u­a­tion and po­ten­tially even am­pli­fi­ca­tion of ex­ist­ing be­liefs and bi­ases. For this very rea­son, we should ques­tion the in­tel­li­gence of AI. The so­lu­tions it pro­vides are nec­es­sar­ily con­ser­va­tive, leav­ing lit­tle room for in­no­va­tion and so­cial progress.

“Sym­bolic Cap­i­tal”

As French so­ci­ol­o­gist Pierre Bour­dieu em­pha­sised in his work on the way that in­equal­i­ties are re­pro­duced, we all have very dif­fer­ent eco­nomic and cul­tural cap­i­tal. The en­vi­ron­ment in which we grow up, the qual­ity of the teach­ing we had, the pres­ence or ab­sence of ex­tracur­ric­u­lar ac­tiv­i­ties and a range of other fac­tors have a de­ci­sive im­pact on our in­tel­lec­tual abil­i­ties and strengths. This also has a big im­pact on the way we per­ceive our­selves – our lev­els of self-con­fi­dence, the ob­jec­tives we set for our­selves, and our chances in life.

An­other fa­mous so­ci­ol­o­gist, Erv­ing Goff­man, called it a “sense of one’s place”. It is this in­grained sense of how we should act that leads peo­ple with less cul­tural cap­i­tal (gen­er­ally from less priv­i­leged back­grounds) to keep to their “or­di­nary” place. This is also re­flected in our body lan­guage and the way we speak. So there are those who, from an early age, have a stronger con­fi­dence in their abil­i­ties and knowl­edge. And there are many oth­ers who have not been ex­posed to the same teach­ings and cul­tural prac­tices, and as a re­sult may be more timid and re­served. They may even suf­fer from an in­fe­ri­or­ity com­plex.

All of this will come across in job in­ter­views. Ease, con­fi­dence, self­as­sur­ance and lin­guis­tic skills be­come what Bour­dieu called “sym­bolic cap­i­tal”. Those who pos­sess it will be more suc­cess­ful – whether or not those qual­i­ties are ac­tu­ally best, or bring some­thing new to the job.

Of course, this is some­thing that has al­ways been the case in so­ci­ety. But ar­ti­fi­cial in­tel­li­gence will only re­in­force it – par­tic­u­larly when AI is fed data of the can­di­dates who were suc­cess­ful in the past. This means com­pa­nies are likely to hire the same types of peo­ple that they have al­ways hired.

The big risk here is that those peo­ple are all from the same set of back­grounds. Al­go­rithms leave lit­tle room for sub­jec­tive ap­pre­ci­a­tion, for risk-tak­ing, or for act­ing upon a feel­ing that a per­son should be given a chance.

In ad­di­tion, this tech­nol­ogy may lead to the re­jec­tion of tal­ented and in­no­va­tive peo­ple who sim­ply do not fit the pro­file of those who smile at the right mo­ment or have the re­quired tone of voice. And this may ac­tu­ally be bad for busi­nesses in the long run, as they risk missing out on tal­ent that comes in un­con­ven­tional forms.

More con­cern­ing is that this tech­nol­ogy may also in­ad­ver­tently ex­clude peo­ple from di­verse back­grounds, and give more chances to those who come from priv­i­leged ones. As a rule, they pos­sess greater eco­nomic and so­cial cap­i­tal, which al­lows them to ob­tain the skills that be­come sym­bolic cap­i­tal in an in­ter­view set­ting.

What we see here is an­other man­i­fes­ta­tion of the more gen­eral is­sues with AI. Tech­nol­ogy that is de­vel­oped us­ing data from our ex­ist­ing so­ci­ety, with its var­i­ous in­equal­i­ties and bi­ases, is likely to re­pro­duce them in the so­lu­tions and de­ci­sions that it pro­poses.

Newspapers in English

Newspapers from South Africa

© PressReader. All rights reserved.