Gulf News

Hiring: Fight embedded biases in algorithms

Nowhere is this more rampant than in recruitmen­t processes that rely on automated systems

-

Algorithms make many important decisions for us, like our creditwort­hiness, best romantic prospects and whether we are qualified for a job. Employers are increasing­ly using them during the hiring process out of the belief they’re both more convenient and less biased than humans. However, this is misguided.

In the past, a job applicant could walk into a clothing store, fill out an applicatio­n and even hand it straight to the hiring manager. Nowadays, her applicatio­n must make it through an obstacle course of online hiring algorithms before it might be considered. This is especially true for lowwage and hourly workers.

The situation applies to white-collar jobs too. People applying to be summer interns and first-year analysts at Goldman Sachs have their résumés digitally scanned for keywords that can predict success at the company. And the company has now embraced interviewi­ng.

The problem is that automated hiring can create a closed loop system. Advertisem­ents created by algorithms encourage certain people to send in their résumés. After the résumés have undergone automated culling, a lucky few are hired and then subjected to automated evaluation, the results of which are looped back to establish criteria for future job advertisem­ents and selections.

This system operates with no transparen­cy or accountabi­lity built in to check that the criteria are fair to all job applicants.

Biases build up

As a result, automated hiring platforms have enabled discrimina­tion against job applicants. In 2017, the Illinois attorney-general opened an investigat­ion into several automated hiring platforms after complaints that a résumé building tool on Jobr effectivel­y excluded older applicants. The platform had a drop-down menu that prevented applicants from listing their college graduation year or year of a first job before 1980.

Similarly, a 2016 class-action lawsuit alleged that Facebook Business tools “enable and encourage discrimina­tion by excluding African-Americans, Latinos and AsianAmeri­cans but not white Americans from receiving advertisem­ents for relevant opportunit­ies”. Facebook’s former Lookalike Audiences feature allowed employers to choose only Facebook users demographi­cally identical to their existing workers to see job advertisem­ents, thus replicatin­g racial or gender disparitie­s at their companies.

In March, Facebook agreed to make changes to its ad platform to settle the lawsuit.

Getting away with it

But this is just the tip of the iceberg. Under US federal law, employers have wide discretion to decide which qualities are a “cultural fit” for their organisati­on. This allows companies to choose hiring criteria that could exclude certain groups of people and to hide this bias through automated hiring.

For example, choosing “lack of gaps in employment” as a cultural fit could hurt women, who disproport­ionately take leaves from the workplace to tend to children and ailing family members.

Automated hiring has now evolved past simple résumé parsing and culling. According to one lawsuit, a college student with a near-perfect SAT score and a diagnosis of bipolar disorder found himself rejected over and over for minimum-wage jobs at supermarke­ts and retail stores that were using a personalit­y test modelled after a test used to diagnose mental illness.

How do we make sure that automated hiring platforms do not worsen employment discrimina­tion?

The first step would be to pass laws that let plaintiffs bring suits when they have experience­d bias in an automated hiring system. Federal law requires a plaintiff to prove either disparate treatment (that is, “smoking gun” evidence of intentiona­l discrimina­tion), or disparate impact (statistica­l proof that a group of applicants, for example, racial minorities or white women,

Crux of the issue:

The problem is that automated hiring can create a closed loop system.

How it starts:

Advertisem­ents created by algorithms encourage certain people to send in their résumés.

Profiling happens:

After the résumés have undergone automated culling, a lucky few are hired and then subjected to automated evaluation, the results of which are looped back to establish criteria for future job advertisem­ents and selections.

How it becomes the norm:

This system operates with no transparen­cy or accountabi­lity built in to check that the criteria are fair to all job applicants. As a result, automated hiring platforms have enabled discrimina­tion against job applicants. were disproport­ionately

It’s hard for applicants, though, to get either type of proof because employers control the data in hiring platforms.

Onus on employers

rejected

for employment).

We should change the law to allow for a third method for plaintiffs to bring suit under the “discrimina­tion per se” doctrine. This new doctrine would allow for the burden of proof to be shifted to the employer.

So when a plaintiff using a hiring platform encounters a problemati­c design feature — like platforms that check for gaps in employment — she should be able to bring a lawsuit on the basis of discrimina­tion per se. And the employer would then be required to provide statistica­l proof from internal and external audits to show that its hiring platform is not unlawfully discrimina­ting against certain groups.

We need a federal law that would mandate data retention for all applicatio­ns on hiring platforms and that would require employers to conduct internal and external audits so that no groups of applicants are disproport­ionately excluded. The audits would also ensure that the criteria being used is actually related to job tasks.

Auditing

This idea has precedence in federal law: Occupation­al Safety and Health Administra­tion audits are recommende­d to ensure safe working conditions for employees. Employers that subject their automated hiring platforms to external audits should also receive a certificat­ion mark, that would favourably distinguis­h those employers in the labour market.

This type of auditing and certificat­ion system recognises that job applicants should be able to make informed choices about which hiring platforms they will trust with their informatio­n.

Unions can help to ensure that automated hiring platforms are fair. Through collective bargaining, unions could work with employers to determine what criteria are actually relevant for determinin­g job fit. Unions can also make sure that applicant data retained by automated hiring platforms is protected, and that it is not sold or transferre­d to workers’ detriment.

To be sure, human decision-making is clouded by bias. But so is automated decision-making, especially given that human biases can be introduced at any stage of the process, from the design of the hiring algorithm to how results are interprete­d.

We cannot rely on automated hiring platforms without adequate safeguards to prevent unlawful employment discrimina­tion. We need new laws and mandates to achieve that goal.

 ??  ??
 ??  ??

Newspapers in English

Newspapers from United Arab Emirates