The Guardian (USA)

Finding it hard to get a new job? Robot recruiters might be to blame

- Hilke Schellmann

Martin Burch had been working for the Wall Street Journal and its parent company Dow Jones for a few years and was looking for new opportunit­ies. One Sunday in May 2021, he applied for a data analyst position at Bloomberg in London that looked like the perfect fit. He received an immediate response, asking him to take a digital assessment.

It was strange. The assessment showed him different shapes and asked him to figure out the pattern. He started feeling incredulou­s. “Shouldn’t we be testing my abilities on the job?”he asked himself.

The next day, a Monday, which happened to be a public holiday in the UK, he got a rejection email. He decided to email a recruiter at Bloomberg. Maybe the company made a mistake?

What Burch discovered offers insight into a larger phenomenon that is baffling experts: while there are record level job openings in both the UK and in the US, why do many people still have to apply to sometimes hundreds of jobs, even in sought-after fields like software developmen­t, while many companies complain they can’t find the right talent?

Some experts argue that algorithms and artificial intelligen­ce now used extensivel­y in hiring are playing a role. This is a huge shift, because until relatively recently, most hiring managers would handle applicatio­ns and resumes themselves.Yet recent findings have shown that some of these new tools discrimina­te against women and use criteria unrelated to work to “predict” job success.

While companies and vendors are not required to disclose if they use artificial intelligen­ce or algorithms to select and hire job applicants, in my reporting I have learned that this is widespread. All the leading job platforms – including LinkedIn, ZipRecruit­er, Indeed, CareerBuil­der, and Monster – have told me they deploy some of these technologi­es.

Ian Siegel, the CEO of ZipRecruit­er,

said that artificial intelligen­ce and algorithms have already conquered the field. He estimates that at least threequart­ers of all resumes submitted for jobs in the US are read by algorithms. “The dawn of robot recruiting has come and went and people just haven’t caught up to the realizatio­n yet,” he said.

A 2021 survey of recruiting executives by the research and consulting firm Gartner found that almost all reported using AI for at least one part of the recruiting and hiring process.

Yet it is not foolproof. One of the most consequent­ial findings comes from Harvard Business School professor Joe Fuller, whose team surveyed more than 2,250 business leaders in the US, UK and Germany. Their motives for using algorithmi­c tools were efficiency and saving costs. Yet 88% of executives said that they know their tools reject qualified candidates.

Despite the prevalence of the technology, there have just been a few famous cases of misfires. A few years back, Amazon discovered that its resume screener tool was biased against women. The algorithm was trained on resumes of current employees, who skewed male, reflecting a gender disparity in many tech fields. Over time, the tool picked up on male preference­s and systematic­ally downgraded people with the word “women” on their resumes, as in “women’s chess club” or “women’s soccer team.” Amazon’s engineers tried to fix the problem, but they couldn’t and the company discontinu­ed the tool in 2018.

“This project was only ever explored on a trial basis, and was always used with human supervisio­n,” said Amazon spokespers­on Brad Glasser.

AI vendors that build these kinds of technologi­es say that algorithm-based tools democratiz­e the hiring process by giving everyone a fair chance. If a company is drowning in applicatio­ns, many human recruiters read only a fraction of the applicatio­ns. An AI analyzes all of them and any assessment­s and judges every candidate the same way.

Another benefit, these vendors say, is if employers choose to focus on skills and not on educationa­l achievemen­ts like college degrees, applicants from diverse background­s who are often overlooked can get to the next stage of the process.

“At the end of the day, we don’t want people to be hired into roles that are going to drain them and not utilize their strengths. And so it’s really not about rejecting people, it’s about ‘screening in’ the right people,” said Caitlin MacGregor, CEO of Plum, which built the assessment Burch found so puzzling. MacGregor said the company’s clients have increased their diversity and retention rates since they started to use Plum. She said the assessment­s helped hone in on applicants’ “potential”.

But job candidates who have the necessary experience worry they’re being unfairly weeded out when companies focus on elusive factors like potential or personalit­y traits.

“This was the first time in my life, in my career, where I was sending out resumes and there was nothing,” said Javier Alvarez, 57, a distributi­on and sales manager from Monrovia, California, who sent out his resume more than 300 times on sites like LinkedIn and Indeed for jobs he said he was qualified for. No job offer materializ­ed, and he started to wonder if he was being automatica­lly excluded in some way – perhaps because of his age or salary requiremen­ts. “I felt hopeless. I started to doubt my abilities.”

Ronnie Riley, a 29-year-old event planner from Canada, had a gap of several years in their resume because of an illness. Riley applied to more than 100 event planning and some administra­tive assistant jobs in December 2021, and over 70 jobs in January, but ended up with a total of five interviews and no job offers. They worry the gap is the reason. “It just seems it’s discountin­g a whole bunch of people that could be perfect for the job,” they said.

Fuller’s research has helped provide answers to how exactly automatic rejections occur. One reason, he found, is that too often, job descriptio­ns include too many criteria and skills. Many employers add new skills and criteria to existing job descriptio­ns, building a long list of requiremen­ts. Algorithms end up rejecting many qualified applicants who may be missing just a couple of skills from the list.

One executive Fuller spoke with said their company’s tool has been rejecting qualified candidates because they scored low in one important category, even when they got a near perfect score in all the other important categories. The company found that it was left with job applicants who received mediocre scores across the board. (Longer job descriptio­ns may also deter more female applicants, Fuller believes, since many women apply to jobs only when they fulfill most of the requiremen­ts.)

Another reason qualified candidates are rejected by automated systems are so-called knockout criteria. In Fuller’s research, he found that almost 50% of the executives surveyed acknowledg­ed that their automatic systems reject job applicants outright who have a work gap longer than six months on their resumes. These applicants never get in front of a hiring manager, even if they are the most qualified candidates for the job.

“The six month gap is a really insidious filter,” said Fuller, since it’s probably built on the assumption that a gap signifies something ominous, but may simply represent military deployment­s, pregnancy complicati­ons, caregiving obligation­s or illness.

Experts contacted by the Guardian also described automatic resume screeners making mistakes similar to the infamous Amazon example, rooted in learning biases from an existing dataset. This hints at how these programs could end up enforcing the kinds of racial and gender biases observed with other AI tools, such as facial recognitio­n tech and algorithms used in health care.

John Scott is the chief operating officer of APMetrics, an organizati­on that helps companies identify talent, and is often brought in by larger companies to check if new technologi­es the company wants to buy from a vendor are fair and legal. Scott has examined multiple resume screeners and recruiting tools and discovered problems in all of them. He found biased criteria unrelated to work, such as the name Thomas and

the keyword church,to “predict” success in a job.

Mark Girouard, an employment lawyer in Minneapoli­s, found that the name Jared and having played lacrossein high school were used as predictors of success in one system.

Martin Burch, the London jobseeker, discovered he had been weeded out in a different way.

He contacted a human recruiter at Bloomberg and asked her to look at his CV. His experience lined up with the job descriptio­n and this was a direct competitor, making his background all the more valuable, he thought. But the problem turned out to be the patternfin­ding and personalit­y test he had taken, which was created by Plum.

A recruiter at Bloomberg replied: “I can see that your applicatio­n was rejected due to not meeting our benchmark in the Plum assessment that you completed. Unfortunat­ely on that basis we are not able to take your applicatio­n any further.” Burch felt stunned that he had indeed been rejected by a piece of code.

He retained a lawyer, and in communicat­ions with Bloomberg asked for a human review of his applicatio­n.

Bloomberg informed Burch that the role he applied for was no longer available and he wouldn’t be able to be considered for it.

Bloomberg did not return emails and calls asking for comment.

As adoption of AI tools in hiring expands, lawmakers are starting to take a closer look. In the UK, the government is planning new regulation of algorithmi­c decision making. In the US, a recent local law requires employers to inform job seekers how their applicatio­n materials are screened by AI upon request. And congressio­nal lawmakers have introduced bills that would regulate AI in hiring at a national level, including the Algorithmi­c Accountabi­lity Act of 2022, but have faced hurdles getting them passed.

Burch decided to file an official claim with the Informatio­n Commission­er’s Office, an independen­t organizati­on that upholds privacy laws in the UK. In February the office reprimande­d Bloomberg, writing: “From reviewing the informatio­n provided, it is our decision that there is more work for you to do. As such, we now expect you to take steps to address any outstandin­g issues with the individual.”

Burch has since accepted £8,000 ($9,864) in compensati­onfrom the company. He says he also fought to demonstrat­e a point: “I am trying to prove to them that it’s probably weeding out good candidates so they should probably stop using it.”

Plum’s CEO Caitlin MacGregor declined to comment on Burch’s case directly, citing privacy concerns, but she stands behind her product: “I should not be interviewi­ng somebody that is a 35, regardless of how much experience they have. There is somewhere else that they are going to be their own 95 [percent] match.”

How to write a resume in the age of AI

Instead of trying to stand out, make your resume machine-readable: no images, no special characters such as ampersands or tildes. Use the most common template. Use short, crisp sentences – declarativ­e and quantitati­ve, said Ian Siegel, CEO of the job platform ZipRecruit­er

List licenses and certificat­ions on your resume

Make sure your resume matches the keywords in the job descriptio­n and compare your resume to the job descriptio­n using online resume scanners to see if you are a match for the role

For entry-level and administra­tive jobs, consider stating that you are competent in Microsoft office suite applicatio­ns even if it’s not in the job descriptio­n, said Harvard business professor Joe Fuller.

 ?? ?? Supporters say that algorithm-based tools democratiz­e the hiring process by giving everyone a fair chance. Photograph: Ivan Chiosea/ Alamy
Supporters say that algorithm-based tools democratiz­e the hiring process by giving everyone a fair chance. Photograph: Ivan Chiosea/ Alamy

Newspapers in English

Newspapers from United States