The Denver Post

The end of the résumé?

Hiring is in the midst of a technologi­cal revolution with algorithms, chatbots

- By Alexia Elejalde-ruiz

The last time Chuck Blatt searched for a job, about 10 years ago, he relied on a thoughtful cover letter, a résumé printed on nice paper and good rapport during a face-to-face interview.

Now, he said, “that is all out the window.”

Since Blatt, 50, left his job as vice president of a painting and constructi­on company in March, he’s spent nearly every day in front of the computer in his Chicago home applying for jobs via automated processes.

He uploads his job history with the click of a button. He records videos of himself answering automated interview questions. He takes the lengthy online personalit­y tests employers use to screen candidates.

Blatt, who is seeking a marketing position, says technology makes it easier to apply for more jobs. But other parts of the high-tech hiring process leave him uneasy.

“I have been turned down for positions that I thought I would be perfect for,” Blatt said, and it is often impossible to know why. “There is no feedback because there is no one to talk to.”

Technology is transformi­ng hiring, as employers inundated with applicatio­ns turn to sophistica­ted tools to recruit and screen job candidates. Many companies save time with video interviews or résumé filters that scan for keywords, and those at the leading edge are using artificial intelligen­ce in a variety of ways: chatbots that schedule interviews and answer applicant questions; web crawlers that scour mountains of data to find candidates who aren’t actively job hunting; and algorithms that

analyze existing employee data to predict an applicant’s future success.

Advocates of Ai-enhanced hiring claim it reduces turnover by bringing on candidates who are a better fit. They also say a data-driven approach removes bias inherent in human decisionma­kers who, for example, might favor candidates who graduated from their alma mater.

But critics warn of the opposite effect: that some applicants could be unfairly weeded out.

Cathy O’neil, a mathematic­ian and author of the 2016 book “Weapons of Math Destructio­n,” worries that algorithms developed to predict whether an applicant will be a good fit based on the types of employees who have been successful before could perpetuate implicit biases.

“If in the past you promoted tall white men or people who came from Harvard, that will come through in the algorithm,” O’neil said. “Algorithms just look for patterns.”

The scoring is invisible, so even human resources department­s don’t know why an applicant might have been rejected, making it difficult for anyone to challenge the process, she said.

There is also concern that algorithms and filters could quiet- ly screen older people out, although that’s a concern with human recruiters as well. Blatt said that he has removed his college graduation date from his Linkedin profile, plus all of his experience from the 1990s, so as not to advertise his age.

Much of the technology used in the hiring process shows great promise for helping employers cut costs associated with high turnover, said Natalie Pierce, cochair of the Robotics, AI and Automation Industry Group at Littler Mendelson, a law firm that represents management. One client — a department store that couldn’t retain cosmetics department employees — discovered through analytics that it had mistakenly assumed that hiring gregarious employees would lead to greater sales, when in fact the best salespeopl­e were problem-solvers who invested time helping customers.

By changing the type of person it hired, the store was “greatly able to reduce training costs and attrition and increase the amount of commission­s going to employees,” Pierce said.

But employers have to be careful. Algorithms designed to identify candidates similar to current high performers could screen out groups of people who are protected by anti-discrimina­tion laws.

At a public meeting held by the Equal Employment Opportunit­y Commission to discuss the issue in 2016, a chief analyst at the federal agency described how an algorithm might find patterns of absences among employees with disabiliti­es.

Even if the algorithm does not intentiona­lly screen out people with disabiliti­es, the impact could be discrimina­tory and therefore violate federal law, said Barry Hartstein, co-chair of Littler’s diversity practice.

“This is an area that the regulators are recognizin­g is the wave of the future,” he said.

The government has not filed any lawsuits based on an employer’s use of high-tech screening tools or algorithms, said Carol Miaskoff, associate legal counsel at the EEOC. But the agency is watching the trend, and employers need to be aware if the tech tools they use to hire and promote are prone to discrimina­tion, she said.

Proving hiring discrimina­tion is difficult because applicants rarely know for sure why they didn’t get the job, and deconstruc­ting an algorithm presents an additional challenge, Miaskoff said. But an indicator could be the compositio­n of the employee group used to train the algorithm, she said.

“It should be carefully constructe­d so that it is diverse by gender, race, age and disability,” she said.

The potential legal issues echo concerns about the growing popularity of personalit­y tests, which have come under fire for potentiall­y disadvanta­ging people with mental health issues.

Attorney Roland Behm, who filed charges with the EEOC against several national retailers after his son, Kyle, was denied jobs after completing personalit­y tests, said they are part of the same trend of using analytics to make hiring more efficient.

“More and more goes on behind the curtain,” Behm said. “From an employee perspectiv­e, you don’t know if what’s happening is appropriat­e or legal because you don’t know those things are happening.”

Kyle Behm, at the time an engineerin­g student who had been diagnosed with bipolar disorder, discovered he had failed a personalit­y test at a grocery store in 2012 because a friend who worked there told him so.

Some tech firms offering Aienhanced recruiting services say they explicitly clean their data of bias.

Pymetrics creates custom recruitmen­t algorithms based on how top employees at each client company score on online games that measure 90 different traits — such as attention or altruism. Applicants play the online games and are evaluated based on how they score on the desired qualities.

The biggest reason companies use Pymetrics is to improve the fit and diversity of their teams, CEO Frida Polli said.

 ?? Chris Walker, Chicago Tribune ?? Chuck Blatt, left, of Chicago, meets in June with other job-seekers during a networking and support group for executives that meets Monday mornings at JVS, a counseling center in Skokie, Ill. Blatt, 50, is seeking employment through a variety of online recruitmen­t tools as well as networking and traditiona­l, face-to-face interviews.
Chris Walker, Chicago Tribune Chuck Blatt, left, of Chicago, meets in June with other job-seekers during a networking and support group for executives that meets Monday mornings at JVS, a counseling center in Skokie, Ill. Blatt, 50, is seeking employment through a variety of online recruitmen­t tools as well as networking and traditiona­l, face-to-face interviews.

Newspapers in English

Newspapers from United States