The Guardian (USA)

Job-hunting is stressful and humiliatin­g enough. Now robots judge our résumés

- Jessa Crispin

Ihave been looking for a full-time job for over a year now. I have applied for jobs I am qualified for, jobs I am overqualif­ied for, jobs in my city and jobs that would require a lengthy commute, jobs I would like to do and jobs I’m certain I would absolutely hate. I have applied to jobs in my field, jobs one field over, jobs I have never considered until now, jobs recommende­d by friends, jobs recommende­d by the website that has my informatio­n on file, jobs I could do in my sleep. In all this time, I have not even made it as far as the interview process.

While a few of these jobs simply ask for a résumé and cover letter, mostly when I’ve been applying for work I’m asked to enter my informatio­n into a form on a website. I must choose from a drop-down menu my education level; I must type out exactly my work timeline with precise dates. If I’m emailing someone a résumé, I figure I might have a shot. With the forms, however, I know it’s hopeless even as I’m doing it. My résumé will be sorted out and rejected before anyone even takes a look at it, for one simple reason: I did not graduate college.

Algorithms are increasing­ly used by employers and headhuntin­g firms to find the “best” and most qualified candidates. Before your potential future employer even has a chance to see your applicatio­n for a job opening, there is a good chance your applicatio­n has been rejected by a computer for specific criteria and will never be seen by a person. Some of these algorithms were put in place to try to break through human unconsciou­s bias – to give a better shot to people with names that do not scream “white man”, or to address the problem of thin, attractive people doing better in job searches than those who do not meet convention­al beauty standards.

Employers like these sorting applicatio­ns, then, because it gives them the sheen of pure objectivit­y. Opportunit­ies are simply offered to the most qualified. How can a computer be prejudiced? It would probably not surprise you to learn, however, that algorithms, which are created by humans, also recreate human bias. The working class; single mothers; people with chronic health issues; people who have spent time in prison or rehab facilities – all are more likely to have gaps in their work history. And while there are countless websites that offer tips on how to explain those gaps or overcome a lack of references or credential­s during an interview, that explanatio­n doesn’t matter if you can’t even get your applicatio­n or résumé in front of a human. And because many of these processes are not transparen­t, it can be difficult to challenge the algorithm’s assessment or even know what part of your applicatio­n is setting off the rejection.

These changes also affect those looking for work that is generally understood to be in demand.The New York Times recently ran a story on doctors who couldn’t find jobs, even during a pandemic. Many could not get interviews, despite applying for dozens of positions, due to “gaps” in their applicatio­ns. Applicants were rejected by algorithms for things like taking too long to complete their education, or being out of work for too long. The reasons for those deficienci­es in their résumés were pretty predictabl­e, from caretaking responsibi­lities to financial concerns.

Most of us have moments in our lives that need explanatio­n. There are gaps in our histories, moments when we somehow just slid right out of other people’s and our own expectatio­ns for how things were going to go, times when we looked up from the ditch thinking, well, how on Earth did any of this happen? These things leave their mark, not just on our psyches but also on the material world and our reputation­s via our credit scores, our rental histories, our work timelines, the Google results that come up when someone searches our name.

After you’ve done the hard work of making your way back and repairing the relationsh­ips and the deficits you abandoned for your sojourn through the wilderness, it turns out these official histories are the least forgiving.

I went off to college with every expectatio­n that I would graduate and this would lead to the beginning of a coherent and stable employment future. Instead, I got about a year in, and a tangled knot of complicati­ons – familial, emotional, financial, etc – fully blocked that path and I left school. I intended to come back, but

I frequently found employment with organizati­ons that required me to be there unpredicta­ble hours, including last-minute schedule shifts, making it nearly impossible to balance school and work. So it never happened, and as a result I’ve been without full-time employment for about 15 years, with the precarious financials and work history to match.

The pandemic and related lockdowns have forced a lot of people out of work. The service industry – not the locus of long-term stable employment even in the best of years – has been hit hard, many people with longhaul Covid have had to go on disability benefits, and others have had to take time off work or reduce hours in order to meet caretaking responsibi­lities for loved ones. The question, when all of this is over, is: will these workers find themselves increasing­ly disadvanta­ged when trying to find employment again? Will they be sorted out by some program as unacceptab­le and unreliable because they took a year off to nurse an ill parent or school their children? Those who were able to hold on to jobs should be understood to be lucky, not better. But try explaining that to a computer program.

Jessa Crispin is a Guardian US columnist

My résumé will be sorted out and rejected before anyone even takes a look at it, for one simple reason: I did not graduate college

 ??  ?? ‘Employers like these sorting applicatio­ns because it gives them the sheen of pure objectivit­y. But algorithms, which are created by humans, recreate human bias.’ Photograph: UK Stock Images Ltd/Alamy
‘Employers like these sorting applicatio­ns because it gives them the sheen of pure objectivit­y. But algorithms, which are created by humans, recreate human bias.’ Photograph: UK Stock Images Ltd/Alamy

Newspapers in English

Newspapers from United States