Los Angeles Times

Check ‘bossware’ tools for bias, U.S. agency head warns

-

The head of the U.S. agency charged with enforcing civil rights in the workplace says artificial intelligen­ce-driven “bossware” tools that closely track the whereabout­s, keystrokes and productivi­ty of workers can also run afoul of discrimina­tion laws.

Charlotte Burrows, chair of the Equal Employment Opportunit­y Commission, told the Associated Press that the agency is trying to educate employers and technology providers about their use of these surveillan­ce tools as well as AI tools that streamline the work of evaluating job prospects.

And if they aren’t careful with say, draconian schedule-monitoring algorithms that penalize breaks for pregnant women or Muslims taking time to pray, or allowing faulty software to screen out graduates of women’s or historical­ly Black colleges — they can’t blame AI when the EEOC comes calling.

“I’m not shy about using our enforcemen­t authority when it’s necessary,” Burrows said. “We want to work with employers, but there’s certainly no exemption to the civil rights laws because you engage in discrimina­tion some high-tech way.”

The federal agency put out its latest set of guidance Thursday on the use of automated systems in employment decisions such as whom to hire or promote. It explains how to interpret a key provision of the Civil Rights Act of 1964 known as Title VII that bars job discrimina­tion based on race, color, national origin, religion or sex, which includes bias against gay, lesbian and transgende­r workers.

Burrows said one important example involves widely used resume screeners and whether they can produce a biased result if they are based on biased data.

“What will happen is that there’s an algorithm that is looking for patterns that reflect patterns that it’s already familiar with,” she said. “It will be trained on data that comes from its existing employees. And if you have a nondiverse set of employees currently, you’re likely to end up with kicking out people inadverten­tly who don’t look like your current employees.”

Amazon, for instance, abandoned its own resume-scanning tool to recruit top talent after finding it favored men for technical roles — in part because it was comparing job candidates against the company’s own maledomina­ted tech workforce.

Other agencies, including the Department of Justice, have been sending similar warnings for the last year, with previous sets of guidance about how some AI tools could discrimina­te against people with disabiliti­es and violate the Americans with Disabiliti­es Act.

In some cases, the EEOC has taken action. In March, the operator of tech jobsearch website Dice.com settled with the agency to end an investigat­ion over allegation­s it was allowing job posters to exclude workers of U.S. national origin in favor of immigrants seeking work visas. To settle the case, the parent company, DHI Group, agreed to rewrite its programmin­g to “scrape” for discrimina­tory language such as “H-1Bs Only,” a reference to a type of work visa.

Much of the EEOC’s work involves investigat­ing the complaints filed by employees who believe they were discrimina­ted against. And though it’s hard for job applicants to know if a biased hiring tool resulted in them being denied a job, Burrows said there is “generally more awareness” among workers about the tools that are increasing­ly being used to monitor their productivi­ty.

Those tools have ranged from radio frequency devices to track nurses, to monitoring the minute-byminute tightly controlled schedule of warehouse workers and delivery drivers, to tracking keystrokes or mouse clicks as many office employees started working from home during the pandemic. Some might violate civil rights laws, depending on how they’re being used.

Burrows noted that the National Labor Relations Board is also looking at such AI tools.

Newspapers in English

Newspapers from United States