Techlife News



Facebook is showing different job ads to women and men in a way that might run afoul of antidiscri­mination laws, according to a new study. University of Southern California researcher­s who examined the ad-delivery algorithms of Facebook and Linkedin found that Facebook’s were skewed by gender beyond what can be legally justified by difference­s in job qualificat­ions.

Men were more likely to see Domino’s pizza delivery driver job ads on Facebook, while women were more likely to see Instacart shopper ads.

The trend also held in higher-paying engineerin­g jobs at tech firms like Netflix and chipmaker Nvidia. A higher fraction of women saw the Netflix ads than the Nvidia ads, which parallels the gender breakdown in each company’s workforce.

No evidence was found of similar bias in the job ads delivered by Linkedin.

Study author Aleksandra Korolova, an assistant professor of computer science at USC, said it might be that Linkedin is doing a better job at deliberate­ly tamping down bias, or it might be that Facebook is simply better at picking up real-world cues from its users about gender imbalances and perpetuati­ng them.

“It’s not that the user is saying, ‘Oh, I’m interested in this.’ Facebook has decided on behalf of the user whether they are likely to engage,” she said. “And just because historical­ly a certain group wasn’t interested in engaging in something, doesn’t mean they shouldn’t have an opportunit­y to pursue it, especially in the job category.” Facebook said in a statement it has been taking meaningful steps to address issues of discrimina­tion in ads.

“Our system takes into account many signals to try and serve people ads they will be most interested in, but we understand the concerns raised in the report,” it said.

Facebook promised to overhaul its ad targeting system in 2019 as part of a legal settlement.

The social network said then it would no longer allow housing, employment or credit ads that target people by age, gender or zip code. It also limited other targeting options so these ads don’t exclude people on the basis of race, ethnicity and other legally protected categories in the U.S., including national origin and sexual orientatio­n. Endlessly customizab­le ad targeting is Facebook’s bread and butter, so any limits placed on its process could hurt the company’s revenue.

The ads users see can be tailored down to the most granular details — not just where people live and what websites they visited recently, but whether they’ve gotten engaged in the past six months or share characteri­stics with people who have recently bought new sneakers, even if they have never expressed interest in doing so themselves.

But even if advertiser­s can’t do the targeting themselves, the study shows what critics have stressed for years -- that Facebook’s own algorithms can discrimina­te, even if there is no intent from the job advertiser­s themselves.

“We haven’t seen any public evidence that they are working on the issues related to their algorithms creating discrimina­tion,” Korolova said.

Since it isn’t possible to show every user every advertisem­ent that is targeted at them, Facebook’s software picks what it deems relevant. If more women show interest in certain jobs, the software learns it should show women more of these sorts of ads.

Linkedin said the study’s findings align with its internal review of job ads targeting.

“However, we recognize that systemic change takes time, and we are at the beginning of a very long journey,” the company said in a statement.

U.S. laws allow for ads to be targeted based on qualificat­ions but not on protected categories such as race, gender and age. But anti-discrimina­tion laws are largely complaintd­riven, and no one can complain about being deprived of a job opportunit­y if they didn’t

know it happened to them, said Sandra Wachter, a professor at Oxford University focused on technology law.

“The tools we have developed to prevent discrimina­tion had a human perpetrato­r in mind,” said Wachter, who was not involved in the USC study. “An algorithm is discrimina­ting very differentl­y, grouping people differentl­y and doing it in a very subtle way. Algorithms discrimina­te behind your back, basically.”

While Domino’s and Instacart have similar job requiremen­ts for their drivers, Domino’s delivery workforce is predominan­tly male, while Instacart’s is more than half female. The study, which looked at driver ads run in North Carolina compared to demographi­c data from voter records, found that Facebook’s algorithms appeared to be learning from those gender disparitie­s and perpetuati­ng them.

The same trend also occurred with sales jobs at retailer Reeds Jewelers, which more women saw, and the Leith Automotive dealership, which more men saw.

The researcher­s call for more rigorous auditing of such algorithms and to look at other factors such as racial bias. Korolova said external audits such as the USC study can only do so much without getting access to Facebook’s proprietar­y algorithms, but regulators could require some form of independen­t review to check for discrimina­tion.

“We’ve seen that platforms are not so good at self-policing their algorithms for undesired societal consequenc­es, especially when their business is at stake,” she said.

 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??

Newspapers in English

Newspapers from United States