New York Daily News

MUST NIX RACIST ADS

Feds force Facebook to end bias in how it markets homes

- BY MOLLY CRANE-NEWMAN

It didn’t matter if you are Black or white — unless you were looking for a new apartment on Facebook.

What users looked like helped determine whether or not Facebook showed them ads for available housing until at least 2019, Justice Department officials said Tuesday in announcing a first-of-itskind settlement with the social network’s parent company for running a biased algorithm system.

Manhattan U.S. Attorney Damian Williams said the Justice Department had reached an unpreceden­ted lawsuit settlement with Facebook’s parent, Meta Platforms, Inc., that will require the company to revamp technology that blocked ads to users based on their race, gender, zip code and other characteri­stics.

“Because of this groundbrea­king lawsuit, Meta will — for the first time — change its ad delivery system to address algorithmi­c discrimina­tion,” said Williams.

If Meta doesn’t change the system to the Justice Department’s satisfacti­on, Williams said, his office will “proceed with the litigation.”

Meta has until December to abandon its housing ad system and create a new one that’s not racist, sexist or classist and whose technology the government must approve before implementa­tion.

The new algorithm must incorporat­e a self-policing element, and Meta must agree to submit to ongoing reviews.

Mark Zuckerberg’s company also agreed to pay a civil penalty of $115,054, the maximum by law.

In a blog post, Roy Austin, Meta’s deputy general counsel, said the company’s new “variance reduction system” technology will ensure users are not discrimina­ted against along racial lines or other characteri­stics protected by the 1968 Fair Housing Act.

Austin said the algorithm will also be used for to ensure ads related to employment and credit reach everyone who wants to see them.

The settlement resolves a lawsuit filed Tuesday born from a discrimina­tion charge and civil suit against Facebook issued by the Department of Housing and Urban Developmen­t in March 2019.

According to the complaint, Facebook collects data on its users’ appearance­s in myriad ways.

One is its popular tool inviting people to create their own cartoon-like “avatar” — which helped the algorithm collect informatio­n about users’ race.

After inputting details like skin color, eye, nose and lip shapes and hairstyle, the site prompts users making an avatar to open the selfie cam to determine the closest-matching facial features.

“This informatio­n concerning the user’s physical appearance becomes part of Facebook’s enormous set of user data,” reads the complaint.

By excluding people from seeing ads based on their race, gender and other characteri­stics, Facebook violated the Fair Housing Act, the feds say.

Self-designed tools called “Lookalike Audience” and “Special Ad Audience,” which were intended to help businesses broaden the number of people who saw their ads, but in fact excluded people based on gender and race, officials said.

Under the settlement, Meta will discontinu­e both tools.

Demetria McCain, principal deputy assistant secretary of the Department of Housing and Urban Developmen­t, said companies like Facebook play as important a role as housing providers in the modern age.

“Parties who discrimina­te in the housing market, including those engaging in algorithmi­c bias, must be held accountabl­e,” said McCain. “This type of behavior hurts us all.”

The lawsuit is the DOJ’s first challengin­g discrimina­tion by an algorithm under the FHA, which prohibited discrimina­tion based on race, gender, religion and other characteri­stics when renting, selling or financing housing.

 ?? ?? Facebook, run by Mark Zuckerberg (left), asked users to make an avatar (above) that looked most like themselves.
Facebook, run by Mark Zuckerberg (left), asked users to make an avatar (above) that looked most like themselves.

Newspapers in English

Newspapers from United States