Hartford Courant

Facebook, US in deal to stop discrimina­tory housing ads

- By Larry Neumeister

NEW YORK — Facebook will change its algorithms to prevent discrimina­tory housing advertisin­g and its parent company will subject itself to court oversight to settle a lawsuit brought by the U.S. Department of Justice.

In a news release this week, U.S. government officials said Meta Platforms Inc., formerly known as Facebook Inc., said it reached an agreement to settle the lawsuit filed the same day in Manhattan federal court.

According to the release, it was the Justice Department’s first case challengin­g algorithmi­c discrimina­tion under the Fair Housing Act. Facebook will now be subject to Justice Department approval and court oversight for its ad targeting and delivery system.

U.S. Attorney Damian Williams called the lawsuit “groundbrea­king.” Assistant Attorney General Kristen Clarke called it “historic.”

Ashley Settle, a Facebook spokespers­on, said in an email that the company was “building a novel machine learning method without our ads system that will change the way housing ads are delivered to people residing in the U.S. across different demographi­c groups.”

She said the company would extend its new method for ads related to employment and credit in the U.S.

Williams said Facebook’s technology has in the past violated the Fair Housing Act online “just as when companies engage in discrimina­tory advertisin­g using more traditiona­l advertisin­g methods.”

Clarke said “companies like Meta have a responsibi­lity to ensure their algorithmi­c tools are not used in a discrimina­tory manner.”

According to terms of the settlement, Facebook will stop using an advertisin­g tool for housing ads that the government said employed a discrimina­tory algorithm to locate users who “look like” other users based on characteri­stics protected by the Fair Housing Act, the Justice Department said. By Dec. 31, Facebook must stop using the tool once called “Lookalike Audience,” which relies on an algorithm that the U.S. said discrimina­tes on the basis of race, sex and other characteri­stics.

Facebook also will develop a new system over the next half-year to address racial and other disparitie­s caused by its use of personaliz­ation algorithms in its delivery system for housing ads, it said.

If the new system is inadequate, the settlement agreement can be terminated, the Justice Department said. Per the settlement, Meta also must pay a penalty of just over $115,000.

The announceme­nt comes after Facebook agreed in March 2019 to overhaul its ad-targeting systems to prevent discrimina­tion in housing, credit and employment ads as part of a legal settlement with a group including the American Civil Liberties Union, the National Fair Housing Alliance and others.

The changes announced then were designed so advertiser­s who wanted to run housing, employment or credit ads would no longer be allowed to target people by their age, gender or ZIP code.

 ?? NOAH BERGER/GETTY-AFP 2021 ?? Facebook parent company Meta agreed Tuesday to court oversight of the social media giant’s ad-targeting system due to discrimina­tion concerns.
NOAH BERGER/GETTY-AFP 2021 Facebook parent company Meta agreed Tuesday to court oversight of the social media giant’s ad-targeting system due to discrimina­tion concerns.

Newspapers in English

Newspapers from United States