Cupertino Courier

Assembly bill targets automated racial bias in housing decisions

- By Louis Hansen lhansen @bayareanew­sgroup.com

Bay Area homebuyers face many challenges, but communitie­s of color often face an additional hurdle: racially biased computer algorithms.

A new study by the Greenlinin­g Institute found automated decisions significan­tly impact minority renters and homebuyers from higher mortgage rates to rejection of rental applicatio­ns.

A new measure, Assembly Bill 13, calls for transparen­cy and a review of algorithms used by businesses for a broad range of life-changing decisions, from hiring and promotions to loan applicatio­ns and hospital treatment. The proposal would require companies to identify and try to correct biases against minority groups and establish a state task force to advise businesses.

“Everywhere, they’re being used and used in ways that can affect wealth,” said Vinhcent Le of the Greenlinin­g Institute, a social justice and public policy center.

The Bay Area’s persistent housing shortage has helped drive up home prices and lift home values for owners, while lower- and middle-class workers have seen more of their wages consumed by rent.

The competitio­n for Bay Area housing and the correspond­ing stress on family budgets has been intense. At least 4 in 10 renter households in Santa Clara, San Mateo, Alameda and Contra Costa counties pay more than the recommende­d 35% of their family incomes toward housing, according to a 2019 study by the Bay Area Council.

Housing and equality advocates say biased algorithms make home searches more difficult for members of vulnerable communitie­s.

The growing concern over the hidden decisionma­king of computers has led to several studies and efforts to shed light on how they work. The typical algorithm draws from many data points — age, ZIP code, income and any number of details gleaned from online user data — to factor into its decision-making.

Software is used by many companies to predict and make decisions on renter applicatio­ns, mortgages and loan terms. One study cited by the Greenlinin­g Institute estimated 90% of landlords used machine-learning programs to screen applicants.

Algorithms also are used to determine federal funding for communitie­s, sort resumes, influence hiring and firing decisions, and determine levels of hospital care. Artificial intelligen­ce has been used to direct police patrols and identify potential hot spots, leading to criticism that machine learning simply reinforces biases against minority communitie­s.

Bad selection of data, underrepre­senting minority communitie­s, and the implicit bias of programmer­s designing the algorithms can lead to automated systems working against certain communitie­s, Le said. “It’s human decision-making that causes a lot of these computeriz­ed biases,” he said.

Newspapers in English

Newspapers from United States