Assembly bill targets automated racial bias in housing decisions
Bay Area homebuyers face many challenges, but communities of color often face an additional hurdle: racially biased computer algorithms.
A new study by the Greenlining Institute found automated decisions significantly impact minority renters and homebuyers from higher mortgage rates to rejection of rental applications.
A new measure, Assembly Bill 13, calls for transparency and a review of algorithms used by businesses for a broad range of life-changing decisions, from hiring and promotions to loan applications and hospital treatment. The proposal would require companies to identify and try to correct biases against minority groups and establish a state task force to advise businesses.
“Everywhere, they’re being used and used in ways that can affect wealth,” said Vinhcent Le of the Greenlining Institute, a social justice and public policy center.
The Bay Area’s persistent housing shortage has helped drive up home prices and lift home values for owners, while lower- and middle-class workers have seen more of their wages consumed by rent.
The competition for Bay Area housing and the corresponding stress on family budgets has been intense. At least 4 in 10 renter households in Santa Clara, San Mateo, Alameda and Contra Costa counties pay more than the recommended 35% of their family incomes toward housing, according to a 2019 study by the Bay Area Council.
Housing and equality advocates say biased algorithms make home searches more difficult for members of vulnerable communities.
The growing concern over the hidden decisionmaking of computers has led to several studies and efforts to shed light on how they work. The typical algorithm draws from many data points — age, ZIP code, income and any number of details gleaned from online user data — to factor into its decision-making.
Software is used by many companies to predict and make decisions on renter applications, mortgages and loan terms. One study cited by the Greenlining Institute estimated 90% of landlords used machine-learning programs to screen applicants.
Algorithms also are used to determine federal funding for communities, sort resumes, influence hiring and firing decisions, and determine levels of hospital care. Artificial intelligence has been used to direct police patrols and identify potential hot spots, leading to criticism that machine learning simply reinforces biases against minority communities.
Bad selection of data, underrepresenting minority communities, and the implicit bias of programmers designing the algorithms can lead to automated systems working against certain communities, Le said. “It’s human decision-making that causes a lot of these computerized biases,” he said.