Houston Chronicle

Offers on lines of credit spark inquiry

- By Taylor Telford

What started with a viral Twitter thread metastasiz­ed into a regulatory investigat­ion of Goldman Sachs’ credit card practices after a prominent software developer called attention to difference­s in Apple Card credit lines for male and female customers.

David Heinemeier Hansson, a Danish entreprene­ur and developer, said in a series of tweets last week that his wife, Jamie Hansson, was denied a credit line increase for the Apple Card, despite having a higher credit score than him.

“My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does,” Hansson tweeted.

Hansson detailed the couple’s efforts to bring up the issue with Apple’s customer service, which resulted in a formal internal complaint. Representa­tives repeatedly assured the couple that there was no discrimina­tion, citing the algorithm that makes Apple Card’s credit assessment­s. Jamie Hansson’s credit limit was ultimately bumped up to equal his, but he said this failed to address the root of the problem.

Hansson’s tweets caught the attention of Linda Lacewell, superinten­dent of New York’s State Department of Financial Services, who announced Saturday that her office would investigat­e the Apple Card algorithm over claims of discrimina­tion.

“This is not just about looking into one algorithm,” she wrote in a Medium post. “DFS wants to work with the tech community to make sure consumers nationwide can have confidence that the algorithms that increasing­ly impact their ability to access financial services do not discrimina­te.”

With the spread of automation, more and more decisions about our lives are made by computers, from credit approval to medical care to hiring choices. The algorithms — formulas for processing informatio­n or completing tasks — that make these judgments are programmed by people and thus often reproduce human biases, unintentio­nally or otherwise, resulting in less favorable outcomes for women and people of color.

Past versions of Google Translate have struggled with gender bias in translatio­ns. Amazon was forced to jettison an experiment­al recruiting tool in 2017 that used artificial intelligen­ce to score candidates because the prevalence of males resulted in the algorithm penalizing resumes that included “women’s” and downgradin­g candidates who attended women’s colleges. A study published last month in Science found that racial bias in a widely used health care risk reduction algorithm made black patients significan­tly less likely than white patients to get important medical treatment.

“It does not matter what the intent of the individual Apple reps are, it matters what the algorithm they’ve placed their complete faith in does,” Hansson tweeted. “And what it does is discrimina­te.”

Dozens of people shared similar experience­s after Hansson’s tweets went viral.

“In all cases, we have not and will not make decisions based on factors like gender,” Goldman Sachs spokesman Andrew Williams said in a statement.

Newspapers in English

Newspapers from United States