Offers on lines of credit spark inquiry
What started with a viral Twitter thread metastasized into a regulatory investigation of Goldman Sachs’ credit card practices after a prominent software developer called attention to differences in Apple Card credit lines for male and female customers.
David Heinemeier Hansson, a Danish entrepreneur and developer, said in a series of tweets last week that his wife, Jamie Hansson, was denied a credit line increase for the Apple Card, despite having a higher credit score than him.
“My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does,” Hansson tweeted.
Hansson detailed the couple’s efforts to bring up the issue with Apple’s customer service, which resulted in a formal internal complaint. Representatives repeatedly assured the couple that there was no discrimination, citing the algorithm that makes Apple Card’s credit assessments. Jamie Hansson’s credit limit was ultimately bumped up to equal his, but he said this failed to address the root of the problem.
Hansson’s tweets caught the attention of Linda Lacewell, superintendent of New York’s State Department of Financial Services, who announced Saturday that her office would investigate the Apple Card algorithm over claims of discrimination.
“This is not just about looking into one algorithm,” she wrote in a Medium post. “DFS wants to work with the tech community to make sure consumers nationwide can have confidence that the algorithms that increasingly impact their ability to access financial services do not discriminate.”
With the spread of automation, more and more decisions about our lives are made by computers, from credit approval to medical care to hiring choices. The algorithms — formulas for processing information or completing tasks — that make these judgments are programmed by people and thus often reproduce human biases, unintentionally or otherwise, resulting in less favorable outcomes for women and people of color.
Past versions of Google Translate have struggled with gender bias in translations. Amazon was forced to jettison an experimental recruiting tool in 2017 that used artificial intelligence to score candidates because the prevalence of males resulted in the algorithm penalizing resumes that included “women’s” and downgrading candidates who attended women’s colleges. A study published last month in Science found that racial bias in a widely used health care risk reduction algorithm made black patients significantly less likely than white patients to get important medical treatment.
“It does not matter what the intent of the individual Apple reps are, it matters what the algorithm they’ve placed their complete faith in does,” Hansson tweeted. “And what it does is discriminate.”
Dozens of people shared similar experiences after Hansson’s tweets went viral.
“In all cases, we have not and will not make decisions based on factors like gender,” Goldman Sachs spokesman Andrew Williams said in a statement.