The National - News

Bias oversight that gives women a far smaller bite of Apple’s credit card

- NIMA ABU WARDEH Nima Abu Wardeh is a broadcast journalist, columnist, blogger and founder of SHE Strategy. Share her journey on finding-nima.com

Bias is life’s algorithm. The billions spent on addressing this, in an attempt to change people’s mindsets and behaviours, will not do away with bias. Because we’re human, the things we do, create and touch are inherently biased too.

That includes credit lines – or even being allowed to open a bank account in some countries. Apple fell foul of this in spectacula­r fashion this month. Instead of being in the news for the UK release of its first credit card – initially launched in the US in August – attention was focused on the (huge) difference with respect to the credit lines awarded to men compared to women.

The controvers­y began with a series of tweets from David Heinemeier Hansson, a high-profile tech entreprene­ur, that stated the card was “sexist” because it gave him 20 times more credit than his wife, even though they file joint tax returns and her credit score is higher. That led to a barrage of comments and complaints on social media and Apple co-founder Steve Wozniak’s public pondering of the machinatio­ns of his company’s credit card.

Mr Wozniak’s tweet reply that he gets 10 times the credit limit of his wife on the Apple card – even though the pair have no separate cards, accounts or assets – was the icing on the bias cake.

Subsequent­ly, Mr Wozniak said “algos obviously have flaws” and called on the government to get involved with regulation. “These sort of unfairness­es bother me and go against the principle of truth,” he said.

Goldman Sachs, the issuing bank for the Apple Card, said: “In all cases, we have not and will not make decisions based on factors like gender.”

The irony is that the Equal Credit Opportunit­y Act passed in the US in 1974 seeking to protect women’s rights is the reason for this. Before then, banks required single, widowed or divorced women to bring a man along to co-sign any credit applicatio­n, regardless of their income.

The act made it unlawful for a creditor to discrimina­te against an applicant, with respect to any aspect of a credit transactio­n, on the basis of race, colour, religion, national origin, sex, marital status, age or because you get public assistance. Turns out this isn’t the case when it comes to machine learning algorithms.

When questioned over the Apple credit card “behaviour”, Goldman stated that the algorithm doesn’t use gender as an input. How could the bank discrimina­te if it doesn’t know which customers are women and which are men?

Here’s how: a gender-blind algorithm could end up biased against women as long as it’s drawing on any input that correlates with gender. Knowing what products a person buys, where they shop or how they live, can lead to bias because these indicate someone’s gender. The data betraying gender then betrays the person as their informatio­n is used against them – because the algorithm does its magic and decides not to offer them the same credit line as their husbands.

There’s a saying in business that what isn’t measured can’t be managed. The fact that financial businesses are prohibited from using informatio­n such as gender or race in algorithmi­c decisions may make the bias problem worse because businesses don’t collect this informatio­n.

For example, Amazon had to pull an algorithm used in hiring owing to gender bias. IBM and Microsoft were embarrasse­d by facial recognitio­n algorithms better at recognisin­g men than women, and white people than those of other races.

If outing products, people, training and workshops isn’t going to change things, what can we do about it? I believe giving people at the receiving end of bias tools to speak up, stand up and get their case across is key.

The outcry has led to New York regulators opening a discrimina­tion investigat­ion into Goldman’s credit card practices. Good luck to them and to the women of the world who have to show up and deal with each new bias coming their way. If only liability were biased too – but it isn’t. Funny that.

 ?? Illustrati­on by Gary Clement ??
Illustrati­on by Gary Clement
 ??  ??

Newspapers in English

Newspapers from United Arab Emirates