The Post

Pandering to biases

-

An email from my insurer introduced me to their new Virtual Assistant. ‘‘Aimee’’ will be there to answer my questions 24/7, just like Josie at ASB, Kiri at Vodafone and Sophie at Air New Zealand. Among the growing army of virtual customer service representa­tives, you’ll be hard pushed to find a Mark, John or Frank.

Female voices are used because research shows they are considered more helpful, polite and cordial. Businesses say it’s a simple case of catering to customer preference­s, but these preference­s are based in problemati­c gender biases. Female dominance of the VA sector reflects the much broader issue of gender bias in AI. Algorithms are programmed by humans and in teaching machines to make decisions, we are also inadverten­tly teaching them our racial, age and gender biases.

It’s ironic that the companies investing in this technology are the same ones that have expansive policies to stamp out gender discrimina­tion and improve outcomes for women in the workplace.

No HR department would employ exclusivel­y female sales assistants because research showed it would be more pleasing for customers, so why are brands pandering to these biases in designing digital experience­s? Equality is great but selling stuff is better.

There are many ways to encourage the move to digital, all of them more innovative than pandering to outdated stereotype­s. Alessandra Nixon, Grey Lynn

Newspapers in English

Newspapers from New Zealand