Pandering to biases
An email from my insurer introduced me to their new Virtual Assistant. ‘‘Aimee’’ will be there to answer my questions 24/7, just like Josie at ASB, Kiri at Vodafone and Sophie at Air New Zealand. Among the growing army of virtual customer service representatives, you’ll be hard pushed to find a Mark, John or Frank.
Female voices are used because research shows they are considered more helpful, polite and cordial. Businesses say it’s a simple case of catering to customer preferences, but these preferences are based in problematic gender biases. Female dominance of the VA sector reflects the much broader issue of gender bias in AI. Algorithms are programmed by humans and in teaching machines to make decisions, we are also inadvertently teaching them our racial, age and gender biases.
It’s ironic that the companies investing in this technology are the same ones that have expansive policies to stamp out gender discrimination and improve outcomes for women in the workplace.
No HR department would employ exclusively female sales assistants because research showed it would be more pleasing for customers, so why are brands pandering to these biases in designing digital experiences? Equality is great but selling stuff is better.
There are many ways to encourage the move to digital, all of them more innovative than pandering to outdated stereotypes. Alessandra Nixon, Grey Lynn