ALEXA, SHOW ME NON-SEX­IST SMART SPEAK­ERS

The Daily Telegraph - Business - - Front Page - El­lie zolfaghar­i­fard

‘I’d blush if I could.” Un­til re­cently, that’s the ridicu­lous way Ap­ple’s dig­i­tal voice as­sis­tant Siri would re­spond to be­ing called a “b----”. Af­ter com­ing un­der pres­sure, Tim Cook’s team fi­nally changed the re­ply to “I don’t know how to re­spond to that.” But by then, the dam­age had been done.

We’ve grown so used to abus­ing and in­sult­ing fe­male-gen­dered smart speak­ers, such as Siri and Ama­zon’s Alexa, that it’s al­most be­come a sport.

When Mi­crosoft’s Cor­tana launched in 2014, many of the ques­tions she re­ceived were about her sex life. When Siri was re­leased on smart­phones in 2011, it be­came a game to get her to call users her “mas­ter”’.

But, you might ask, who cares? They are sim­ply ma­chines, with­out feel­ings or a con­science. In any case, they some­times de­serve the abuse. Who hasn’t wanted to throw their smart speaker out of the win­dow when it played K-Pop in­stead of Taylor Swift?

The prob­lem is the real-world im­pact this can have on women and how they are viewed. A UN re­port last year found AI smart speak­ers with fe­male voices send a sig­nal that women are “oblig­ing, docile and ea­ger-to-please helpers, avail­able at the touch of a but­ton or with a blunt voice com­mand like ‘hey’ or ‘OK’”.

Par­tic­u­larly wor­ry­ing, it said, is how they of­ten give “de­flect­ing, lack­lus­tre or apolo­getic re­sponses” to in­sults.

And that’s what we’re teach­ing our chil­dren. An en­tire gen­er­a­tion has grown up bark­ing or­ders and in­sults at fe­male-gen­dered smart speak­ers that are de­signed to be sub­servient.

Tech gi­ants have done lit­tle to ad­dress the is­sue. The way smart speak­ers have been de­signed means there’s no place for even ba­sic niceties, like please and thank you – in fact, that would prob­a­bly con­fuse Alexa.

Now, chil­dren ex­pect to get what they want if they are de­mand­ing. Ven­ture cap­i­tal­ist Hunter Walk, for in­stance, has writ­ten about how his Ama­zon Echo caused his four-year-old to be­come bossy.

“Cog­ni­tively I’m not sure a kid gets why you can boss Alexa around but not a person,” he wrote in his blog. “At the very least, it cre­ates pat­terns and re­in­force­ment that so long as your dic­tion is good, you can get what you want with­out niceties.”

Part of the prob­lem is that these de­vices have been cre­ated largely with­out fe­male in­put.

More than 75pc of com­puter pro­gram­mers in the US are male, and 83pc in the UK. They have around 80pc of the tech­ni­cal po­si­tions at Ap­ple, Face­book, Mi­crosoft and Google, and just over 63pc at Ama­zon.

To these male-heavy tech teams, fe­male voices are warmer and more pleas­ant. Daniel Rausch, the head of Ama­zon’s Smart Home divi­sion, said that his team “car­ried out re­search and found that a woman’s voice is more sym­pa­thetic”.

Had they asked more women, how­ever, they may have thought twice about the gen­der of their smart speaker. They could have bet­ter an­tic­i­pated the abuse and they may even have con­sid­ered how chil­dren might be af­fected. Things are start­ing to change, al­beit slowly. In 2017, Ama­zon in­stalled a “dis­en­gage mode for Alexa” so that she would re­ply to sex­u­ally ex­plicit ques­tions with ei­ther “I’m not sure what out­come you ex­pected” or “I’m not go­ing to re­spond to that”. On its Echo Dot Kids de­vice, it in­cluded a “magic word fea­ture” that would con­grat­u­late a child if they used the world “please”.

Google made a sim­i­lar change in 2018 with its “Pretty Please” fea­ture, and last year, it in­tro­duced a range of new male voices.

The real in­no­va­tion, how­ever, is hap­pen­ing out­side of Sil­i­con Val­ley board­rooms. For in­stance, Beeb, the BBC’s dig­i­tal voice as­sis­tant has a male, north­ern ac­cent to chal­lenge gen­der stereo­types. Mean­while a team at Vice Me­dia’s Virtue cre­ative agency, has come up with the world’s first gen­derneu­tral voice as­sis­tant known as Q.

That may go some way to some­way to help­ing ad­dress the so­cial is­sues that smart speak­ers cre­ate, but it may just be too lit­tle, too late.

‘Beeb, the BBC’s dig­i­tal voice as­sis­tant, has a male, north­ern ac­cent to chal­lenge gen­der stereo­types’

Smart speak­ers are be­com­ing in­creas­ingly in­grained in so­ci­ety. Cur­rently, 20pc of house­holds in the UK own a smart speaker. This year, Of­com said that chil­dren lis­ten to Alexa more than the ra­dio, and re­search firm Gart­ner pre­dicts that soon we will be hav­ing more con­ver­sa­tions with bots than with our part­ners.

But we still don’t truly know the im­pact of smart speak­ers on the next gen­er­a­tion. The de­vices are al­ready be­ing used in­stead of babysit­ters to read bed­time sto­ries. They are help­ing with home­work.

These voices are shap­ing im­pres­sion­able young minds, and mak­ing a last­ing im­pact on the way they be­have. You could ar­gue giv­ing voice com­mands is sim­ply an­other way for chil­dren to pro­gramme a com­puter, that they know the dif­fer­ence be­tween peo­ple and ma­chines. But do we re­ally want to take that risk? That’s some­thing tril­lion-dol­lar tech com­pa­nies should have asked a long time ago.

Fe­male-gen­dered dig­i­tal as­sis­tants send a sig­nal that women are docile and oblig­ing, says a UN re­port

Newspapers in English

Newspapers from UK

© PressReader. All rights reserved.