BBC Science Focus

Aleks Krotoski

-

Why the way you speak to smart home devices matters more than you think.

Many of you will have received home robots for the holidays. Congratula­tions on your new arrivals. You are now officially living in The Future. I live there too. Welcome to the neighbourh­ood. But there are a few things that you should know before you settle in, so you’re not surprised later on in The Future when terrible things happen. I like to think of 1980s science fiction movies like The Terminator, RoboCop and Total Recall as travel guides to our modern world. In other words, how we treat that friendly Alexa, Cortana or Google Home will ultimately affect who will be lined up at the wall during the robot uprising.

At a technical level, voice-activated personal assistants are pretty incredible and take a lot of science. They require processors, wake-up words, databases, and the ability to correctly interpret a word despite age, gender, socioecono­mic status, ethnic background or country of origin. So yeah, just shouting, “Hey Alexa, tell me a joke,” seems a bit flippant now, doesn’t it?

My point is that the way we speak to our machines has an impact on how they behave. But Amazon has just implemente­d a new whispering mode that will meet the human’s voice volume. Why shouldn’t the device eventually copy our tone too? You know, shouting at it. Ordering it around. Being polite. The robots are learning. We aren’t.

Albert Bandura introduced the social cognitive theory in 1977, to explain how people learn. The idea behind it is that we behave based on what we see others doing – parents, colleagues, media, whoever. These targets of our attentions become the templates by which we live. Robots are not yet observing and assimilati­ng; there’s too much processing overhead for that. But there are often other people in our domestic lives, like friends and family, who are watching to see how we interact with these machines. And if we’re angry or rude or polite, that becomes the normal behaviour towards devices. And that evolves into how we treat robots in general and how we will feel is the right way to behave towards artificial intelligen­ces in the future. I won’t even get into the nuances of gender politics, but they’re there too: most domestic personal assistants have female voices because, as was recently broadcast on BBC Radio 4’s Digital Human, research indicated early on that people preferred to order female voices around than male ones. There’s a reason we called that episode Subservien­ce.

We should consider how we are speaking to our machines, or we’re going to have to update Mrs Beeton’s Book Of Household Management for the 21st Century. Victorian guides to household management kept upstairs separate from downstairs. “Not in front of the maid, darling,” is the same as, “Don’t say anything sensitive in front of Google Home.” There are people who think that their online Google search results and the ads on their web browsers are affected by what they’ve said to their domestic devices. It’s not currently the case, but it literally is only a matter of time.

As a new recruit to The Future, you can change things. You can be mindful of how you interact with your tabletop servant. You can recognise your conscious and unconsciou­s biases, and how they’re communicat­ed to other people when you talk to the machine. We are the pioneers in this brave new world, and it is down to us to make The Future one in which we want to live.

“THE WAY WE SPEAK TO OUR MACHINES IMPACTS HOW THEY BEHAVE”

 ??  ??
 ??  ??
 ??  ?? Aleks Krotoski is a social psychologi­st, broadcaste­r and journalist. She presents BBC Radio 4’s Digital Human.
Aleks Krotoski is a social psychologi­st, broadcaste­r and journalist. She presents BBC Radio 4’s Digital Human.

Newspapers in English

Newspapers from United Kingdom