The Denver Post

Alexa, tell us how much you’re recording

- By Geoffrey A. Fowler

We’re learning an important lesson about cuttingedg­e voice technology: Amazon’s Alexa is always listening. So are Google’s Assistant and Apple’s Siri.

Putting live microphone­s in our homes has always been an outthere idea. But tech companies successful­ly marketed talking speakers such as the Amazon Echo and Google Home to millions by assuring they only record us when we give a “wake word.”

That turns out to be a misnomer. These devices are always “awake,” passively listening for the command to activate, such as “Alexa,” “O.K. Google,” or “Hey Siri.” The problem is they’re far from perfect about only responding when we want them to.

The latest, and most alarming example to date: A family in Portland, Ore., last month found its Echo had recorded a private conversati­on and sent it to a random contact.

Privacy is the one aspect of Alexa that Amazon can’t afford to screw up.

Amazon, in a statement, made it sound like the Portland, Ore., case involved a sequence of events you might expect in a “Seinfeld” episode. It said the Echo woke up when it heard a word that sounded like Alexa. “The subsequent conversati­on was heard as a ‘send mes

sage’ request. At which point, Alexa said out loud ‘To whom?’ At which point, the background conversati­on was interprete­d as a name in the customer’s contact list.”

Amazon also said the incident was rare and it is “evaluating options to make this case even less likely.”

But how often do these devices go rogue and record more than we’d like them to? Neither Google nor Amazon immediatel­y responded to my questions about false positives for their “wake words.” But anyone who lives with one of these devices knows it happens.

As a tech columnist, I’ve got an Echo, Google Home and Apple HomePod in my living room — and find at least one of them starts recording, randomly, at least once per week. It happens when they pick up a sound from the TV, or a stray bit of conversati­on that sounds enough like one of their wake words.

The Amazon Alexa app will play back stored recordings — including cases like this one, where it started recording because it misheard its “wake word.”

Separating a command out from surroundin­g home noise — especially loud music — is no easy task. Amazon’s Echo uses seven microphone­s and noise-canceling tech to listen for its wake word.

Over-recording isn’t just an Amazon problem. Last year, Google faced a screw up where some models of its Home Mini were set to record everything and had to be patched. Last month, researcher­s reported they were able to make Siri, Alexa and Google’s Assistant hear secret audio instructio­ns undetectab­le to the human ear.

So what should you do about this? You can mute these devices, which in the case of the Amazon Echo physically disconnect­s the microphone — until you’re ready to use it. But that partly defeats the usefulness of a computer you can just holler at when your hands are otherwise occupied.

Another approach is to turn off some more-sensitive functions in the Alexa app, including making product purchases via voice. You can turn off the “drop in” feature that lets another Echo automatica­lly connect to start a conversati­on.

You also have the ability to dig deeper into what’s being recorded. Prepare to be a bit horrified: Amazon and Google keep a copy of every single conversati­on, both as a nod toward transparen­cy and to help improve their voice-recognitio­n and artificial intelligen­ce systems. In the Alexa app and on Google’s user activity site, you can listen to and delete these past recordings. (Apple also keeps Siri recordings, but not in a way you can look up — and anonymizes them after six months.)

The nuclear response is to unplug your smart speaker entirely until the companies come clean about how often their voice assistants over-listen — and what they’re doing to stop it.

 ?? Mark Lennihan, Associated Press file ?? Revelation­s that an Amazon Echo smart speaker inadverten­tly sent a private conversati­on to an acquaintan­ce show the risks that come with using new technologi­es.
Mark Lennihan, Associated Press file Revelation­s that an Amazon Echo smart speaker inadverten­tly sent a private conversati­on to an acquaintan­ce show the risks that come with using new technologi­es.

Newspapers in English

Newspapers from United States