Milwaukee Journal Sentinel

Amazon Alexa is not spying

- John Kruzel The Journal Sentinel’s PolitiFact Wisconsin is part of the PolitiFact network.

Once the stuff of science fiction, voice-activated virtual assistants like the Amazon Echo, Google Home and Apple HomePod now reside in millions of American homes, tweaking thermostat­s, streaming music and scheduling appointmen­ts.

While some see these devices as helping hands, others view them as Trojan horses in the age of digital surveillan­ce.

“It is outrageous that the Amazon Echo is recording every conversati­on in a person’s home and transmitti­ng it to the cloud,” Rep. Ro Khanna, DCalif., tweeted May 26.

“This is exactly why we need an internet bill of rights! Didn’t we fight a revolution to prevent exactly this kind of surveillan­ce?”

Is Khanna correct about the scope of smart speakers’ electronic eavesdropp­ing?

We decided to take a closer look.

Does Amazon ‘record every conversati­on’?

Amazon’s voice-controlled Alexa products are considered “always-on” devices — but that doesn’t mean they record customers’ conversati­ons.

The devices constantly listen for a user to say a “wake word,” which triggers Alexa to begin recording voice data and respond to commands. Wake words include “Alexa,” “OK Google” and “Hey Siri.”

The Amazon Echo — one of the online retail giant’s smart speaker product lines — uses seven microphone­s to listen for its wake word.

According to Washington Post tech columnist Geoffrey Fowler, the Echo records a second-long snippet of ambient sound which it “constantly discards and replaces” until a wake word starts the recording process.

(Khanna said his claim was based on Fowler’s piece.)

At least, that’s how it works in theory. In practice, the wake word triggering mechanism has a track record that is far from perfect.

In one highly publicized incident, a Portland, Ore., family’s Alexa captured a private conversati­on after the voice-controlled device misheard what it thought was the wake word. It later sent the audio recording to someone in Seattle whose number was stored in the family’s contact list.

Amazon described the chain of events as “an extremely rare occurrence” and issued the following statement:

“Echo woke up due to a word in background conversati­on sounding like ‘Alexa.’ Then, the subsequent conversati­on was heard as a ‘send message’ request. At which point, Alexa said out loud ‘To whom?’ At which point, the background conversati­on was interprete­d as a name in the customer’s contact list. Alexa then asked out loud, ‘(contact name), right?’ Alexa then interprete­d background conversati­on as ‘right.’ As unlikely as this string of events is, we are evaluating options to make this case even less likely.”

The Washington Post’s Fowler, who has an Echo, Google Home and Apple HomePod, said his devices go rogue on a regular basis.

“At least one of them starts recording, randomly, at least once per week,” he wrote. “It happens when they pick up a sound from the TV, or a stray bit of conversati­on that sounds enough like one of their wake words.”

False positives aside, technology experts told us it’s against Amazon policy to constantly record customers’ private conversati­ons, as Khanna claimed.

“There’s no proof or confirmati­on from Amazon that Echo products record ‘every’ conversati­on in a person’s home,” said Tiffany Li, a privacy attorney at Yale Law School’s Informatio­n Society Project. “Indeed, Amazon has publicly stated that the Alexa products only record after hearing users say wake words.”

But Li noted that Amazon has been less-than-forthcomin­g about the circumstan­ces surroundin­g its recording practices.

“It is possible that more data is being recorded than consumers know or that Amazon is willing to publicly admit,” she said. “Amazon is not very transparen­t on privacy practices related to Alexa/Echo products.”

An Amazon spokesman declined to comment on this story, but pointed us to Amazon’s frequently asked questions page.

Notwithsta­nding Amazon’s lack of candor, Li said Khanna’s claim is “probably not accurate.”

When does Amazon send conversati­ons ‘to the cloud’?

Only voice data that’s recorded after a wake word is detected is sent to the cloud. So Khanna’s claim creates a false impression that private conversati­ons are being secretly routed to Amazon’s computers.

“While the device is indeed always listening (there’s no way for it to respond to the wake

word otherwise), it is not always transmitti­ng to the cloud,” said Daniel Kahn Gillmor, a senior staff technologi­st for the ACLU’s Speech, Privacy, and Technology Project.

Still, Gillmor expressed reservatio­ns about the degree of control Amazon maintains over the devices after they’re installed in customers’ homes.

“The code in that device is under the control of Amazon, and it’s basically up to Amazon (not to the owner of the device) to make sure that it’s not transmitti­ng to the cloud,” he said. “Clearly, Amazon isn’t making those decisions correctly all the time.”

Our ruling

Khanna said, “Amazon Echo is recording every conversati­on in a person’s home and transmitti­ng it to the cloud.”

Amazon’s Alexa technology is designed to capture voice data only after a specific voice command, called a wake word, triggers a recording mechanism.

Despite some instances where private conversati­ons were accidental­ly recorded and uploaded to the cloud, Khanna’s claim greatly overstates things.

We found no evidence to suggest the device records every conversati­on and sends it the cloud.

It does record conversati­ons when it hears the wake word, and in some cases the device has misinterpr­eted speech when people didn’t actually say the wake work.

We rate this Mostly False.

 ??  ??
 ??  ??

Newspapers in English

Newspapers from United States