The Sunday Guardian

Are digital assistants becoming a security threat for users?

- NISHANT ARORA

At a time when digital assistants in smart devices at home or office are talking to us like never before, some users have begun to worry, Is Alexa or Google Home listening and recording personal conversati­ons beyond the “wake” word?

There are multiple triggers to such concerns, the latest one being a person in Germany using Amazon’s voice assistant who received 1,700 audio files from a person he never met.

A woman in the US state of Oregon was in shock last year when the Amazon Echo device at her Portland home recorded a private conversati­on and then shared it with one of her husband’s employees in Seattle.

Amazon later clarified that Alexa mistakenly heard a series of commands and sent the recording as a voice message to one of the husband’s employees.

The threat is very much real, with more and more Indians being hooked to the always-on and Internet-connected smart home devices.

In a latest Forrester report titled “Secure The Rise Of Intelligen­t Agents”, Amy Demartine and Jennifer Wise argue that currently, introducto­ry versions of intelligen­t agents include Alexa, Cortana, Google Assistant and Siri. However, security is not part of the equation, and unless security pros get involved, the implicatio­ns are more worrisome for businesses than normal human beings.

“Alexa doesn’t currently authentica­te or authorise individual­s who access it, leaving a company’s Alexa skills unprotecte­d from anyone who can remember another user’s commands,” reads the report.

“A hacker has already developed a method to install malware on a pre-2017 Amazon Echo that streams the microphone to any remote computer, accesses the owner’s Amazon account, and installs ransomware,” the Forrester report added.

Apple logs and stores Siri queries but they are not associated with an Apple ID or email address, and the company deletes the associatio­n between queries and their numerical codes after six months.

Amazon and Google devices, however, save query histories until the customer deletes them, and Microsoft Cortana users must manage their own data retention preference­s in the Cloud and on their devices.

According to Puneesh Kumar, Country Manager for Alexa Experience­s and Devices, Amazon India, the threat of Alexa recording all your conversati­ons is not real as the company has created layers of privacy protection­s in all of its Echo device.

“It includes a mute button involving a hardware press that electrical­ly disconnect­s the microphone­s and cameras, clear visual indicators when utterances are being captured and streamed, as well as the ability to see and delete voice recording history for their devices,” Kumar told IANS.

Echo speakers use ondevice keyword spotting to detect the “wake” word and only the “wake” word. When the “wake” word is detected, the light ring around the top of the device turns blue to indicate that Alexa is streaming audio to the Cloud.

“At any time, you can turn the microphone off by pushing the microphone button on the top of the device and this creates an electrical disconnect to the mic, which will turn on a red ring to visually indicate that the device is muted,” informed Kumar.

According to Amazon, the voice utterances spoken to the device may be used in order to deliver and improve its services.

The users, if needed, can delete specific voice recordings associated with their accounts by going to History in Settings in the Alexa App, drilling down for a specific entry, and then tapping the delete button. You can also delete all voice recordings associated with your account for each of your Alexaenabl­ed products. IANS

 ??  ??

Newspapers in English

Newspapers from India