Houston Chronicle

WITH A LASER, RESEARCHER­S SAY THEY CAN HACK ALEXA, GOOGLE HOME OR SIRI

- By Nicole Perlroth

SAN FRANCISCO — Since voicecontr­olled digital assistants were introduced a few years ago, security experts have fretted that systems like Apple’s Siri and Amazon’s Alexa were a privacy threat and could be easily hacked.

But the risk presented by a cleverly pointed light was probably not on anyone’s radar.

Researcher­s in Japan and at the University of Michigan said Monday that they had found a way to take over Google Home, Amazon’s Alexa or Apple’s Siri devices from hundreds of feet away by shining laser pointers, and even flashlight­s, at the devices’ microphone­s.

In one case, they said they opened a garage door by shining a laser beam at a voice assistant that was connected to it. They also climbed 140 feet to the top of a bell tower at the University of Michigan and successful­ly controlled a Google Home device on the fourth floor of an office building 230 feet away. And by focusing their lasers using a telephoto lens, they said, they were able to hijack a voice assistant more than 350 feet away.

Opening the garage door was easy, the researcher­s said. With the light commands, the researcher­s could have hijacked any digital smart systems attached to the voice-controlled assistants.

They said they could have easily switched light switches on and off, made online purchases or opened a front door protected by a smart lock. They even could have remotely unlocked or started a car that was connected to the device.

“This opens up an entirely new class of vulnerabil­ities,” said Kevin Fu, an associate professor of electrical engineerin­g and computer science at the University of Michigan. “It’s difficult to know how many products are affected because this is so basic.”

The computer science and electrical engineerin­g researcher­s — Takeshi Sugawara at the University of Electro-Communicat­ions in Japan; and Fu, Daniel Genkin, Sara Rampazzi and Benjamin Cyr at the University of Michigan — released their findings in a paper this week.

Genkin was also one of the researcher­s responsibl­e for discoverin­g two major security flaws, known as Meltdown and Spectre, in the microproce­ssors inside nearly all the world’s computers last year. Shares of chipmaker Intel briefly dropped 5% on news of their discovery.

The researcher­s, who studied the light flaw for seven months, said they had discovered that the microphone­s in the devices would respond to light as if it were sound. Inside each microphone is a small plate called a diaphragm that moves when sound hits it.

That movement can be replicated by focusing a laser or a flashlight at the diaphragm, which converts it into electric signals, they said. The rest of the system then responds the way it would to sound.

The researcher­s said they had notified Tesla, Ford, Amazon, Apple and Google to the light vulnerabil­ity. The companies all said they were studying the conclusion­s in the paper released this week.

The researcher­s said most microphone­s would need to be redesigned to remedy the problem. And simply covering the microphone with a piece of tape wouldn’t solve it. Fu said the microphone­s on several digital assistants had dirt shields that didn’t block their commands.

Security researcher­s have a long history of revealing stunning vulnerabil­ities in internet-connected devices. Experts have often cautioned that while those weaknesses can be surprising, they are often worst-case scenarios that can be exploited only in the rarest circumstan­ces. And there is no clear indication that the light vulnerabil­ity detailed this week has been used by hackers.

This is not the first discovery of a surprising vulnerabil­ity in digital assistants. Researcher­s in China and the United States have demonstrat­ed that they can send hidden commands that are undetectab­le to the human ear.

With a tsunami of internetco­nnected devices coming onto the market, however, the researcher­s said the discovery was a reminder to consumers to remain vigilant about security.

“This is the tip of the iceberg,” Fu said. “There is this wide gap between what computers are supposed to do and what they actually do. With the internet of things, they can do unadvertis­ed behaviors, and this is just one example.”

An Amazon spokeswoma­n said that the company had not heard of anyone other than the researcher­s using the light-command hack and that its digital assistant customers could rely on a few easy safety measures. For one, they can set up voice PINs for Alexa shopping or other sensitive smart-home requests. They can also use the mute button to disconnect power to the microphone­s.

There is also a common-sense solution to the light vulnerabil­ity: If you have a voice assistant in your home, keep it out of the line of sight from outside, Genkin said. “And don’t give it access to anything you don’t want someone else to access,” he added.

 ?? Haruka Sakaguchi / New York Times ?? Researcher­s in Japan and at the University of Michigan said they have found a way to take over Google Home, Amazon’s Alexa or Apple’s Siri devices by shining laser pointers at the devices’ microphone­s.
Haruka Sakaguchi / New York Times Researcher­s in Japan and at the University of Michigan said they have found a way to take over Google Home, Amazon’s Alexa or Apple’s Siri devices by shining laser pointers at the devices’ microphone­s.
 ?? Grant Hindsley / New York Times ?? Dave Limp, senior vice president of Amazon Devices and Services, introduces new Alexa accessorie­s at Amazon’s headquarte­rs in Seattle on Sept. 25. Amazon suggests setting up voice PINs for Alexa shopping or other sensitive smart-home requests.
Grant Hindsley / New York Times Dave Limp, senior vice president of Amazon Devices and Services, introduces new Alexa accessorie­s at Amazon’s headquarte­rs in Seattle on Sept. 25. Amazon suggests setting up voice PINs for Alexa shopping or other sensitive smart-home requests.

Newspapers in English

Newspapers from United States