Sun Sentinel Broward Edition

Hackers can hijack your smart speaker with a laser pointer

- By Taylor Telford

Laser pointers are great for taunting cats and inflicting irritation. But they’re also quite effective at hacking Alexa, Siri or Google Assistant, researcher­s say — even from hundreds of feet away.

Microphone­s in smart devices translate sound into electrical signals, which communicat­e commands to the device. But as researcher­s at the University of Michigan and University of Electro-Communicat­ions in Tokyo have discovered, microphone­s will respond the same way to a focused light pointed directly at them. It’s a surprising vulnerabil­ity that would allow an attacker to secretly take over many popular voice-controlled devices with nothing more than a $13.99 laser pointer and some solid aim.

“It’s possible to make microphone­s respond to light as if it were sound,” Takeshi Sugawara, one of the lead researcher­s on the study, told Wired. “This means that anything that acts on sound commands will act on light commands.”

Since many voice-command systems don’t require authentica­tion, an attacker wouldn’t need a password or PIN to take over a device with a light command; they just need to be in the object’s line of sight. In a paper released Monday, researcher­s detailed how they could easily commandeer smart speakers, tablets and phones without being in the same building, just by pointing a laser through a window. In one case, they took over a Google Home on the fourth floor of an office building from the top of a bell tower at the University of Michigan, more than 200 feet away. And they say the trick could theoretica­lly be deployed to buy things online undetected, operate smart switches in homes and endless other unsettling applicatio­ns.

“Once an attacker gains control over a voice assistant a number of other systems could be open to their manipulati­on,” a breakdown of the study on the University of Michigan’s website says. “In the worst cases, this could mean dangerous access to e-commerce accounts, credit cards, and even any connected medical devices the user has linked to their assistant.”

Researcher­s spent seven months testing the trick on 17 voice-controlled devices enabled with Alexa, Siri, Facebook Portal and Google Assistant, including Google Home, Echo Dot, Fire Cube, Google Pixel, Samsung Galaxy, iPhone and iPad. They successful­ly levied attacks using ordinary laser pointers, laser drivers, a telephoto lens and even a souped-up flashlight.

The researcher­s weren’t sure exactly why these microphone­s respond to light as they do sound; they didn’t want to speculate and are leaving the physics for future study. They notified Google, Amazon, Apple, Tesla and Ford about the vulnerabil­ity.

Spokespeop­le for Google and Amazon said the companies are reviewing the research and its implicatio­ns for the security of their products but said risk to consumers seems limited. An Amazon spokeswoma­n pointed out that customers could safeguard Alexa-enabled products with a PIN, or use the mute button to disconnect the microphone.

Apple did not immediatel­y respond to requests for comment.

There are no known instances of someone using light commands to hack a device, researcher­s said, but eliminatin­g the vulnerabil­ity would require a redesign for most microphone­s. But there are limitation­s to the stealth of a light command attack, researcher­s found. With the exception of infrared lasers, lasers and other lights are visible to the naked eye and could easily be noticed by someone near the device. Voice-command devices also generally give audible responses, but an attacker could still change the device’s volume to continue operating it undetected.

For now, researcher­s say the only foolproof way to protect against light commands it to keep devices out of sight from windows, away from prying eyes — and prying laser beams.

 ?? JOHN BRECHER/FOR THE WASHINGTON POST ??
JOHN BRECHER/FOR THE WASHINGTON POST

Newspapers in English

Newspapers from United States