Hacked robot vacuums can spy
A team of researchers has demonstrated that popular robotic household vacuum cleaners can be hacked to act as remote microphones.
The researchers collected information from the laser-based navigation system in a popular vacuum robot and applied signal processing and deep-learning techniques to recover speech and identify television programs playing in the same room as the device.
Nirupam Roy, an assistant professor in the University of Maryland’s Department of Computer Science, led the researchers from the National University of Singapore.
They demonstrated the potential for any device that uses light detection and ranging (Lidar) technology to be manipulated for collecting sound, despite not having a microphone.
A collaboration with assistant professor Jun Han at the National University of Singapore, the work was presented at the Association for Computing Machinery’s Conference on Embedded Networked Sensor Systems (SenSys 2020).
“We welcome these devices into our homes, and we don’t think anything about it,” said Roy, who holds a joint appointment in the University of Maryland Institute for Advanced Computer Studies (UMIACS).
“But we have shown that even though these devices don’t have microphones, we can repurpose the systems they use for navigation to spy on conversations and potentially reveal private information.”
Privacy experts have suggested that the maps made by vacuum bots, which are often stored in the cloud, pose potential privacy breaches.
The Lidar navigation systems in household vacuum bots shine a laser beam around a room and sense the reflection of the laser as it bounces off nearby objects. The robot uses the reflected signals to map the room and avoid collisions as it moves through the house.
Privacy experts have suggested that the maps made by vacuum bots, which are often stored in the cloud, pose potential privacy breaches that could give advertisers access to information about such things as home size, which suggests income level, and other lifestyle-related information.
Roy and his team wondered if the Lidar in these robots could also pose potential security risks as sound recording devices in users’ homes or businesses.
Sound waves cause objects to vibrate, and these vibrations cause slight variations in the light bouncing off an object. Laser microphones (used in espionage since the 1940s) are capable of converting those variations back into sound waves.