A team of academics released a new study this week that turned a smart vacuum cleaner into a microphone capable of recording short conversations. Named LidarPhone, the technique works by taking over the built-in LiDAR-based laser navigation component and turning it into a laser microphone.
"Laser microphones" are well-known surveillance tools used during the Cold War to record conversations from far. Agents dropped the laser on distant windows to watch the glass vibrate and decode vibrations to decrypt the conversations that took place in the rooms.
Academics from the University of Maryland and the National University of Singapore took this simple idea and applied it to a robot broom Xiaomi Roborock.
Certain conditions must be met
A LidarPhone attack is not simple and certain conditions must be met. Initially, the attacker would have to use malware or an infected update process to modify the firmware of the broom to take control of the LiDAR component.
This is necessary because the broom LiDARs work by making rotations at any time, one procedure which reduces the number of "data points" that an attacker can collect.
Through the infected firmware, intruders should stop the broom LiDAR from rotating and instead focus on one nearby object at a time, from which it could record how its surface vibrates in sound waves.
In addition, because the LiDAR components of the vacuum cleaner are not as accurate as laser microphones, the researchers also said that the collected "laser readings" should be downloaded to the remote server of the intruder for further processing in order to amplify the signal and receive the sound in a state where it can be understood by a human.
Nevertheless, despite all these conditions, the researchers stated that they were able to record and receive audio data from the LiDAR navigation test component of the Xiaomi robot vacuum cleaner.
They tried LidarPhone attack with various objects, changing the distance between the robot and the object and the distance between the sound source and the object.
Academics also said the technique could also be used to identify speakers based on gender or even to determine their political orientation by music played during newscasts, recorded by the broom LiDAR.
But while the LidarPhone attack sounds like a serious invasion of privacy, users for now do not need to be afraid. This type of attack requires many conditions to take place. There are much easier ways to spy on users than replacing the firmware of a broom, such as tricking the user into installing malware on their phone.
The LidarPhone attack is just a new academic research that can be used to strengthen security and the design of future robotic smart brooms.