Like many other technological breakthroughs, so too digital assistants were created to make everyday life easier. However, as it has been known lately, the privacy of users of such devices was called into question when it was revealed that Companies they listened to their interactions with them. And as if that weren't enough, there are also malicious agents who go out of their way at any time to influence these Appliances.
The most recent addition to the ways a digital assistant can be exploited is “Light Commands“. A team of researchers managed to fool digital assistants by sending a specially crafted laser beam to various popular devices, such as Google HomeThe Amazon Echo, iPhone XR, Google Pixel 2, Facebook Portal Mini etc.
The researchers found that "by modulating an electrical signal at the intensity of a light beam, attackers they can deceive the microphones to produce electrical signals, as if they receive genuine sound. " This in turn allows researchers to send almost any command they like to the devices.
This system exploits one vulnerability that is found in MEMS microphones on many devices that these assistants use to recognize voice commands.
By now, most of us know that microphones are sensitive to sound waves. But as the researchers explain, under the right conditions, they can respond to light as well.
And while targeting a laser helper is a difficult process, a prospect hacker could do this within a distance of 115 meters from the window of a house.
It should be noted, however, that the voice recognition feature that identifies the user by their voice was disabled during Light Command tests. However, according to researchers, assistants control the trigger phrase (such as "Ok, Google" or "Alexa") to determine one's voice and not the entire command. So, h attack it is still likely to take place.
Also, all Light Commands settings cost about $ 600, which means it's not so easy for anyone to get it.
In order to protect yourself from such an attack you must be careful with your device, as you will not hear anything strange. However, you may notice a light falling on it, or the device turning itself on.
The researchers published their findings in a new research paper. Some mitigation techniques have also been proposed. For example, use devices that can reduce the amount of light reaching the microphones or use sensor fusion techniques to obtain sound from multiple microphones, as the laser light can only target one.