A team of cybersecurity researchers has discovered a technique named Light Commands which can remotely inject inaudible and invisible commands into voice-controlled devices.
All just by shining a laser at the targeted device instead of using spoken words. So, it shows that smart voice assistants like Siri, Alexa can be under danger. Team with 5 members consisting of researchers from Japanese and Michigan Universities have discovered a new attack technique named Light Commands.
Injecting commands into voice-controllable systems
They define it as an attack that is capable of covertly injecting commands into voice-controllable systems from long distances. The goal of the research was to show how an attacker can inject arbitrary audio signals to the target microphone by aiming an amplitude-modulated light at the microphone’s aperture.
It allows attackers to attack from a distance of several meters away with a device that is able to covertly trigger the attack by simply modulating the amplitude of laser light to produce an acoustic pressure wave. It relies on a vulnerability in MEMS microphones embedded in widely-used popular voice-controllable systems that unintentionally respond to light as if it were sound.
Can be attacked at a distance of 110 meters
According to the research paper, by modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio.
After made many experiments, they concluded that obtaining successful command injections is possible at a maximum distance of more than 100 meters while penetrating clear glass windows.
This is an important discovery because there are smart voice assistants in mobile phones, tablets, and other smart devices, such as Google Home and Nest Cam IQ, Amazon Alexa and Echo, Facebook Portal, Apple Siri which are all vulnerable to this new light-based signal injection attack.
Various vehicles like Tesla and Ford also can be targeted
Steps of signal injection attacks on microphones based on the photoacoustic effect:
- Showing how an attacker can inject arbitrary audio signals to the target microphone by aiming an amplitude-modulated light at the microphone’s aperture.
- Experimenting how this effect leads to a remote voice-command injection attack on voice-controllable systems. Examining various products that use Amazon’s Alexa, Apple’s Siri, Facebook’s Portal, and Google Assistant, we show how to use light to obtain full control over these devices at distances up to 110 meters and from two separate buildings.
- Showing that user authentication on these devices is often lacking or non-existent, allowing the attacker to use light-injected voice commands to unlock the target’s smart lock-protected front doors, open garage doors, shop on e-commerce websites at the target’s expense, or even locate, unlock and start various vehicles (e.g., Tesla and Ford) that are connected to the target’s Google account.
- Concluding with possible software and hardware defenses against our attacks.