PhotoResearchers from Tokyo and the University of Michigan have found that laser pointers are capable of “hijacking” smart speakers. 

In a paper titled “Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems,” the researchers said they found that voice-enabled devices could be tricked into following voice commands by beaming a laser at them. 

The team tested the effect of laser pointers on smart speakers that included Google Assistant, Apple Siri, and Amazon Alexa. They found that these devices interpreted the light of the laser as sound. 

“We have identified a semantic gap between the physics and specifications of MEMS (microelectro-mechanical systems) microphones, where such microphones unintentionally respond to light as if it was sound,” they wrote. “Exploiting this effect, we can inject sound into microphones by simply modulating the amplitude of a laser light.” 

Privacy threat

The effect produced “an attack that is capable of covertly injecting commands into voice-controllable systems” at distances of 230 to 350 away. In one instance, the team successfully commanded a Google Home device that was in a room in another building to open a garage door simply by shining a light that had the “OK Google” command encoded in it. 

The list of devices that were tested and found to be vulnerable to light commands includes Google Home; Google Nest Cam IQ; multiple Amazon Echo, Echo Dot, and Echo Show devices; Facebook's Portal Mini; the iPhone XR; and the sixth-generation iPad.

The researchers said they have already notified Tesla, Ford, Amazon, Apple, and Google about the weakness. They said that mitigating the flaw would require a redesign of most microphones. Lead author Takeshi Sugawara said one possible way to get rid of the vulnerability in microphones would be to create an obstacle that would block a line of sight to the microphone's diaphragm.

Looking for credit repair?

Discover the right credit repair company for you.


    Share your Comments