How Alexa, Google Assistant can be hacked using lasers/flashlight
Google Assistant, Amazon Alexa, Apple's Siri have long drawn flak, primarily due to concerns around how they store/use our intimate user conversations. Now, researchers from Japan and the University of Michigan have proliferated those fears, claiming that anyone can hack these 'smart' AI assistants by way of a simple laser or flashlight. Here's all you need to know about it.
MEMS mic vulnerability exploited to compromise voice assistants
In a recent blog post, the research team flagged a vulnerability in the MEMS (microelectro-mechanical systems) microphones used by home assistant products. They said this little known flaw opens the way to inject inaudible and invisible commands into beams of light (laser/flashlight) and then use the same to hack/control Siri, Alexa, or Google Assistant from hundreds of feet away.
How smart speaker is tricked into taking 'light command'
"In addition to sound, microphones also react to light aimed directly at them," the researchers explained, noting that "by modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio."
In tests, they hacked assistants from 350 feet away
The researchers demonstrated the hack and claimed they were able to use laser lights to compromise Google Home and other smart home assistants from as far as 350 feet away. In one case, they controlled a Google Home device positioned in an office building from the top of the University of Michigan's bell tower some 230 feet away.
Now, this is a major point of concern
According to the researchers, hackers can easily apply this technique to compromise smart home devices that are otherwise marketed as a solution to make your home secure. For instance, they could use light-inject commands and tell Assistants to open your home's smart-lock protected front door. However, it is not an easy attack. The attackers need a direct line of sight to the target.
Currently, there is no way to tackle this issue
The researchers say the technique can compromise any device using the MEMS microphone and there's no way to stay protected against it. However, they do claim to be working with researchers at Google, Apple, and Amazon to develop appropriate defense mechanisms that can be incorporated into the future models of their smart home assistants. So, till then, keep your Assistant indoors, away from lasers!