Google Assistant may be vulnerable to attacks via subsonic commands


Google Assistant

  • A new study claims that Google Assistant, and other voice command-based AI services like Alexa and Siri,may be vulnerable to subsonic commands.
  • The study says that while these commands cannot be heard by humans, they can be detected by Google Assistant, Siri and Alexa.
  • In theory, cybercriminals could use these commands to order these services to purchase products, launch websites and more.

We have already seen that voice-based AI services like Google Assistant can accidentally be turned on just by listening to a TV commercial. Now a new study claims that Google Assistant, along with its rivals like Apple’s Siri and Amazon’s Alexa, could be vulnerable to sound commands that can’t even be heard by humans.

According to The New York Times, the research was conducted by teams at Berkeley and Princeton University in the US, along with China’s Zhejiang University. They say that they have created a way to get rid of sounds that would normally be heard by Google Assistant, Siri and Alexa, and replace them with audio files that cannot be heard by the human ear. However, they can be heard and used by the machine learning software that’s used to power these digital assistants.

So what does that mean? In theory, the researchers claim that cybercriminals could use these subsonic commands to cause all sort of havoc. They could put in audio in a YouTube video or website that could cause Google Assistant to order products online without your consent, launch malicious sites and more. If a speaker like Google Home is connected to smart home devices, these kinds of stealth commands could possibly order your security cameras to shut down, your lights to go off and your door to unlock.

The good news is that there is no evidence that these kinds of subsonic commands are being used outside the university research facilities that found them in the first place. When asked to comment, Google claims that Assistant already has ways to defeat these kinds of commands. Apple and Amazon have also commented, claiming they have taken steps to address these concerns. Hopefully, these companies will continue to develop security measures to defeat these kinds of threats.

Need an Android App without google signin try this new site APK Market

What's Your Reaction?

Cute Cute
0
Cute
Lol Lol
0
Lol
Love Love
0
Love

Comments 0

Your email address will not be published. Required fields are marked *

Google Assistant may be vulnerable to attacks via subsonic commands

log in

Captcha!
Don't have an account?
sign up

reset password

Back to
log in

sign up

Captcha!
Back to
log in
Choose A Format
Gif
GIF format