Researchers prompt Alexa to respond to a command that is inaudible to humans

5/24/2018 Joseph Park, ECE ILLINOIS

Beyond nefarious applications, this acoustic communication channel could jam spy microphones and watermark music during a live concert.

Written by Joseph Park, ECE ILLINOIS

According to the research firm Ovum, smartphones and smart speakers that implement digital assistants such as Amazon's Alexa or Apple's Siri will outnumber people by 2021. According to Juniper Research, more than half of all American households will have at least one smart speaker by then.

With the increasing prevalence of digital assistants, it is inevitable that artificial intelligence will assume a relevant role in our daily lives. However, ECE ILLINOIS Professor [profile:croy] and his team of researchers including PhD candidate [profile:nroy8], PhD student [profile:sshen19], and ECE ILLINOIS Assistant Professor [profile:haitham], discovered a vital security flaw in these smart devices. This work won the MobiSys 2017 Best Paper Award.

In Project BackDoor, the team of researchers created sounds that were inaudible to the human ear but audible to microphones. They were able to send commands to digital assistants from the inaudible BackDoor speaker array to prompt Amazon's Alexa to respond to a seemingly unheard question. 

The researchers were able to demonstrate ultrasound attacks from 25 feet away. Even though their commands could not penetrate walls, they were able to "control smart devices through open windows from outside a building" per the New York Times

According to Roy Choudhury, applications of such a flaw include jamming spy microphones, live watermarking music in a concert, or even threats of acoustic denial-of-service (DoS) attacks on phone calls or inaudible command attacks on voice-enabled devices. 

Choudhury and Hassanieh are both affiliated with the CSL. Read more from the New York Times here.


Share this story

This story was published May 24, 2018.