Researchers in the United States have developed a new ultrasonic acoustic attack that can covertly give hackers remote access to many smart devices by turning the device’s microphone and voice assistant against them.
The attack works by using ultrasonic acoustic signals that are inaudible to humans but can be picked up by voice assistants on smart devices, a type of cyberattack commonly referred to as a “SurfingAttack” or “DolphinAttack.”
With a SurfingAttack, a hacker can modulate voice commands into silent, near-ultrasonic signals, allowing them to issue commands to a smart device, all while a user is blissfully unaware their device has been hijacked.
It has been previously demonstrated that SurfingAttacks can be carried out using “line-of-sight” ultrasonic acoustic attacks, with a hacker using a nearby ultrasound speaker to target a smart device’s voice assistant.
However, in their recent research, Dr. Guenevere Chen, an associate professor from the University of Texas at San Antonio, her doctoral student Qi Xia, and Dr. Shouhuai Xu, a professor from the University of Colorado Colorado Springs, showed that ultrasonic acoustic attacks can also be carried out remotely, with a hacker potentially being thousands of miles away from their intended victim.
Instead of injecting inaudible frequencies via a nearby amplifier, this new type of cyber attack, dubbed a “Near-Ultrasound Inaudible Trojan,” or NUIT, uses a device’s microphone and speaker to remotely seize control of popular voice assistants.
Speaking with The Register, Chen and Xia demonstrated how virtually any technology equipped with smart voice assistants, including smartphones, computers, speakers, televisions, garage door openers, or even front door locks, is vulnerable to two types of NUIT attacks.
With a NUIT-1 attack, the researchers showed how near-ultrasonic frequencies played through the speaker of a smart device could allow a hacker to seize control of the microphone and voice assistant on the same device.
In a similar ultrasonic acoustic attack, NUIT-2 uses a smart device’s speaker to attack the microphone and voice assistant on a different nearby device.
In either attack, the malicious signal can be surreptitiously embedded in various ways, such as a device app, a YouTube video, or a voicemail. Using traditional social engineering or spear phishing, hackers can trick a victim into unwittingly playing the inaudible signal through their device’s speaker.
“And once the victim plays this clip, voluntarily or involuntarily, the attacker can manipulate your Siri to do something, for example, open your door,” explained Xia.
Testing out the ultrasonic acoustic attack, researchers found that, to varying degrees, Apple’s Siri, Google’s Assistant, Microsoft’s Cortana, and Amazon’s Alexa were all vulnerable to NUIT attacks.
Of the two attacks, NUIT-1 proved to be the most difficult to carry out undetected because it required the same device to both “hear” and silently “respond” to the ultrasonic signals.
With NUIT-1, only Apple’s Siri voice assistant could successfully receive silent instructions and provide inaudible responses. However, these end-to-end silent attacks were limited to the iPhone X, XR, and 8.
The 2021 MacBook Pro and 2017 MacBook Air could fall victim to the initial ultrasonic acoustic attack. However, the devices still provided audible responses, potentially alerting a victim that something nefarious could be afoot.
Google’s Assistant, Microsoft’s Cortana, and Amazon’s Alexa were also vulnerable to NUIT-1’s inaudible attack signals. However, like the MacBooks, the devices tested using these popular voice assistants, including the Samsung Galaxy S8, S9, A10e, and Amazon’s first-generation Echo Dot, all provided audible responses to attack prompts.
The NUIT-1 attack was unsuccessful against Apple Watch 3, Google Pixel 3, Galaxy Tab S4, LG Think Q V35, and Google Home 1.
The NUIT-2 attack, which uses a SurfingAttack on the speaker of one device to control another smart device, proved to be a far more formidable foe.
Of 17 smart devices tested, only two were impervious to NUIT-2’s silent attack and subsequent issuing of inaudible commands.
The Dell Inspiron 15 could be successfully attacked by NUIT-2. However, the device would still issue audible commands to the silent prompts.
Of all the smart devices, only the nearly 10-year-old Apple iPhone 6 Plus was invulnerable to both NUIT-1 and NUIT-2 attacks.
Researchers say it’s likely that the iPhone 6 could not be hijacked by the ultrasonic acoustic attack because the device uses a low-gain amplifier, whereas the more recent iPhones tested use a high-gain amplifier.
The determining factor for whether a device was susceptible to the NUIT-1 attack appeared to be based on how close a device’s speaker and microphone are to each other.
“In part, this highlights a design flaw with smartphones where the speaker and microphone are located next to each other,” Chen told The Register. “This is a hardware design problem, not a software problem.”
The Debrief recently reported on a new classified program launched by the Defense Advanced Research Projects Agency (DARPA), aimed at developing new methods for quickly identifying vulnerabilities in commercial cyber-physical systems, like those exploited by ultrasonic acoustic attacks.
Researchers hope their recent work will draw attention to how technology meant to make life easier, such as digital voice assistants or smart home devices, can also be used for harm by industrious criminals.
Additionally, manufacturers can use this recent research to develop tools to defend against ultrasonic acoustic attacks.
Chen, Xia, and Xu will publish their research and demonstrate the NUIT attacks at the USENIX Security Symposium in August.
Tim McMillan is a retired law enforcement executive, investigative reporter and co-founder of The Debrief. His writing typically focuses on defense, national security, the Intelligence Community and topics related to psychology. You can follow Tim on Twitter: @LtTimMcMillan. Tim can be reached by email: firstname.lastname@example.org or through encrypted email: LtTimMcMillan@protonmail.com