Email
Password
Remember meForgot password?
    Log in with Twitter

article imageSimple 'Dolphin' flaw trivialises hacking Siri and Alexa

By James Walker     Sep 7, 2017 in Technology
A simple but serious vulnerability in the design of modern smartphone assistants could allow attackers to pull off silent hacks. Researchers found digital assistants including Siri and Alexa will respond to ultrasound commands inaudible to humans.
The discovery was made by scientists at China's Zheijiang University. As FastCompany reports, the "DolphinAttack" works by converting regular voice commands into ultrasonic counterparts.
Ultrasound frequencies are too high-pitched for the human ear to hear. The microphones inside smartphones, smart speakers and other telecoms equipment are capable of registering them though. Because the underlying sound data remains the same, the assistant software doesn't notice any difference. It runs the command as normal.
In testing, the researchers succeeded in forcing a range of digital assistants to run different commands. They could tell Siri to call a phone number, get Google Now to look up restaurant info or ask Alexa the latest weather.
More sinisterly, the trivial attack could be used to hijack smart home devices, purchase products from Amazon or even take control of cars. The researchers used the inaudible commands to change the navigation directions on an Audi Q3, indicating how widespread the problem is. Although Siri and Alexa are dominating headlines about the discovery, the flaw seems to apply to every commercial digital assistant.
READ NEXT: Will AI really prompt World War III?
The team concentrated on five distinct attacks. These were spying on a target using hardware sensors, injecting fake information onto the device, disabling wireless functionality, adjusting hardware settings and visiting a malicious website. This latter attack could be used to force the download of regular malware.
"We have tested these attacks on 16 [voice control system] models including Apple iPhone, Google Nexus, Amazon Echo, and automobiles," the researchers wrote in their paper. "Each attack is successful on at least one [speech recognition] system. We believe this list is by far not comprehensive. Nevertheless, it serves as a wake-up call to reconsider what functionality and levels of human interaction shall be supported in voice controllable systems."
Right now, there is no indication that any of the apps are being updated to patch the vulnerability. This could be done in either of two ways. Phone makers could check what frequency a command was issued at and choose to block it. Alternatively, microphone input outside the audible ranges could be entirely ignored, ensuring the problem is fixed for every app on the device.
Although the flaw is simple and very widespread, it's unlikely it will be used for any serious attacks. To successfully issue a command, an attacker would have to be physically close to you so the device could hear the signal. Most smartphone assistants also require you unlock your device before allowing potentially sensitive commands, giving you at least some protection against DolphinAttack.
More about Cybersecurity, voice control, Smartphones, digital assistants, siri