Siri, Google Assistant, Alexa Said to Be Vulnerable to New 'Dolphin Attack'

Siri, Google Assistant, Alexa Said to Be Vulnerable to New 'Dolphin Attack'
Highlights
  • This hack uses ultrasound frequencies to attack AI assistants
  • Alexa, Siri, Google Assistant, Samsung S Voice are vulnerable to it
  • Researchers advice OEMs to stop supporting frequencies above 20KHz
Advertisement

Almost every other day researchers find new hacks to get into your smartphones, gadgets, and smart devices. Now, a group from China's Zheijiang University have found a new way to enter into your smartphones and open malicious websites and even gain access to connected smart devices if there are any using ultrasound frequencies. For this purpose, the hack takes advantage of what is known as the Dolphin Attack, where frequencies of over 20KHz are sent to a nearby smartphone to attack Siri, Google Assistant, Alexa, and other voice-based virtual assistants.

Microphones on a smartphone can catch frequencies above 20,000Hz, something that humans cannot hear. Voice assistants are able to catch these frequencies and take commands accordingly, without the knowledge of the owner. Taking advantage of this, the researchers translated human voice commands into ultrasound frequencies, and then played them back using very cheap components. The parts include a regular smartphone, an amplifier, an ultrasonic transducer, and a battery.

This method is said to work on all assistants like Siri, Alexa, Google Assistant, Samsung S Voice, Bixby, and on devices like iPad, MacBook, Amazon Echo, Audi Q3, and more. What makes this attack scary is that speech recognition can recognise this frequency easily on any device mentioned in the report, and this attack works even if the necessary security measures are in place. Furthermore, the attacker doesn't need any device access as well to carry out this hack. To demonstrate the effectiveness of the attack, researchers showed how they changed the navigation on an Audi Q3, and successfully commanded smartphones to do tasks like 'open dolphinattack.com', and 'open the back door'.

However, for this hack to work, there are certain preconditions. Firstly, the smartphone to which the signal needs to be transmitted to, has to be in a range of five to six feet from the transmitter, and not more. Also, in case of Siri and Google Assistant, you have to have the assistant activated. Furthermore, the user is immediately alerted as these assistants make a tone or reply back to these frequency commands. Therefore, the conditions include that the device has to be very near, the user has to leave their phone unlocked and voice assistant activated, and not be around the smartphone, or be distracted. All of these together are a very unlikely scenario, but something that can be worked around if the hacker seriously looks to do harm.

In any case, the researchers claim that the only simple solution for this hack, is for device makers to programme their AI assistants to ignore frequencies above 20kHz or cancel out any frequencies that humans cannot understand. Until then, keep your voice assistants deactivated, if you're really paranoid.

Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Facebook May Open Office in Shanghai to Make Hardware: Report
US House Unanimously Approves Sweeping Self-Driving Car Measure
Facebook Gadgets360 Twitter Share Tweet Snapchat LinkedIn Reddit Comment google-newsGoogle News

Advertisement

Follow Us
© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »