Voice-based virtual assistants are susceptible to hacking, like most online tools. Researchers at the Michigan State University's College of Engineering now claim to have discovered a new way of hacking into a person's personal device by engaging virtual assistants like Apple's Siri and the Google Assistant through ultrasonic soundwaves. The research paper was presented by scientists from Michigan State University's Department of Science and Engineering at the Network and Distributed System Security Symposium on February 24. The researchers used inaudible ultrasonic vibrations to give commands to the virtual assistants, since a smartphone's microphone can pick up sounds inaudible to the human ear.
The study, named SurfingAttack, was headed by Michigan State University's Assistant Professor in the Department of Computer Science and Engineering, Qiben Yan. Yan told a website called TechXplore that the research team used inaudible vibrations that can be sent through any hard and flat surface (especially tabletops) to activate voice assistants up to 30 feet away.
The researchers placed mobile phones on a hard surface and ran ultrasonic waves from across the surface. The ultrasonic waves engaged the virtual assistant; as a smartphone's microphone was detect inaudible sounds. Hackers could use ultrasonic waves to secretly control the voice assistants on smartphones, which are usually activated using phrases like 'OK Google' or 'Hey Siri' as wake up words. Then, the attacker can give whichever command he/she pleases. This is dangerous since there are a lot of malicious use cases that a hacker can engage in by using this technique. Some examples were placing a malicious call to a friend or family or cancelling meetings, or activating smart home appliances.
To come to the conclusion, the researchers used piezoelectric transducer under a table or charging station which converted electricity into ultrasonic vibrations. The technique worked for 15 out of the total 17 smartphones used by the researchers. They used 17 different phone models to test this technique, out of which it worked on the following: iPhone 5, iPhone 5S, iPhone 6, iPhone X, Google Pixel, Pixel 2, Pixel 3, Xiaomi Mi 5, Xiaomi Mi 8, Xiaomi Mi 8 Lite, Samsung Galaxy S7, Galaxy S9 and the Honor View 8. "This research exposes the insecurity within smartphone voice assistants," a co-author told TechXplore. The researchers also advised people to be wary of public charging stations.
Now, this is not the first time such a vulnerability has surfaced. It was first discovered almost three years ago, in 2017 by researchers at China's Zheijiang University. It was named as the 'Dolphin Attack' at that time, since Dolphins use ultrasound to navigate. In the Dolphin attack also, the attackers used frequencies inaudible by a human ear to engage virtual assistants in smartphones and smart speakers.
Is Redmi Note 9 Pro the new best phone under Rs. 15,000? We discussed how you can pick the best one, on Orbital, our weekly technology podcast, which you can subscribe to via Apple Podcasts or RSS, download the episode, or just hit the play button below.
For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.