Electronics (Apr 2023)

Inaudible Attack on AI Speakers

  • Seyitmammet Saparmammedovich Alchekov,
  • Mohammed Abdulhakim Al-Absi,
  • Ahmed Abdulhakim Al-Absi,
  • Hoon Jae Lee

DOI
https://doi.org/10.3390/electronics12081928
Journal volume & issue
Vol. 12, no. 8
p. 1928

Abstract

Read online

The modern world does not stand still. We used to be surprised that technology could speak, but now voice assistants have become real family members. They do not simply turn on the alarm clock or play music. They communicate with children, help solving problems, and sometimes even take offense. Since all voice assistants have artificial intelligence, when communicating with the user, they take into account the change in their location, time of day and days of the week, search query history, previous orders in the online store, etc. However, voice assistants, which are part of modern smartphones or smart speakers, pose a threat to their owner’s personal data since their main function is to capture audio commands from the user. Generally, AI smart speakers such as Siri, Google Assistance, Google Home, and so on are moderately harmless. As voice assistants become versatile, like any other product, they can be used for the most nefarious purposes. There are many common attacks that people with bad intentions can use to hack our voice assistant. We show in our experience that a laser beam can control Google Assistance, smart speakers, and Siri. The attacker does not need to make physical contact with the victim’s equipment or interact with the victim; since the attacker’s laser can hit the smart speaker, it can send commands. In our experiments, we achieve a successful attack that allows us to transmit invisible commands by aiming lasers up to 87 m into the microphone. We have discovered the possibility of attacking Android and Siri devices using the built-in voice assistant module through the charging port.

Keywords