Can You Hear It? Backdoor Attacks via Ultrasonic Triggers

被引:34
作者
Koffas, Stefanos [1 ]
Xu, Jing [1 ]
Conti, Mauro [1 ,2 ]
Picek, Stjepan [1 ,3 ]
机构
[1] Delft Univ Technol, Delft, Netherlands
[2] Univ Padua, Padua, Italy
[3] Radboud Univ Nijmegen, Nijmegen, Netherlands
来源
PROCEEDINGS OF THE 2022 ACM WORKSHOP ON WIRELESS SECURITY AND MACHINE LEARNIG (WISEML '22) | 2022年
关键词
Backdoor Attacks; Inaudible Trigger; Neural Networks;
D O I
10.1145/3522783.3529523
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This work explores backdoor attacks for automatic speech recognition systems where we inject inaudible triggers. By doing so, we make the backdoor attack challenging to detect for legitimate users and, consequently, potentially more dangerous. We conduct experiments on two versions of a speech dataset and three neural networks and explore the performance of our attack concerning the duration, position, and type of the trigger. Our results indicate that less than 1% of poisoned data is sufficient to deploy a backdoor attack and reach a 100% attack success rate. We observed that short, non-continuous triggers result in highly successful attacks. Still, since our trigger is inaudible, it can be as long as possible without raising any suspicions making the attack more effective. Finally, we conduct our attack on actual hardware and saw that an adversary could manipulate inference in an Android application by playing the inaudible trigger over the air.
引用
收藏
页码:57 / 62
页数:6
相关论文
empty
未找到相关数据