Universal Physical Camouflage Attacks on Object Detectors

被引:97
作者
Huang, Lifeng [1 ,2 ]
Gao, Chengying [1 ]
Zhou, Yuyin [3 ]
Xie, Cihang [3 ]
Yuille, Alan L. [3 ]
Zou, Changqing [4 ,5 ]
Liu, Ning [1 ,2 ]
机构
[1] Sun Yat Sen Univ, Sch Data & Comp Sci, Guangzhou, Peoples R China
[2] Guangdong Key Lab Informat Secur Technol, Guangzhou, Peoples R China
[3] Johns Hopkins Univ, Dept Comp Sci, Baltimore, MD 21218 USA
[4] Max Planck Inst Informat, Berlin, Germany
[5] Univ Maryland, College Pk, MD 20742 USA
来源
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR) | 2020年
基金
中国国家自然科学基金;
关键词
D O I
10.1109/CVPR42600.2020.00080
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we study physical adversarial attacks on object detectors in the wild. Previous works mostly craft instance-dependent perturbations only for rigid or planar objects. To this end, we propose to learn an adversarial pattern to effectively attack all instances belonging to the same object category, referred to as Universal Physical Camouflage Attack (UPC). Concretely, UPC crafts camouflage by jointly fooling the region proposal network, as well as misleading the classifier and the regressor to output errors. In order to make UPC effective for non-rigid or non-planar objects, we introduce a set of transformations for mimicking deformable properties. We additionally impose optimization constraint to make generated patterns look natural to human observers. To fairly evaluate the effectiveness of different physical-world attacks, we present the first standardized virtual database, AttackScenes, which simulates the real 3D world in a controllable and reproducible environment. Extensive experiments suggest the superiority of our proposed UPC compared with existing physical adversarial attackers not only in virtual environments (AttackScenes), but also in real-world physical environments.
引用
收藏
页码:717 / 726
页数:10
相关论文
共 47 条
[1]  
[Anonymous], 2018, COMPUTER VISION PATT
[2]  
[Anonymous], 2017, PROC NEURIPS MACH LE
[3]  
[Anonymous], ARXIV181205271
[4]  
[Anonymous], 2017, Agonistic Mourning: Political Dissidence and the Women in Black
[5]  
[Anonymous], 2017, ARXIV170404861
[6]  
Bengio S., 2017, P ICLR
[7]   Practical Black-Box Attacks on Deep Neural Networks Using Efficient Query Mechanisms [J].
Bhagoji, Arjun Nitin ;
He, Warren ;
Li, Bo ;
Song, Dawn .
COMPUTER VISION - ECCV 2018, PT XII, 2018, 11216 :158-174
[8]   ShapeShifter: Robust Physical Adversarial Attack on Faster R-CNN Object Detector [J].
Chen, Shang-Tse ;
Cornelius, Cory ;
Martin, Jason ;
Chau, Duen Horng .
MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2018, PT I, 2019, 11051 :52-68
[9]  
Dai JF, 2016, ADV NEUR IN, V29
[10]  
Evtimov I., 2017, Robust physical-world attacks on deep learning models