Instance-Aware Distillation for Efficient Object Detection in Remote Sensing Images

被引:3
作者
Li, Cong [1 ,2 ]
Cheng, Gong [1 ,2 ]
Wang, Guangxing [1 ,2 ]
Zhou, Peicheng [3 ]
Han, Junwei [1 ,2 ]
机构
[1] Northwestern Polytech Univ, Sch Automat, Xian 710129, Peoples R China
[2] Northwestern Polytech Univ Shenzhen, Res & Dev Inst, Shenzhen 518057, Peoples R China
[3] Xidian Univ, Sch Telecommun Engn, State Key Lab Integrated Serv Networks, Xian 710071, Peoples R China
来源
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING | 2023年 / 61卷
基金
美国国家科学基金会;
关键词
Remote sensing; Object detection; Detectors; Feature extraction; Task analysis; Behavioral sciences; Sensors; Knowledge distillation; object detection; remote sensing images; NETWORK;
D O I
10.1109/TGRS.2023.3238801
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Practical applications ask for object detection models that achieve high performance at low overhead. Knowledge distillation demonstrates favorable potential in this case by transferring knowledge from a cumbersome teacher model to a lightweight student model. However, previous distillation methods are plagued with massive misleading background information in remote sensing images and ignore investigating the relationships between different instances. In this article, we propose an instance-aware distillation (InsDist for short) method to derive efficient remote sensing object detectors. Our InsDist combines feature-based and relation-based knowledge distillation to make the most of instance-related information in the knowledge transfer from the teacher to the student. On one hand, we propose a parameter-free masking module to decouple instance-related foreground from instance-irrelevant background in multiscale features. On the other hand, we construct the relationships between different instances to enhance the learning of intraclass compactness and interclass dispersion. The student comprehensively imitates both features and relationships from the teacher, yielding considerable effectiveness in dealing with complex remote sensing images. In addition, our InsDist can be easily built on mainstream object detectors with negligible extra cost. Extensive experiments on two large-scale remote sensing object detection datasets, namely DIOR and DOTA, show that our InsDist obtains noticeable gains over other distillation methods for both one-stage and two-stage, as well as both anchor-based and anchor-free detectors. The source code will be publicly available at https://github.com/swift1988/InsDist
引用
收藏
页数:11
相关论文
共 64 条
[1]   Variational Information Distillation for Knowledge Transfer [J].
Ahn, Sungsoo ;
Hu, Shell Xu ;
Damianou, Andreas ;
Lawrence, Neil D. ;
Dai, Zhenwen .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, :9155-9163
[2]  
Alvarez JM, 2016, ADV NEUR IN, V29
[3]  
[Anonymous], 2016, P 2016 IEEE C COMP V
[4]   Cascade R-CNN: Delving into High Quality Object Detection [J].
Cai, Zhaowei ;
Vasconcelos, Nuno .
2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, :6154-6162
[5]  
Chen GB, 2017, ADV NEUR IN, V30
[6]  
Chen K, 2019, Arxiv, DOI arXiv:1906.07155
[7]   You Only Look One-level Feature [J].
Chen, Qiang ;
Wang, Yingming ;
Yang, Tong ;
Zhang, Xiangyu ;
Cheng, Jian ;
Sun, Jian .
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, :13034-13043
[8]   Class attention network for image recognition [J].
Cheng, Gong ;
Lai, Pujian ;
Gao, Decheng ;
Han, Junwei .
SCIENCE CHINA-INFORMATION SCIENCES, 2023, 66 (03)
[9]   Anchor-Free Oriented Proposal Generator for Object Detection [J].
Cheng, Gong ;
Wang, Jiabao ;
Li, Ke ;
Xie, Xingxing ;
Lang, Chunbo ;
Yao, Yanqing ;
Han, Junwei .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
[10]   ISNet: Towards Improving Separability for Remote Sensing Image Change Detection [J].
Cheng, Gong ;
Wang, Guangxing ;
Han, Junwei .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60