Pair-Wise Similarity Knowledge Distillation for RSI Scene Classification

被引:10
作者
Zhao, Haoran [1 ,2 ]
Sun, Xin [1 ,2 ,3 ]
Gao, Feng [1 ,2 ]
Dong, Junyu [1 ,2 ]
机构
[1] Ocean Univ China, Coll Informat Sci & Engn, Haide Coll, Qingdao 266100, Peoples R China
[2] Ocean Univ China, Inst Adv Ocean Study, Qingdao 266100, Peoples R China
[3] Tech Univ Munich, Dept Aerosp & Geodesy, D-80333 Munich, Germany
基金
中国国家自然科学基金;
关键词
knowledge distillation; imaging science; scene classification; geosciences; convolutional neural network; NETWORK;
D O I
10.3390/rs14102483
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
Remote sensing image (RSI) scene classification aims to identify the semantic categories of remote sensing images based on their contents. Owing to the strong learning capability of deep convolutional neural networks (CNNs), RSI scene classification methods based on CNNs have drawn much attention and achieved remarkable performance. However, such outstanding deep neural networks are usually computationally expensive and time-consuming, making them impossible to apply on resource-constrained edge devices, such as the embedded systems used on drones. To tackle this problem, we introduce a novel pair-wise similarity knowledge distillation method, which could reduce the model complexity while maintaining satisfactory accuracy, to obtain a compact and efficient deep neural network for RSI scene classification. Different from the existing knowledge distillation methods, we design a novel distillation loss to transfer the valuable discriminative information, which could reduce the within-class variations and restrain the between-class similarity, from the cumbersome model to the compact model. This method could obtain the compact student model with higher performance compared with existing knowledge distillation methods in RSI scene classification. To be specific, we distill the probability outputs between sample pairs with the same label and match the probability outputs between the teacher and student models. Experiments on three public benchmark datasets for RSI scene classification, i.e., AID, UCMerced, and NWPU-RESISC datasets, verify that the proposed method could effectively distill the knowledge and result in a higher performance.
引用
收藏
页数:17
相关论文
共 43 条
[1]  
Ba LJ, 2014, ADV NEUR IN, V27
[2]  
Bucilua C., 2006, P ACM SIGKDD INT C K
[3]   Compact Cloud Detection with Bidirectional Self-Attention Knowledge Distillation [J].
Chai, Yajie ;
Fu, Kun ;
Sun, Xian ;
Diao, Wenhui ;
Yan, Zhiyuan ;
Feng, Yingchao ;
Wang, Lei .
REMOTE SENSING, 2020, 12 (17)
[4]   Training Small Networks for Scene Classification of Remote Sensing Images via Knowledge Distillation [J].
Chen, Guanzhou ;
Zhang, Xiaodong ;
Tan, Xiaoliang ;
Cheng, Yufeng ;
Dai, Fan ;
Zhu, Kun ;
Gong, Yuanfu ;
Wang, Qing .
REMOTE SENSING, 2018, 10 (05)
[5]   Wasserstein Contrastive Representation Distillation [J].
Chen, Liqun ;
Wang, Dong ;
Gan, Zhe ;
Liu, Jingjing ;
Henao, Ricardo ;
Carin, Lawrence .
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, :16291-16300
[6]   When Deep Learning Meets Metric Learning: Remote Sensing Image Scene Classification via Learning Discriminative CNNs [J].
Cheng, Gong ;
Yang, Ceyuan ;
Yao, Xiwen ;
Guo, Lei ;
Han, Junwei .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2018, 56 (05) :2811-2821
[7]   Remote Sensing Image Scene Classification: Benchmark and State of the Art [J].
Cheng, Gong ;
Han, Junwei ;
Lu, Xiaoqiang .
PROCEEDINGS OF THE IEEE, 2017, 105 (10) :1865-1883
[8]   Histograms of oriented gradients for human detection [J].
Dalal, N ;
Triggs, B .
2005 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOL 1, PROCEEDINGS, 2005, :886-893
[9]  
Fan JY, 2017, IEEE T GEOSCI REMOTE, V55, P2250, DOI [10.1109/TGRS.2016.2640186, 10.1109/tgrs.2016.2640186]
[10]   A Multi-Level Semantic Scene Interpretation Strategy for Change Interpretation in Remote Sensing Imagery [J].
Ghazouani, Fethi ;
Farah, Imed Riadh ;
Solaiman, Basel .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2019, 57 (11) :8775-8795