Unified Partial Configuration Model Framework for Fast Partially Occluded Object Detection in High-Resolution Remote Sensing Images

被引:4
作者
Qiu, Shaohua [1 ]
Wen, Gongjian [1 ]
Liu, Jia [2 ]
Deng, Zhipeng [3 ]
Fan, Yaxiang [1 ]
机构
[1] Natl Univ Def Technol, Sci & Technol Automat Target Recognit Lab, Changsha 410073, Hunan, Peoples R China
[2] Natl Univ Def Technol, Coll Meteorol & Oceanol, Changsha 410073, Hunan, Peoples R China
[3] Natl Univ Def Technol, Coll Elect Sci, Changsha 410073, Hunan, Peoples R China
基金
中国国家自然科学基金;
关键词
high-resolution remote sensing images; partially occluded object detection; partial configuration model; unified detection framework; part sharing; deformable part-based model; VEHICLE DETECTION; CLASSIFICATION;
D O I
10.3390/rs10030464
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
Partially occluded object detection (POOD) has been an important task for both civil and military applications that use high-resolution remote sensing images (HR-RSIs). This topic is very challenging due to the limited object evidence for detection. Recent partial configuration model (PCM) based methods deal with occlusion yet suffer from the problems of massive manual annotation, separate parameter learning, and low training and detection efficiency. To tackle this, a unified PCM framework (UniPCM) is proposed in this paper. The proposed UniPCM adopts a part sharing mechanism which directly shares the root and part filters of a deformable part-based model (DPM) among different partial configurations. It largely reduces the convolution overhead during both training and detection. In UniPCM, a novel DPM deformation deviation method is proposed for spatial interrelationship estimation of PCM, and a unified weights learning method is presented to simultaneously obtain the weights of elements within each partial configuration and the weights between partial configurations. Experiments on three HR-RSI datasets show that the proposed UniPCM method achieves a much higher training and detection efficiency for POOD compared with state-of-the-art PCM-based methods, while maintaining a comparable detection accuracy. UniPCM obtains a training speedup of maximal 10x and 2.5x for airplane and ship, and a detection speedup of maximal 7.2x, 4.1x and 2.5x on three test sets, respectively.
引用
收藏
页数:23
相关论文
共 35 条
[1]  
[Anonymous], IEEE T PATTERN ANAL
[2]  
[Anonymous], 2017, IEEE Geosci. Remote Sens. Lett.
[3]  
[Anonymous], 2001, INTRO GRAPH THEORY
[4]   A Decision Mixture Model-Based Method for Inshore Ship Detection Using High-Resolution Remote Sensing Images [J].
Bi, Fukun ;
Chen, Jing ;
Zhuang, Yin ;
Bian, Mingming ;
Zhang, Qingjun .
SENSORS, 2017, 17 (07)
[5]   Airport Detection Using End-to-End Convolutional Neural Network with Hard Example Mining [J].
Cai, Bowen ;
Jiang, Zhiguo ;
Zhang, Haopeng ;
Zhao, Danpei ;
Yao, Yuan .
REMOTE SENSING, 2017, 9 (11)
[6]   A Unified Multi-scale Deep Convolutional Neural Network for Fast Object Detection [J].
Cai, Zhaowei ;
Fan, Quanfu ;
Feris, Rogerio S. ;
Vasconcelos, Nuno .
COMPUTER VISION - ECCV 2016, PT IV, 2016, 9908 :354-370
[7]   Vehicle Detection in Satellite Images by Hybrid Deep Convolutional Neural Networks [J].
Chen, Xueyun ;
Xiang, Shiming ;
Liu, Cheng-Lin ;
Pan, Chun-Hong .
IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2014, 11 (10) :1797-1801
[8]   Learning Rotation-Invariant Convolutional Neural Networks for Object Detection in VHR Optical Remote Sensing Images [J].
Cheng, Gong ;
Zhou, Peicheng ;
Han, Junwei .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2016, 54 (12) :7405-7415
[9]   Multi-class geospatial object detection and geographic image classification based on collection of part detectors [J].
Cheng, Gong ;
Han, Junwei ;
Zhou, Peicheng ;
Guo, Lei .
ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2014, 98 :119-132
[10]   Object detection in remote sensing imagery using a discriminatively trained mixture model [J].
Cheng, Gong ;
Han, Junwei ;
Guo, Lei ;
Qian, Xiaoliang ;
Zhou, Peicheng ;
Yao, Xiwen ;
Hu, Xintao .
ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2013, 85 :32-43