Remote Sensing Image Scene Classification Using Rearranged Local Features

被引:109
作者
Yuan, Yuan [1 ]
Fang, Jie [2 ,3 ]
Lu, Xiaoqiang [1 ]
Feng, Yachuang [1 ]
机构
[1] Chinese Acad Sci, Xian Inst Opt & Precis Mech, Ctr OPT IMagery Anal & Learning, Xian 710119, Shaanxi, Peoples R China
[2] Chinese Acad Sci, Xian Inst Opt & Precis Mech, Xian 710119, Shaanxi, Peoples R China
[3] Univ Chinese Acad Sci, Beijing 100049, Peoples R China
来源
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING | 2019年 / 57卷 / 03期
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Feature fusion; rearranged local features; remote sensing image; representation scene classification; TEXTURE FEATURES; REPRESENTATION; DESCRIPTORS; RETRIEVAL; SCALE;
D O I
10.1109/TGRS.2018.2869101
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Remote sensing image scene classification is a fundamental problem, which aims to label an image with a specific semantic category automatically. Recently, deep learning methods have achieved competitive performance for remote sensing image scene classification, especially the methods based on a convolutional neural network (CNN). However, most of the existing CNN methods only use feature vectors of the last fully connected layer. They give more importance to global information and ignore local information of images. It is common that some images belong to different categories, although they own similar global features. The reason is that the category of an image may be highly related to local features, other than the global feature. To address this problem, a method based on rearranged local features is proposed in this paper. First, outputs of the last convolutional layer and the last fully connected layer are employed to depict the local and global information, respectively. After that, the remote sensing images are clustered to several collections using their global features. For each collection, local features of an image are rearranged according to their similarities with local features of the cluster center. In addition, a fusion strategy is proposed to combine global and local features for enhancing the image representation. The proposed method surpasses the state of the arts on four public and challenging data sets: UC-Merced, WHU-RS19, Sydney, and AID.
引用
收藏
页码:1779 / 1792
页数:14
相关论文
共 52 条
[1]  
[Anonymous], 2015, ACTA ECOL SIN
[2]  
[Anonymous], IEEE T PATTERN ANAL
[3]  
[Anonymous], P ISPRS TC 7 S 100 Y
[4]  
[Anonymous], 2017, P IEEE, DOI DOI 10.1109/JPROC.2017.2675998
[5]  
[Anonymous], 1995, STORAGE RETRIEVAL IM, DOI DOI 10.1117/12.205308
[6]  
[Anonymous], 2014, 2014 INT C LEARNING
[7]   Latent Dirichlet allocation [J].
Blei, DM ;
Ng, AY ;
Jordan, MI .
JOURNAL OF MACHINE LEARNING RESEARCH, 2003, 3 (4-5) :993-1022
[8]  
Bosch A, 2006, LECT NOTES COMPUT SC, V3954, P517
[9]   Deep Feature Fusion for VHR Remote Sensing Scene Classification [J].
Chaib, Souleyman ;
Liu, Huan ;
Gu, Yanfeng ;
Yao, Hongxun .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2017, 55 (08) :4775-4784
[10]   Learning Rotation-Invariant Convolutional Neural Networks for Object Detection in VHR Optical Remote Sensing Images [J].
Cheng, Gong ;
Zhou, Peicheng ;
Han, Junwei .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2016, 54 (12) :7405-7415