Statistical Loss and Analysis for Deep Learning in Hyperspectral Image Classification

被引:50
作者
Gong, Zhiqiang [1 ,2 ]
Zhong, Ping [1 ]
Hu, Weidong [1 ]
机构
[1] Natl Univ Def Technol, Coll Elect Sci & Technol, Natl Key Lab Sci & Technol ATR, Changsha 410073, Peoples R China
[2] Chinese Acad Mil Sci, Natl Innovat Inst Def Technol, Beijing 100000, Peoples R China
关键词
Hyperspectral imaging; Training; Feature extraction; Deep learning; Probabilistic logic; Data models; Convolutional neural networks (CNNs); deep learning; diversity; hyperspectral image classification; statistical loss; CNN;
D O I
10.1109/TNNLS.2020.2978577
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Nowadays, deep learning methods, especially the convolutional neural networks (CNNs), have shown impressive performance on extracting abstract and high-level features from the hyperspectral image. However, the general training process of CNNs mainly considers the pixelwise information or the samples' correlation to formulate the penalization while ignores the statistical properties especially the spectral variability of each class in the hyperspectral image. These sample-based penalizations would lead to the uncertainty of the training process due to the imbalanced and limited number of training samples. To overcome this problem, this article characterizes each class from the hyperspectral image as a statistical distribution and further develops a novel statistical loss with the distributions, not directly with samples for deep learning. Based on the Fisher discrimination criterion, the loss penalizes the sample variance of each class distribution to decrease the intraclass variance of the training samples. Moreover, an additional diversity-promoting condition is added to enlarge the interclass variance between different class distributions, and this could better discriminate samples from different classes in the hyperspectral image. Finally, the statistical estimation form of the statistical loss is developed with the training samples through multivariant statistical analysis. Experiments over the real-world hyperspectral images show the effectiveness of the developed statistical loss for deep learning.
引用
收藏
页码:322 / 333
页数:12
相关论文
共 39 条
[1]   Nonparametric Coupled Bayesian Dictionary and Classifier Learning for Hyperspectral Classification [J].
Akhtar, Naveed ;
Mian, Ajmal .
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (09) :4038-4050
[2]  
[Anonymous], 2007, APPL MULTIVARIATE ST
[3]  
[Anonymous], 1993, PROBABILITY STAT ENG
[4]  
[Anonymous], INDIAN PINES
[5]   Learning visual similarity for product design with convolutional neural networks [J].
Bell, Sean ;
Bala, Kavita .
ACM TRANSACTIONS ON GRAPHICS, 2015, 34 (04)
[6]   Deep Feature Extraction and Classification of Hyperspectral Images Based on Convolutional Neural Networks [J].
Chen, Yushi ;
Jiang, Hanlu ;
Li, Chunyang ;
Jia, Xiuping ;
Ghamisi, Pedram .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2016, 54 (10) :6232-6251
[7]   Exploring Hierarchical Convolutional Features for Hyperspectral Image Classification [J].
Cheng, Gong ;
Li, Zhenpeng ;
Han, Junwei ;
Yao, Xiwen ;
Guo, Lei .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2018, 56 (11) :6712-6722
[8]   A CNN With Multiscale Convolution and Diversified Metric for Hyperspectral Image Classification [J].
Gong, Zhiqiang ;
Zhong, Ping ;
Yu, Yang ;
Hu, Weidong ;
Li, Shutao .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2019, 57 (06) :3599-3618
[9]   Diversity-Promoting Deep Structural Metric Learning for Remote Sensing Scene Classification [J].
Gong, Zhiqiang ;
Zhong, Ping ;
Yu, Yang ;
Hu, Weidong .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2018, 56 (01) :371-390
[10]   SELECTION OF VARIABLES IN DISCRIMINANT-ANALYSIS BY F-STATISTIC AND ERROR RATE [J].
HABBEMA, JDF ;
HERMANS, J .
TECHNOMETRICS, 1977, 19 (04) :487-493