Bayesian Deep Learning for Hyperspectral Image Classification With Low Uncertainty

被引:0
作者
He, Xin [1 ]
Chen, Yushi [1 ]
Huang, Lingbo [1 ]
机构
[1] Harbin Inst Technol, Sch Elect & Informat Engn, Harbin 150001, Peoples R China
来源
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING | 2023年 / 61卷
关键词
Bayesian neural network; deep learning; hyperspectral image (HSI) classification; uncertainty estimation; FEATURE-EXTRACTION; SPATIAL CLASSIFICATION; CNN;
D O I
10.1109/TGRS.2023.3257865
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
In recent years, deep learning models have been widely used for hyperspectral image (HSI) classification, and most of the existing deep-learning-based methods merely focused on high classification accuracy. However, in real applications, classification with low uncertainty matters as much as accurate classification. Unfortunately, the existing methods fail to consider uncertainty. To tackle this challenge, for the first time, Bayesian deep learning (BDL) is investigated to analyze the model uncertainty for HSI classification. Specifically, first, at the feature extraction (FE) stage, an HSI classification framework based on BDL, which contains two Bayesian Gabor layers and a global pooling layer (i.e., BDL-G222), is proposed. In BDL-G222, parameters in Gabor layers are sampled from the Gaussian distribution. The proposed BDL-G222 not only provides the uncertainty estimation but also strengthens the structure characteristic (i.e., texture) of HSI. Second, to model the uncertainty at the final classification stage, BDL-G222 is combined with a Bayesian fully connected layer (BFL) (i.e., BDL-G222-BFL), where the parameters' distribution is adjusted adaptively. In the proposed BDL-G222-BFL, the uncertainty at FE and classification stages is captured, and a whole uncertainty estimation framework is established. Experimental results on the three public HSI datasets demonstrate the superiority in both accuracy and uncertainty. The proposed BDL-based methods pioneer a new direction and provide useful inspiration and experience for practical applications.
引用
收藏
页数:16
相关论文
共 49 条
[11]  
Gal Y, 2016, PR MACH LEARN RES, V48
[12]   Hyperspectral Image Classification Method Based on 2D-3D CNN and Multibranch Feature Fusion [J].
Ge, Zixian ;
Cao, Guo ;
Li, Xuesong ;
Fu, Peng .
IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2020, 13 :5776-5788
[13]   New Frontiers in Spectral-Spatial Hyperspectral Image Classification The latest advances based on mathematical morphology, Markov random fields, segmentation, sparse representation, and deep learning [J].
Ghamisi, Pedram ;
Maggiori, Emmanuel ;
Li, Shutao ;
Souza, Roberto ;
Tarabalka, Yuliya ;
Moser, Gabriele ;
De Giorgi, Andrea ;
Fang, Leyuan ;
Chen, Yushi ;
Chi, Mingmin ;
Serpico, Sebastiano B. ;
Benediktsson, Jon Atli .
IEEE GEOSCIENCE AND REMOTE SENSING MAGAZINE, 2018, 6 (03) :10-43
[14]   Hyperspectral image classification using an extended Auto-Encoder method [J].
Ghasrodashti, Elham Kordi ;
Sharma, Nabin .
SIGNAL PROCESSING-IMAGE COMMUNICATION, 2021, 92
[15]   Comparison of texture features based on Gabor filters [J].
Grigorescu, SE ;
Petkov, N ;
Kruizinga, P .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2002, 11 (10) :1160-1167
[16]  
Guo CA, 2017, PR MACH LEARN RES, V70
[17]   Recent Advances on Spectral-Spatial Hyperspectral Image Classification: An Overview and New Guidelines [J].
He, Lin ;
Li, Jun ;
Liu, Chenying ;
Li, Shutao .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2018, 56 (03) :1579-1597
[18]   Modifications of the Multi-Layer Perceptron for Hyperspectral Image Classification [J].
He, Xin ;
Chen, Yushi .
REMOTE SENSING, 2021, 13 (17)
[19]   Spatial-Spectral Transformer for Hyperspectral Image Classification [J].
He, Xin ;
Chen, Yushi ;
Lin, Zhouhan .
REMOTE SENSING, 2021, 13 (03) :1-22
[20]   Heterogeneous Transfer Learning for Hyperspectral Image Classification Based on Convolutional Neural Network [J].
He, Xin ;
Chen, Yushi ;
Ghamisi, Pedram .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2020, 58 (05) :3246-3263