An End-To-End Bayesian Segmentation Network Based on a Generative Adversarial Network for Remote Sensing Images

被引:20
作者
Xiong, Dehui [1 ]
He, Chu [1 ,2 ]
Liu, Xinlong [1 ]
Liao, Mingsheng [2 ]
机构
[1] Wuhan Univ, Elect & Informat Sch, Wuhan 430072, Peoples R China
[2] Wuhan Univ, State Key Lab Informat Engn Surveying Mapping & R, Wuhan 430079, Peoples R China
基金
中国国家自然科学基金;
关键词
image semantic segmentation; Bayesian; generative adversarial networks (GAN); fully convolutional networks (FCN); synthetic aperture radar (SAR);
D O I
10.3390/rs12020216
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
Due to the development of deep convolutional neural networks (CNNs), great progress has been made in semantic segmentation recently. In this paper, we present an end-to-end Bayesian segmentation network based on generative adversarial networks (GANs) for remote sensing images. First, fully convolutional networks (FCNs) and GANs are utilized to realize the derivation of the prior probability and the likelihood to the posterior probability in Bayesian theory. Second, the cross-entropy loss in the FCN serves as an a priori to guide the training of the GAN, so as to avoid the problem of mode collapse during the training process. Third, the generator of the GAN is used as a teachable spatial filter to construct the spatial relationship between each label. Some experiments were performed on two remote sensing datasets, and the results demonstrate that the training of the proposed method is more stable than other GAN based models. The average accuracy and mean intersection (MIoU) of the two datasets were 0.0465 and 0.0821, and 0.0772 and 0.1708 higher than FCN, respectively.
引用
收藏
页数:21
相关论文
共 40 条
[1]  
[Anonymous], INT SOC OPT PHOTONIC
[2]  
[Anonymous], P DAGM C PATT REC DA
[3]  
[Anonymous], 2014, Comput. Sci.
[4]  
[Anonymous], ICML
[5]  
[Anonymous], 2016, ARXIV160401685
[6]  
[Anonymous], 2012, Advances in neural information processing systems, DOI DOI 10.5555/2999325.2999452
[7]  
[Anonymous], 2015, PROC CVPR IEEE
[8]  
[Anonymous], 2015, COMP VIS WINT WORKSH
[9]  
[Anonymous], ADV NEURAL INFORM PR
[10]  
[Anonymous], ARXIV10074969