Cross-Domain Association Mining Based Generative Adversarial Network for Pansharpening

被引:6
作者
He, Lijun [1 ]
Zhang, Wanyue [1 ]
Shi, Jiankang [2 ]
Li, Fan [1 ]
机构
[1] Xi An Jiao Tong Univ, Sch Informat & Commun Engn, Shaanxi Key Lab Deep Space Explorat Intelligent I, Xian 710049, Peoples R China
[2] Xi An Jiao Tong Univ, Sch Automat Sci & Engn, Xian 710049, Peoples R China
基金
中国国家自然科学基金;
关键词
Pansharpening; Generators; Feature extraction; Spatial resolution; Junctions; Generative adversarial networks; Superresolution; Deep learning; dual discriminators; image association; multispectral (MS) pansharpening; PAN-SHARPENING METHOD; PANCHROMATIC IMAGES; FUSION;
D O I
10.1109/JSTARS.2022.3204824
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Multispectral (MS) pansharpening can improve the spatial resolution of MS images, which plays an increasingly important role in agriculture and environmental monitoring. Existing neural network-based methods tend to focus on global features of images, without considering the inherent relationships between similar substances in MS images. However, there is a high probability that different substances at the junction mix with each other, which leads to spectral distortion in the final pansharpened image. In this article, we propose a cross-domain association mining-based generative adversarial network for pansharpening, which consists of a spectral fidelity generator and dual discriminators. In our spectral fidelity generator, the cross-region similarity attention module is designed to establish dependencies between similar substances at different positions in the image, thereby leveraging the similar spectral features to generate pansharpened images with better spectral preservation. To mine the potential relationship between the MS image domain and the panchromatic image domain, we pretrain a spatial information extraction network. The network is then transferred to the dual-discriminator architecture to obtain the spatial information of the pansharpened images more accurately and prevent the loss of spatial details. The experimental results show that our method outperforms several state-of-the-art pansharpening methods in both quantitative and qualitative evaluations.
引用
收藏
页码:7770 / 7783
页数:14
相关论文
共 57 条
[1]   MTF-tailored multiscale fusion of high-resolution MS and pan imagery [J].
Aiazzi, B. ;
Alparone, L. ;
Baronti, S. ;
Garzelli, A. ;
Selva, M. .
PHOTOGRAMMETRIC ENGINEERING AND REMOTE SENSING, 2006, 72 (05) :591-596
[2]   Multispectral and panchromatic data fusion assessment without reference [J].
Alparone, Luciano ;
Alazzi, Bruno ;
Baronti, Stefano ;
Garzelli, Andrea ;
Nencini, Filippo ;
Selva, Massimo .
PHOTOGRAMMETRIC ENGINEERING AND REMOTE SENSING, 2008, 74 (02) :193-200
[3]   Comparison of pansharpening algorithms: Outcome of the 2006 GRS-S data-fusion contest [J].
Alparone, Luciano ;
Wald, Lucien ;
Chanussot, Jocelyn ;
Thomas, Claire ;
Gamba, Paolo ;
Bruce, Lori Mann .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2007, 45 (10) :3012-3021
[4]   Spatial Methods for Multispectral Pansharpening: Multiresolution Analysis Demystified [J].
Alparone, Luciano ;
Baronti, Stefano ;
Aiazzi, Bruno ;
Garzelli, Andrea .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2016, 54 (05) :2563-2576
[5]  
[Anonymous], 2015, 3 INT C LEARN REPR I
[6]  
CARPER WJ, 1990, PHOTOGRAMM ENG REM S, V56, P459
[7]   A New Adaptive Component-Substitution-Based Satellite Image Fusion by Using Partial Replacement [J].
Choi, Jaewan ;
Yu, Kiyun ;
Kim, Yongil .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2011, 49 (01) :295-309
[8]   Detail Injection-Based Deep Convolutional Neural Networks for Pansharpening [J].
Deng, Liang-Jian ;
Vivone, Gemine ;
Jin, Cheng ;
Chanussot, Jocelyn .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2021, 59 (08) :6995-7010
[9]   Image Super-Resolution Using Deep Convolutional Networks [J].
Dong, Chao ;
Loy, Chen Change ;
He, Kaiming ;
Tang, Xiaoou .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2016, 38 (02) :295-307
[10]  
Garzelli A, 2005, INT GEOSCI REMOTE SE, P2838