Multiresolution generative adversarial networks with bidirectional adaptive-stage progressive guided fusion for remote sensing image

被引:1
|
作者
Wu, Yuanyuan [1 ]
Li, Yuchun [1 ]
Huang, Mengxing [1 ,2 ,3 ]
Feng, Siling [1 ]
机构
[1] Hainan Univ, Sch Informat & Commun Engn, Haikou, Peoples R China
[2] Hainan Univ, State Key Lab Marine Resource Utilizat South China, Haikou, Peoples R China
[3] Hainan Univ, Sch Informat & Commun Engn, State Key Lab Marine Resource Utilizat South China, Haikou 570228, Peoples R China
基金
中国国家自然科学基金;
关键词
Remote sensing image fusion framework; adaptive-resolution generative adversarial networks; bidirectional adaptive-stage feature extraction; progressive guided fusion; multitask loss; SATELLITE IMAGES; QUALITY; LANDSAT; REFLECTANCE; FRAMEWORK; MS;
D O I
10.1080/17538947.2023.2241441
中图分类号
P9 [自然地理学];
学科分类号
0705 ; 070501 ;
摘要
Remote sensing image (RSI) with concurrently high spatial, temporal, and spectral resolutions cannot be produced by a single sensor. Multisource RSI fusion is a convenient technique to realize high spatial resolution multispectral (MS) images (spatial spectral fusion, i.e. SSF) and high temporal and spatial resolution MS images (spatiotemporal fusion, i.e. STF). Currently, deep learning-based fusion models can only implement SSF or STF, lacking models that perform both SSF and STF. Multiresolution generative adversarial networks with bidirectional adaptive-stage progressive guided fusion (BAPGF) for RSI are proposed to implement both SSF and STF, namely BPF-MGAN. A bidirectional adaptive-stage feature extraction architecture in fine-scale-to-coarse-scale and coarse-scale-to-fine-scale modes is introduced. The designed BAPGF introduces a previous fusion result-oriented cross-stage-level dual-residual attention fusion strategy to enhance critical information and suppress superfluous information. Adaptive resolution U-shaped discriminators are implemented to feed multiresolution context into the generator. A generalized multitask loss function unlimited by no-reference images is developed to strengthen the model via constraints on the multiscale feature, structural, and content similarities. The BPF-MGAN model is validated on SSF datasets and STF datasets. Compared with the state-of-the-art SSF and STF models, results demonstrate the superior performance of the proposed BPF-MGAN model in both subjective and objective evaluations.
引用
收藏
页码:2962 / 2997
页数:36
相关论文
共 22 条
  • [21] A self-supervised remote sensing image fusion framework with dual-stage self-learning and spectral super-resolution injection
    He, Jiang
    Yuan, Qiangqiang
    Li, Jie
    Xiao, Yi
    Zhang, Liangpei
    ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2023, 204 : 131 - 144
  • [22] F-UNet plus plus : Remote Sensing Image Fusion Based on Multipurpose Adaptive Shuffle Attention and Composite Multi-Input Reconstruction Network
    Jin, Xin
    Zhang, Pingfan
    Jiang, Qian
    Miao, Shengfa
    Yao, Shaowen
    Zhou, Wei
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72