Multi-scale RGB and NIR image Cross-fusion based on Generative Adversarial Network

被引:0
|
作者
Xiang, Sen [1 ]
Hu, Zishan [1 ]
Deng, Huiping [1 ]
Wu, Jin [1 ]
机构
[1] Wuhan Univ Sci & Technol, Sch Informat Sci & Engn, Wuhan, Peoples R China
来源
2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC | 2023年
基金
中国国家自然科学基金;
关键词
image fusion; generative adversarial network; full-scale feature extraction; color fusion network; cross-space attention;
D O I
10.1109/CCDC58219.2023.10327440
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Fusing the RGB and NIR images can improve the visibility and perception quality. In this task, enhancing details and keeping color fidelity are of the most importance. To achieve this goal, in this paper, an unsupervised dual-branch GAN model is proposed. In the generator, two branches are introduced to fuse texture and color information, respectively. Specifically, the upper branch uses full-scale skip connections to fuse texture details, while the lower branch learns color features within and across image channels to keep color fidelity. The features of the two branches are merged via cross-space attention blocks. As to discrimination, two discriminators are utilized to fully integrate and balance the contributions of the RGB map and the NIR map. Last but not the least, unsupervised loss functions are proposed in considerations of color, texture and adversary between the generator and the discriminator. The network is trained with a public dataset and a self-collected RGB-NIR dataset. Experimental results demonstrate that the algorithm fully fuses RGB and NIR images with fine details and plausible color, which is superior to most existing algorithms.
引用
收藏
页码:4172 / 4177
页数:6
相关论文
共 50 条
  • [31] Image fusion based on generative adversarial network consistent with perception
    Fu, Yu
    Wu, Xiao-Jun
    Durrani, Tariq
    INFORMATION FUSION, 2021, 72 : 110 - 125
  • [32] Remote Sensing Image Fusion Based on Generative Adversarial Network with Multi-stream Fusion Architecture
    Lei D.
    Zhang C.
    Li Z.
    Wu Y.
    Dianzi Yu Xinxi Xuebao/Journal of Electronics and Information Technology, 2020, 42 (08): : 1942 - 1949
  • [33] Remote Sensing Image Fusion Based on Generative Adversarial Network with Multi-stream Fusion Architecture
    Lei Dajiang
    Zhang Ce
    Li Zhixing
    Wu Yu
    JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY, 2020, 42 (08) : 1942 - 1949
  • [34] AMMGAN: adaptive multi-scale modulation generative adversarial network for few-shot image generation
    Wenkuan Li
    Wenyi Xu
    Xubin Wu
    Qianshan Wang
    Qiang Lu
    Tianxia Song
    Haifang Li
    Applied Intelligence, 2023, 53 : 20979 - 20997
  • [35] Infrared and visible image fusion based on WEMD and generative adversarial network reconstruction
    Yang Y.
    Gao X.
    Dang J.
    Wang Y.
    Guangxue Jingmi Gongcheng/Optics and Precision Engineering, 2022, 30 (03): : 320 - 330
  • [36] AMMGAN: adaptive multi-scale modulation generative adversarial network for few-shot image generation
    Li, Wenkuan
    Xu, Wenyi
    Wu, Xubin
    Wang, Qianshan
    Lu, Qiang
    Song, Tianxia
    Li, Haifang
    APPLIED INTELLIGENCE, 2023, 53 (18) : 20979 - 20997
  • [37] A Generative Adversarial Network for Infrared and Visible Image Fusion Based on Semantic Segmentation
    Hou, Jilei
    Zhang, Dazhi
    Wu, Wei
    Ma, Jiayi
    Zhou, Huabing
    ENTROPY, 2021, 23 (03)
  • [38] RGB-D Image Inpainting Using Generative Adversarial Network with a Late Fusion Approach
    Fujii, Ryo
    Hachiuma, Ryo
    Saito, Hideo
    AUGMENTED REALITY, VIRTUAL REALITY, AND COMPUTER GRAPHICS, AVR 2020, PT I, 2020, 12242 : 440 - 451
  • [39] Nighttime Image Dehazing Based on Multi-Scale Gated Fusion Network
    Zhao, Bo
    Wu, Han
    Ma, Zhiyang
    Fu, Huini
    Ren, Wenqi
    Liu, Guizhong
    ELECTRONICS, 2022, 11 (22)
  • [40] Generative adversarial networks with multi-scale and attention mechanisms for underwater image enhancement
    Wang, Ziyang
    Zhao, Liquan
    Zhong, Tie
    Jia, Yanfei
    Cui, Ying
    FRONTIERS IN MARINE SCIENCE, 2023, 10