Infrared and Visible Image Fusion Based on Image Enhancement and Target Extraction

被引:0
作者
Zhu, Haoran [1 ]
Zhang, Wenying [2 ]
机构
[1] Changchun Univ Sci & Technol, Sch Elect & Informat Engn, Changchun 130022, Peoples R China
[2] Jilin Engn Normal Univ, Jilin Engn Lab Quantum Informat Technol, Changchun 130052, Peoples R China
关键词
Image fusion; image enhancement; target extraction; background subtraction; NETWORK;
D O I
10.1109/ACCESS.2025.3557799
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In order to improve the detail visibility and target contrast of fused images, an infrared and visible image fusion method based on image enhancement and target extraction is proposed. The proposed method is divided into three parts, including enhancement stage, target extraction stage, and fusion stage. In the enhancement stage, to efficiently decompose the visible images, a decomposition method based on improved guided filter is proposed by using the deep feature as the guided image. Based on the characteristics of low and high frequency layers, brightness correction function and detail adjustment function are designed to improve the global and local contrast, respectively. In target extraction, morphological operation and background subtraction are introduced to achieve coarse target extraction efficiently. The feature distribution of redundant background is optimized to obtain more accurate infrared targets. In the fusion stage, the infrared target is injected into the enhanced visible image by compression ratio. A fused image with clear local details and high contrast thermal target is obtained while the exposure is suppressed. In order to prove the effectiveness of the proposed method, experiments are performed on different datasets. Moreover, the proposed method is compared with several mainstream methods based on six evaluation metrics. The results show that the fusion effect of the proposed method is better as compared to other methods.
引用
收藏
页码:61862 / 61875
页数:14
相关论文
共 32 条
[1]   IBFusion: An Infrared and Visible Image Fusion Method Based on Infrared Target Mask and Bimodal Feature Extraction Strategy [J].
Bai, Yang ;
Gao, Meijing ;
Li, Shiyu ;
Wang, Ping ;
Guan, Ning ;
Yin, Haozheng ;
Yan, Yonghao .
IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 :10610-10622
[2]   Two-scale image fusion of visible and infrared images using saliency detection [J].
Bavirisetti, Durga Prasad ;
Dhuli, Ravindra .
INFRARED PHYSICS & TECHNOLOGY, 2016, 76 :52-64
[3]   Fusion of Infrared and Visible Sensor Images Based on Anisotropic Diffusion and Karhunen-Loeve Transform [J].
Bavirisetti, Durga Prasad ;
Dhuli, Ravindra .
IEEE SENSORS JOURNAL, 2016, 16 (01) :203-209
[4]   Semantic Region Adaptive Fusion of Infrared and Visible Images via Dual-DeepLab Guidance [J].
Cao, Wenzi ;
Zheng, Minghui ;
Liao, Qing .
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
[5]   Co-Enhancement of Multi-Modality Image Fusion and Object Detection via Feature Adaptation [J].
Dong, Aimei ;
Wang, Long ;
Liu, Jian ;
Xu, Jingyuan ;
Zhao, Guixin ;
Zhai, Yi ;
Lv, Guohua ;
Cheng, Jinyong .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (12) :12624-12637
[6]   Infrared and Visible Image Fusion Based on Dual Channel Residual Dense Network [J].
Feng Xin ;
Yang Jieming ;
Zhang Hongde ;
Qiu Guohang .
ACTA PHOTONICA SINICA, 2023, 52 (11)
[7]   Infrared and visible image fusion with the use of multi-scale edge-preserving decomposition and guided image filter [J].
Gan, Wei ;
Wu, Xiaohong ;
Wu, Wei ;
Yang, Xiaomin ;
Ren, Chao ;
He, Xiaohai ;
Liu, Kai .
INFRARED PHYSICS & TECHNOLOGY, 2015, 72 :37-51
[8]   PFCFuse: A Poolformer and CNN Fusion Network for Infrared-Visible Image Fusion [J].
Hu, Xinyu ;
Liu, Yang ;
Yang, Feng .
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73
[9]  
Huang S., 2025, IEEE Trans. Instrum. Meas., V74, P1
[10]   MTDFusion: A Multilayer Triple Dense Network for Infrared and Visible Image Fusion [J].
Karim, Shahid ;
Tong, Geng ;
Li, Jinyang ;
Yu, Xiaochang ;
Hao, Jia ;
Qadir, Akeel ;
Yu, Yiting .
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73 :1-17