Infrared and visible image fusion method based on principal component analysis network and multi-scale morphological gradient

被引:7
作者
Li, Shengshi [1 ]
Zou, Yonghua [1 ,2 ]
Wang, Guanjun [1 ,2 ]
Lin, Cong [1 ]
机构
[1] Hainan Univ, Sch Informat & Commun Engn, Haikou 570228, Peoples R China
[2] Hainan Univ, State Key Lab Marine Resource Utilizat South China, Haikou 570228, Peoples R China
基金
中国国家自然科学基金;
关键词
Image fusion; Principal component analysis network; Multi-scale morphological gradient; Guided filter; Deep learning; QUALITY ASSESSMENT; ALGORITHM;
D O I
10.1016/j.infrared.2023.104810
中图分类号
TH7 [仪器、仪表];
学科分类号
0804 ; 080401 ; 081102 ;
摘要
In infrared (IR) and visible image fusion, energy conservation and detail extraction are two key problems. We propose a novel IR and visible image fusion model based on principal component analysis network (PCANet) and multi-scale morphological gradient (MSMG), aiming to better preserve energy and extract details. Firstly, we obtain the features of IR and visible images through PCANet. The features obtained by PCANet have more powerful representation capabilities in IR target perception and visible detail description. Secondly, we develop a fusion strategy combining MSMG with guided filter to obtain the corresponding feature map weights. This fusion strategy is able to preserve more IR image energy and extract more visible image details. Finally, a weighted-averaging strategy is used to obtain the fused image. The effectiveness of the proposed method is verified by two datasets with more than 80 pairs of source images in total. Compared with 17 representative methods, the experimental results demonstrate that the proposed method can achieve the state-of-the-art in both visual quality and objective evaluation.
引用
收藏
页数:12
相关论文
共 38 条
[1]  
Alexander T., 2014, TNO Image Fusion Dataset
[2]   PCANet: A Simple Deep Learning Baseline for Image Classification? [J].
Chan, Tsung-Han ;
Jia, Kui ;
Gao, Shenghua ;
Lu, Jiwen ;
Zeng, Zinan ;
Ma, Yi .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2015, 24 (12) :5017-5032
[3]   A human perception inspired quality metric for image fusion based on regional information [J].
Chen, Hao ;
Varshney, Pramod K. .
INFORMATION FUSION, 2007, 8 (02) :193-207
[4]   A new automated quality assessment algorithm for image fusion [J].
Chen, Yin ;
Blum, Rick S. .
IMAGE AND VISION COMPUTING, 2009, 27 (10) :1421-1432
[5]  
Cvejic N., 2008, INT J COMPUT INF ENG, V2, P2826
[6]  
Haghighat M, 2014, I C APPL INF COMM TE, P424
[7]   Guided Image Filtering [J].
He, Kaiming ;
Sun, Jian ;
Tang, Xiaoou .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (06) :1397-1409
[8]   Evaluation of focus measures in multi-focus image fusion [J].
Huang, Wei ;
Jing, Zhongliang .
PATTERN RECOGNITION LETTERS, 2007, 28 (04) :493-500
[9]  
Kristan M., 2020, ECCVW
[10]  
Li H, 2020, CODE RFN NEST