Multi-focus image fusion with convolutional neural network based on Dempster-Shafer theory

被引:5
|
作者
Li L. [1 ]
Li C. [1 ]
Lu X. [1 ]
Wang H. [1 ]
Zhou D. [1 ]
机构
[1] National Key Laboratory of Aerospace Flight Dynamics, School of Astronautics, Northwestern Polytechnical University, Xi'an
来源
Optik | 2023年 / 272卷
基金
中国国家自然科学基金;
关键词
Convolutional neural network; Dempster-Shafer theory; Multi-focus image fusion;
D O I
10.1016/j.ijleo.2022.170223
中图分类号
学科分类号
摘要
Convolutional neural networks (CNN) have been applied to many fields including image classification. Multi-focus image fusion can be regarded as the classification of focused areas and unfocused areas. Therefore, CNN has been widely used in multi-focus image fusion. However, most methods only use information from the last convolutional layer to complete the fusion task, which leads to a suboptimal fusion result. Aiming to solve this problem, we propose a novel convolutional neural network based on the Dempster-Shafer theory (DST) for multi-focus image fusion. Firstly, as a theoretical method for dealing with uncertain issues, the DST is introduced to fuse the results from different branch layers, thus increasing the reliability of the results. Also, a gradient residual block is designed to boost the utilization of edge information by the network while reducing the dimension from feature maps in the branch layers, thereby improving the performance of the network and reducing the number of training parameters. Compared with other state-of-the-art fusion methods, the decision map of the proposed method is more precise. And objectively, the average metrics of our proposed method for the 20 images from the ``Lytro" and ``Nature" datasets perform best in terms of information entropy, mutual information, structural similarity metric, and visual perception metric. © 2022 Elsevier GmbH
引用
收藏
相关论文
共 50 条
  • [31] Color Image Segmentation Based On Dempster-Shafer Evidence Theory
    Ben Chaabane, S.
    Sayadi, M.
    Fnaiech, F.
    Brassart, E.
    2008 IEEE MEDITERRANEAN ELECTROTECHNICAL CONFERENCE, VOLS 1 AND 2, 2008, : 841 - +
  • [32] Multi-Scale Visual Attention Deep Convolutional Neural Network for Multi-Focus Image Fusion
    Lai, Rui
    Li, Yongxue
    Guan, Juntao
    Xiong, Ai
    IEEE ACCESS, 2019, 7 : 114385 - 114399
  • [33] MSIMCNN: Multi-scale inception module convolutional neural network for multi-focus image fusion
    Wenchang Gao
    Lei Yu
    Yao Tan
    Pengna Yang
    Applied Intelligence, 2022, 52 : 14085 - 14100
  • [34] THE DEMPSTER-SHAFER THEORY COMBINED WITH NEURAL NETWORK IN HANDWRITTEN CHARACTER RECOGNITION
    Chang, Bae-Muu
    Tsai, Hung-Hsu
    Yu, Pao-Ta
    INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 2009, 5 (09): : 2561 - 2573
  • [35] MSIMCNN: Multi-scale inception module convolutional neural network for multi-focus image fusion
    Gao, Wenchang
    Yu, Lei
    Tan, Yao
    Yang, Pengna
    APPLIED INTELLIGENCE, 2022, 52 (12) : 14085 - 14100
  • [36] Multi-focus image fusion based on fully convolutional networks
    Guo, Rui
    Shen, Xuan-jing
    Dong, Xiao-yu
    Zhang, Xiao-li
    FRONTIERS OF INFORMATION TECHNOLOGY & ELECTRONIC ENGINEERING, 2020, 21 (07) : 1019 - 1033
  • [37] Multi-focus image fusion based on fully convolutional networks
    Rui Guo
    Xuan-jing Shen
    Xiao-yu Dong
    Xiao-li Zhang
    Frontiers of Information Technology & Electronic Engineering, 2020, 21 : 1019 - 1033
  • [38] A Method for Recognizing Fatigue Driving Based on Dempster-Shafer Theory and Fuzzy Neural Network
    Zhu, WenBo
    Yang, Huicheng
    Jin, Yi
    Liu, Bingyou
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2017, 2017
  • [39] Network Security Inference Method Based on Dempster-Shafer Theory
    Wu N.
    Cheng Z.
    Du L.
    Shen Y.
    Shen, Yingping (withnet@126.com), 1600, Shanghai Jiaotong University (55): : 77 - 81and97
  • [40] Decision Fusion Using Fuzzy Dempster-Shafer Theory
    Surathong, Somnuek
    Auephanwiriyakul, Sansanee
    Theera-Umpon, Nipon
    RECENT ADVANCES IN INFORMATION AND COMMUNICATION TECHNOLOGY 2018, 2019, 769 : 115 - 125