Multi-sensor data fusion method based on divergence measure and probability transformation belief factor

被引:4
作者
Hu, Zhentao [1 ]
Su, Yujie [1 ]
Hou, Wei [1 ]
Ren, Xing [1 ]
机构
[1] Henan Univ, Sch Artificial Intelligence, Zhengzhou 450046, Peoples R China
基金
中国国家自然科学基金;
关键词
Dempster-Shafer evidence theory; Divergence measure; Probability transformation belief factor; Belief entropy; Multi-sensor data fusion; DEMPSTER-SHAFER THEORY; FUZZY ROUGH SET; COMBINATION; SPECIFICITY; ENTROPY;
D O I
10.1016/j.asoc.2023.110603
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dempster-Shafer evidence theory is widely used in multi-sensor data fusion. However, how to manage the counterintuitive result generated by the highly conflicting evidence remains an open question. To solve the problem, a novel multi-sensor data fusion method is proposed, which analyses the credibility of evidence from both the discrepancy between evidences and the factors of evidence itself. Firstly, a new Belief Kullback-Leibler divergence is put forward, which evaluates the credibility of evidence from the discrepancy between evidences. Secondly, another credibility measure called the Probability Transformation Belief Factor is defined, which assesses the credibility of evidence from the evidence itself. These two credibilities are combined as the comprehensive credibility of evidence. Furthermore, considering the uncertainty of evidence, a new belief entropy based on the cross-information within the evidence is presented, which is applied to quantify the information volume of evidence and to adjust the comprehensive credibility of evidence. The adjusted comprehensive credibility is regarded as the final weight to modify the body of evidence. Finally, the Dempster's combination rule is applied for fusion. Experiment and applications show that the proposed method is effective and superior. & COPY; 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [31] Multi-sensor Data Fusion Method Based on ARIMA-LightGBM for AGV Positioning
    Che, HongLei
    2021 5TH INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION SCIENCES (ICRAS 2021), 2021, : 272 - 276
  • [32] Research on fire detection method of complex space based on multi-sensor data fusion
    Su, Qian
    Hu, Guangzhou
    Liu, Zhenxing
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (08)
  • [33] A new divergence measure for belief functions in D-S evidence theory for multisensor data fusion
    Xiao, Fuyuan
    INFORMATION SCIENCES, 2020, 514 : 462 - 483
  • [34] MULTI-SENSOR DATA FUSION BASED ON GCN-LSTM
    Xiao, Bohuai
    Xie, Xiaolan
    Yang, Chengyong
    INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 2022, 18 (05): : 1363 - 1381
  • [35] CPN based multi-sensor data fusion for target classification
    Niu, LH
    Ni, GQ
    Liu, MQ
    SECOND INTERNATION CONFERENCE ON IMAGE AND GRAPHICS, PTS 1 AND 2, 2002, 4875 : 671 - 676
  • [36] Self-tuning Filtering for Multi-sensor Data Fusion Based on Forget Factor Algorithms
    Zhang, Yulai
    Luo, Guiming
    Luo, Fu
    2011 6TH IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS (ICIEA), 2011, : 2415 - 2420
  • [37] Application of Multi-sensor Data Fusion Method in the Ammonia Modification System
    Zhu Helei
    Meng Zhuo
    Sun Yize
    Lu Wei
    Zhu Zina
    AUTOMATIC CONTROL AND MECHATRONIC ENGINEERING III, 2014, 615 : 118 - 121
  • [38] A novel method to determine basic probability assignment in Dempster-Shafer theory and its application in multi-sensor information fusion
    Fei, Liguo
    Xia, Jun
    Feng, Yuqiang
    Liu, Luning
    INTERNATIONAL JOURNAL OF DISTRIBUTED SENSOR NETWORKS, 2019, 15 (07)
  • [39] An improved belief entropy-based uncertainty management approach for sensor data fusion
    Tang, Yongchuan
    Zhou, Deyun
    He, Zichang
    Xu, Shuai
    INTERNATIONAL JOURNAL OF DISTRIBUTED SENSOR NETWORKS, 2017, 13 (07):
  • [40] A Multi-sensor Data Fusion Method for Nondestructive Testing of Oil Pipelines
    Xi G.
    Huang C.
    Liu S.
    Instrumentation Mesure Metrologie, 2019, 18 (03): : 249 - 255