Distributed Fusion of Multiple Model Estimators Using Minimum Forward Kullback-Leibler Divergence Sum

被引:0
|
作者
Wei, Zheng [1 ]
Duan, Zhansheng [1 ]
Hanebeck, Uwe D. [2 ]
机构
[1] Xi An Jiao Tong Univ, Ctr Informat Engn Sci Res, Xian 710049, Peoples R China
[2] Karlsruhe Inst Technol, Intelligent Sensor Actuator Syst Lab, D-76131 Karlsruhe, Germany
基金
国家重点研发计划;
关键词
Target tracking; Probability density function; Sensor fusion; Approximation algorithms; Time measurement; Predictive models; Optimization; Distributed fusion; Kullback-Leibler (KL) divergence; maneuvering target tracking; multiple model (MM) estimation; MANEUVERING TARGET TRACKING; DYNAMIC STATE ESTIMATION; INFORMATION; ALGORITHM; CONSENSUS; SYSTEMS; AVERAGE; FILTER;
D O I
10.1109/TAES.2024.3358791
中图分类号
V [航空、航天];
学科分类号
08 ; 0825 ;
摘要
The problem of distributed fusion of Gaussian mixture models (GMMs) provided by the local multiple model (MM) estimators is addressed in this article. Taking GMMs instead of combined Gaussian assumed probability density functions (pdfs) as the output of local MM estimators can retain more detailed (or internal) information about local estimations, but the accompanying challenge is to perform the fusion of GMMs. For this problem, a distributed fusion framework of GMMs under the minimum forward Kullback-Leibler (KL) divergence sum criterion is proposed first. Then, because the KL divergence between GMMs is not analytically tractable, two suboptimal distributed fusion algorithms are further developed within this framework. These two fusion algorithms all have closed forms. Numerical examples verify their effectiveness in terms of both computational efficiency and estimation accuracy.
引用
收藏
页码:2934 / 2947
页数:14
相关论文
共 50 条
  • [11] DETECTING RARE EVENTS USING KULLBACK-LEIBLER DIVERGENCE
    Xu, Jingxin
    Denman, Simon
    Fookes, Clinton
    Sridharan, Sridha
    2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 1305 - 1309
  • [12] Markov-switching model selection using Kullback-Leibler divergence
    Smith, Aaron
    Naik, Prasad A.
    Tsai, Chih-Ling
    JOURNAL OF ECONOMETRICS, 2006, 134 (02) : 553 - 577
  • [13] Detecting abnormal situations using the Kullback-Leibler divergence
    Zeng, Jiusun
    Kruger, Uwe
    Geluk, Jaap
    Wang, Xun
    Xie, Lei
    AUTOMATICA, 2014, 50 (11) : 2777 - 2786
  • [14] Fault tolerant learning using Kullback-Leibler divergence
    Sum, John
    Leung, Chi-sing
    Hsu, Lipin
    TENCON 2007 - 2007 IEEE REGION 10 CONFERENCE, VOLS 1-3, 2007, : 1193 - +
  • [15] Anomaly Detection Using the Kullback-Leibler Divergence Metric
    Afgani, Mostafa
    Sinanovic, Sinan
    Haas, Harald
    ISABEL: 2008 FIRST INTERNATIONAL SYMPOSIUM ON APPLIED SCIENCES IN BIOMEDICAL AND COMMMUNICATION TECHNOLOGIES, 2008, : 197 - 201
  • [16] Android Malware Detection Using Kullback-Leibler Divergence
    Cooper, Vanessa N.
    Haddad, Hisham M.
    Shahriar, Hossain
    ADCAIJ-ADVANCES IN DISTRIBUTED COMPUTING AND ARTIFICIAL INTELLIGENCE JOURNAL, 2014, 3 (02): : 17 - 24
  • [17] An Asymptotic Test for Bimodality Using The Kullback-Leibler Divergence
    Contreras-Reyes, Javier E.
    SYMMETRY-BASEL, 2020, 12 (06):
  • [18] Optimal robust estimates using the Kullback-Leibler divergence
    Yohai, Victor J.
    STATISTICS & PROBABILITY LETTERS, 2008, 78 (13) : 1811 - 1816
  • [19] Human promoter recognition using kullback-leibler divergence
    Zeng, Ja
    Cao, Xiao-Qin
    Yan, Hong
    PROCEEDINGS OF 2007 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2007, : 3319 - 3325
  • [20] Estimating Kullback-Leibler Divergence Using Kernel Machines
    Ahuja, Kartik
    CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 690 - 696