Distributed Fusion of Multiple Model Estimators Using Minimum Forward Kullback-Leibler Divergence Sum

被引:0
|
作者
Wei, Zheng [1 ]
Duan, Zhansheng [1 ]
Hanebeck, Uwe D. [2 ]
机构
[1] Xi An Jiao Tong Univ, Ctr Informat Engn Sci Res, Xian 710049, Peoples R China
[2] Karlsruhe Inst Technol, Intelligent Sensor Actuator Syst Lab, D-76131 Karlsruhe, Germany
基金
国家重点研发计划;
关键词
Target tracking; Probability density function; Sensor fusion; Approximation algorithms; Time measurement; Predictive models; Optimization; Distributed fusion; Kullback-Leibler (KL) divergence; maneuvering target tracking; multiple model (MM) estimation; MANEUVERING TARGET TRACKING; DYNAMIC STATE ESTIMATION; INFORMATION; ALGORITHM; CONSENSUS; SYSTEMS; AVERAGE; FILTER;
D O I
10.1109/TAES.2024.3358791
中图分类号
V [航空、航天];
学科分类号
08 ; 0825 ;
摘要
The problem of distributed fusion of Gaussian mixture models (GMMs) provided by the local multiple model (MM) estimators is addressed in this article. Taking GMMs instead of combined Gaussian assumed probability density functions (pdfs) as the output of local MM estimators can retain more detailed (or internal) information about local estimations, but the accompanying challenge is to perform the fusion of GMMs. For this problem, a distributed fusion framework of GMMs under the minimum forward Kullback-Leibler (KL) divergence sum criterion is proposed first. Then, because the KL divergence between GMMs is not analytically tractable, two suboptimal distributed fusion algorithms are further developed within this framework. These two fusion algorithms all have closed forms. Numerical examples verify their effectiveness in terms of both computational efficiency and estimation accuracy.
引用
收藏
页码:2934 / 2947
页数:14
相关论文
共 50 条
  • [1] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [2] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [3] Model parameter learning using Kullback-Leibler divergence
    Lin, Chungwei
    Marks, Tim K.
    Pajovic, Milutin
    Watanabe, Shinji
    Tung, Chih-kuan
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2018, 491 : 549 - 559
  • [4] Minimising the Kullback-Leibler Divergence for Model Selection in Distributed Nonlinear Systems
    Cliff, Oliver M.
    Prokopenko, Mikhail
    Fitch, Robert
    ENTROPY, 2018, 20 (02):
  • [5] Spectral unmixing using minimum volume constrained Kullback-Leibler divergence
    Mohammed, Salah A. G.
    Meddeber, Lila
    Zouagui, Tarik
    Karoui, Moussa S.
    JOURNAL OF APPLIED REMOTE SENSING, 2020, 14 (02)
  • [6] Distributed Vector Quantization Based on Kullback-Leibler Divergence
    Shen, Pengcheng
    Li, Chunguang
    Luo, Yiliang
    ENTROPY, 2015, 17 (12) : 7875 - 7887
  • [7] Optimistic reinforcement learning by forward Kullback-Leibler divergence optimization
    Kobayashi, Taisuke
    NEURAL NETWORKS, 2022, 152 : 169 - 180
  • [8] Kullback-Leibler divergence for interacting multiple model estimation with random matrices
    Li, Wenling
    Jia, Yingmin
    IET SIGNAL PROCESSING, 2016, 10 (01) : 12 - 18
  • [9] Kullback-Leibler divergence for Bayesian nonparametric model checking
    Al-Labadi, Luai
    Patel, Viskakh
    Vakiloroayaei, Kasra
    Wan, Clement
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2021, 50 (01) : 272 - 289
  • [10] DETECTING RARE EVENTS USING KULLBACK-LEIBLER DIVERGENCE
    Xu, Jingxin
    Denman, Simon
    Fookes, Clinton
    Sridharan, Sridha
    2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 1305 - 1309