Fault-tolerant relative navigation based on Kullback-Leibler divergence

被引:3
|
作者
Xiong, Jun [1 ]
Cheong, Joon Wayn [2 ]
Xiong, Zhi [1 ]
Dempster, Andrew G. [2 ]
Tian, Shiwei [1 ,3 ]
Wang, Rong [1 ]
Liu, Jianye [1 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Coll Automat Engn, Nanjing, Jiangsu, Peoples R China
[2] Univ New South Wales, Sch Elect Engn & Telecommun, Sydney, NSW, Australia
[3] Army Engn Univ, Coll Commun Engn, Nanjing, Peoples R China
基金
中国国家自然科学基金;
关键词
Fault detection and exclusion; extended Kalman filter; Kullback– Leibler divergence; relative navigation; SYSTEM; EXCLUSION;
D O I
10.1177/1729881420979125
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
A fault-detection method for relative navigation based on Kullback-Leibler divergence (KLD) is proposed. Different from the traditional chi (2)-based approaches, the KLD for a filter is following a hybrid distribution that combines chi (2) distribution and F-distribution. Using extended Kalman filter (EKF) as the estimator, the distance between the priori and posteriori data of EKF is calculated to detect the abnormal measurements. After fault detection step, a fault exclusion method is applied to remove the error observations from the fusion procedure. The proposed method is suitable for the Kalman filter-based multisensor relative navigation system. Simulation and experimental results show that the proposed method can detect the abnormal measurement successfully, and its positioning accuracy after fault detection and exclusion outperforms the traditional chi (2)-based method.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Fault tolerant learning using Kullback-Leibler divergence
    Sum, John
    Leung, Chi-sing
    Hsu, Lipin
    TENCON 2007 - 2007 IEEE REGION 10 CONFERENCE, VOLS 1-3, 2007, : 1193 - +
  • [2] Fault-Tolerant indoor localization based on speed conscious recurrent neural network using Kullback-Leibler divergence
    Varma, Pothuri Surendra
    Anand, Veena
    PEER-TO-PEER NETWORKING AND APPLICATIONS, 2022, 15 (03) : 1370 - 1384
  • [3] Renyi Divergence and Kullback-Leibler Divergence
    van Erven, Tim
    Harremoes, Peter
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (07) : 3797 - 3820
  • [4] The fractional Kullback-Leibler divergence
    Alexopoulos, A.
    JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL, 2021, 54 (07)
  • [5] An improved incipient fault detection method based on Kullback-Leibler divergence
    Chen, Hongtian
    Jiang, Bin
    Lu, Ningyun
    ISA TRANSACTIONS, 2018, 79 : 127 - 136
  • [6] BOUNDS FOR KULLBACK-LEIBLER DIVERGENCE
    Popescu, Pantelimon G.
    Dragomir, Sever S.
    Slusanschi, Emil I.
    Stanasila, Octavian N.
    ELECTRONIC JOURNAL OF DIFFERENTIAL EQUATIONS, 2016,
  • [7] Kullback-Leibler divergence based wind turbine fault feature extraction
    Wu, Yueqi
    Ma, Xiandong
    2018 24TH IEEE INTERNATIONAL CONFERENCE ON AUTOMATION AND COMPUTING (ICAC' 18), 2018, : 483 - 488
  • [8] Kullback-Leibler Divergence Revisited
    Raiber, Fiana
    Kurland, Oren
    ICTIR'17: PROCEEDINGS OF THE 2017 ACM SIGIR INTERNATIONAL CONFERENCE THEORY OF INFORMATION RETRIEVAL, 2017, : 117 - 124
  • [9] On the Interventional Kullback-Leibler Divergence
    Wildberger, Jonas
    Guo, Siyuan
    Bhattacharyya, Arnab
    Schoelkopf, Bernhard
    CONFERENCE ON CAUSAL LEARNING AND REASONING, VOL 213, 2023, 213 : 328 - 349
  • [10] Modulation Classification Based on Kullback-Leibler Divergence
    Im, Chaewon
    Ahn, Seongjin
    Yoon, Dongweon
    15TH INTERNATIONAL CONFERENCE ON ADVANCED TRENDS IN RADIOELECTRONICS, TELECOMMUNICATIONS AND COMPUTER ENGINEERING (TCSET - 2020), 2020, : 373 - 376