Adaptive integrated positioning algorithm of VIO/UWB based on robust estimation

被引:0
|
作者
Li X. [1 ,2 ]
Li J. [1 ]
Wang A. [1 ]
Luo H. [1 ]
Yang Z. [1 ]
Li K. [1 ]
机构
[1] Institute of Geospatial Information, Information Engineering University, Zhengzhou
[2] Troops 61618, Beijing
关键词
adaptive; multi-sensor fusion; robust; ultra-wideband; visual-inertial odometry;
D O I
10.13695/j.cnki.12-1222/o3.2023.11.004
中图分类号
学科分类号
摘要
In the indoor environment positioning where the global navigation satellite system (GNSS) signal is rejected, the positioning error of visual-inertial odometry (VIO) inevitably accumulates under adverse conditions such as long-term movement or no loop closure, and the positioning accuracy of ultra-wideband (UWB) is difficult to guarantee due to the influence of non-line-of-sight (NLOS). Therefore, an UWB aided visual-inertial adaptive integrated positioning algorithm based on the robust estimation is proposed. Firstly, the optimization framework of UWB and VIO is constructed, and the global constraints on VIO are applied by using the UWB positioning results. Secondly, robust estimation is added in the back-end optimization stage to adjust the weight values between sensors in real time to suppress the influence of NLOS of UWB. Finally, experiments are carried out in EuRoc dataset and real-world scenarios. Experimental results in real-world scenarios show that the positioning accuracy of the combination algorithm is improved by 75.05% compared to VINS_Mono under non-occlusion conditions, and the positioning accuracy of the combination algorithm based on robust estimation is improved by 37.53% compared to the combination algorithm without robust estimation under occlusion conditions. © 2023 Editorial Department of Journal of Chinese Inertial Technology. All rights reserved.
引用
收藏
页码:1083 / 1091
页数:8
相关论文
共 20 条
  • [1] Qin T, Li P, Shen S., VINS-Mono: A robust and versatile monocular visual-inertial state estimator, IEEE Transactions on Robotics, 34, 4, pp. 1004-1020, (2018)
  • [2] Campos C, Elvira R, Rodriguez J, Et al., ORB-SLAM3: An accurate open-source library for visual, visual-inertial, and multimap SL AM, IEEE Transactions on Robotics, 37, 6, pp. 1874-1890, (2021)
  • [3] Yang Y, Huang G., Aided inertial navigation: Unified feature representations and observability analysis, 2019 International Conference on Robotics and Automation (ICRA), pp. 3528-3534, (2019)
  • [4] Cioffi G, Scaramuzza D., Tightly-coupled fusion of global positional measurements in optimization-based visual-inertial odometry, 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5089-5095, (2020)
  • [5] Cao S, Lu X, Shen S., GVINS: Tightly coupled GNSS-visual-inertial fusion for smooth and consistent state estimation, IEEE Transactions on Robotics, 38, 4, pp. 2004-2021, (2022)
  • [6] Li X, Li J, Wang A, Et al., A review of integrated navigation technology based on visual/inertial/UWB fusion, Science of Surveying and Mapping, 48, 6, pp. 49-58, (2023)
  • [7] Jia X, Lu W, Teng Y, Et al., Improved adaptive SRCKF algorithm for GNSS/SINS integrated navigation based on measurement characteristics, Journal of Chinese Inertial Technology, 31, 4, pp. 327-334, (2023)
  • [8] Yang X, Huangfu S, Yan S., Fusion positioning method with UWB/IMU/odometer based on the improved UKF, Journal of Chinese Inertial Technology, 31, 5, pp. 462-471, (2023)
  • [9] Li Y, Bao H, Xu C., Precise localization of monocular vision inertial SLAM and UWB data fusion, Transducer and Microsystem Technologies, 41, 9, pp. 125-128, (2022)
  • [10] Shen B, Zhang Z, Shu S., UWB-VIO integrated indoor positioning algorithm for mobile robots, Journal of Computer Applications, 42, 12, pp. 3924-3930, (2022)