Design of visual inertial state estimator for autonomous systems via multi-sensor fusion approach

被引:2
|
作者
He, Shenghuang [1 ]
Li, Yanzhou [2 ,3 ]
Lu, Yongkang [2 ,3 ]
Liu, Yishan [1 ,2 ,3 ]
机构
[1] Shanghai Jiao Tong Univ, Ningbo Artificial Intelligence Inst, Dept Automat, Shanghai 200240, Peoples R China
[2] Guangdong Univ Technol, Sch Automat, Guangzhou 510006, Peoples R China
[3] Guangdong Univ Technol, Guangdong Prov Key Lab Intelligent Decis & Coopera, Guangzhou 510006, Peoples R China
基金
中国国家自然科学基金;
关键词
Autonomous systems; Visual-inertial data fusion; State estimation; Inverse combination optical flow; Nonlinear optimization; KALMAN FILTER; NAVIGATION; ALGORITHM; TRACKING;
D O I
10.1016/j.mechatronics.2023.103066
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The achievement of autonomous navigation in autonomous systems critically hinges on the implementation of robust localization and reliable mapping. A new visual-inertial simultaneous localization and mapping (SLAM) algorithm is proposed in this paper, which consists of a visual-inertial frontend, system backend, loop closure detection module, and initialization module. Firstly, combined the inverse combination optical flow method with image pyramid, the problem of localization failure of autonomous systems due to light sensitivity of vision sensors is addressed. To meet real-time requirements, the computation complexity of algorithm is effectively reduced by combining FAST corner points with threading building block (TBB) programming library. Secondly, based on the fourth-order Runge Kutta (RK), inertial measurement unit (IMU) pre-integration model can effectively improve the estimation accuracy of autonomous systems. Nonlinear optimization backend based on DogLeg, sliding window and marginalization methods, is adopted to reduce computation complexity during backend processing. Thirdly, to mitigate the drawback of accumulating errors leading to large pose error over long periods, a loop closure detection module is introduced, and an initialization module is added to integrate visual and inertial data. Finally, the feasibility and robustness of the system are verified through testing on the Euroc dataset and Evo precision evaluation tool.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] A robot state estimator based on multi-sensor information fusion
    Zhou, Yang
    Ye, Ping
    Liu, Yunhang
    2018 5TH INTERNATIONAL CONFERENCE ON SYSTEMS AND INFORMATICS (ICSAI), 2018, : 115 - 119
  • [2] Visual Marker based Multi-Sensor Fusion State Estimation
    Luis Sanchez-Lopez, Jose
    Arellano-Quintana, Victor
    Tognon, Marco
    Campoy, Pascual
    Franchi, Antonio
    IFAC PAPERSONLINE, 2017, 50 (01): : 16003 - 16008
  • [3] Distributed fusion estimator for multi-sensor asynchronous sampling systems with missing measurements
    Lin, Honglei
    Sun, Shuli
    IET SIGNAL PROCESSING, 2016, 10 (07) : 724 - 731
  • [4] Distributed fusion estimator for networked multi-rate multi-sensor systems with unknown inputs
    Liu, Qiang
    Lin, Honglei
    2024 3RD CONFERENCE ON FULLY ACTUATED SYSTEM THEORY AND APPLICATIONS, FASTA 2024, 2024, : 786 - 791
  • [5] Visual-Inertial Navigation Systems for Aerial Robotics: Sensor Fusion and Technology
    Santoso, Fendy
    Garratt, Matthew A.
    Anavatti, Sreenatha G.
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2017, 14 (01) : 260 - 275
  • [6] A generic multi-sensor fusion scheme for localization of autonomous platforms using moving horizon estimation
    Osman, Mostafa
    Mehrez, Mohamed W.
    Daoud, Mohamed A.
    Hussein, Ahmed
    Jeon, Soo
    Melek, William
    TRANSACTIONS OF THE INSTITUTE OF MEASUREMENT AND CONTROL, 2021, 43 (15) : 3413 - 3427
  • [7] Fault Diagnosis Based on Multi-Sensor State Fusion Estimation
    Lv, Feng
    Wang, Xiuqing
    Xin, Tao
    Fu, Chao
    SENSOR LETTERS, 2011, 9 (05) : 2006 - 2011
  • [8] MEMS Inertial Sensors Based Gait Analysis for Rehabilitation Assessment via Multi-Sensor Fusion
    Qiu, Sen
    Liu, Long
    Zhao, Hongyu
    Wang, Zhelong
    Jiang, Yongmei
    MICROMACHINES, 2018, 9 (09)
  • [9] An underwater autonomous robot based on multi-sensor data fusion
    Yang, Qingmei
    Sun, Jianmin
    WCICA 2006: SIXTH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-12, CONFERENCE PROCEEDINGS, 2006, : 172 - 172
  • [10] Inertial/magnetic sensors based pedestrian dead reckoning by means of multi-sensor fusion
    Qiu, Sen
    Wang, Zhelong
    Zhao, Hongyu
    Qin, Kairong
    Li, Zhenglin
    Hu, Huosheng
    INFORMATION FUSION, 2018, 39 : 108 - 119