A Tightly Coupled LiDAR-Inertial SLAM for Perceptually Degraded Scenes

被引:10
作者
Yang, Lin [1 ,2 ]
Ma, Hongwei [1 ,2 ]
Wang, Yan [1 ,2 ]
Xia, Jing [1 ,2 ]
Wang, Chuanwei [1 ,2 ]
机构
[1] Xian Univ Sci & Technol, Sch Mech Engn, Xian 710054, Peoples R China
[2] Shaanxi Key Lab Mine Electromech Equipment Intell, Xian 710054, Peoples R China
基金
中国国家自然科学基金;
关键词
perceptually degraded scenes; LiDAR; IMU; state estimation; SLAM; KALMAN FILTER; MOTION; NAVIGATION; ROBUST;
D O I
10.3390/s22083063
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Realizing robust six degrees of freedom (6DOF) state estimation and high-performance simultaneous localization and mapping (SLAM) for perceptually degraded scenes (such as underground tunnels, corridors, and roadways) is a challenge in robotics. To solve these problems, we propose a SLAM algorithm based on tightly coupled LiDAR-IMU fusion, which consists of two parts: front end iterative Kalman filtering and back end pose graph optimization. Firstly, on the front end, an iterative Kalman filter is established to construct a tightly coupled LiDAR-Inertial Odometry (LIO). The state propagation process for the a priori position and attitude of a robot, which uses predictions and observations, increases the accuracy of the attitude and enhances the system robustness. Second, on the back end, we deploy a keyframe selection strategy to meet the real-time requirements of large-scale scenes. Moreover, loop detection and ground constraints are added to the tightly coupled framework, thereby further improving the overall accuracy of the 6DOF state estimation. Finally, the performance of the algorithm is verified using a public dataset and the dataset we collected. The experimental results show that for perceptually degraded scenes, compared with existing LiDAR-SLAM algorithms, our proposed algorithm grants the robot higher accuracy, real-time performance and robustness, effectively reducing the cumulative error of the system and ensuring the global consistency of the constructed maps.
引用
收藏
页数:21
相关论文
共 42 条
  • [1] Anderson S, 2013, IEEE INT C INT ROBOT, P2093, DOI 10.1109/IROS.2013.6696649
  • [2] [Anonymous], 2009, ROBOTICS SCI SYSTEMS
  • [3] Into Darkness: Visual Navigation Based on a Lidar-Intensity-Image Pipeline
    Barfoot, Timothy D.
    McManus, Colin
    Anderson, Sean
    Dong, Hang
    Beerepoot, Erik
    Tong, Chi Hay
    Furgale, Paul
    Gammell, Jonathan D.
    Enright, John
    [J]. ROBOTICS RESEARCH, ISRR, 2016, 114 : 487 - 504
  • [4] Behley J, 2018, ROBOTICS: SCIENCE AND SYSTEMS XIV
  • [5] THE ITERATED KALMAN FILTER UPDATE AS A GAUSS-NEWTON METHOD
    BELL, BM
    CATHEY, FW
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 1993, 38 (02) : 294 - 297
  • [6] BESL PJ, 1992, P SOC PHOTO-OPT INS, V1611, P586, DOI 10.1117/12.57955
  • [7] Iterated extended Kalman filter based visual-inertial odometry using direct photometric feedback
    Bloesch, Michael
    Burri, Michael
    Omari, Sammy
    Hutter, Marco
    Siegwart, Roland
    [J]. INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2017, 36 (10) : 1053 - 1072
  • [8] 1-Point RANSAC for Extended Kalman Filtering: Application to Real-Time Structure from Motion and Visual Odometry
    Civera, Javier
    Grasa, Oscar G.
    Davison, Andrew J.
    Montiel, J. M. M.
    [J]. JOURNAL OF FIELD ROBOTICS, 2010, 27 (05) : 609 - 631
  • [9] Demir M, 2019, IEEE INT C INTELL TR, P3288, DOI 10.1109/ITSC.2019.8916995
  • [10] RANDOM SAMPLE CONSENSUS - A PARADIGM FOR MODEL-FITTING WITH APPLICATIONS TO IMAGE-ANALYSIS AND AUTOMATED CARTOGRAPHY
    FISCHLER, MA
    BOLLES, RC
    [J]. COMMUNICATIONS OF THE ACM, 1981, 24 (06) : 381 - 395