Versatile LiDAR-Inertial Odometry With SE(2) Constraints for Ground Vehicles

被引:8
作者
Chen, Jiaying [1 ]
Wang, Han [1 ]
Hu, Minghui [1 ]
Suganthan, Ponnuthurai Nagaratnam [1 ,2 ]
机构
[1] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
[2] Qatar Univ, Coll Engn, KINDI Ctr Comp Res, Doha, Qatar
关键词
Laser radar; Simultaneous localization and mapping; Robots; Location awareness; Land vehicles; Robot sensing systems; Real-time systems; SLAM; localization; mapping; sensor fusion;
D O I
10.1109/LRA.2023.3268584
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
LiDAR SLAM has become one of the major localization systems for ground vehicles since LiDAR Odometry And Mapping (LOAM). Many extension works on LOAM mainly leverage one specific constraint to improve the performance, e.g., information from on-board sensors such as loop closure and inertial state; prior conditions such as ground level and motion dynamics. In many robotic applications, these conditions are often known partially, hence a SLAM system can be a comprehensive problem due to the existence of numerous constraints. Therefore, we can achieve a better SLAM result by fusing them properly. In this letter, we propose a hybrid LiDAR-inertial SLAM framework that leverages both the on-board perception system and prior information such as motion dynamics to improve localization performance. In particular, we consider the case for ground vehicles, which are commonly used for autonomous driving and warehouse logistics. We present a computationally efficient LiDAR-inertial odometry method that directly parameterizes ground vehicle poses on SE(2). The out-of-SE(2) motion perturbations are not neglected but incorporated into an integrated noise term of a novel SE(2)-constraints model. For odometric measurement processing, we propose a versatile, tightly coupled LiDAR-inertial odometry to achieve better pose estimation than traditional LiDAR odometry. Thorough experiments are performed to evaluate our proposed method's performance in different scenarios, including localization for both indoor and outdoor environments. The proposed method achieves superior performance in accuracy and robustness.
引用
收藏
页码:3486 / 3493
页数:8
相关论文
共 30 条
[1]  
Barfoot T.D., 2017, State Estimation for Robotics
[2]   ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial, and Multimap SLAM [J].
Campos, Carlos ;
Elvira, Richard ;
Gomez Rodriguez, Juan J. ;
Montiel, Jose M. M. ;
Tardos, Juan D. .
IEEE TRANSACTIONS ON ROBOTICS, 2021, 37 (06) :1874-1890
[3]   A Review of Visual-LiDAR Fusion based Simultaneous Localization and Mapping [J].
Debeunne, Cesar ;
Vivet, Damien .
SENSORS, 2020, 20 (07)
[4]   PFilter: Building Persistent Maps through Feature Filtering for Fast and Accurate LiDAR-based SLAM [J].
Duan, Yifan ;
Peng, Jie ;
Zhang, Yu ;
Ji, Jianmin ;
Zhang, Yanyong .
2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2022, :11087-11093
[5]   On-Manifold Preintegration for Real-Time Visual-Inertial Odometry [J].
Forster, Christian ;
Carlone, Luca ;
Dellaert, Frank ;
Scaramuzza, Davide .
IEEE TRANSACTIONS ON ROBOTICS, 2017, 33 (01) :1-21
[6]  
Forster C, 2015, ROBOTICS: SCIENCE AND SYSTEMS XI
[7]   SVO: Semidirect Visual Odometry for Monocular and Multicamera Systems [J].
Forster, Christian ;
Zhang, Zichao ;
Gassner, Michael ;
Werlberger, Manuel ;
Scaramuzza, Davide .
IEEE TRANSACTIONS ON ROBOTICS, 2017, 33 (02) :249-265
[8]  
Geiger A, 2012, PROC CVPR IEEE, P3354, DOI 10.1109/CVPR.2012.6248074
[9]  
Hess W, 2016, IEEE INT CONF ROBOT, P1271, DOI 10.1109/ICRA.2016.7487258
[10]   RF-LIO: Removal-First Tightly-coupled Lidar Inertial Odometry in High Dynamic Environments [J].
2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2021, :4421-4428