LICFM3-SLAM: LiDAR-Inertial-Camera Fusion and Multimodal Multilevel Matching for Bionic Quadruped Inspection Robot Mapping

被引:0
作者
Zhang, Haibing [1 ,2 ]
Li, Lin [1 ,2 ]
Jiang, Andong [1 ]
Xu, Jiajun [1 ]
Shen, Huan [1 ]
Li, Youfu [3 ]
Ji, Aihong [4 ,5 ]
Wang, Zhongyuan [2 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Coll Mech & Elect Engn, Lab Locomot Bioinspirat & Intelligent Robots, Nanjing 210016, Peoples R China
[2] Hefei CSG Smart Robot Technol Co Ltd, Hefei 230094, Peoples R China
[3] City Univ Hong Kong, Dept Mech Engn, Hong Kong, Peoples R China
[4] Nanjing Univ Aeronaut & Astronaut, Coll Mech & Elect Engn, Lab Locomot Bioinspirat & Intelligent Robots, Nanjing 210016, Peoples R China
[5] Nanjing Univ Aeronaut & Astronaut, State Key Lab Mech & Control Aerosp Struct, Nanjing 210016, Peoples R China
基金
中国国家自然科学基金;
关键词
Robots; Laser radar; Simultaneous localization and mapping; Quadrupedal robots; Cameras; Accuracy; Visualization; Sensors; Odometry; Inspection; Bionic quadruped robot; mapping; sensor fusion; simultaneous localization and mapping (SLAM); REAL-TIME; ROBUST; ODOMETRY;
D O I
10.1109/TIM.2025.3555702
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In comparison to wheeled robots, the locomotion of bionic quadruped robots is more vigorous. Mapping systems should maintain satisfactory robustness and accuracy in various complex real-world scenarios, even when the robot's body experiences intense shaking. To address these challenges, this study proposes a simultaneous localization and mapping (SLAM) system based on LiDAR-inertial-camera fusion and a multimodal multilayer matching algorithm (LICFM3-SLAM). First, a tightly coupled strategy is utilized to fuse LiDAR, inertial, and camera information, introducing a visual-inertial odometry (VIO) subsystem based on adaptive graph inference; thus, high-precision and robust robot state estimation is achieved. Second, inspired by human spatial cognition, the study proposes a multimodal multilayer matching algorithm and utilizes observation data obtained from the camera and LiDAR, thereby achieving accurate and robust data association. Finally, incremental poses are optimized using factor graph optimization methods; thus, a globally consistent 3-D point cloud map is constructed. The proposed system is tested on a public benchmark dataset and applied to a bionic quadruped inspection robot (BQIR), and experiments are conducted in various challenging indoor and outdoor large-scale scenarios. The results reveal that LICFM3-SLAM exhibits high robustness and mapping accuracy while meeting real-time requirements.
引用
收藏
页数:17
相关论文
共 46 条
[1]   Role of Deep Learning in Loop Closure Detection for Visual and Lidar SLAM: A Survey [J].
Arshad, Saba ;
Kim, Gon-Woo .
SENSORS, 2021, 21 (04) :1-17
[2]  
Bazeille S., 2014, P WORKSH MOD EST PER, P1
[3]   Walking Posture Adaptation for Legged Robot Navigation in Confined Spaces [J].
Buchanan, Russell ;
Bandyopadhyay, Tirthankar ;
Bjelonic, Marko ;
Welihausen, Lorenz ;
Hutter, Marco ;
Kottege, Navinda .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2019, 4 (02) :2148-2155
[4]   ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial, and Multimap SLAM [J].
Campos, Carlos ;
Elvira, Richard ;
Gomez Rodriguez, Juan J. ;
Montiel, Jose M. M. ;
Tardos, Juan D. .
IEEE TRANSACTIONS ON ROBOTICS, 2021, 37 (06) :1874-1890
[5]   Pronto: A Multi-Sensor State Estimator for Legged Robots in Real-World Scenarios [J].
Camurri, Marco ;
Ramezani, Milad ;
Nobili, Simona ;
Fallon, Maurice .
FRONTIERS IN ROBOTICS AND AI, 2020, 7
[6]   Research on Environment Perception System of Quadruped Robots Based on LiDAR and Vision [J].
Chen, Guangrong ;
Hong, Liang .
DRONES, 2023, 7 (05)
[7]   Direct LiDAR Odometry: Fast Localization With Dense Point Clouds [J].
Chen, Kenny ;
Lopez, Brett T. ;
Agha-mohammadi, Ali-akbar ;
Mehta, Ankur .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (02) :2000-2007
[8]   A review of visual SLAM methods for autonomous driving vehicles [J].
Cheng, Jun ;
Zhang, Liyan ;
Chen, Qihong ;
Hu, Xinrong ;
Cai, Jingcao .
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2022, 114
[9]   Insect-inspired AI for autonomous robots [J].
de Croon, G. C. H. E. ;
Dupeyroux, J. J. G. ;
Fuller, S. B. ;
Marshall, J. A. R. .
SCIENCE ROBOTICS, 2022, 7 (67)
[10]  
Dellaert F., 2017, Foundations and Trends in Robotics, V6, P1