Advancing Simultaneous Localization and Mapping with Multi-Sensor Fusion and Point Cloud De-Distortion

被引:1
|
作者
Shao, Haiyan [1 ]
Zhao, Qingshuai [1 ]
Chen, Hongtang [1 ]
Yang, Weixin [2 ]
Chen, Bin [3 ]
Feng, Zhiquan [4 ]
Zhang, Jinkai [1 ]
Teng, Hao [1 ]
机构
[1] Univ Jinan, Sch Mech Engn, Jinan 250022, Peoples R China
[2] Univ Nevada, Sch Elect & Biomed Engn, Reno, NV 89557 USA
[3] Shandong Youbaote Intelligent Robot Co Ltd, Jinan 250098, Peoples R China
[4] Univ Jinan, Sch Informat Sci & Engn, Jinan 250022, Peoples R China
关键词
obstacle detection; motion distortion; error compensation; simultaneous localization and mapping (SLAM); LiDAR; depth camera; quadruped robot dog; ROBOTS;
D O I
10.3390/machines11060588
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This study addresses the challenges associated with incomplete or missing information in obstacle detection methods that employ a single sensor. Additionally, it tackles the issue of motion distortion in LiDAR point cloud data during synchronization and mapping in complex environments. The research introduces two significant contributions. Firstly, a novel obstacle detection method, named the point-map fusion (PMF) algorithm, was proposed. This method integrates point cloud data from the LiDAR, camera, and odometer, along with local grid maps. The PMF algorithm consists of two components: the point-fusion (PF) algorithm, which combines LiDAR point cloud data and camera laser-like point cloud data through a point cloud library (PCL) format conversion and concatenation, and selects the most proximate point cloud to the quadruped robot dog as the valid data; and the map-fusion (MF) algorithm, which incorporates local grid maps acquired using the Gmapping and OctoMap algorithms, leveraging Bayesian estimation theory. The local grid maps obtained by the Gmapping and OctoMap algorithms are denoted as map A and map B, respectively. This sophisticated methodology enables seamless map fusion, which significantly enhances the precision and reliability of the approach. Secondly, a motion distortion removal (MDR) method for LiDAR point cloud data based on odometer readings was proposed. The MDR method utilizes legged odometer data for linear data interpolation of the original distorted LiDAR point cloud data, facilitating the determination of the corresponding pose of the quadruped robot dog. Subsequently, the LiDAR point cloud data are then transformed to the quadruped robot dog coordinate system, efficiently mitigating motion distortion. Experimental results demonstrated that the proposed PMF algorithm achieved a 50% improvement in success rate compared to using only LiDAR or the PF algorithm in isolation, while the MDR algorithm enhanced mapping accuracy by 45.9% when motion distortion was taken into account. The effectiveness of the proposed methods was confirmed through rigorous experimentation.
引用
收藏
页数:19
相关论文
共 50 条
  • [1] Simultaneous localization and mapping of mobile robots with multi-sensor fusion
    Zhang K.
    Cui H.
    Yan X.
    Applied Mathematics and Nonlinear Sciences, 2024, 9 (01)
  • [2] A Glass Detection Method Based on Multi-sensor Data Fusion in Simultaneous Localization and Mapping
    Zhang, Pengfei
    Fan, Guangyu
    Rao, Lei
    Cheng, Songlin
    Song, Xiaoyong
    Chen, Niansheng
    Xu, Zhaohui
    PROCEEDINGS OF 2022 INTERNATIONAL CONFERENCE ON AUTONOMOUS UNMANNED SYSTEMS, ICAUS 2022, 2023, 1010 : 1386 - 1400
  • [3] Multi-sensor point cloud data fusion for precise 3D mapping
    Abdelazeem, Mohamed
    Elamin, Ahmed
    Afifi, Akram
    El-Rabbany, Ahmed
    EGYPTIAN JOURNAL OF REMOTE SENSING AND SPACE SCIENCES, 2021, 24 (03): : 835 - 844
  • [4] Multi-sensor point cloud data fusion for precise 3D mapping
    Abdelazeem, Mohamed
    Elamin, Ahmed
    Afifi, Akram
    El-Rabbany, Ahmed
    Egyptian Journal of Remote Sensing and Space Science, 2021, 24 (03): : 835 - 844
  • [5] Localization and Mapping Based on Multi-feature and Multi-sensor Fusion
    Li, Danni
    Zhao, Yibing
    Wang, Weiqi
    Guo, Lie
    INTERNATIONAL JOURNAL OF AUTOMOTIVE TECHNOLOGY, 2024, 25 (06) : 1503 - 1515
  • [6] An overview of simultaneous localisation and mapping: towards multi-sensor fusion
    Yin, Jun
    Yan, Fei
    Liu, Yisha
    He, Guojian
    Zhuang, Yan
    INTERNATIONAL JOURNAL OF SYSTEMS SCIENCE, 2024, 55 (03) : 550 - 568
  • [7] CUPREDS: Multi-sensor Point Cloud Mapping for Local Navigation
    Tojal, Carlos
    Bento, Luis Conde
    Peixoto, Paulo
    2024 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS, ICARSC, 2024, : 47 - 53
  • [8] Asynchronous Multi-Sensor Fusion for 3D Mapping and Localization
    Geneva, Patrick
    Eckenhoff, Kevin
    Huang, Guoquan
    2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2018, : 5994 - 5999
  • [9] Study of multi-sensor fusion for localization
    Pelka, Michal
    Majek, Karol
    Ratajczak, Jakub
    Bedkowski, Janusz
    Maslowski, Andrzej
    2019 IEEE INTERNATIONAL SYMPOSIUM ON SAFETY, SECURITY, AND RESCUE ROBOTICS (SSRR), 2019, : 110 - 111
  • [10] Multi-Sensor Fusion Simultaneous Localization Mapping Based on Deep Reinforcement Learning and Multi-Model Adaptive Estimation
    Wong, Ching-Chang
    Feng, Hsuan-Ming
    Kuo, Kun-Lung
    SENSORS, 2024, 24 (01)