VoxelMap++: Mergeable Voxel Mapping Method for Online LiDAR(-Inertial) Odometry

被引:6
作者
Wu, Chang [1 ]
You, Yuan [2 ]
Yuan, Yifei [1 ]
Kong, Xiaotong [1 ]
Zhang, Ying [1 ]
Li, Qiyan [1 ]
Zhao, Kaiyong [2 ]
机构
[1] Univ Elect Sci & Technol China UESTC, Inst Informat & Commun Engn, Chengdu 611731, Peoples R China
[2] XGRIDS, Shenzhen 518051, Peoples R China
关键词
Mapping; union-find; localization; simultaneous localization and mapping (SLAM); LIDAR; LIO;
D O I
10.1109/LRA.2023.3333736
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
This letter presents VoxelMap++: a voxel mapping method with plane merging which can effectively improve the accuracy and efficiency of LiDAR(-inertial) based simultaneous localization and mapping (SLAM). This map is a collection of voxels that contains one plane feature with 3DOF representation and corresponding covariance estimation. Considering map will contain a large number of coplanar features (kid planes), these kid planes' can be regarded as the measurements with covariance of a larger plane (father plane). Thus, we have designed a plane merging module based on the union-find. This merging module is capable of distinguishing co-plane relationship within various voxels, then merge these kid planes to estimate the father plane by minimizing the trace of covariance. After merging, the father plane exhibits more accurate compare to kids plane, with decreasing of uncertainty, which improve the accuracy of LiDAR(-inertial) odometry. Experiments on different environments demonstrate the superior of VoxelMap++ compared with other state-of-the-art methods (see our attached video). Our implementation is open-sourced on GitHub which is applicable for both non-repetitive scanning LiDARs and traditional scanning LiDAR.
引用
收藏
页码:427 / 434
页数:8
相关论文
共 24 条
  • [1] Efficient and Probabilistic Adaptive Voxel Mapping for Accurate Online LiDAR Odometry
    Yuan, Chongjian
    Xu, Wei
    Liu, Xiyuan
    Hong, Xiaoping
    Zhang, Fu
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (03) : 8518 - 8525
  • [2] LIGHTWEIGHT AND FAST MATCHING METHOD FOR LIDAR-INERTIAL ODOMETRY AND MAPPING
    Li, Chuanjiang
    Hu, Ziwei
    Zhu, Yanfei
    Ji, Xingzhao
    Zhang, Chongming
    Qi, Ziming
    INTERNATIONAL JOURNAL OF ROBOTICS & AUTOMATION, 2024, 39 (05) : 338 - 348
  • [3] InLIOM: Tightly-Coupled Intensity LiDAR Inertial Odometry and Mapping
    Wang, Hanqi
    Liang, Huawei
    Li, Zhiyuan
    Zheng, Xiaokun
    Xu, Haitao
    Zhou, Pengfei
    Kong, Bin
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2024, 25 (09) : 11821 - 11832
  • [4] Towards High-Performance Solid-State-LiDAR-Inertial Odometry and Mapping
    Li, Kailai
    Li, Meng
    Hanebeck, Uwe D.
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2021, 6 (03): : 5167 - 5174
  • [5] Tightly Coupled LiDAR-Inertial Odometry and Mapping for Underground Environments
    Chen, Jianhong
    Wang, Hongwei
    Yang, Shan
    SENSORS, 2023, 23 (15)
  • [6] Evaluation of LiDAR Inertial Odometry method with 3D LiDAR-based Sensor Pack
    Ogunniyi, Samuel
    Withey, Daniel
    2021 RAPID PRODUCT DEVELOPMENT ASSOCIATION OF SOUTH AFRICA - ROBOTICS AND MECHATRONICS - PATTERN RECOGNITION ASSOCATION OF SOUTH AFRICA (RAPDASA-ROBMECH-PRASA), 2022,
  • [7] DV-LIO: LiDAR-inertial Odometry based on dynamic merging and smoothing voxel
    Shen, Chenyu
    Lin, Wanbiao
    Sun, Siyang
    Ouyang, Wenlan
    Shi, Bohan
    Sun, Lei
    ROBOTICA, 2025,
  • [8] RSS-LIWOM: Rotating Solid-State LiDAR for Robust LiDAR-Inertial-Wheel Odometry and Mapping
    Gong, Shunjie
    Shi, Chenghao
    Zhang, Hui
    Lu, Huimin
    Zeng, Zhiwen
    Chen, Xieyuanli
    REMOTE SENSING, 2023, 15 (16)
  • [9] A Robust and Precise LiDAR-Inertial-GPS Odometry and Mapping Method for Large-Scale Environment
    Wu, Yuanqing
    Zhao, Jiajun
    IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2022, 27 (06) : 5027 - 5036
  • [10] SR-LIVO: LiDAR-Inertial-Visual Odometry and Mapping With Sweep Reconstruction
    Yuan, Zikang
    Deng, Jie
    Ming, Ruiye
    Lang, Fengtian
    Yang, Xin
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (06) : 5110 - 5117