Semantic-Assisted LIDAR Tightly Coupled SLAM for Dynamic Environments

被引:2
|
作者
Liu, Peng [1 ]
Bi, Yuxuan [1 ]
Shi, Jialin [1 ]
Zhang, Tianyi [1 ]
Wang, Caixia [1 ]
机构
[1] Changchun Univ Sci & Technol, Sch Elect Informat Engn, Changchun 130022, Peoples R China
关键词
Semantics; Simultaneous localization and mapping; Laser radar; Robots; Point cloud compression; Vehicle dynamics; Heuristic algorithms; Odometry; LIDAR odometry; semantic SLAM; dynamic removal; SIMULTANEOUS LOCALIZATION; SEGMENTATION; ROBUST;
D O I
10.1109/ACCESS.2024.3369183
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The Simultaneous Localization and Mapping (SLAM) environment is evolving from static to dynamic. However, traditional SLAM methods struggle to eliminate the influence of dynamic objects, leading to significant deviations in pose estimation. Addressing these challenges in dynamic environments, this paper introduces a semantic-assisted LIDAR tightly coupled SLAM method. Specifically, to mitigate interference from dynamic objects, a scheme for calculating static semantic probability is proposed. This enables the segmentation of static and dynamic points while eliminating both stationary dynamic objects and moving environmental blocking objects. Additionally, in point cloud feature extraction and matching processes, we incorporate constraint conditions based on semantic information to enhance accuracy and improve pose estimation precision. Furthermore, a semantic similarity constraint is included within the closed-loop factor module to significantly enhance positioning accuracy and facilitate the construction of maps with higher global consistency. Experimental results from KITTI and M2DGR datasets demonstrate that our method exhibits generalization ability towards unknown data while effectively mitigating dynamic interference in real-world environments. Compared with current state-of-the-art methods, our approach achieves notable improvements in both accuracy and robustness.
引用
收藏
页码:34042 / 34053
页数:12
相关论文
共 50 条
  • [11] Intermittent VIO-Assisted LiDAR SLAM Against Degeneracy: Recognition and Mitigation
    Xu, Jiahao
    Li, Tuan
    Wang, Hongxia
    Wang, Zhipeng
    Bai, Tong
    Hou, Xiaopeng
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2025, 74
  • [12] Semantic Lidar-Inertial SLAM for Dynamic Scenes
    Bu, Zean
    Sun, Changku
    Wang, Peng
    APPLIED SCIENCES-BASEL, 2022, 12 (20):
  • [13] SDF-SLAM: Semantic Depth Filter SLAM for Dynamic Environments
    Cui, Linyan
    Ma, Chaowei
    IEEE ACCESS, 2020, 8 (08): : 95301 - 95311
  • [14] Blitz-SLAM: A semantic SLAM in dynamic environments
    Fan, Yingchun
    Zhang, Qichi
    Tang, Yuliang
    Liu, Shaofen
    Han, Hong
    PATTERN RECOGNITION, 2022, 121
  • [15] TC2LI-SLAM: A Tightly-Coupled Camera-LiDAR-Inertial SLAM System
    Tong, Yunze
    Zhang, Xuebo
    Wang, Runhua
    Song, Zhixing
    Wu, Songyang
    Zhang, Shiyong
    Wang, Youwei
    Yuan, Jing
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (09): : 7421 - 7428
  • [16] DY-LIO: Tightly Coupled LiDAR-Inertial Odometry for Dynamic Environments
    Zou, Jingliang
    Chen, Huangsong
    Shao, Liang
    Bao, Haoran
    Tang, Hesheng
    Xiang, Jiawei
    Liu, Jun
    IEEE SENSORS JOURNAL, 2024, 24 (21) : 34756 - 34765
  • [17] RDMO-SLAM: Real-Time Visual SLAM for Dynamic Environments Using Semantic Label Prediction With Optical Flow
    Liu, Yubao
    Miura, Jun
    IEEE ACCESS, 2021, 9 : 106981 - 106997
  • [18] A Tightly Coupled LiDAR-Inertial SLAM for Perceptually Degraded Scenes
    Yang, Lin
    Ma, Hongwei
    Wang, Yan
    Xia, Jing
    Wang, Chuanwei
    SENSORS, 2022, 22 (08)
  • [19] VPE-SLAM: Virtual Point Enhanced SLAM Using Solid-State LiDAR for Weak Feature Environments
    Chen, Jiahui
    Zi, Yindong
    Li, Nian
    Guo, Shisheng
    Hao, Xiaojian
    Cui, Guolong
    Yang, Xiaobo
    IEEE SENSORS JOURNAL, 2024, 24 (10) : 16397 - 16407
  • [20] A Novel Real-time Semantic-Assisted Lidar Odometry and Mapping System
    Wang, Fei
    Wang, Zichen
    Yan, Fei
    Gu, Hong
    Zhuang, Yan
    2019 TENTH INTERNATIONAL CONFERENCE ON INTELLIGENT CONTROL AND INFORMATION PROCESSING (ICICIP), 2019, : 44 - 49