A Robust Visual SLAM System in Dynamic Environment

被引:0
作者
Ma, Huajun [1 ]
Qin, Yijun [1 ]
Duan, Shukai [1 ]
Wang, Lidan [1 ,2 ,3 ,4 ]
机构
[1] Southwest Univ, Coll Artificial Intelligence, Chongqing 400715, Peoples R China
[2] Natl & Local Joint Engn Res Ctr Intelligent Trans, Chongqing 400715, Peoples R China
[3] Chongqing Key Lab Brain Inspired Comp & Intellige, Chongqing 400715, Peoples R China
[4] Minist Educ, Key Lab Luminescence Anal & Mol Sensing, Chongqing 400715, Peoples R China
来源
ADVANCES IN NEURAL NETWORKS-ISNN 2024 | 2024年 / 14827卷
基金
中国国家自然科学基金;
关键词
Visual SLAM; Dynamic scene; Semantic segmentation;
D O I
10.1007/978-981-97-4399-5_23
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Simultaneous Localization and Mapping (SLAM) is vital for the navigation of autonomous robots in unknown environments. Current SLAM systems have made progress but still struggle to balance accuracy, robustness, and real-time processing in dynamic environments. This paper presents a visual SLAM system that, with the assistance of neural networks, significantly enhances localization accuracy in dynamic environments without compromising real-time performance. It utilizes a static feature point algorithm based on semantic information during feature extraction to isolate static from dynamic feature points for better tracking and integrates a fast feature point weight calculation algorithm that assesses feature reliability based on proximity to dynamic objects, which greatly enhances the initial camera pose. Moreover, a match detection algorithm removes incorrect match relationships during local map tracking, which boosts pose precision and system robustness. The experiments on the TUM datasets show our system's superior performance. Specifically, in the s/static data sequence, our system achieves over 51.9% improvement on the RPE RMSE metric, while other systems either do worse than ORB-SLAM3 or improve less than 26.4%. These results prove the robustness of our system.
引用
收藏
页码:248 / 257
页数:10
相关论文
共 50 条
[31]   MG-SLAM: RGB-D SLAM Based on Semantic Segmentation for Dynamic Environment in the Internet of Vehicles [J].
Zhang, Fengju ;
Zhu, Kai .
CMC-COMPUTERS MATERIALS & CONTINUA, 2025, 82 (02) :2353-2372
[32]   A Robust Visual SLAM Method for Additive Manufacturing of Vehicular Parts Under Dynamic Scenes [J].
Xu, Wenbo ;
Fan, Weiwei ;
Li, Jingyang ;
Alfarraj, Osama ;
Tolba, Amr ;
Huang, Tianhong .
IEEE ACCESS, 2023, 11 :22114-22123
[33]   Robust Monocular SLAM in Dynamic Environments [J].
Tan, Wei ;
Liu, Haomin ;
Dong, Zilong ;
Zhang, Guofeng ;
Bao, Hujun .
2013 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR) - SCIENCE AND TECHNOLOGY, 2013, :209-218
[34]   Dynamic Visual SLAM Based on Improved ORB-SLAM3 Tracking Thread [J].
Wang, Quanchao ;
Dang, Shuwen .
SIXTEENTH INTERNATIONAL CONFERENCE ON GRAPHICS AND IMAGE PROCESSING, ICGIP 2024, 2025, 13539
[35]   ROBUST MAP ALIGNMENT FOR COOPERATIVE VISUAL SLAM [J].
Garcea, Adrian ;
Zhu, Jiazhen ;
Van Opdenbosch, Dominik ;
Steinbach, Eckehard .
2018 25TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2018, :4083-4087
[36]   A Review of Visual SLAM for Dynamic Objects [J].
Zhao, Lina ;
Wei, Baoguo ;
Li, Lixin ;
Li, Xu .
2022 IEEE 17TH CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS (ICIEA), 2022, :1080-1085
[37]   A review of visual SLAM with dynamic objects [J].
Qin, Yong ;
Yu, Haidong .
INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2023, 50 (06) :1000-1010
[38]   MOR-SLAM: A New Visual SLAM System for Indoor Dynamic Environments Based on Mask Restoration [J].
Yao, Chengzhi ;
Ding, Lei ;
Lan, Yonghong .
MATHEMATICS, 2023, 11 (19)
[39]   Fast Semantic-Aware Motion State Detection for Visual SLAM in Dynamic Environment [J].
Singh, Gaurav ;
Wu, Meiqing ;
Do, Minh, V ;
Lam, Siew-Kei .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (12) :23014-23030
[40]   Visual SLAM method combining sparse scene flow and weighted features in dynamic environment [J].
Yan, He ;
Wang, Xu ;
Lei, Qiuxia .
Zhongguo Guanxing Jishu Xuebao/Journal of Chinese Inertial Technology, 2024, 32 (09) :891-897