DMOT-SLAM: visual SLAM in dynamic environments with moving object tracking

被引:1
|
作者
Wang, Kesai [1 ]
Yao, Xifan [1 ]
Ma, Nanfeng [1 ]
Ran, Guangjun [1 ]
Liu, Min [2 ]
机构
[1] South China Univ Technol, Sch Mech & Automot Engn, Guangzhou 510641, Peoples R China
[2] Guangxi Univ Sci & Technol, Sch Mech & Automot Engn, Liuzhou 545006, Guangxi, Peoples R China
基金
中国国家自然科学基金;
关键词
visual SLAM; moving object tracking; dynamic scenes; reconstruction;
D O I
10.1088/1361-6501/ad4dc7
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Visual simultaneous localization and mapping (SLAM) in dynamic environments has received significant attention in recent years, and accurate segmentation of real dynamic objects is the key to enhancing the accuracy of pose estimation in such environments. In this study, we propose a visual SLAM approach based on ORB-SLAM3, namely dynamic multiple object tracking SLAM (DMOT-SLAM), which can accurately estimate the camera's pose in dynamic environments while tracking the trajectories of moving objects. We introduce a spatial point correlation constraint and combine it with instance segmentation and epipolar constraint to identify dynamic objects. We integrate the proposed motion check method into DeepSort, an object tracking algorithm, to facilitate inter-frame tracking of dynamic objects. This integration not only enhances the stability of dynamic features detection but also enables the estimation of global motion trajectories for dynamic objects and the construction of object-level semi-dense semantic maps. We evaluate our approach on the public TUM, Bonn, and KITTI dataset, and the results show that our approach has a significant improvement over ORB-SLAM3 in dynamic scenes and performs better compared to other state-of-the-art SLAM approaches. Moreover, experiments in real-world scenarios further substantiate the effectiveness of our approach.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] MOD-SLAM:Visual SLAM with Moving Object Detection in Dynamic Environments
    Hi, Jiarui
    Fang, Hao
    Yang, Qingkai
    Zha, Wenzhong
    2021 PROCEEDINGS OF THE 40TH CHINESE CONTROL CONFERENCE (CCC), 2021, : 4302 - 4307
  • [2] OTE-SLAM: An Object Tracking Enhanced Visual SLAM System for Dynamic Environments
    Chang, Yimeng
    Hu, Jun
    Xu, Shiyou
    SENSORS, 2023, 23 (18)
  • [3] Robust Stereo Visual SLAM for Dynamic Environments With Moving Object
    Li, Gang
    Liao, Xiang
    Huang, Huilan
    Song, Shaojian
    Liu, Bin
    Zeng, Yawen
    IEEE ACCESS, 2021, 9 : 32310 - 32320
  • [4] RGB-D SLAM with moving object tracking in dynamic environments
    Dai, Weichen
    Zhang, Yu
    Zheng, Yuxin
    Sun, Donglei
    Li, Ping
    IET CYBER-SYSTEMS AND ROBOTICS, 2021, 3 (04) : 281 - 291
  • [5] Dynamic Object Tracking and Masking for Visual SLAM
    Vincent, Jonathan
    Labbe, Mathieu
    Lauzon, Jean-Samuel
    Grondin, Francois
    Comtois-Rivet, Pier-Marc
    Michaud, Francois
    2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 4974 - 4979
  • [6] DOT: Dynamic Object Tracking for Visual SLAM
    Ballester, Irene
    Fontan, Alejandro
    Civera, Javier
    Strobl, Klaus H.
    Triebel, Rudolph
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 11705 - 11711
  • [7] DP-SLAM: A visual SLAM with moving probability towards dynamic environments
    Li, Ao
    Wang, Jikai
    Xu, Meng
    Chen, Zonghai
    INFORMATION SCIENCES, 2021, 556 : 128 - 142
  • [8] Visual SLAM in dynamic environments based on object detection
    Ai, Yong-bao
    Rui, Ting
    Yang, Xiao-qiang
    He, Jia-lin
    Fu, Lei
    Li, Jian-bin
    Lu, Ming
    DEFENCE TECHNOLOGY, 2021, 17 (05) : 1712 - 1721
  • [9] Visual SLAM in dynamic environments based on object detection
    Yongbao Ai
    Ting Rui
    Xiaoqiang Yang
    Jialin He
    Lei Fu
    Jianbin Li
    Ming Lu
    Defence Technology, 2021, 17 (05) : 1712 - 1721
  • [10] Object Mobility classification based Visual SLAM in Dynamic Environments
    Zhang, Huayan
    Zhang, Tianwei
    Li, Yang
    Zhang, Lei
    Wang, Wanpeng
    2020 17TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS (UR), 2020, : 437 - 441