TSG-SLAM: SLAM Employing Tight Coupling of Instance Segmentation and Geometric Constraints in Complex Dynamic Environments

被引:4
作者
Zhang, Yongchao [1 ]
Li, Yuanming [2 ,3 ]
Chen, Pengzhan [1 ,3 ]
Xie, Yuanlong
Zheng, Shiqi
Hu, Zhaozheng
Wang, Shuting
机构
[1] Taizhou Univ, Sch Intelligent Mfg, Taizhou 318000, Peoples R China
[2] Ganzhou Polytech, Dept Elect Engn, Ganzhou 341000, Peoples R China
[3] East China Jiaotong Univ, Sch Elect & Automat Engn, Nanchang 330013, Peoples R China
基金
中国国家自然科学基金;
关键词
SLAM; complex dynamic environment; fundamental matrix; semantic segmentation; multi-view geometric constraint; RGB-D SLAM; MOTION REMOVAL; MONOCULAR SLAM; TRACKING; RECONSTRUCTION; ODOMETRY;
D O I
10.3390/s23249807
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Although numerous effective Simultaneous Localization and Mapping (SLAM) systems have been developed, complex dynamic environments continue to present challenges, such as managing moving objects and enabling robots to comprehend environments. This paper focuses on a visual SLAM method specifically designed for complex dynamic environments. Our approach proposes a dynamic feature removal module based on the tight coupling of instance segmentation and multi-view geometric constraints (TSG). This method seamlessly integrates semantic information with geometric constraint data, using the fundamental matrix as a connecting element. In particular, instance segmentation is performed on frames to eliminate all dynamic and potentially dynamic features, retaining only reliable static features for sequential feature matching and acquiring a dependable fundamental matrix. Subsequently, based on this matrix, true dynamic features are identified and removed by capitalizing on multi-view geometry constraints while preserving reliable static features for further tracking and mapping. An instance-level semantic map of the global scenario is constructed to enhance the perception and understanding of complex dynamic environments. The proposed method is assessed on TUM datasets and in real-world scenarios, demonstrating that TSG-SLAM exhibits superior performance in detecting and eliminating dynamic feature points and obtains good localization accuracy in dynamic environments.
引用
收藏
页数:25
相关论文
共 45 条
[21]   RDS-SLAM: Real-Time Dynamic SLAM Using Semantic Segmentation Methods [J].
Liu, Yubao ;
Jun, Miura .
IEEE ACCESS, 2021, 9 :23772-23785
[22]   Hierarchical Semantic Mapping Using Convolutional Neural Networks for Intelligent Service Robotics [J].
Luo, Ren C. ;
Chiou, Michael .
IEEE ACCESS, 2018, 6 :61287-61294
[23]   Fusion plus plus : Volumetric Object-Level SLAM [J].
McCormac, John ;
Clark, Ronald ;
Bloesch, Michael ;
Davison, Andrew J. ;
Leutenegger, Stefan .
2018 INTERNATIONAL CONFERENCE ON 3D VISION (3DV), 2018, :32-41
[24]   ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras [J].
Mur-Artal, Raul ;
Tardos, Juan D. .
IEEE TRANSACTIONS ON ROBOTICS, 2017, 33 (05) :1255-1262
[25]   ORB-SLAM: A Versatile and Accurate Monocular SLAM System [J].
Mur-Artal, Raul ;
Montiel, J. M. M. ;
Tardos, Juan D. .
IEEE TRANSACTIONS ON ROBOTICS, 2015, 31 (05) :1147-1163
[26]   A region-based SLAM algorithm capturing metric, topological, and semantic properties [J].
Oberlaender, Jan ;
Uhl, Klaus ;
Zoellner, J. Marius ;
Dillmann, Ruediger .
2008 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-9, 2008, :1886-1891
[27]   You Only Look Once: Unified, Real-Time Object Detection [J].
Redmon, Joseph ;
Divvala, Santosh ;
Girshick, Ross ;
Farhadi, Ali .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :779-788
[28]   MaskFusion: Real-Time Recognition, Tracking and Reconstruction of Multiple Moving Objects [J].
Runz, Martin ;
Buffier, Maud ;
Agapito, Lourdes .
PROCEEDINGS OF THE 2018 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR), 2018, :10-20
[29]   Point Feature Extraction on 3D Range Scans Taking into Account Object Boundaries [J].
Steder, Bastian ;
Rusu, Radu Bogdan ;
Konolige, Kurt ;
Burgard, Wolfram .
2011 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2011, :2601-2608
[30]  
Sturm J, 2012, IEEE INT C INT ROBOT, P573, DOI 10.1109/IROS.2012.6385773