RGB-D SLAM in Dynamic Environments Using Point Correlations

被引:148
作者
Dai, Weichen [1 ]
Zhang, Yu [2 ]
Li, Ping [2 ]
Fang, Zheng [3 ]
Scherer, Sebastian [4 ]
机构
[1] Zhejiang Univ, Coll Control Sci & Engn, Hangzhou 310027, Zhejiang, Peoples R China
[2] Zhejiang Univ, State Key Lab Ind Control Technol, Coll Control Sci & Engn, Hangzhou 310027, Zhejiang, Peoples R China
[3] Northeastern Univ, Fac Robot Sci & Engn, Shenyang 110032, Peoples R China
[4] Carnegie Mellon Univ, Inst Robot, Pittsburgh, PA 15213 USA
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
SLAM; motion estimation; dynamic environments; VISUAL ODOMETRY; TRACKING; MOTION; ALGORITHM;
D O I
10.1109/TPAMI.2020.3010942
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, a simultaneous localization and mapping (SLAM) method that eliminates the influence of moving objects in dynamic environments is proposed. This method utilizes the correlation between map points to separate points that are part of the static scene and points that are part of different moving objects into different groups. A sparse graph is first created using Delaunay triangulation from all map points. In this graph, the vertices represent map points, and each edge represents the correlation between adjacent points. If the relative position between two points remains consistent over time, there is correlation between them, and they are considered to be moving together rigidly. If not, they are considered to have no correlation and to be in separate groups. After the edges between the uncorrelated points are removed during point-correlation optimization, the remaining graph separates the map points of the moving objects from the map points of the static scene. The largest group is assumed to be the group of reliable static map points. Finally, motion estimation is performed using only these points. The proposed method was implemented for RGB-D sensors, evaluated with a public RGB-D benchmark, and tested in several additional challenging environments. The experimental results demonstrate that robust and accurate performance can be achieved by the proposed SLAM method in both slightly and highly dynamic environments. Compared with other state-of-the-art methods, the proposed method can provide competitive accuracy with good real-time performance.
引用
收藏
页码:373 / 389
页数:17
相关论文
共 60 条
[41]   Tracking 3-D Motion of Dynamic Objects Using Monocular Visual-Inertial Sensing [J].
Qiu, Kejie ;
Qin, Tong ;
Gao, Wenliang ;
Shen, Shaojie .
IEEE TRANSACTIONS ON ROBOTICS, 2019, 35 (04) :799-816
[42]   You Only Look Once: Unified, Real-Time Object Detection [J].
Redmon, Joseph ;
Divvala, Santosh ;
Girshick, Ross ;
Farhadi, Ali .
2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, :779-788
[43]  
Riazuelo L, 2017, 2017 EUROPEAN CONFERENCE ON MOBILE ROBOTS (ECMR)
[44]  
Runz Martin, 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA), P4471, DOI 10.1109/ICRA.2017.7989518
[45]   MaskFusion: Real-Time Recognition, Tracking and Reconstruction of Multiple Moving Objects [J].
Runz, Martin ;
Buffier, Maud ;
Agapito, Lourdes .
PROCEEDINGS OF THE 2018 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR), 2018, :10-20
[46]   Visual SLAM and Structure from Motion in Dynamic Environments: A Survey [J].
Saputra, Muhamad Risqi U. ;
Markham, Andrew ;
Trigoni, Niki .
ACM COMPUTING SURVEYS, 2018, 51 (02)
[47]   Visual Odometry Part I: The First 30 Years and Fundamentals [J].
Scaramuzza, Davide ;
Fraundorfer, Friedrich .
IEEE ROBOTICS & AUTOMATION MAGAZINE, 2011, 18 (04) :80-92
[48]  
Scona R, 2018, IEEE INT CONF ROBOT, P3849, DOI 10.1109/ICRA.2018.8460681
[49]  
Stachniss C, 2016, SPRINGER HANDBOOK OF ROBOTICS, P1153
[50]  
Steinbrücker F, 2011, 2011 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCV WORKSHOPS)