DynaVINS: A Visual-Inertial SLAM for Dynamic Environments

被引:72
作者
Song, Seungwon [1 ]
Lim, Hyungtae [1 ]
Lee, Alex Junho [2 ]
Myung, Hyun [1 ]
机构
[1] Korea Adv Inst Sci & Technol, Sch Elect Engn, Daejeon 34141, South Korea
[2] Korea Adv Inst Sci & Technol, Dept Civil & Environm Engn, Daejeon 34141, South Korea
关键词
Visual-inertial SLAM; SLAM; visual tracking; ROBUST; TRACKING; VERSATILE; ODOMETRY; FILTER;
D O I
10.1109/LRA.2022.3203231
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Visual inertial odometry and SLAM algorithms are widely used in various fields, such as service robots, drones, and autonomous vehicles. Most of the SLAM algorithms are based on assumption that landmarks are static. However, in the real-world, various dynamic objects exist, and they degrade the pose estimation accuracy. In addition, temporarily static objects, which are static during observation but move when they are out of sight, trigger false positive loop closings. To overcome these problems, we propose a novel visual-inertial SLAM framework, called DynaVINS, which is robust against both dynamic objects and temporarily static objects. In our framework, we first present a robust bundle adjustment that could reject the features from dynamic objects by leveraging pose priors estimated by the IMU preintegration. Then, a keyframe grouping and a multi-hypothesis-based constraints grouping methods are proposed to reduce the effect of temporarily static objects in the loop closing. Subsequently, we evaluated our method in a public dataset that contains numerous dynamic objects. Finally, the experimental results corroborate that our DynaVINS has promising performance compared with other state-of-the-art methods by successfully rejecting the effect of dynamic and temporarily static objects.
引用
收藏
页码:11523 / 11530
页数:8
相关论文
共 25 条
[1]  
Babin P, 2019, IEEE INT CONF ROBOT, P1451, DOI [10.1109/icra.2019.8793791, 10.1109/ICRA.2019.8793791]
[2]   DynaSLAM II: Tightly-Coupled Multi-Object Tracking and SLAM [J].
Bescos, Berta ;
Campos, Carlos ;
Tardos, Juan D. ;
Neira, Jose .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2021, 6 (03) :5191-5198
[3]   DynaSLAM: Tracking, Mapping, and Inpainting in Dynamic Scenes [J].
Bescos, Berta ;
Facil, Jose M. ;
Civera, Javier ;
Neira, Jose .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2018, 3 (04) :4076-4083
[4]   On the unification of line processes, outlier rejection, and robust statistics with applications in early vision [J].
Black, MJ ;
Rangarajan, A .
INTERNATIONAL JOURNAL OF COMPUTER VISION, 1996, 19 (01) :57-91
[5]  
Bloesch M, 2015, IEEE INT C INT ROBOT, P298, DOI 10.1109/IROS.2015.7353389
[6]   ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial, and Multimap SLAM [J].
Campos, Carlos ;
Elvira, Richard ;
Gomez Rodriguez, Juan J. ;
Montiel, Jose M. M. ;
Tardos, Juan D. .
IEEE TRANSACTIONS ON ROBOTICS, 2021, 37 (06) :1874-1890
[7]   Speed and Memory Efficient Dense RGB-D SLAM in Dynamic Scenes [J].
Canovas, Bruce ;
Rombaut, Michele ;
Negre, Amaury ;
Pellerin, Denis ;
Olympieff, Serge .
2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, :4996-5001
[8]   Dynamic objects elimination in SLAM based on image fusion [J].
Fan, Yingchun ;
Han, Hong ;
Tang, Yuliang ;
Zhi, Tao .
PATTERN RECOGNITION LETTERS, 2019, 127 :191-201
[9]   Bags of Binary Words for Fast Place Recognition in Image Sequences [J].
Galvez-Lopez, Dorian ;
Tardos, Juan D. .
IEEE TRANSACTIONS ON ROBOTICS, 2012, 28 (05) :1188-1197
[10]   A NONLINEAR FILTER FOR FILM RESTORATION AND OTHER PROBLEMS IN IMAGE-PROCESSING [J].
GEMAN, S ;
MCCLURE, DE ;
GEMAN, D .
CVGIP-GRAPHICAL MODELS AND IMAGE PROCESSING, 1992, 54 (04) :281-289