RGB-D Inertial Odometry for a Resource-Restricted Robot in Dynamic Environments

被引:61
作者
Liu, Jianheng [1 ]
Li, Xuanfu [2 ]
Liu, Yueqian [1 ]
Chen, Haoyao [1 ]
机构
[1] Harbin Inst Technol Shenzhen, Sch Mech Engn & Automat, Shenzhen 518055, Guangdong, Peoples R China
[2] Huawei Technol Co Ltd, Dept HiSilicon Res, Shenzhen 518129, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Localization; visual-inertial SLAM;
D O I
10.1109/LRA.2022.3191193
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Current simultaneous localization and mapping (SLAM) algorithms perform well in static environments but easily fail in dynamic environments. Recent works introduce deep learning-based semantic information to SLAM systems to reduce the influence of dynamic objects. However, it is still challenging to apply a robust localization in dynamic environments for resource-restricted robots. This paper proposes a real-time RGB-D inertial odometry system for resource-restricted robots in dynamic environments named Dynamic-VINS. Three main threads run in parallel: object detection, feature tracking, and state optimization. The proposed Dynamic-VINS combines object detection and depth information for dynamic feature recognition and achieves performance comparable to semantic segmentation. Dynamic-VINS adopts grid-based feature detection and proposes a fast and efficient method to extract high-quality FAST feature points. IMU is applied to predict motion for feature tracking and moving consistency check. The proposed method is evaluated on both public datasets and real-world applications and shows competitive localization accuracy and robustness in dynamic environments. Yet, to the best of our knowledge, it is the best-performance real-time RGB-D inertial odometry for resource-restricted platforms in dynamic environments for now. The proposed system is open source at: https://github.com/HITSZ-NRSL/Dynamic-VINS.git.
引用
收藏
页码:9573 / 9580
页数:8
相关论文
共 30 条
[11]  
He K., 2017, P IEEE INT C COMPUTE, P2980, DOI [DOI 10.1109/ICCV.2017.322, 10.1109/ICCV.2017.322]
[12]  
Ji T., 2021, PROC IEEE INT C ROBO, P175
[13]   RDS-SLAM: Real-Time Dynamic SLAM Using Semantic Segmentation Methods [J].
Liu, Yubao ;
Jun, Miura .
IEEE ACCESS, 2021, 9 :23772-23785
[14]  
Lucas B. D., 1981, DARPA IM UND WORKSH, V2, P674, DOI DOI 10.5334/JORS.BL
[15]   ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras [J].
Mur-Artal, Raul ;
Tardos, Juan D. .
IEEE TRANSACTIONS ON ROBOTICS, 2017, 33 (05) :1255-1262
[16]   ARM-VO: an efficient monocular visual odometry for ground vehicles on ARM CPUs [J].
Nejad, Zana Zakaryaie ;
Ahmadabadian, Ali Hosseininaveh .
MACHINE VISION AND APPLICATIONS, 2019, 30 (06) :1061-1070
[17]  
Palazzolo E, 2019, IEEE INT C INT ROBOT, P7855, DOI [10.1109/IROS40897.2019.8967590, 10.1109/iros40897.2019.8967590]
[18]   VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator [J].
Qin, Tong ;
Li, Peiliang ;
Shen, Shaojie .
IEEE TRANSACTIONS ON ROBOTICS, 2018, 34 (04) :1004-1020
[19]  
Redmon J., 2018, arXiv, DOI 10.48550/arXiv.1804.02767
[20]   Machine learning for high-speed corner detection [J].
Rosten, Edward ;
Drummond, Tom .
COMPUTER VISION - ECCV 2006 , PT 1, PROCEEDINGS, 2006, 3951 :430-443