A Robust Visual-Inertial SLAM in Complex Indoor Environments

被引:9
作者
Zhong, Min [1 ]
You, Yinghui [2 ]
Zhou, Shuai [1 ]
Xu, Xiaosu [1 ]
机构
[1] Southeast Univ, Sch Instrument Sci & Engn, Key Lab Microinertial Instruments & Adv Nav Techn, Minist Educ, Nanjing 210096, Peoples R China
[2] CETC LES Informat Syst Grp Co Ltd, Nanjing 211300, Peoples R China
基金
中国国家自然科学基金;
关键词
Image edge detection; Optimization; Feature extraction; Simultaneous localization and mapping; Sensors; Cameras; Heuristic algorithms; Inertial measurement unit (IMU); indoor; red; green; blue and depth camera (RGBD); simultaneous localization and mapping (SLAM); visual-inertial; ODOMETRY;
D O I
10.1109/JSEN.2023.3274702
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This article proposes a visual-inertial simultaneous localization and mapping (SLAM) method based on edge alignment, which is specifically designed for mobile robot platforms with limited load operating in complex indoor environments, particularly those with weak texture and multiple corners. The proposed algorithm is a modified version of the real-time edge-based SLAM (RESLAM) framework, with the integration of inertial data into both its front- and back-end processing to enhance the positioning robustness of RESLAM. In addition, the algorithm includes inertial initialization and dynamic marginalization processes to ensure the stable operation. The performance of the algorithm has been evaluated through both simulation and experiments. The simulation results indicate that the proposed algorithm reduces the absolute trajectory error (ATE) of positioning error in comparison to the original edge alignment method. The improvement ranges from 13.62% to 71.70%, and it is more pronounced in sequences with complex motion, such as WithMR and Fast. To further validate the effectiveness of the proposed algorithm, a prototype platform was constructed, and its positioning capabilities were verified in indoor environments with weak textures and multiple corners.
引用
收藏
页码:19986 / 19994
页数:9
相关论文
共 27 条
[1]  
[Anonymous], 2012, Theory Comput., DOI [10.4086/toc.2012.v008a019, DOI 10.4086/TOC.2012.V008A019]
[2]   A METHOD FOR REGISTRATION OF 3-D SHAPES [J].
BESL, PJ ;
MCKAY, ND .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1992, 14 (02) :239-256
[3]  
Bloesch M, 2015, IEEE INT C INT ROBOT, P298, DOI 10.1109/IROS.2015.7353389
[4]   ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial, and Multimap SLAM [J].
Campos, Carlos ;
Elvira, Richard ;
Gomez Rodriguez, Juan J. ;
Montiel, Jose M. M. ;
Tardos, Juan D. .
IEEE TRANSACTIONS ON ROBOTICS, 2021, 37 (06) :1874-1890
[5]  
Company-Corcoles JP, 2019, IEEE INT C EMERG, P1563, DOI [10.1109/ETFA.2019.8868237, 10.1109/etfa.2019.8868237]
[6]   Direct Sparse Odometry [J].
Engel, Jakob ;
Koltun, Vladlen ;
Cremers, Daniel .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (03) :611-625
[7]  
Forster C, 2015, ROBOTICS: SCIENCE AND SYSTEMS XI
[8]   Realtime Edge Based Visual Inertial Odometry for MAV Teleoperation in Indoor Environments [J].
Jose Tarrio, Juan ;
Pedre, Sol .
JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2018, 90 (1-2) :235-252
[9]  
Kim C., 2018, PROC IEEERSJ INT C I, P1
[10]   Edge-based Visual Odometry with Stereo Cameras using Multiple Oriented Quadtrees [J].
Kim, Changhyeon ;
Kim, Junha ;
Kim, H. Jin .
2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, :5917-5924