Tightly-coupled visual-inertial odometry with robust feature association in dynamic illumination environments

被引:0
作者
Zhang, Jie [1 ]
Zhang, Cong [2 ]
Liu, Qingchen [2 ]
Ma, Qichao [2 ]
Qin, Jiahu [2 ]
机构
[1] Univ Sci & Technol China, Inst Adv Technol, Hefei 230031, Peoples R China
[2] Univ Sci & Technol China, Dept Automat, Hefei 230027, Peoples R China
基金
中国国家自然科学基金;
关键词
simultaneous localization and mapping; optical flow; dynamic illumination; visual-inertial odometry; VERSATILE;
D O I
10.1017/S0263574725000608
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
This paper focuses on the feature-based visual-inertial odometry (VIO) in dynamic illumination environments. While the performance of most existing feature-based VIO methods is degraded by the dynamic illumination, which leads to unstable feature association, we propose a tightly-coupled VIO algorithm termed RAFT-VINS, integrating a Lite-RAFT tracker into the visual inertial navigation system (VINS). The key module of this odometry algorithm is a lightweight optical flow network designed for accurate feature tracking with real-time operation. It guarantees robust feature association in dynamic illumination environments and thereby ensures the performance of the odometry. Besides, to further improve the accuracy of the pose estimation, a moving consistency check strategy is developed in RAFT-VINS to identify and remove the outlier feature points. Meanwhile, a tightly-coupled optimization-based framework is employed to fuse IMU and visual measurements in the sliding window for efficient and accurate pose estimation. Through comprehensive experiments in the public datasets and real-world scenarios, the proposed RAFT-VINS is validated for its capacity to provide trustable pose estimates in challenging dynamic illumination environments. Our codes are open-sourced on https://github.com/USTC-AIS-Lab/RAFT-VINS.
引用
收藏
页数:16
相关论文
共 33 条
[1]  
Agarwal S., 2023, Ceres Solver
[2]   The EuRoC micro aerial vehicle datasets [J].
Burri, Michael ;
Nikolic, Janosch ;
Gohl, Pascal ;
Schneider, Thomas ;
Rehder, Joern ;
Omari, Sammy ;
Achtelik, Markus W. ;
Siegwart, Roland .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2016, 35 (10) :1157-1163
[3]   A Naturalistic Open Source Movie for Optical Flow Evaluation [J].
Butler, Daniel J. ;
Wulff, Jonas ;
Stanley, Garrett B. ;
Black, Michael J. .
COMPUTER VISION - ECCV 2012, PT VI, 2012, 7577 :611-625
[4]   ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial, and Multimap SLAM [J].
Campos, Carlos ;
Elvira, Richard ;
Gomez Rodriguez, Juan J. ;
Montiel, Jose M. M. ;
Tardos, Juan D. .
IEEE TRANSACTIONS ON ROBOTICS, 2021, 37 (06) :1874-1890
[5]   End-to-End Object Detection with Transformers [J].
Carion, Nicolas ;
Massa, Francisco ;
Synnaeve, Gabriel ;
Usunier, Nicolas ;
Kirillov, Alexander ;
Zagoruyko, Sergey .
COMPUTER VISION - ECCV 2020, PT I, 2020, 12346 :213-229
[6]   Dynamic simultaneous localization and mapping based on object tracking in occluded environment [J].
Ding, Weili ;
Pei, Ziqi ;
Yang, Tao ;
Chen, Taiyu .
ROBOTICA, 2024, 42 (07) :2209-2225
[7]   FlowNet: Learning Optical Flow with Convolutional Networks [J].
Dosovitskiy, Alexey ;
Fischer, Philipp ;
Ilg, Eddy ;
Haeusser, Philip ;
Hazirbas, Caner ;
Golkov, Vladimir ;
van der Smagt, Patrick ;
Cremers, Daniel ;
Brox, Thomas .
2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, :2758-2766
[8]   Direct Sparse Odometry [J].
Engel, Jakob ;
Koltun, Vladlen ;
Cremers, Daniel .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (03) :611-625
[9]  
Forster C, 2014, IEEE INT CONF ROBOT, P15, DOI 10.1109/ICRA.2014.6906584
[10]  
Fu Q., 2020, arXiv