VINS-MKF: A Tightly-Coupled Multi-Keyframe Visual-Inertial Odometry for Accurate and Robust State Estimation

被引:5
|
作者
Zhang, Chaofan [1 ,2 ]
Liu, Yong [1 ]
Wang, Fan [1 ,2 ]
Xia, Yingwei [1 ]
Zhang, Wen [1 ]
机构
[1] Chinese Acad Sci, Inst Appl Technol, Hefei Inst Phys Sci, Hefei 230031, Anhui, Peoples R China
[2] Univ Sci & Technol China, Grad Sch, Sci Isl Branch, Hefei 230026, Anhui, Peoples R China
关键词
state estimation; visual odometry; visual inertial fusion; multiple fisheye cameras; tightly coupled; MOTION; SLAM; NAVIGATION; VERSATILE;
D O I
10.3390/s18114036
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
State estimation is crucial for robot autonomy, visual odometry (VO) has received significant attention in the robotics field because it can provide accurate state estimation. However, the accuracy and robustness of most existing VO methods are degraded in complex conditions, due to the limited field of view (FOV) of the utilized camera. In this paper, we present a novel tightly-coupled multi-keyframe visual-inertial odometry (called VINS-MKF), which can provide an accurate and robust state estimation for robots in an indoor environment. We first modify the monocular ORBSLAM (Oriented FAST and Rotated BRIEF Simultaneous Localization and Mapping) to multiple fisheye cameras alongside an inertial measurement unit (IMU) to provide large FOV visual-inertial information. Then, a novel VO framework is proposed to ensure the efficiency of state estimation, by adopting a GPU (Graphics Processing Unit) based feature extraction method and parallelizing the feature extraction thread that is separated from the tracking thread with the mapping thread. Finally, a nonlinear optimization method is formulated for accurate state estimation, which is characterized as being multi-keyframe, tightly-coupled and visual-inertial. In addition, accurate initialization and a novel MultiCol-IMU camera model are coupled to further improve the performance of VINS-MKF. To the best of our knowledge, it's the first tightly-coupled multi-keyframe visual-inertial odometry that joins measurements from multiple fisheye cameras and IMU. The performance of the VINS-MKF was validated by extensive experiments using home-made datasets, and it showed improved accuracy and robustness over the state-of-art VINS-Mono.
引用
收藏
页数:28
相关论文
共 41 条
  • [31] R 2 LIVE: A Robust, Real-Time, LiDAR-Inertial-Visual Tightly-Coupled State Estimator and Mapping
    Lin, Jiarong
    Zheng, Chunran
    Xu, Wei
    Zhang, Fu
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2021, 6 (04) : 7469 - 7476
  • [32] An Immediate Update Strategy of Multi-State Constraint Kalman Filter for Visual-Inertial Odometry
    Zhang, Qingchao
    Ouyang, Wei
    Han, Jiale
    Cai, Qi
    Zhu, Maoran
    Wu, Yuanxin
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2025, 10 (04): : 4125 - 4131
  • [33] EVI-SAM: Robust, Real-Time, Tightly-Coupled Event-Visual-Inertial State Estimation and 3D Dense Mapping
    Guan, Weipeng
    Chen, Peiyu
    Zhao, Huibin
    Wang, Yu
    Lu, Peng
    ADVANCED INTELLIGENT SYSTEMS, 2024,
  • [34] Visual-Inertial State Estimation with Pre-integration Correction for Robust Mobile Augmented Reality
    Yuan, Zikang
    Zhu, Dongfu
    Chi, Cheng
    Tang, Jinhui
    Liao, Chunyuan
    Yang, Xin
    PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA (MM'19), 2019, : 1410 - 1418
  • [35] GVINS: Tightly Coupled GNSS-Visual-Inertial Fusion for Smooth and Consistent State Estimation
    Cao, Shaozu
    Lu, Xiuyuan
    Shen, Shaojie
    IEEE TRANSACTIONS ON ROBOTICS, 2022, 38 (04) : 2004 - 2021
  • [36] DiT-SLAM: Real-Time Dense Visual-Inertial SLAM with Implicit Depth Representation and Tightly-Coupled Graph Optimization
    Zhao, Mingle
    Zhou, Dingfu
    Song, Xibin
    Chen, Xiuwan
    Zhang, Liangjun
    SENSORS, 2022, 22 (09)
  • [37] Tightly coupled integration of monocular visual-inertial odometry and UC-PPP based on factor graph optimization in difficult urban environments
    Pan, Cheng
    Li, Fangchao
    Pan, Yuanxin
    Wang, Yonghui
    Soja, Benedikt
    Li, Zengke
    Gao, Jingxiang
    GPS SOLUTIONS, 2024, 28 (01)
  • [38] RMSC-VIO: Robust Multi-Stereoscopic Visual-Inertial Odometry for Local Visually Challenging Scenarios
    Zhang, Tong
    Xu, Jianyu
    Shen, Hao
    Yang, Rui
    Yang, Tao
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (05) : 4130 - 4137
  • [39] Visual-Inertial Cross Fusion: A Fast and Accurate State Estimation Framework for Micro Flapping Wing Rotors
    Dong, Xin
    Wang, Ziyu
    Liu, Fangyuan
    Li, Song
    Fei, Fan
    Li, Daochun
    Tu, Zhan
    DRONES, 2022, 6 (04)
  • [40] Tunable Impact and Vibration Absorbing Neck for Robust Visual-Inertial State Estimation for Dynamic Legged Robots
    Kim, Taekyun
    Kim, Sangbae
    Lee, Dongjun
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 8 (03) : 1431 - 1438