Online Self-Calibration for Visual-Inertial Navigation: Models, Analysis, and Degeneracy

被引:9
作者
Yang, Yulin [1 ]
Geneva, Patrick [1 ]
Zuo, Xingxing [2 ]
Huang, Guoquan [1 ]
机构
[1] Univ Delaware, Robot Percept & Nav Grp, Newark, DE 19716 USA
[2] Tech Univ Munich, Dept Informat, D-80333 Munich, Germany
关键词
Degenerate motions; observability analysis; sensor self-calibration; state estimation; visual inertial systems; OBSERVABILITY ANALYSIS; KALMAN FILTER; ODOMETRY;
D O I
10.1109/TRO.2023.3275878
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
As sensor calibration plays an important role in visual-inertial sensor fusion, this article performs an in-depth investigation of online self-calibration for robust and accurate visual-inertial state estimation. To this end, we first conduct complete observability analysis for visual-inertial navigation systems (VINS) with full calibration of sensing parameters, including inertial measurement unit (IMU)/camera intrinsics and IMU-camera spatial-temporal extrinsic calibration, along with readout time of rolling shutter (RS) cameras (if used). We study different inertial model variants containing intrinsic parameters that encompass most commonly used models for low-cost inertial sensors. With these models, the observability analysis of linearized VINS with full sensor calibration is performed. Our analysis theoretically proves the intuition commonly assumed in the literature-that is, VINS with full sensor calibration has four unobservable directions, corresponding to the system's global yaw and position, while all sensor calibration parameters are observable given fully excited motions. Moreover, we, for the first time, identify degenerate motion primitives for IMU and camera intrinsic calibration, which, when combined, may produce complex degenerate motions. We compare the proposed online self-calibration on commonly used IMUs against the state-of-art offline calibration toolbox Kalibr, showing that the proposed system achieves better consistency and repeatability. Based on our analysis and experimental evaluations, we also offer practical guidelines to effectively perform online IMU-camera self-calibration in practice.
引用
收藏
页码:3479 / 3498
页数:20
相关论文
共 49 条
  • [1] Associating Uncertainty to Extended Poses for on Lie Group IMU Preintegration With Rotating Earth
    Brossard, Martin
    Barrau, Axel
    Chauchat, Paul
    Bonnabel, Silvere
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2022, 38 (02) : 998 - 1015
  • [2] The EuRoC micro aerial vehicle datasets
    Burri, Michael
    Nikolic, Janosch
    Gohl, Pascal
    Schneider, Thomas
    Rehder, Joern
    Omari, Sammy
    Achtelik, Markus W.
    Siegwart, Roland
    [J]. INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2016, 35 (10) : 1157 - 1163
  • [3] ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial, and Multimap SLAM
    Campos, Carlos
    Elvira, Richard
    Gomez Rodriguez, Juan J.
    Montiel, Jose M. M.
    Tardos, Juan D.
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2021, 37 (06) : 1874 - 1890
  • [4] Chatfield A.B., 1997, FUNDAMENTALS HIGH AC
  • [5] MIMC-VINS: A Versatile and Resilient Multi-IMU Multi-Camera Visual-Inertial Navigation System
    Eckenhoff, Kevin
    Geneva, Patrick
    Huang, Guoquan
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2021, 37 (05) : 1360 - 1380
  • [6] On-Manifold Preintegration for Real-Time Visual-Inertial Odometry
    Forster, Christian
    Carlone, Luca
    Dellaert, Frank
    Scaramuzza, Davide
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2017, 33 (01) : 1 - 21
  • [7] Furgale P, 2013, IEEE INT C INT ROBOT, P1280, DOI 10.1109/IROS.2013.6696514
  • [8] Geneva P, 2020, IEEE INT CONF ROBOT, P4666, DOI [10.1109/icra40945.2020.9196524, 10.1109/ICRA40945.2020.9196524]
  • [9] Geneva P, 2019, IEEE INT CONF ROBOT, P3535, DOI [10.1109/ICRA.2019.8793836, 10.1109/icra.2019.8793836]
  • [10] Guo C., 2014, P OFFSH TECHN C AS O, P1