Tightly-coupled Fusion of Global Positional Measurements in Optimization-based Visual-Inertial Odometry

被引:59
作者
Cioffi, Giovanni [1 ,2 ,3 ]
Scaramuzza, Davide [1 ,2 ,3 ]
机构
[1] Univ Zurich, Robot & Percept Grp, Dept Informat, Zurich, Switzerland
[2] Univ Zurich, Dept Neuroinformat, Zurich, Switzerland
[3] Swiss Fed Inst Technol, Zurich, Switzerland
来源
2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS) | 2020年
基金
欧盟地平线“2020”;
关键词
NAVIGATION; ROBUST;
D O I
10.1109/IROS45743.2020.9341697
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Motivated by the goal of achieving robust, drift-free pose estimation in long-term autonomous navigation, in this work we propose a methodology to fuse global positional information with visual and inertial measurements in a tightly-coupled nonlinear-optimization-based estimator. Differently from previous works, which are loosely-coupled, the use of a tightly-coupled approach allows exploiting the correlations amongst all the measurements. A sliding window of the most recent system states is estimated by minimizing a cost function that includes visual re-projection errors, relative inertial errors, and global positional residuals. We use IMU preintegration to formulate the inertial residuals and leverage the outcome of such algorithm to efficiently compute the global position residuals. The experimental results show that the proposed method achieves accurate and globally consistent estimates, with negligible increase of the optimization computational cost. Our method consistently outperforms the loosely-coupled fusion approach. The mean position error is reduced up to 50 % with respect to the loosely-coupled approach in outdoor Unmanned Aerial Vehicle (UAV) flights, where the global position information is given by noisy GPS measurements. To the best of our knowledge, this is the first work where global positional measurements are tightly fused in an optimization-based visual-inertial odometry algorithm, leveraging the IMU preintegration method to define the global positional factors.
引用
收藏
页码:5089 / 5095
页数:7
相关论文
共 24 条
[1]  
[Anonymous], Ceres Solver
[2]   The EuRoC micro aerial vehicle datasets [J].
Burri, Michael ;
Nikolic, Janosch ;
Gohl, Pascal ;
Schneider, Thomas ;
Rehder, Joern ;
Omari, Sammy ;
Achtelik, Markus W. ;
Siegwart, Roland .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2016, 35 (10) :1157-1163
[3]  
Cucci D. A., 2019, ISPRS ANN PHOTOGRAMM, V4
[4]   Bundle adjustment with raw inertial observations in UAV applications [J].
Cucci, Davide Antonio ;
Rehak, Martin ;
Skaloud, Jan .
ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2017, 130 :1-12
[5]  
Delmerico J, 2018, IEEE INT CONF ROBOT, P2502
[6]   SVO: Semidirect Visual Odometry for Monocular and Multicamera Systems [J].
Forster, Christian ;
Zhang, Zichao ;
Gassner, Michael ;
Werlberger, Manuel ;
Scaramuzza, Davide .
IEEE TRANSACTIONS ON ROBOTICS, 2017, 33 (02) :249-265
[7]  
Forster C, 2016, Arxiv, DOI arXiv:1512.02363
[8]  
Geiger A, 2012, PROC CVPR IEEE, P3354, DOI 10.1109/CVPR.2012.6248074
[9]   Camera-IMU-based localization: Observability analysis and consistency improvement [J].
Hesch, Joel A. ;
Kottas, Dimitrios G. ;
Bowman, Sean L. ;
Roumeliotis, Stergios I. .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2014, 33 (01) :182-201
[10]  
Huang GQ, 2019, IEEE INT CONF ROBOT, P9572, DOI [10.1109/icra.2019.8793604, 10.1109/ICRA.2019.8793604]