Camera Intrinsic Parameters Estimation by Visual-Inertial Odometry for a Mobile Phone With Application to Assisted Navigation

被引:30
作者
Jin, Lingqiu [1 ]
Zhang, He [1 ]
Ye, Cang [1 ]
机构
[1] Virginia Commonwealth Univ, Comp Sci Dept, Med Coll Virginia Campus, Richmond, VA 23284 USA
关键词
Cameras; RNA; Pose estimation; Navigation; Simultaneous localization and mapping; Computational modeling; Optimization; 6-DOF camera pose estimation; robotic navigation aid (RNA); simultaneous localization and mapping (SLAM); visual-inertial odometry (VIO); ROBUST; SYSTEM; SLAM;
D O I
10.1109/TMECH.2020.2997606
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The increasing computing and sensing capabilities of modern mobile phones have spurred research interests in developing new visual-inertial odometry (VIO) techniques to turn a smartphone into a self-contained vision-aided inertial navigation system for various applications. Smartphones nowadays use cameras with optical image stabilization (OIS) technology to reduce image blurs. However, the mechanism may result in varying camera intrinsic parameters (CIP), which must be taken into account in VIO computation. In this article, we first develop a linear model to relate the CIP with the inertial measurement unit measured acceleration. Based on the model, we introduce a new VIO method, called CIP-VMobile, which treats CIP as state variables and tightly couples them with other state variables in a graph optimization process to estimate the optimal state. The method uses the linear model to construct a factor graph and uses the linear-model-computed values as initial CIP estimates to speed up the VIO computation and attain a better pose estimation result. Simulation and experimental results with an iPhone 7 validate the method's efficacy. Based on CIP-VMobile, we fabricated a robotic navigation aid (RNA) based on an iPhone 7 for assisted navigation. Experimental results with the RNA demonstrate CIP-VMobile's promise in real-world navigation applications.
引用
收藏
页码:1803 / 1811
页数:9
相关论文
共 34 条
[1]  
[Anonymous], 2019, C HUM SYST INTERACT
[2]  
Carlone L, 2013, P IEEE INT C ROB AUT, P957
[3]   A Human-Robot Collaborative System for Robust Three-Dimensional Mapping [J].
Du, Jianhao ;
Sheng, Weihua ;
Liu, Meiqin .
IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2018, 23 (05) :2358-2368
[4]  
Fan W., 2017, SENSORS, V17
[5]  
Faragher RM, 2013, I NAVIG SAT DIV INT, P1006
[6]   Information fusion in navigation systems via factor graph based incremental smoothing [J].
Indelman, Vadim ;
Williams, Stephen ;
Kaess, Michael ;
Dellaert, Frank .
ROBOTICS AND AUTONOMOUS SYSTEMS, 2013, 61 (08) :721-738
[7]   Efficient Grid-Based Rao-Blackwellized Particle Filter SLAM With Interparticle Map Sharing [J].
Jo, HyungGi ;
Cho, Hae Min ;
Jo, Sungjin ;
Kim, Euntai .
IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2018, 23 (02) :714-724
[8]   Parallel Tracking and Mapping on a Camera Phone [J].
Klein, Georg ;
Murray, David .
2009 8TH IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY - SCIENCE AND TECHNOLOGY, 2009, :83-86
[9]  
Klingensmith M., 2015, ROBOT SCI SYST
[10]  
Lee Y. H., 2011, P 2011 IEEE INT C CO