Tightly-coupled stereo vision-aided inertial navigation using feature-based motion sensors

被引:14
作者
Asadi, E. [1 ]
Bottasso, C. L. [1 ,2 ]
机构
[1] Politecn Milan, Dept Aerosp Sci & Technol, I-20156 Milan, Italy
[2] Tech Univ Munich, Wind Energy Inst, D-85748 Garching, Germany
关键词
vision-aided inertial navigation; tight-coupling; sensor fusion; trifocal constraint; SIMULTANEOUS LOCALIZATION; ROBOT LOCALIZATION; IMAGE;
D O I
10.1080/01691864.2013.870496
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
A tightly-coupled stereo vision-aided inertial navigation system is proposed in this work, as a synergistic incorporation of vision with other sensors. In order to avoid loss of information possibly resulting by visual preprocessing, a set of feature-based motion sensors and an inertial measurement unit are directly fused together to estimate the vehicle state. Two alternative feature-based observation models are considered within the proposed fusion architecture. The first model uses the trifocal tensor to propagate feature points by homography, so as to express geometric constraints among three consecutive scenes. The second one is derived by using a rigid body motion model applied to three-dimensional (3D) reconstructed feature points. A kinematic model accounts for the vehicle motion, and a Sigma-Point Kalman filter is used to achieve a robust state estimation in the presence of non-linearities. The proposed formulation is derived for a general platform-independent 3D problem, and it is tested and demonstrated with a real dynamic indoor data-set alongside of a simulation experiment. Results show improved estimates than in the case of a classical visual odometry approach and of a loosely-coupled stereo vision-aided inertial navigation system, even in GPS (Global Positioning System)-denied conditions and when magnetometer measurements are not reliable.
引用
收藏
页码:717 / 729
页数:13
相关论文
共 21 条
[1]  
[Anonymous], 2001, Robotica, DOI DOI 10.1017/S0263574700223217
[2]   A Decentralized Architecture for Simultaneous Localization and Mapping [J].
Asadi, Ehsan ;
Bozorg, Mohammad .
IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2009, 14 (01) :64-71
[3]   Visual navigation for mobile robots: A survey [J].
Bonin-Font, Francisco ;
Ortiz, Alberto ;
Oliver, Gabriel .
JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2008, 53 (03) :263-296
[4]   Attitude Estimation of a Biologically Inspired Robotic Housefly via Multimodal Sensor Fusion [J].
Campolo, Domenico ;
Schenato, Luca ;
Pi, Lijuan ;
Deng, Xinyan ;
Guglielmelli, Eugenio .
ADVANCED ROBOTICS, 2009, 23 (7-8) :955-977
[5]   Real-time Quadrifocal Visual Odometry [J].
Comport, A. I. ;
Malis, E. ;
Rives, P. .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2010, 29 (2-3) :245-266
[6]   An introduction to inertial and visual sensing [J].
Corke, Peter ;
Lobo, Jorge ;
Dias, Jorge .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2007, 26 (06) :519-535
[7]   Simultaneous localization and mapping: Part I [J].
Durrant-Whyte, Hugh ;
Bailey, Tim .
IEEE ROBOTICS & AUTOMATION MAGAZINE, 2006, 13 (02) :99-108
[8]  
Geiger A, 2011, LECT NOTES COMPUT SC, V6492, P25, DOI 10.1007/978-3-642-19315-6_3
[9]   Improved rover state estimation in challenging terrain [J].
Hoffman, BD ;
Baumgartner, ET ;
Huntsberger, TL ;
Schenker, PS .
AUTONOMOUS ROBOTS, 1999, 6 (02) :113-130
[10]   Biologically inspired visual odometer for navigation of a flying robot [J].
Iida, F .
ROBOTICS AND AUTONOMOUS SYSTEMS, 2003, 44 (3-4) :201-208