Self-Localization of an Omnidirectional Mobile Robot Based on an Optical Flow Sensor

被引:1
作者
Atsushi Sanada
Kazuo Ishii
Tetsuya Yagi
机构
[1] Kyushu Institute of Technology,
[2] Osaka University,undefined
来源
Journal of Bionic Engineering | 2010年 / 7卷
关键词
omnidirectional mobile robot; silicon retina camera; optical flow; FPGA;
D O I
暂无
中图分类号
学科分类号
摘要
An omnidirectional mobile robot has the advantage that three degrees of freedom of motion in a 2D plane can be set independently, and it can thus move in arbitrary directions while maintaining the same heading. Dead reckoning is often used for self-localization using onboard sensors in omnidirectional robots, by means of measuring wheel velocities from motor encoder data, as well as in car-like robots. However, omnidirectional mobile robots can easily slip because of the nature of omni-wheels with multiple free rollers, and dead reckoning will not work if even one wheel is not attached to the ground. An odometry method where the data is not affected by wheel slip must be introduced to acquire high quality self-location data for omnidirectional mobile robots. We describe a method to obtain robot ego-motion using camera images and optical flow calculation, i.e., where the camera is used as a velocity sensor. In this paper, a silicon retina vision camera is introduced as a mobile robot sensor, which has a good dynamic range under various lighting conditions. A Field-Programmable Gate Array (FPGA) optical flow circuit for the silicon retina is also developed to measure ego-motion of the mobile robot. The developed optical flow calculation system is introduced into a small omnidirectional mobile robot and evaluation experiments for the mobile robot ego-motion are carried out. In the experiments, the accuracy of self-location by the dead reckoning and optical flow methods are evaluated by comparison using motion capture. The results show that the correct position is obtained by the optical flow sensor rather than by dead reckoning.
引用
收藏
页码:S172 / S176
相关论文
共 21 条
[1]  
Takemura Y(2007)Development of “Hibikino-Musashi” omnidirectional mobile robot International Congress Series 1301 201-205
[2]  
Sanada A(1994)Performance of optical flow techniques International Journal of Computer Vision 12 43-77
[3]  
Ichinose T(2003)An analog VLSI chip emulating sustained and transient response channels of the vertebrate retina IEEE Transactions on Neural Networks 14 1405-1412
[4]  
Nakano Y(1988)Displacement estimation by hierarchical block matching Proceedings of the SPIE Conference 1001 942-951
[5]  
Nassiraei A A F(1981)Determining optical flow Artificial Inteligence 17 185-203
[6]  
Azeura K(2006)Toward realization of swarm intelligence mobile robots International Congress Series 1291 273-276
[7]  
Kitazumi Y(undefined)undefined undefined undefined undefined-undefined
[8]  
Ogawa Y(undefined)undefined undefined undefined undefined-undefined
[9]  
Godler I(undefined)undefined undefined undefined undefined-undefined
[10]  
Ishii K(undefined)undefined undefined undefined undefined-undefined