An optical flow-based composite navigation method inspired by insect vision

被引:0
作者
Pan, Chao [1 ]
Liu, Jian-Guo [2 ]
Li, Jun-Lin [1 ]
机构
[1] Wuhan Digital Engineering Research Institute, Wuhan
[2] National Key Laboratory of Science and Technology on Multi-Spectral Information Processing, Huazhong University of Science and Technology, Wuhan
来源
Zidonghua Xuebao/Acta Automatica Sinica | 2015年 / 41卷 / 06期
关键词
Cumulative error; Insect vision; Navigation; Optical flow (OF);
D O I
10.16383/j.aas.2015.c120936
中图分类号
学科分类号
摘要
Many insects can use optical flow (OF) for various navigational tasks. Inspired by the OF navigation strategies of insects, this paper develops an OF-based composite navigation method for more efficient and precise visual location. The composite navigation method is composed of an OF navigation and an OF aided navigation. The OF navigation is to measure motion at each step using an insect inspired OF method and the current position information is then obtained by path integration. As path integration can lead to increasing cumulative position errors, the OF aided navigation is thus employed to correct the position errors. This aided navigation implements an OF-based Kalman filter by studying the insect inspired OF method. It can iteratively match the actual and the predicted OF for a continuous error estimation. The OF navigation and the OF aided navigation are derived from the same OF method so that they can share input signals and some operations in navigation. Experiments using a mobile robot have demonstrated the efficiency of the proposed composite navigation method. Copyright © 2015 Acta Automatica Sinica. All rights reserved.
引用
收藏
页码:1102 / 1112
页数:10
相关论文
共 27 条
  • [1] Srinivasan M.V., Going with the flow: A brief history of the study of the honeybee0s navigational 'odometer, Journal of Comparative Physiology A, 200, 6, pp. 563-573, (2014)
  • [2] Wittlinger M., Wolf H., Homing distance in desert ants, Cataglyphis fortis, remains unaffected by disturbance of walking behaviour and visual input, Journal of Physiology-Paris, 107, 1-2, pp. 130-136, (2013)
  • [3] Chao H.Y., Gu Y., Napolitano M., A survey of optical flow techniques for robotics navigation applications, Journal of Intelligent & Robotic Systems, 73, 1-4, pp. 361-372, (2014)
  • [4] Srinivasan M.V., An image-interpolation technique for the computation of optic flow and egomotion, Biological Cybernetics, 71, 5, pp. 401-415, (1994)
  • [5] Borst A., Correlation versus gradient type motion detectors: The pros and cons, Philosophical Transactions B, 362, 8, pp. 369-374, (2007)
  • [6] Franceschini N., Small brains, smart machines: From fly vision to robot vision and back again, Proceedings of the IEEE, 102, 5, pp. 751-781, (2014)
  • [7] Liu X.-M., Chen W.-C., Xing X.-L., Yin X.-L., Optical flow/INS multi-sensor information fusion, Journal of Beijing University of Aeronautics and Astronautics, 38, 5, pp. 620-624, (2012)
  • [8] Cao J., Tao W.-B., Tian J.-W., Estimating target's distance using multiaperture optical imaging sensors, Instrument Technique and Sensor, 3, pp. 50-52, (2004)
  • [9] Cheng K., Freas C.A., Path integration, views, search, and matched filters: The contributions of Rüdiger Wehner to the study of orientation and navigation, Journal of Comparative Physiology A
  • [10] Hostetler L., Andreas R., Nonlinear Kalman filtering techniques for terrain-aided navigation, IEEE Transactions on Automatic Control, 28, 3, pp. 315-323, (1983)