FlyView: a bio-informed optical flow truth dataset for visual navigation using panoramic stereo vision

被引:0
作者
Leroy, Alix [1 ]
Taylor, Graham K. [1 ]
机构
[1] Univ Oxford, Dept Biol, Oxford Flight Grp, Oxford OX1 3SZ, England
来源
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022) | 2022年
基金
欧洲研究理事会;
关键词
CALIBRATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Flying at speed through complex environments is a difficult task that has been performed successfully by insects since the Carboniferous [1], but remains a challenge for robotic and autonomous systems. Insects navigate the world using optical flow sensed by their compound eyes, which they process using a deep neural network implemented on hardware weighing just a few milligrams. Deploying an insect-inspired network architecture in computer vision could therefore enable more efficient and effective ways of estimating structure and self-motion using optical flow. Training a bio-informed deep network to implement these tasks requires biologically relevant training, test, and validation data. To this end, we introduce FlyView1, a novel bio-informed truth dataset for visual navigation. This simulated dataset is rendered using open source 3D scenes in which the agent's position is known at every frame, and is accompanied by truth data on depth, self-motion, and motion flow. This dataset comprising 42,475 frames has several key features that are missing from existing optical flow datasets, including: (i) panoramic camera images, with a monocular and binocular field of view matched to that of a fly's compound eyes; (ii) dynamically meaningful self-motion, modelled on motion primitives or the 3D trajectories of drones and flies; and (iii) complex natural and indoor environments, including reflective surfaces, fog, and clouds.
引用
收藏
页数:15
相关论文
共 44 条
  • [1] Augmented Reality Meets Computer Vision: Efficient Data Generation for Urban Driving Scenes
    Abu Alhaija, Hassan
    Mustikovela, Siva Karthik
    Mescheder, Lars
    Geiger, Andreas
    Rother, Carsten
    [J]. INTERNATIONAL JOURNAL OF COMPUTER VISION, 2018, 126 (09) : 961 - 972
  • [2] [Anonymous], 2018, CELL, V174, P730
  • [3] Artizzu Charles-Olivier, 2021, 25 INT C PATT REC IC
  • [4] A database and evaluation methodology for optical flow
    Baker, Simon
    Scharstein, Daniel
    Lewis, J. P.
    Roth, Stefan
    Black, Michael J.
    Szeliski, Richard
    [J]. 2007 IEEE 11TH INTERNATIONAL CONFERENCE ON COMPUTER VISION, VOLS 1-6, 2007, : 588 - 595
  • [5] Bayanlou M. R., 2021, INT ARCH PHOTOGRAMME, P1
  • [6] VISION OF INSECTS
    BURKHARDT, D
    [J]. JOURNAL OF COMPARATIVE PHYSIOLOGY, 1977, 120 (01): : 33 - 50
  • [7] A Naturalistic Open Source Movie for Optical Flow Evaluation
    Butler, Daniel J.
    Wulff, Jonas
    Stanley, Garrett B.
    Black, Michael J.
    [J]. COMPUTER VISION - ECCV 2012, PT VI, 2012, 7577 : 611 - 625
  • [8] Cabon Yohann, 2020, Virtual kitti 2
  • [9] Implementation of wide-field integration of optic flow for autonomous quadrotor navigation
    Conroy, Joseph
    Gremillion, Gregory
    Ranganathan, Badri
    Humbert, J. Sean
    [J]. AUTONOMOUS ROBOTS, 2009, 27 (03) : 189 - 198