Event-Based, 6-DOF Camera Tracking from Photometric Depth Maps

被引:129
作者
Gallego, Guillermo [1 ,2 ,3 ]
Lund, Jon E. A. [1 ,2 ,3 ]
Mueggler, Elias [1 ,2 ,3 ]
Rebecq, Henri [1 ,2 ,3 ]
Delbruck, Tobi [1 ,2 ,3 ]
Scaramuzza, Davide [1 ,2 ,3 ]
机构
[1] Univ Zurich, Robot & Percept Grp, Dept Informat, CH-8092 Zurich, Switzerland
[2] Univ Zurich, Dept Neuroinformat, CH-8092 Zurich, Switzerland
[3] Swiss Fed Inst Technol, CH-8092 Zurich, Switzerland
关键词
Event-based vision; pose tracking; dynamic vision sensor; Bayes filter; asynchronous processing; conjugate priors; low latency; high speed; AR/VR; VISUAL ODOMETRY; VISION; PIXEL;
D O I
10.1109/TPAMI.2017.2769655
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Event cameras are bio-inspired vision sensors that output pixel-level brightness changes instead of standard intensity frames. These cameras do not suffer from motion blur and have a very high dynamic range, which enables them to provide reliable visual information during high-speed motions or in scenes characterized by high dynamic range. These features, along with a very low power consumption, make event cameras an ideal complement to standard cameras for VR/AR and video game applications. With these applications in mind, this paper tackles the problem of accurate, low-latency tracking of an event camera from an existing photometric depth map (i.e., intensity plus depth information) built via classic dense reconstruction pipelines. Our approach tracks the 6-DOF pose of the event camera upon the arrival of each event, thus virtually eliminating latency. We successfully evaluate the method in both indoor and outdoor scenes and show that-because of the technological advantages of the event camera-our pipeline works in scenes characterized by high-speed motion, which are still inaccessible to standard cameras.
引用
收藏
页码:2402 / 2412
页数:11
相关论文
共 35 条
  • [21] Mueggler Elias, 2017, COMPUT RES REPOSITOR, P1
  • [22] ORB-SLAM: A Versatile and Accurate Monocular SLAM System
    Mur-Artal, Raul
    Montiel, J. M. M.
    Tardos, Juan D.
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2015, 31 (05) : 1147 - 1163
  • [23] Newcombe RA, 2011, IEEE I CONF COMP VIS, P2320, DOI 10.1109/ICCV.2011.6126513
  • [24] KinectFusion: Real-Time Dense Surface Mapping and Tracking
    Newcombe, Richard A.
    Izadi, Shahram
    Hilliges, Otmar
    Molyneaux, David
    Kim, David
    Davison, Andrew J.
    Kohli, Pushmeet
    Shotton, Jamie
    Hodges, Steve
    Fitzgibbon, Andrew
    [J]. 2011 10TH IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR), 2011, : 127 - 136
  • [25] Pizzoli M, 2014, IEEE INT CONF ROBOT, P2609, DOI 10.1109/ICRA.2014.6907233
  • [26] Rebecq H., 2017, P BRIT MACH VIS C, P1
  • [27] EVO: A Geometric Approach to Event-Based 6-DOF Parallel Tracking and Mapping in Real Time
    Rebecq, Henri
    Horstschaefer, Timo
    Gallego, Guillermo
    Scaramuzza, Davide
    [J]. IEEE ROBOTICS AND AUTOMATION LETTERS, 2017, 2 (02): : 593 - 600
  • [28] Reinbacher C, 2017, IEEE INT CONF COMPUT, P106
  • [29] Reinbrecht C., 2016, 2016 29 S INTEGRATED, P1
  • [30] Son B, 2017, ISSCC DIG TECH PAP I, P66