IDOL: A Framework for IMU-DVS Odometry using Lines

被引:25
作者
Le Gentil, Cedric [1 ,2 ]
Tschopp, Florian [1 ]
Alzugaray, Ignacio [3 ]
Vidal-Calleja, Teresa [2 ]
Siegwart, Roland [1 ]
Nieto, Juan [1 ]
机构
[1] Swiss Fed Inst Technol, Autonomous Syst Lab, Zurich, Switzerland
[2] Univ Technol Sydney, Sch Mech & Mechatron Engn, Ctr Autonomous Syst, Sydney, NSW, Australia
[3] Swiss Fed Inst Technol, Vis Robot Lab, Zurich, Switzerland
来源
2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS) | 2020年
关键词
ROBUST; VERSATILE; MOTION; SLAM;
D O I
10.1109/IROS45743.2020.9341208
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we introduce IDOL, an optimization-based framework for IMIJ-DVS Odometry using Lines. Event cameras, also called Dynamic Vision Sensors (DVSs), generate highly asynchronous streams of events triggered upon illumination changes for each individual pixel. This novel paradigm presents advantages in low illumination conditions and high-speed motions. Nonetheless, this unconventional sensing modality brings new challenges to perform scene reconstruction or motion estimation. The proposed method offers to leverage a continuous-time representation of the inertial readings to associate each event with timely accurate inertial data. The method's front-end extracts event clusters that belong to line segments in the environment whereas the back-end estimates the system's trajectory alongside the lines' 3D pasition by minimizing point-to-line distances between individual events and the lines' projection in the image space. A novel attraction/repulsion mechanism is presented to accurately estimate the lines' extremities, avoiding their explicit detection in the event data. The proposed method is benchmarked against a state-of-the-art frame-based visual-inertial odometry framework using public datasets. The results show that IDOL performs at the same order of magnitude on most datasets and even shows better orientation estimates. These findings can have a great impact on new algorithms for DVS.
引用
收藏
页码:5863 / 5870
页数:8
相关论文
共 37 条
[1]   Asynchronous Corner Detection and Tracking for Event Cameras in Real Time [J].
Alzugaray, Ignacio ;
Chli, Margarita .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2018, 3 (04) :3177-3184
[2]  
Alzugaray Ignacio, 2018, INT C 3D VIS 3DV
[3]  
Anderson S, 2015, IEEE INT C INT ROBOT, P157, DOI 10.1109/IROS.2015.7353368
[4]  
Bloesch M, 2015, IEEE INT C INT ROBOT, P298, DOI 10.1109/IROS.2015.7353389
[5]  
Bosse M, 2009, IEEE INT CONF ROBOT, P4244
[6]   ELiSeD - An Event-Based Line Segment Detector [J].
Brandli, Christian ;
Strubel, Jonas ;
Keller, Susanne ;
Scaramuzza, Davide ;
Delbruck, Tobi .
2016 2ND INTERNATIONAL CONFERENCE ON EVENT-BASED CONTROL, COMMUNICATION, AND SIGNAL PROCESSING (EBCCSP), 2016,
[7]   A 240 x 180 130 dB 3 μs Latency Global Shutter Spatiotemporal Vision Sensor [J].
Brandli, Christian ;
Berner, Raphael ;
Yang, Minhao ;
Liu, Shih-Chii ;
Delbruck, Tobi .
IEEE JOURNAL OF SOLID-STATE CIRCUITS, 2014, 49 (10) :2333-2341
[8]   Past, Present, and Future of Simultaneous Localization and Mapping: Toward the Robust-Perception Age [J].
Cadena, Cesar ;
Carlone, Luca ;
Carrillo, Henry ;
Latif, Yasir ;
Scaramuzza, Davide ;
Neira, Jose ;
Reid, Ian ;
Leonard, John J. .
IEEE TRANSACTIONS ON ROBOTICS, 2016, 32 (06) :1309-1332
[9]   Direct Sparse Odometry [J].
Engel, Jakob ;
Koltun, Vladlen ;
Cremers, Daniel .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (03) :611-625
[10]   Low-Latency Line Tracking Using Event-Based Dynamic Vision Sensors [J].
Everding, Lukas ;
Conradt, Joerg .
FRONTIERS IN NEUROROBOTICS, 2018, 12