VECtor: A Versatile Event-Centric Benchmark for Multi-Sensor SLAM

被引:48
作者
Gao, Ling [1 ,2 ,3 ]
Liang, Yuxuan [1 ,4 ]
Yang, Jiaqi [1 ]
Wu, Shaoxun [1 ]
Wang, Chenyu [1 ]
Chen, Jiaben [1 ]
Kneip, Laurent [1 ,5 ]
机构
[1] ShanghaiTech Univ, Sch Informat Sci & Technol, Mobile Percept Lab, Shanghai 201210, Peoples R China
[2] Chinese Acad Sci, Shanghai Inst Microsyst & Informat Technol, Shanghai 200050, Peoples R China
[3] Univ Chinese Acad Sci, Beijing 100864, Peoples R China
[4] Northwestern Univ, Evanston, IL 60201 USA
[5] Shanghai Engn Res Ctr Intelligent Vis & Imaging S, Shanghai 201210, Peoples R China
基金
上海市自然科学基金;
关键词
Data sets for SLAM; data sets for robotic vision; data sets for robot learning; sensor fusion; CAMERA DATASET; ROBUST;
D O I
10.1109/LRA.2022.3186770
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Event cameras have recently gained in popularity as they hold strong potential to complement regular cameras in situations of high dynamics or challenging illumination. An important problem that may benefit from the addition of an event camera is given by Simultaneous Localization And Mapping (SLAM). However, in order to ensure progress on event-inclusive multi-sensor SLAM, novel benchmark sequences are needed. Our contribution is the first complete set of benchmark datasets captured with a multi-sensor setup containing an event-based stereo camera, a regular stereo camera, multiple depth sensors, and an inertial measurement unit. The setup is fully hardware-synchronized and underwent accurate extrinsic calibration. All sequences come with ground truth data captured by highly accurate external reference devices such as a motion capture system. Individual sequences include both small and large-scale environments, and cover the specific challenges targeted by dynamic vision sensors.
引用
收藏
页码:8217 / 8224
页数:8
相关论文
共 29 条
[1]   Asynchronous Corner Detection and Tracking for Event Cameras in Real Time [J].
Alzugaray, Ignacio ;
Chli, Margarita .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2018, 3 (04) :3177-3184
[2]   A Dataset for Visual Navigation with Neuromorphic Methods [J].
Barranco, Francisco ;
Fermuller, Cornelia ;
Aloimonos, Yiannis ;
Delbruck, Tobi .
FRONTIERS IN NEUROSCIENCE, 2016, 10
[3]  
Berner Raphael, 2013, 2013 Symposium on VLSI Circuits, pC186
[4]   The EuRoC micro aerial vehicle datasets [J].
Burri, Michael ;
Nikolic, Janosch ;
Gohl, Pascal ;
Schneider, Thomas ;
Rehder, Joern ;
Omari, Sammy ;
Achtelik, Markus W. ;
Siegwart, Roland .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2016, 35 (10) :1157-1163
[5]   DHP19: Dynamic Vision Sensor 3D Human Pose Dataset [J].
Calabrese, Enrico ;
Taverni, Gemma ;
Easthope, Christopher Awai ;
Skriabine, Sophie ;
Corradi, Federico ;
Longinotti, Luca ;
Eng, Kynan ;
Delbruck, Tobi .
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2019), 2019, :1695-1704
[6]   ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial, and Multimap SLAM [J].
Campos, Carlos ;
Elvira, Richard ;
Gomez Rodriguez, Juan J. ;
Montiel, Jose M. M. ;
Tardos, Juan D. .
IEEE TRANSACTIONS ON ROBOTICS, 2021, 37 (06) :1874-1890
[7]   Hand-eye calibration using dual quaternions [J].
Daniilidis, K .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 1999, 18 (03) :286-298
[8]  
Delmerico J, 2019, IEEE INT CONF ROBOT, P6713, DOI [10.1109/ICRA.2019.8793887, 10.1109/icra.2019.8793887]
[9]  
Furgale P, 2013, IEEE INT C INT ROBOT, P1280, DOI 10.1109/IROS.2013.6696514
[10]  
Furgale P, 2012, IEEE INT CONF ROBOT, P2088, DOI 10.1109/ICRA.2012.6225005