End-to-End Multimodal Sensor Dataset Collection Framework for Autonomous Vehicles

被引:3
|
作者
Gu, Junyi [1 ]
Lind, Artjom [2 ]
Chhetri, Tek Raj [3 ,4 ]
Bellone, Mauro [5 ]
Sell, Raivo [1 ]
机构
[1] Tallinn Univ Technol Tallinn, Dept Mech & Ind Engn, EE-12616 Tallinn, Estonia
[2] Univ Tartu, Inst Comp Sci, Intelligent Transportat Syst Lab, EE-51009 Tartu, Estonia
[3] Univ Innsbruck, Semant Technol Inst STI Innsbruck, Dept Comp Sci, A-6020 Innsbruck, Austria
[4] Ctr Artificial Intelligence AI Res Nepal, Sundarharaincha 56604, Nepal
[5] Tallinn Univ Technol, FinEst Ctr Smart Cities, EE-19086 Tallinn, Estonia
关键词
multimodal sensors; autonomous driving; dataset collection framework; sensor calibration and synchronization; sensor fusion; CALIBRATION; CAMERA; ROAD; VISION; FUSION; RADAR;
D O I
10.3390/s23156783
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Autonomous driving vehicles rely on sensors for the robust perception of their surroundings. Such vehicles are equipped with multiple perceptive sensors with a high level of redundancy to ensure safety and reliability in any driving condition. However, multi-sensor, such as camera, LiDAR, and radar systems raise requirements related to sensor calibration and synchronization, which are the fundamental blocks of any autonomous system. On the other hand, sensor fusion and integration have become important aspects of autonomous driving research and directly determine the efficiency and accuracy of advanced functions such as object detection and path planning. Classical model-based estimation and data-driven models are two mainstream approaches to achieving such integration. Most recent research is shifting to the latter, showing high robustness in real-world applications but requiring large quantities of data to be collected, synchronized, and properly categorized. However, there are two major research gaps in existing works: (i) they lack fusion (and synchronization) of multi-sensors, camera, LiDAR and radar; and (ii) generic scalable, and user-friendly end-to-end implementation. To generalize the implementation of the multi-sensor perceptive system, we introduce an end-to-end generic sensor dataset collection framework that includes both hardware deploying solutions and sensor fusion algorithms. The framework prototype integrates a diverse set of sensors, such as camera, LiDAR, and radar. Furthermore, we present a universal toolbox to calibrate and synchronize three types of sensors based on their characteristics. The framework also includes the fusion algorithms, which utilize the merits of three sensors, namely, camera, LiDAR, and radar, and fuse their sensory information in a manner that is helpful for object detection and tracking research. The generality of this framework makes it applicable in any robotic or autonomous applications and suitable for quick and large-scale practical deployment.
引用
收藏
页数:25
相关论文
共 50 条
  • [1] A Survey on Imitation Learning Techniques for End-to-End Autonomous Vehicles
    Le Mero, Luc
    Yi, Dewei
    Dianati, Mehrdad
    Mouzakitis, Alexandros
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (09) : 14128 - 14147
  • [2] End-to-End Autonomous Driving: Challenges and Frontiers
    Chen, Li
    Wu, Penghao
    Chitta, Kashyap
    Jaeger, Bernhard
    Geiger, Andreas
    Li, Hongyang
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (12) : 10164 - 10183
  • [3] End-to-End Lidar-Camera Self-Calibration for Autonomous Vehicles
    Rachman, Arya
    Seiler, Jurgen
    Kaup, Andre
    2023 IEEE INTELLIGENT VEHICLES SYMPOSIUM, IV, 2023,
  • [4] An End-to-End Motion Planner Using Sensor Fusion for Autonomous Driving
    Thu, Nguyen Thi Hoai
    Han, Dong Seog
    2023 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE IN INFORMATION AND COMMUNICATION, ICAIIC, 2023, : 678 - 683
  • [5] nuScenesComplex: A More Rigorous Evaluation Framework for End-to-End Autonomous Driving Planning
    Nguyen, Dung
    Zhang, Gang
    Pan, Hujie
    Hu, Xiaolin
    ADVANCES IN NEURAL NETWORKS-ISNN 2024, 2024, 14827 : 482 - 491
  • [6] End-to-end Autonomous Driving: Advancements and Challenges
    Chu, Duan-Feng
    Wang, Ru-Kang
    Wang, Jing-Yi
    Hua, Qiao-Zhi
    Lu, Li-Ping
    Wu, Chao-Zhong
    Zhongguo Gonglu Xuebao/China Journal of Highway and Transport, 2024, 37 (10): : 209 - 232
  • [7] End-to-End Velocity Estimation for Autonomous Racing
    Srinivasan, Sirish
    Sa, Inkyu
    Zyner, Alex
    Reijgwart, Victor
    Valls, Miguel I.
    Siegwart, Roland
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2020, 5 (04) : 6869 - 6875
  • [8] End-to-End Autonomous Driving in CARLA: A Survey
    Al Ozaibi, Youssef
    Hina, Manolo Dulva
    Ramdane-Cherif, Amar
    IEEE ACCESS, 2024, 12 : 146866 - 146900
  • [9] End-to-End Target Liveness Detection via mmWave Radar and Vision Fusion for Autonomous Vehicles
    Wang, Shuai
    Mei, Luoyu
    Yin, Zhimeng
    Li, Hao
    Liu, Ruofeng
    Jiang, Wenchao
    Lu, Chris Xiaoxuan
    ACM TRANSACTIONS ON SENSOR NETWORKS, 2024, 20 (04)
  • [10] An efficient end-to-end EKF-SLAM architecture based on LiDAR, GNSS, and IMU data sensor fusion for autonomous ground vehicles
    Hamza MAILKA
    Mohamed Abouzahir
    Mustapha Ramzi
    Multimedia Tools and Applications, 2024, 83 : 56183 - 56206