End-to-End Velocity Estimation for Autonomous Racing

被引:21
|
作者
Srinivasan, Sirish [1 ]
Sa, Inkyu [2 ]
Zyner, Alex [1 ]
Reijgwart, Victor [1 ]
Valls, Miguel I. [3 ]
Siegwart, Roland [1 ]
机构
[1] Swiss Fed Inst Technol, Autonomous Syst Lab, CH-8092 Zurich, Switzerland
[2] CSIRO, Robot & Autonomous Syst Grp, Pullenvale, Qld 4069, Australia
[3] Sevensense Robot AG, CH-8006 Zurich, Switzerland
关键词
Field robots; autonomous vehicle navigation; sensor fusion; STATE ESTIMATION;
D O I
10.1109/LRA.2020.3016929
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Velocity estimation plays a central role in driverless vehicles, but standard, and affordable methods struggle to cope with extreme scenarios like aggressive maneuvers due to the presence of high sideslip. To solve this, autonomous race cars are usually equipped with expensive external velocity sensors. In this letter, we present an end-to-end recurrent neural network that takes available raw sensors as input (IMU, wheel odometry, and motor currents), and outputs velocity estimates. The results are compared to two state-of-the-art Kalman filters, which respectively include, and exclude expensive velocity sensors. All methods have been extensively tested on a formula student driverless race car with very high sideslip (10 degrees at the rear axle), and slip ratio (approximate to 20%), operating close to the limits of handling. The proposed network is able to estimate lateral velocity up to 15x better than the Kalman filter with the equivalent sensor input, and matches (0.06 m/s RMSE) the Kalman filter with the expensive velocity sensor setup.
引用
收藏
页码:6869 / 6875
页数:7
相关论文
共 50 条
  • [1] End-to-End Multimodal Sensor Dataset Collection Framework for Autonomous Vehicles
    Gu, Junyi
    Lind, Artjom
    Chhetri, Tek Raj
    Bellone, Mauro
    Sell, Raivo
    SENSORS, 2023, 23 (15)
  • [2] An End-to-End Motion Planner Using Sensor Fusion for Autonomous Driving
    Thu, Nguyen Thi Hoai
    Han, Dong Seog
    2023 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE IN INFORMATION AND COMMUNICATION, ICAIIC, 2023, : 678 - 683
  • [3] ICOP: Image-based Cooperative Perception for End-to-End Autonomous Driving
    Li, Lantao
    Cheng, Yujie
    Sun, Chen
    Zhang, Wenqi
    2024 35TH IEEE INTELLIGENT VEHICLES SYMPOSIUM, IEEE IV 2024, 2024, : 2367 - 2374
  • [4] Bayesian Imitation Learning for End-to-End Mobile Manipulation
    Du, Yuqing
    Ho, Daniel
    Alemi, Alexander A.
    Jang, Eric
    Khansari, Mohi
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [5] Flying in Highly Dynamic Environments With End-to-End Learning Approach
    Fan, Xiyu
    Lu, Minghao
    Xu, Bowen
    Lu, Peng
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2025, 10 (04): : 3851 - 3858
  • [6] A Review on End-to-End High-Definition Map Generation
    Kwag, Jiyong
    Toth, Charles
    MID-TERM SYMPOSIUM THE ROLE OF PHOTOGRAMMETRY FOR A SUSTAINABLE WORLD, VOL. 48-2, 2024, : 187 - 194
  • [7] An efficient end-to-end EKF-SLAM architecture based on LiDAR, GNSS, and IMU data sensor fusion for autonomous ground vehicles
    Hamza MAILKA
    Mohamed Abouzahir
    Mustapha Ramzi
    Multimedia Tools and Applications, 2024, 83 : 56183 - 56206
  • [8] An efficient end-to-end EKF-SLAM architecture based on LiDAR, GNSS, and IMU data sensor fusion for autonomous ground vehicles
    Mailka, Hamza
    Abouzahir, Mohamed
    Ramzi, Mustapha
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 83 (18) : 56183 - 56206
  • [9] Generalizable end-to-end deep learning frameworks for real-time attitude estimation 6DoF inertial measurement units
    Golroudbari, Arman Asgharpoor
    Sabour, Mohammad Hossein
    MEASUREMENT, 2023, 217
  • [10] Probabilistic End-to-End Vehicle Navigation in Complex Dynamic Environments With Multimodal Sensor Fusion
    Cai, Peide
    Wang, Sukai
    Sun, Yuxiang
    Liu, Ming
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2020, 5 (03) : 4218 - 4224