Environment recognition based on multi-sensor fusion for autonomous driving vehicles

被引:12
|
作者
Weon I.-S. [1 ]
Lee S.-G. [2 ]
机构
[1] Department of Mechanical Engineering, Graduated School, Kyung Hee University
[2] Department of Mechanical Engineering, Kyung Hee University
来源
Journal of Institute of Control, Robotics and Systems | 2019年 / 25卷 / 02期
关键词
Autonomous driving; Deep learning; Environment recognition; Sensor fusion; Unmanned vehicle;
D O I
10.5302/J.ICROS.2019.18.0128
中图分类号
学科分类号
摘要
Unmanned driving of an autonomous vehicle requires high reliability and excellent recognition performance of the road environment and driving situation. Since a single sensor cannot recognize various driving conditions precisely, a recognition system using only a single sensor is not suitable for autonomous driving due to the uncertainty of recognition. In this study, we have developed an autonomous vehicle using sensor fusion with radar, LIDAR and vision data that are coordinate-corrected by GPS and IMU. Deep learning and sensor fusion improves the recognition rate of stationary objects in the driving environment such as lanes, signs, and crosswalks, and accurately recognizes dynamic objects such as vehicles and pedestrians. Using a real road test, the unmanned autonomous driving technology developed in this research was verified to meet the reliability and stability requirements of the NHTSA level 3 autonomous standard. © ICROS 2019.
引用
收藏
页码:125 / 131
页数:6
相关论文
共 50 条
  • [1] An Open Multi-Sensor Fusion Toolbox for Autonomous Vehicles
    Cano, Abraham Monrroy
    Takeuchi, Eijiro
    Kato, Shinpei
    Edahiro, Masato
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2020, E103A (01) : 252 - 264
  • [2] Multi-sensor Fusion and Cooperative Perception for Autonomous Driving A Review
    Xiang, Chao
    Feng, Chen
    Xie, Xiaopo
    Shi, Botian
    Lu, Hao
    Lv, Yisheng
    Yang, Mingchuan
    Niu, Zhendong
    IEEE INTELLIGENT TRANSPORTATION SYSTEMS MAGAZINE, 2023, 15 (05) : 36 - 58
  • [3] Malicious Attacks against Multi-Sensor Fusion in Autonomous Driving
    Zhu, Yi
    Miao, Chenglin
    Xue, Hongfei
    Yu, Yunnan
    Su, Lu
    Qiao, Chunming
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL CONFERENCE ON MOBILE COMPUTING AND NETWORKING, ACM MOBICOM 2024, 2024, : 436 - 451
  • [4] A Review of Environmental Perception Technology Based on Multi-Sensor Information Fusion in Autonomous Driving
    Yang, Boquan
    Li, Jixiong
    Zeng, Ting
    WORLD ELECTRIC VEHICLE JOURNAL, 2025, 16 (01):
  • [5] Feature Map Transformation for Multi-sensor Fusion in Object Detection Networks for Autonomous Driving
    Schroder, Enrico
    Braun, Sascha
    Mahlisch, Mirko
    Vitay, Julien
    Hamker, Fred
    ADVANCES IN COMPUTER VISION, VOL 2, 2020, 944 : 118 - 131
  • [6] Intelligent Car Autonomous Driving Tracking Technology Based on Fuzzy Information and Multi-sensor Fusion
    Gao, Dongxuan
    Wang, Jing
    Chai, Rui
    Informatica (Slovenia), 2024, 48 (21): : 37 - 50
  • [7] SaLsA Streams: Dynamic Context Models for Autonomous Transport Vehicles based on Multi-Sensor Fusion
    Kuka, Christian
    Bolles, Andre
    Funk, Alexander
    Eilers, Soenke
    Schweigert, Soeren
    Gerwinn, Sebastian
    Nicklas, Daniela
    2013 IEEE 14TH INTERNATIONAL CONFERENCE ON MOBILE DATA MANAGEMENT (MDM 2013), VOL 1, 2013, : 263 - 266
  • [8] Towards Compact Autonomous Driving Perception With Balanced Learning and Multi-Sensor Fusion
    Natan, Oskar
    Miura, Jun
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (09) : 16249 - 16266
  • [9] Multi-Sensor Fusion in Automated Driving: A Survey
    Wang, Zhangjing
    Wu, Yu
    Niu, Qingqing
    IEEE ACCESS, 2020, 8 : 2847 - 2868
  • [10] Real-Time Hybrid Multi-Sensor Fusion Framework for Perception in Autonomous Vehicles
    Jahromi, Babak Shahian
    Tulabandhula, Theja
    Cetin, Sabri
    SENSORS, 2019, 19 (20)