A Survey of the Multi-Sensor Fusion Object Detection Task in Autonomous Driving

被引:1
作者
Wang, Hai [1 ]
Liu, Junhao [1 ]
Dong, Haoran [1 ]
Shao, Zheng [1 ]
机构
[1] Jiangsu Univ, Sch Automot & Traff Engn, Zhenjiang 212013, Peoples R China
基金
中国国家自然科学基金;
关键词
multi-sensor fusion; object detection; LiDAR; cameras; environmental perception;
D O I
10.3390/s25092794
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Multi-sensor fusion object detection is an advanced method that improves object recognition and tracking accuracy by integrating data from different types of sensors. As it can overcome the limitations of a single sensor in complex environments, the method has been widely applied in fields such as autonomous driving, intelligent monitoring, robot navigation, drone flight and so on. In the field of autonomous driving, multi-sensor fusion object detection has become a hot research topic. To further explore the future development trends of multi-sensor fusion object detection, we introduce the mainstream framework Transformer model of the multi-sensor fusion object detection algorithm, and we also provide a comprehensive summary of the feature fusion algorithms used in multi-sensor fusion object detection, specifically focusing on the fusion of camera and LiDAR data. This article provides an overview of feature fusion's development into feature-level fusion and proposal-level fusion, and it specifically reviews multiple related algorithms. We discuss the application of current multi-sensor object detection algorithms. In the future, with the continuous advancement of sensor technology and the development of artificial intelligence algorithms, multi-sensor fusion object detection will show great potential in more fields.
引用
收藏
页数:25
相关论文
共 120 条
[1]  
[Anonymous], 2014, P 2014 C EMP METH NA
[2]   Comparative Evaluation of Land Surface Temperature Images from Unmanned Aerial Vehicle and Satellite Observation for Agricultural Areas Using In Situ Data [J].
Awais, Muhammad ;
Li, Wei ;
Hussain, Sajjad ;
Cheema, Muhammad Jehanzeb Masud ;
Li, Weiguo ;
Song, Rui ;
Liu, Chenchen .
AGRICULTURE-BASEL, 2022, 12 (02)
[3]  
Barnes D, 2020, IEEE INT CONF ROBOT, P6433, DOI [10.1109/icra40945.2020.9196884, 10.1109/ICRA40945.2020.9196884]
[4]   A METHOD FOR REGISTRATION OF 3-D SHAPES [J].
BESL, PJ ;
MCKAY, ND .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1992, 14 (02) :239-256
[5]   Euro-PVI: Pedestrian Vehicle Interactions in Dense Urban Centers [J].
Bhattacharyya, Apratim ;
Reino, Daniel Olmeda ;
Fritz, Mario ;
Schiele, Bernt .
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, :6404-6413
[6]   Testing of MMW radar performance in adverse weather conditions and clutter backgrounds [J].
Bruder, JA ;
Brinkmann, MC ;
Whitley, GR ;
Lane, TL ;
Granger, P .
RADAR 2002, 2002, (490) :547-551
[7]   Performance Comparison of Extended Kalman Filter and Unscented Kalman Filter for The Control Moment Gyroscope Inverted Pendulum [J].
Buch, Jyot R. ;
Kakad, Yogendra P. ;
Amengonu, Yawo H. .
2017 25TH INTERNATIONAL CONFERENCE ON SYSTEMS ENGINEERING (ICSENG), 2017, :57-62
[8]   nuScenes: A multimodal dataset for autonomous driving [J].
Caesar, Holger ;
Bankiti, Varun ;
Lang, Alex H. ;
Vora, Sourabh ;
Liong, Venice Erin ;
Xu, Qiang ;
Krishnan, Anush ;
Pan, Yu ;
Baldan, Giancarlo ;
Beijbom, Oscar .
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2020), 2020, :11618-11628
[9]  
Cai HX, 2023, Arxiv, DOI arXiv:2303.17099
[10]  
Cao Jinrong, 2024, 2024 43rd Chinese Control Conference (CCC), P8876, DOI 10.23919/CCC63176.2024.10662866