Robust Target Recognition and Tracking of Self-Driving Cars With Radar and Camera Information Fusion Under Severe Weather Conditions

被引:205
作者
Liu, Ze [1 ]
Cai, Yingfeng [1 ]
Wang, Hai [2 ,3 ]
Chen, Long [1 ]
Gao, Hongbo [4 ]
Jia, Yunyi [5 ]
Li, Yicheng [1 ]
机构
[1] Jiangsu Univ, Automot Engn Res Inst, Zhenjiang 212013, Jiangsu, Peoples R China
[2] Jiangsu Univ, Sch Automot & Traff Engn, Zhenjiang 212013, Jiangsu, Peoples R China
[3] Jiangsu Univ Engn Technol, Res Inst, Zhenjiang 212013, Jiangsu, Peoples R China
[4] Univ Sci & Technol China, Dept Automat, Hefei 230026, Peoples R China
[5] Clemson Univ, Dept Automot Engn, Int Ctr Automot Res CU ICAR, Greenville, SC 29607 USA
基金
中国国家自然科学基金;
关键词
Multi-sensor fusion; radar camera fusion; severe weather conditions; self-driving cars; VEHICLE; VISION; LIDAR;
D O I
10.1109/TITS.2021.3059674
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
Radar and camera information fusion sensing methods are used to solve the inherent shortcomings of the single sensor in severe weather. Our fusion scheme uses radar as the main hardware and camera as the auxiliary hardware framework. At the same time, the Mahalanobis distance is used to match the observed values of the target sequence. Data fusion based on the joint probability function method. Moreover, the algorithm was tested using actual sensor data collected from a vehicle, performing real-time environment perception. The test results show that radar and camera fusion algorithms perform better than single sensor environmental perception in severe weather, which can effectively reduce the missed detection rate of autonomous vehicle environment perception in severe weather. The fusion algorithm improves the robustness of the environment perception system and provides accurate environment perception information for the decision-making system and control system of autonomous vehicles.
引用
收藏
页码:6640 / 6653
页数:14
相关论文
共 37 条
[1]   Vehicle and guard rail detection using radar and vision data fusion [J].
Alessandretti, Giancarlo ;
Broggi, Alberto ;
Cerri, Pietro .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2007, 8 (01) :95-105
[2]  
[Anonymous], 2018, IEEE C COMPUTER VISI
[3]  
[Anonymous], 2007, On the definition of information fusion as a field of research
[4]  
Bar -Shalom Y., 1998, J ACOUST SOC AM, V87, P918
[5]   TRACKING IN A CLUTTERED ENVIRONMENT WITH PROBABILISTIC DATA ASSOCIATION [J].
BARSHALOM, Y ;
TSE, E .
AUTOMATICA, 1975, 11 (05) :451-460
[6]   Pedestrian Motion Trajectory Prediction in Intelligent Driving from Far Shot First-Person Perspective Video [J].
Cai, Yingfeng ;
Dai, Lei ;
Wang, Hai ;
Chen, Long ;
Li, Yicheng ;
Sotelo, Miguel Angel ;
Li, Zhixiong .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (06) :5298-5313
[7]   A Novel Saliency Detection Algorithm Based on Adversarial Learning Model [J].
Cai, Yingfeng ;
Dai, Lei ;
Wang, Hai ;
Chen, Long ;
Li, Yicheng .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2020, 29 :4489-4504
[8]   Vehicle Detection by Fusing Part Model Learning and Semantic Scene Information for Complex Urban Surveillance [J].
Cai, Yingfeng ;
Liu, Ze ;
Wang, Hai ;
Chen, Xiaobo ;
Chen, Long .
SENSORS, 2018, 18 (10)
[9]   Vehicle Detection Based on Deep Dual-Vehicle Deformable Part Models [J].
Cai, Yingfeng ;
Liu, Ze ;
Sun, Xiaoqiang ;
Chen, Long ;
Wang, Hai ;
Zhang, Yong .
JOURNAL OF SENSORS, 2017, 2017
[10]   Real-Time Object Tracking on a Drone With Multi-Inertial Sensing Data [J].
Chen, Peng ;
Dang, Yuanjie ;
Liang, Ronghua ;
Zhu, Wei ;
He, Xiaofei .
IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2018, 19 (01) :131-139