Traffic participants classification based on 3D radio detection and ranging point clouds

被引:11
作者
Bai, Jie [1 ,2 ]
Li, Sen [1 ]
Tan, Bin [1 ]
Zheng, Lianqing [1 ]
Huang, Libo [1 ]
Dong, Lianfei [1 ]
机构
[1] TongJi Univ, Sch Automot Studies, Shang Hai, Peoples R China
[2] Zhejiang Univ City Coll, Sch Informat & Elect, Hangzhou, Zhejiang, Peoples R China
关键词
automotive radar; learning (artificial intelligence); signal classification; RADAR;
D O I
10.1049/rsn2.12182
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Traffic participant classification is critical in autonomous driving perception. Millimetre wave radio detection and ranging (RADAR) is a cost-effective and robust means of performing this task in adverse traffic scenarios such as inclement weather (e.g. fog, snow, and rain) and poor lighting conditions. Compared to commercial two-dimensional RADAR, the new generation of three-dimensional (3D) RADAR can obtain height information about targets as well as their dense point clouds, greatly improving target classification capabilities. This study proposes a multi-objective classification method for traffic participants based on 3D RADAR point clouds. First, a 22-dimensional feature vector of the 3D RADAR point cloud distribution was extracted to describe the shape, discrete, Doppler, and reflection intensity features of the targets. Then, dynamic and static datasets containing five classes of targets were produced, creating a 10k frame. Extensive experiments were conducted to build machine learning classifiers. The experimental results show that the trained classifiers can achieve over 92% classification accuracy when the targets are classified into five groups and over 95% classification accuracy when the targets are classified into four groups. The proposed method can guide the design of safer and more efficient intelligent driving systems.
引用
收藏
页码:278 / 290
页数:13
相关论文
共 33 条
  • [1] Practical classification of different moving targets using automotive radar and deep neural networks
    Angelov, Aleksandar
    Robertson, Andrew
    Murray-Smith, Roderick
    Fioranelli, Francesco
    [J]. IET RADAR SONAR AND NAVIGATION, 2018, 12 (10) : 1082 - 1089
  • [2] Barnes D, 2020, IEEE INT CONF ROBOT, P6433, DOI [10.1109/icra40945.2020.9196884, 10.1109/ICRA40945.2020.9196884]
  • [3] Belgiovane Domenic, 2017, 2017 11th European Conference on Antennas and Propagation (EUCAP), P2912, DOI 10.23919/EuCAP.2017.7928457
  • [4] The Rise of Radar for Autonomous Vehicles Signal processing solutions and future research directions
    Bilik, Igal
    Longman, Oren
    Villeval, Shahar
    Tabrikian, Joseph
    [J]. IEEE SIGNAL PROCESSING MAGAZINE, 2019, 36 (05) : 20 - 31
  • [5] Björklund S, 2018, EUROP RADAR CONF, P182, DOI 10.23919/EuRAD.2018.8546569
  • [6] Caesar H, 2020, PROC CVPR IEEE, P11618, DOI 10.1109/CVPR42600.2020.01164
  • [7] Shape Completion using 3D-Encoder-Predictor CNNs and Shape Synthesis
    Dai, Angela
    Qi, Charles Ruizhongtai
    Niessner, Matthias
    [J]. 30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 6545 - 6554
  • [8] Du L, 2020, PROC CVPR IEEE, P13326, DOI 10.1109/CVPR42600.2020.01334
  • [9] Feng Z., 2019, AME 2019AUTOMOTIVE M, P1
  • [10] Geiger A, 2012, PROC CVPR IEEE, P3354, DOI 10.1109/CVPR.2012.6248074