Sparsity-Robust Feature Fusion for Vulnerable Road-User Detection with 4D Radar

被引:1
作者
Ruddat, Leon [1 ]
Reichardt, Laurenz [1 ]
Ebert, Nikolas [1 ,2 ]
Wasenmueller, Oliver [1 ]
机构
[1] Mannheim Univ Appl Sci, Res & Transfer Ctr CeMOS, D-68163 Mannheim, Germany
[2] RPTU Kaiserslautern Landau, Dept Comp Sci, D-67663 Kaiserslautern, Germany
来源
APPLIED SCIENCES-BASEL | 2024年 / 14卷 / 07期
关键词
4D radar; 3D object detection; attention; NETWORK;
D O I
10.3390/app14072781
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Detecting vulnerable road users is a major challenge for autonomous vehicles due to their small size. Various sensor modalities have been investigated, including mono or stereo cameras and 3D LiDAR sensors, which are limited by environmental conditions and hardware costs. Radar sensors are a low-cost and robust option, with high-resolution 4D radar sensors being suitable for advanced detection tasks. However, they involve challenges such as few and irregularly distributed measurement points and disturbing artifacts. Learning-based approaches utilizing pillar-based networks show potential in overcoming these challenges. However, the severe sparsity of radar data makes detecting small objects with only a few points difficult. We extend a pillar network with our novel Sparsity-Robust Feature Fusion (SRFF) neck, which combines high- and low-level multi-resolution features through a lightweight attention mechanism. While low-level features aid in better localization, high-level features allow for better classification. As sparse input data are propagated through a network, the increasing effective receptive field leads to feature maps of different sparsities. The combination of features with different sparsities improves the robustness of the network for classes with few points.
引用
收藏
页数:11
相关论文
共 38 条
  • [1] Radar Transformer: An Object Classification Network Based on 4D MMW Imaging Radar
    Bai, Jie
    Zheng, Lianqing
    Li, Sen
    Tan, Bin
    Chen, Sihan
    Huang, Libo
    [J]. SENSORS, 2021, 21 (11)
  • [2] Bansal Kshitiz, 2020, SenSys '20: Proceedings of the 18th Conference on Embedded Networked Sensor Systems, P340, DOI 10.1145/3384419.3430783
  • [3] Bin Yang, 2020, Computer Vision - ECCV 2020. 16th European Conference. Proceedings. Lecture Notes in Computer Science (LNCS 12363), P496, DOI 10.1007/978-3-030-58523-5_29
  • [4] Ghost Target Detection in 3D Radar Data using Point Cloud based Deep Neural Network
    Chamseddine, Mahdi
    Rambach, Jason
    Stricker, Didier
    Wasenmueller, Oliver
    [J]. 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 10398 - 10403
  • [5] Radar-based 2D Car Detection Using Deep Neural Networks
    Dreher, Maria
    Ercelik, Emec
    Banziger, Timo
    Knol, Alois
    [J]. 2020 IEEE 23RD INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS (ITSC), 2020,
  • [6] DeepFusion: A Robust and Modular 3D Object Detector for Lidars, Cameras and Radars
    Drews, Florian
    Feng, Di
    Faion, Florian
    Rosenbaum, Lars
    Ulrich, Michael
    Glaser, Claudius
    [J]. 2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2022, : 560 - 567
  • [7] LRPD: Long Range 3D Pedestrian Detection Leveraging Specific Strengths of LiDAR and RGB
    Fuerst, Michael
    Wasenmueller, Oliver
    Stricker, Didier
    [J]. 2020 IEEE 23RD INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS (ITSC), 2020,
  • [8] RAMP-CNN: A Novel Neural Network for Enhanced Automotive Radar Object Recognition
    Gao, Xiangyu
    Xing, Guanbin
    Roy, Sumit
    Liu, Hui
    [J]. IEEE SENSORS JOURNAL, 2021, 21 (04) : 5119 - 5132
  • [9] Vision meets robotics: The KITTI dataset
    Geiger, A.
    Lenz, P.
    Stiller, C.
    Urtasun, R.
    [J]. INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2013, 32 (11) : 1231 - 1237
  • [10] Ioffe Sergey, 2015, Proceedings of Machine Learning Research, V37, P448, DOI DOI 10.48550/ARXIV.1502.03167