An Extensible Multi-Sensor Fusion Framework for 3D Imaging

被引:7
作者
Siddiqui, Talha Ahmad [1 ]
Madhok, Rishi [1 ]
O'Toole, Matthew [1 ]
机构
[1] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
来源
2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020) | 2020年
关键词
LIDAR;
D O I
10.1109/CVPRW50498.2020.00512
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many autonomous vehicles rely on an array of sensors for safe navigation, where each sensor captures different visual attributes from the surrounding environment. For example, a single conventional camera captures high-resolution images but no 3D information; a LiDAR provides excellent range information but poor spatial resolution; and a prototype single-photon LiDAR (SP-LiDAR) can provide a dense but noisy representation of the 3D scene. Although the outputs of these sensors vary dramatically (e.g., 2D images, point clouds, 3D volumes), they all derive from the same 3D scene. We propose an extensible sensor fusion framework that (1) lifts the sensor output to volumetric representations of the 3D scene, (2) fuses these volumes together, and (3) processes the resulting volume with a deep neural network to generate a depth (or disparity) map. Although our framework can potentially extend to many types of sensors, we focus on fusing combinations of three imaging systems: monocular/stereo cameras, regular LiDARs, and SP-LiDARs. To train our neural network, we generate a synthetic dataset through CARLA that contains the individual measurements. We also conduct various fusion ablation experiments and evaluate the results of different sensor combinations.
引用
收藏
页码:4344 / 4353
页数:10
相关论文
共 50 条
[41]   Enhancing Object Detection and Localization through Multi-Sensor Fusion for Smart City Infrastructure [J].
Syamal, Soujanya ;
Huang, Cheng ;
Petrunin, Ivan .
2024 IEEE INTERNATIONAL WORKSHOP ON METROLOGY FOR AUTOMOTIVE, METROAUTOMOTIVE 2024, 2024, :41-46
[42]   Feature Map Transformation for Multi-sensor Fusion in Object Detection Networks for Autonomous Driving [J].
Schroder, Enrico ;
Braun, Sascha ;
Mahlisch, Mirko ;
Vitay, Julien ;
Hamker, Fred .
ADVANCES IN COMPUTER VISION, VOL 2, 2020, 944 :118-131
[43]   External multi-modal imaging sensor calibration for sensor fusion: A review [J].
Qiu, Zhouyan ;
Martinez-Sanchez, Joaquin ;
Arias-Sanchez, Pedro ;
Rashdi, Rabia .
INFORMATION FUSION, 2023, 97
[44]   Toward a Robust Sensor Fusion Step for 3D Object Detection on Corrupted Data [J].
Wozniak, Maciej K. ;
Karefjard, Viktor ;
Thiel, Marko ;
Jensfelt, Patric .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 8 (11) :7018-7025
[45]   A Multimodal 3D Object Detection Method Based on Double-Fusion Framework [J].
Ge T.-A. ;
Li H. ;
Guo Y. ;
Wang J.-Y. ;
Zhou D. .
Tien Tzu Hsueh Pao/Acta Electronica Sinica, 2023, 51 (11) :3100-3110
[46]   Advancing Simultaneous Localization and Mapping with Multi-Sensor Fusion and Point Cloud De-Distortion [J].
Shao, Haiyan ;
Zhao, Qingshuai ;
Chen, Hongtang ;
Yang, Weixin ;
Chen, Bin ;
Feng, Zhiquan ;
Zhang, Jinkai ;
Teng, Hao .
MACHINES, 2023, 11 (06)
[47]   Exploring the Unseen: A Survey of Multi-Sensor Fusion and the Role of Explainable AI (XAI) in Autonomous Vehicles [J].
Yeong, De Jong ;
Panduru, Krishna ;
Walsh, Joseph .
SENSORS, 2025, 25 (03)
[48]   Evaluating Navigation Performance of Elastically Constructed HD Map with Multi-Sensor Fusion Engine System [J].
Chiu, Y. -T. ;
Srinara, S. ;
Tsai, M. -L ;
Chen, J. -A. ;
Chiang, K. -W. ;
El-Sheimy, N. .
GEOSPATIAL WEEK 2023, VOL. 48-1, 2023, :799-805
[49]   Environment Mapping Using Sensor Fusion of 2D Laser Scanner and 3D Ultrasonic Sensor for a Real Mobile Robot [J].
Tran, Tien Quang ;
Becker, Andreas ;
Grzechca, Damian .
SENSORS, 2021, 21 (09)
[50]   Validate and Update of 3D Urban Features using Multi-Source Fusion [J].
Arrington, Marcus ;
Edwards, Dan ;
Sengers, Arjan .
GEOSPATIAL INFOFUSION II, 2012, 8396