Hard-to-Detect Obstacle Mapping by Fusing LIDAR and Depth Camera

被引:4
作者
Jeyabal, Sidharth [1 ,2 ]
Sachinthana, W. K. R. [1 ]
Samarakoon, S. M. P. Bhagya [1 ]
Elara, Mohan Rajesh [1 ]
Sheu, Bing J. [2 ]
机构
[1] Singapore Univ Technol & Design, Engn Prod Dev Pillar, Singapore 487372, Singapore
[2] Chang Gung Univ, Coll Engn, Dept Elect Engn, Taoyuan 330, Taiwan
关键词
Robots; Laser radar; Sensors; Navigation; Cameras; Glass; Sensor fusion; Coverage path planning (CPP); mapping; obstacle detection; robot safety; sensor fusion; ROBOTS; VISION;
D O I
10.1109/JSEN.2024.3409623
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In the era of autonomy, the integration of intelligent systems capable of navigating and perceiving their surroundings has become ubiquitous. Many sensors have been developed for environmental perceiving, with LIDAR emerging as a preeminent technology for precise obstacle detection. However, LIDAR has inherent limitations, impeding its ability to detect specific obstacles located below the LIDAR's height or penetrating its rays. Typical environments where robots are deployed often contain obstacles, which might cause issues for robot operations, such as collisions and entanglements, leading to performance degradation. This research addresses the identified limitations by recognizing obstacles that traditionally challenge LIDAR's detection capabilities. Objects such as glass, carpets, wires, and ramps have been meticulously identified as hard-to-detect objects by LIDAR (HDOL). YOLOv8 has been used to detect HDOL using a depth camera. HDOL objects are incorporated into the environmental map, circumventing the constraints posed by LIDAR. Furthermore, HDOL-aware coverage path planning (CPP) has been proposed using boustrophedon motion with an A* algorithm to navigate the robot safely in an environment. Real-world experiments have validated the applicability of the proposed method for ensuring robot safety.
引用
收藏
页码:24690 / 24698
页数:9
相关论文
共 50 条
[21]   Robust Extrinsic Calibration for LiDAR-Camera Systems via Depth and Height Complementary Supervision Network [J].
Chen, Yaqing ;
Wang, Huaming .
IEEE ACCESS, 2025, 13 :35818-35828
[22]   Adaptive Active Fusion of Camera and Single-Point LiDAR for Depth Estimation [J].
Tran, Dang M. ;
Ahlgren, Nate ;
Depcik, Chris ;
He, Hongsheng .
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
[23]   Obstacle Risk Assessment for Unmanned Surface Vehicle Using Camera and Lidar [J].
Buasri, Jarunyawat ;
Thamrongaphichartkult, Kitti ;
Vongbunyong, Supachai ;
Charubhu, Weerawut ;
Buddhachan, Varunyou .
2024 21ST INTERNATIONAL CONFERENCE ON ELECTRICAL ENGINEERING/ELECTRONICS, COMPUTER, TELECOMMUNICATIONS AND INFORMATION TECHNOLOGY, ECTI-CON 2024, 2024,
[24]   Unifying Obstacle Detection, Recognition, and Fusion Based on the Polarization Color Stereo Camera and LiDAR for the ADAS [J].
Long, Ningbo ;
Yan, Han ;
Wang, Liqiang ;
Li, Haifeng ;
Yang, Qing .
SENSORS, 2022, 22 (07)
[25]   Obstacle detection based on depth fusion of lidar and radar in challenging conditions [J].
Xie, Guotao ;
Zhang, Jing ;
Tang, Junfeng ;
Zhao, Hongfei ;
Sun, Ning ;
Hu, Manjiang .
INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2021, 48 (06) :792-802
[26]   Research on object perception by fusing thermal imaging camera and 3D LiDAR in fire hazard scenarios [J].
Zhang, Dan ;
Wang, Wei ;
Meng, Haibin ;
Fang, Jun ;
Lou, Zhen ;
Li, Sen ;
Wang, Wenlong .
SENSOR REVIEW, 2025,
[27]   Enhancing Off-Road Topography Estimation by Fusing LIDAR and Stereo Camera Data with Interpolated Ground Plane [J].
Sten, Gustav ;
Feng, Lei ;
Moller, Bjorn .
SENSORS, 2025, 25 (02)
[28]   Fast Multiple Objects Detection and Tracking Fusing Color Camera and 3D LIDAR for Intelligent Vehicles [J].
Hwang, Soonmin ;
Kim, Namil ;
Choi, Yukyung ;
Lee, Seokju ;
Kweon, In So .
2016 13TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAI), 2016, :234-239
[29]   Dense Depth-Map Estimation Based on Fusion of Event Camera and Sparse LiDAR [J].
Cui, Mingyue ;
Zhu, Yuzhang ;
Liu, Yechang ;
Liu, Yunchao ;
Chen, Gang ;
Huang, Kai .
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
[30]   Probabilistic multi-modal depth estimation based on camera–LiDAR sensor fusion [J].
Johan S. Obando-Ceron ;
Victor Romero-Cano ;
Sildomar Monteiro .
Machine Vision and Applications, 2023, 34