Hard-to-Detect Obstacle Mapping by Fusing LIDAR and Depth Camera

被引:2
|
作者
Jeyabal, Sidharth [1 ,2 ]
Sachinthana, W. K. R. [1 ]
Samarakoon, S. M. P. Bhagya [1 ]
Elara, Mohan Rajesh [1 ]
Sheu, Bing J. [2 ]
机构
[1] Singapore Univ Technol & Design, Engn Prod Dev Pillar, Singapore 487372, Singapore
[2] Chang Gung Univ, Coll Engn, Dept Elect Engn, Taoyuan 330, Taiwan
关键词
Robots; Laser radar; Sensors; Navigation; Cameras; Glass; Sensor fusion; Coverage path planning (CPP); mapping; obstacle detection; robot safety; sensor fusion; ROBOTS; VISION;
D O I
10.1109/JSEN.2024.3409623
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In the era of autonomy, the integration of intelligent systems capable of navigating and perceiving their surroundings has become ubiquitous. Many sensors have been developed for environmental perceiving, with LIDAR emerging as a preeminent technology for precise obstacle detection. However, LIDAR has inherent limitations, impeding its ability to detect specific obstacles located below the LIDAR's height or penetrating its rays. Typical environments where robots are deployed often contain obstacles, which might cause issues for robot operations, such as collisions and entanglements, leading to performance degradation. This research addresses the identified limitations by recognizing obstacles that traditionally challenge LIDAR's detection capabilities. Objects such as glass, carpets, wires, and ramps have been meticulously identified as hard-to-detect objects by LIDAR (HDOL). YOLOv8 has been used to detect HDOL using a depth camera. HDOL objects are incorporated into the environmental map, circumventing the constraints posed by LIDAR. Furthermore, HDOL-aware coverage path planning (CPP) has been proposed using boustrophedon motion with an A* algorithm to navigate the robot safely in an environment. Real-world experiments have validated the applicability of the proposed method for ensuring robot safety.
引用
收藏
页码:24690 / 24698
页数:9
相关论文
共 50 条
  • [1] iBTC: An Image-Assisting Binary and Triangle Combined Descriptor for Place Recognition by Fusing LiDAR and Camera Measurements
    Zou, Zuhao
    Zheng, Chunran
    Yuan, Chongjian
    Zhou, Shunbo
    Xue, Kaiwen
    Zhang, Fu
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (12): : 10858 - 10865
  • [2] Accurate LiDAR-Camera Fused Odometry and RGB-Colored Mapping
    Lin, Zhipeng
    Gao, Zhi
    Chen, Ben M.
    Chen, Jingwei
    Li, Chenyang
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (03) : 2495 - 2502
  • [3] Deep Car Detection by Fusing Grayscale Image and Weighted Upsampled LiDAR Depth
    Seikavandi, Meisam Jamshidi
    Nasrollahi, Kamal
    Moeslund, Thomas B.
    THIRTEENTH INTERNATIONAL CONFERENCE ON MACHINE VISION (ICMV 2020), 2021, 11605
  • [4] UAV low-altitude obstacle detection based on the fusion of LiDAR and camera
    Ma Z.
    Yao W.
    Niu Y.
    Lin B.
    Liu T.
    Autonomous Intelligent Systems, 1 (1):
  • [5] Omni-Directional Obstacle Detection for Vehicles Based on Depth Camera
    Zhao, Xiangmo
    Wu, Huayue
    Xu, Zhigang
    Min, Haigen
    IEEE ACCESS, 2020, 8 : 93733 - 93748
  • [6] Target Tracking for TarkBot Intelligent Robots by Fusing Lidar and Camera Data
    Li, Li
    Cao, Hong
    IEEE SENSORS JOURNAL, 2024, 24 (19) : 30920 - 30929
  • [7] LiDAR - Stereo Camera Fusion for Accurate Depth Estimation
    Cholakkal, Hafeez Husain
    Mentasti, Simone
    Bersani, Mattia
    Arrigoni, Stefano
    Matteucci, Matteo
    Cheli, Federico
    2020 AEIT INTERNATIONAL CONFERENCE OF ELECTRICAL AND ELECTRONIC TECHNOLOGIES FOR AUTOMOTIVE (AEIT AUTOMOTIVE), 2020,
  • [8] Camera-LIDAR Integration: Probabilistic Sensor Fusion for Semantic Mapping
    Berrio, Julie Stephany
    Shan, Mao
    Worrall, Stewart
    Nebot, Eduardo
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (07) : 7637 - 7652
  • [9] Coarse-to-Fine Hybrid 3D Mapping System With Co-Calibrated Omnidirectional Camera and Non-Repetitive LiDAR
    Miao, Ziliang
    He, Buwei
    Xie, Wenya
    Zhao, Wenquan
    Huang, Xiao
    Bai, Jian
    Hong, Xiaoping
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 8 (03) : 1778 - 1785
  • [10] FUSING STRUCTURE FROM MOTION AND LIDAR FOR DENSE ACCURATE DEPTH MAP ESTIMATION
    Ding, Li
    Sharma, Gaurav
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 1283 - 1287