Autonomous Navigation System for Indoor Mobile Robots Based on a Multi-sensor Fusion Technology

被引:0
作者
Wang, Hongcheng [1 ]
Chen, Niansheng [1 ]
Yang, Dingyu [2 ]
Fan, Guangyu [1 ]
机构
[1] Shanghai Dianji Univ, Sch Elect Informat Engn, Shanghai, Peoples R China
[2] Alibaba Grp, Shanghai, Peoples R China
来源
COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, CHINESECSCW 2021, PT I | 2022年 / 1491卷
基金
中国国家自然科学基金;
关键词
Mobile robot; RTABMAP; Sensor fusion; Path planning;
D O I
10.1007/978-981-19-4546-5_39
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Map construction and path planning are two critical problems for an autonomous navigation system. One traditional map construction method is to construct a 2D grid map based on LiDAR, but this method has some limits. It easily ignores 3D information which affects the accuracy of navigation. Another one is visual SLAM techniques, such as ORB-SLAM2 and S-PTAM algorithms, which can recognize 3D objects. But the visual methods perform not well because of light changes. Some conventional path planning algorithms, such as TEB and DWA, are proposed for auto-navigation. However, those algorithms are likely to go to a stalemate due to local optimum, or have the problems of collision caused by sudden speed changes in constrained environments. In order to address these issues, this paper proposes a multi-sensor fusion method for map construction and autonomous navigation. Firstly, the fusion model combines RGB-D, lidar laser, and inertial measurement unit (IMU) to construct 2D grid maps and 3D color point cloud maps in real-time. Next, we present an improved local planning algorithm (Opt_TEB) to solve the velocity mutation problem, enabling the robot to get a collision-free path. We implemented the whole system based on the ROS framework, which is a wide used an open-source robot operating system. The map construction and path planning algorithms are running on the robot, while the visualization and control modules are deployed on a back-end server. The experimental results illustrate that the multi-sensor fusion algorithm is able to conform to the originalmapmore than the2Dgrid map. Furthermore, our improved algorithm Opt_TEB performs smoothly and has no collision with obstacles in 30 trials. The navigation speed is improved by 4.2% and 11.5% compared to TEB and DWA, respectively.
引用
收藏
页码:502 / 517
页数:16
相关论文
共 23 条
[1]  
Akir E., 2021, 16 IFAC S CONTROL TR, V54, P348
[2]   σ-DVO: Sensor Noise Model Meets Dense Visual Odometry [J].
Babu, Benzun Wisely ;
Kimt, Soohwan ;
Yan, Zhixin ;
Ren, Liu .
PROCEEDINGS OF THE 2016 15TH IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR), 2016, :18-26
[3]   Fast loop-closure detection using visual-word-vectors from image sequences [J].
Bampis, Loukas ;
Amanatiadis, Angelos ;
Gasteratos, Antonios .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2018, 37 (01) :62-82
[4]  
Carlone L., 2011, ROBOTICS SCI SYSTEMS
[5]   Reinforcement based mobile robot path planning with improved dynamic window approach in unknown environment [J].
Chang, Lu ;
Shan, Liang ;
Jiang, Chao ;
Dai, Yuewei .
AUTONOMOUS ROBOTS, 2021, 45 (01) :51-76
[6]   A Real-Time Obstacle Avoidance Method for Autonomous Vehicles Using an Obstacle-Dependent Gaussian Potential Field [J].
Cho, Jang-Ho ;
Pae, Dong-Sung ;
Lim, Myo-Taeg ;
Kang, Tae-Koo .
JOURNAL OF ADVANCED TRANSPORTATION, 2018,
[7]   Rosbridge: ROS for Non-ROS Users [J].
Crick, Christopher ;
Jay, Graylin ;
Osentoski, Sarah ;
Pitzer, Benjamin ;
Jenkins, Odest Chadwicke .
ROBOTICS RESEARCH, ISRR, 2017, 100
[8]  
Diao Y., 2010, 13 INT C ADV COMPUTA, P160
[9]   3-D Mapping With an RGB-D Camera [J].
Endres, Felix ;
Hess, Juergen ;
Sturm, Juergen ;
Cremers, Daniel ;
Burgard, Wolfram .
IEEE TRANSACTIONS ON ROBOTICS, 2014, 30 (01) :177-187
[10]  
Fan X., 2021, J PHYS C SERIES, V1905