FADM-SLAM: a fast and accurate dynamic intelligent motion SLAM for autonomous robot exploration involving movable objects

被引:20
作者
Ul Islam, Qamar [1 ]
Ibrahim, Haidi [1 ]
Chin, Pan Kok [2 ]
Lim, Kevin [2 ]
Abdullah, Mohd Zaid [1 ]
机构
[1] Univ Sains Malaysia, Sch Elect & Elect Engn, Engn Campus, Nibong Tebal, Malaysia
[2] Univ Sains Malaysia, PixArt Imaging Penang, Sdn Bhd, Kompleks Eureka, Gelugor, Malaysia
来源
ROBOTIC INTELLIGENCE AND AUTOMATION | 2023年 / 43卷 / 03期
关键词
SLAM; Optical flow; Intelligent motion detection; Multi-view geometry; Dynamic environment; Ego motion estimation; Feature-based extraction; Moving objects;
D O I
10.1108/RIA-11-2022-0269
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
PurposeMany popular simultaneous localization and mapping (SLAM) techniques have low accuracy, especially when localizing environments containing dynamically moving objects since their presence can potentially cause inaccurate data associations. To address this issue, the proposed FADM-SLAM system aims to improve the accuracy of SLAM techniques in environments containing dynamically moving objects. It uses a pipeline of feature-based approaches accompanied by sparse optical flow and multi-view geometry as constraints to achieve this goal. Design/methodology/approachFADM-SLAM, which works with monocular, stereo and RGB-D sensors, combines an instance segmentation network incorporating an intelligent motion detection strategy (iM) with an optical flow technique to improve location accuracy. The proposed AS-SLAM system comprises four principal modules, which are the optical flow mask and iM, the ego motion estimation, dynamic point detection and the feature-based extraction framework. FindingsExperiment results using the publicly available RGBD-Bonn data set indicate that FADM-SLAM outperforms established visual SLAM systems in highly dynamic conditions. Originality/valueIn summary, the first module generates the indication of dynamic objects by using the optical flow and iM with geometric-wise segmentation, which is then used by the second module to compute the starting point of a posture. The third module, meanwhile, first searches for the dynamic feature points in the environment, and second, eliminates them from further processing. An algorithm based on epipolar constraints is implemented to do this. In this way, only the static feature points are retained, which are then fed to the fourth module for extracting important features.
引用
收藏
页码:254 / 266
页数:13
相关论文
共 46 条
[1]  
Bai DD, 2016, 2016 IEEE INTERNATIONAL CONFERENCE ON REAL-TIME COMPUTING AND ROBOTICS (IEEE RCAR), P70, DOI 10.1109/RCAR.2016.7784003
[2]   Dynamic Intervisibility Analysis of 3D Point Clouds [J].
Bai, Ling ;
Li, Yinguo ;
Cen, Ming .
ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION, 2021, 10 (11)
[3]   DOT: Dynamic Object Tracking for Visual SLAM [J].
Ballester, Irene ;
Fontan, Alejandro ;
Civera, Javier ;
Strobl, Klaus H. ;
Triebel, Rudolph .
2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, :11705-11711
[4]   Empty Cities: A Dynamic-Object-Invariant Space for Visual SLAM [J].
Bescos, Berta ;
Cadena, Cesar ;
Neira, Jose .
IEEE TRANSACTIONS ON ROBOTICS, 2021, 37 (02) :433-451
[5]   DynaSLAM: Tracking, Mapping, and Inpainting in Dynamic Scenes [J].
Bescos, Berta ;
Facil, Jose M. ;
Civera, Javier ;
Neira, Jose .
IEEE ROBOTICS AND AUTOMATION LETTERS, 2018, 3 (04) :4076-4083
[6]  
Biswas R, 2002, 2002 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-3, PROCEEDINGS, P1014
[7]   Topological simultaneous localization and mapping: a survey [J].
Boal, Jaime ;
Sanchez-Miralles, Alvaro ;
Arranz, Alvaro .
ROBOTICA, 2014, 32 (05) :803-821
[8]   ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial, and Multimap SLAM [J].
Campos, Carlos ;
Elvira, Richard ;
Gomez Rodriguez, Juan J. ;
Montiel, Jose M. M. ;
Tardos, Juan D. .
IEEE TRANSACTIONS ON ROBOTICS, 2021, 37 (06) :1874-1890
[9]  
Cao W. etal., 2022, IEEE Trans. Intell. Transp. Syst.
[10]  
Chen Qiao, 2023, 2023 IEEE 3rd International Conference on Power, Electronics and Computer Applications (ICPECA), P1300, DOI 10.1109/ICPECA56706.2023.10075875