Visual Localization and Mapping in Dynamic and Changing Environments

被引:2
作者
Soares, Joao Carlos Virgolino [1 ,3 ]
Medeiros, Vivian Suzano [2 ]
Abati, Gabriel Fischer [3 ]
Becker, Marcelo [2 ]
Caurin, Glauco [4 ]
Gattass, Marcelo [5 ]
Meggiolaro, Marco Antonio [3 ]
机构
[1] Ist Italiano Tecnol IIT, Dynam Legged Syst lab, Via S Quirico 19d, I-16163 Genoa, GE, Italy
[2] Univ Sao Paulo, Dept Mech Engn, Ave Trabalhador Sao Carlense, BR-13566590 Sao Carlos, SP, Brazil
[3] Pontifical Catholic Univ Rio De Janeiro, Dept Mech Engn, Marques Sao Vicente, BR-22451040 Rio De Janeiro, RJ, Brazil
[4] Univ Sao Paulo, Dept Aeronaut, Ave Joao Dagnone, BR-13563120 Sao Carlos, SP, Brazil
[5] Pontifical Catholic Univ Rio De Janeiro, Dept Informat, BR-22451900 Rio De Janeiro, RJ, Brazil
基金
巴西圣保罗研究基金会;
关键词
SLAM; Object detection; Segmentation and categorization; Localization; RGB-D perception; POSE GRAPH; SLAM; TRACKING;
D O I
10.1007/s10846-023-02019-6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The real-world deployment of fully autonomous mobile robots depends on a robust simultaneous localization and mapping (SLAM) system, capable of handling dynamic environments, where objects are moving in front of the robot, and changing environments, where objects are moved or replaced after the robot has already mapped the scene. This paper proposes Changing-SLAM, a method for robust Visual SLAM in both dynamic and changing environments. This is achieved by using a Bayesian filter combined with a long-term data association algorithm. Also, it employs an efficient algorithm for dynamic keypoints filtering based on object detection that correctly identifies features inside the bounding box that are not dynamic, preventing a depletion of features that could cause lost tracks. Furthermore, a new dataset was developed with RGB-D data specially designed for the evaluation of changing environments on an object level, called PUC-USP dataset. Six sequences were created using a mobile robot, an RGB-D camera and a motion capture system. The sequences were designed to capture different scenarios that could lead to a tracking failure or map corruption. Changing-SLAM does not assume a given camera pose or a known map, being also able to operate in real time. The proposed method was evaluated using benchmark datasets and compared with other state-of-the-art methods, proving to be highly accurate.
引用
收藏
页数:20
相关论文
共 50 条
  • [1] Probabilistic Object Maps for Long-Term Robot Localization
    Adkins, Amanda
    Chen, Taijing
    Biswas, Joydeep
    [J]. 2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2022, : 931 - 938
  • [2] Agarwal P, 2013, IEEE INT CONF ROBOT, P62, DOI 10.1109/ICRA.2013.6630557
  • [3] DynaSLAM: Tracking, Mapping, and Inpainting in Dynamic Scenes
    Bescos, Berta
    Facil, Jose M.
    Civera, Javier
    Neira, Jose
    [J]. IEEE ROBOTICS AND AUTOMATION LETTERS, 2018, 3 (04): : 4076 - 4083
  • [4] Bochkovskiy A, 2020, Arxiv, DOI [arXiv:2004.10934, 10.48550/arXiv.2004.10934, DOI 10.48550/ARXIV.2004.10934]
  • [5] Detection and Tracking of General Movable Objects in Large Three-Dimensional Maps
    Bore, Nils
    Ekekrantz, Johan
    Jensfelt, Patric
    Folkesson, John
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2019, 35 (01) : 231 - 247
  • [6] ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial, and Multimap SLAM
    Campos, Carlos
    Elvira, Richard
    Gomez Rodriguez, Juan J.
    Montiel, Jose M. M.
    Tardos, Juan D.
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2021, 37 (06) : 1874 - 1890
  • [7] Navigable Space Construction from Sparse Noisy Point Clouds
    Chen, Zheng
    Liu, Lantao
    [J]. IEEE ROBOTICS AND AUTOMATION LETTERS, 2021, 6 (03): : 4720 - 4727
  • [8] SOF-SLAM: A Semantic Visual SLAM for Dynamic Environments
    Cui, Linyan
    Ma, Chaowei
    [J]. IEEE ACCESS, 2019, 7 : 166528 - 166539
  • [9] Change detection using weighted features for image-based localization
    Derner, Erik
    Gomez, Clara
    Hernandez, Alejandra C.
    Barber, Ramon
    Babuska, Robert
    [J]. ROBOTICS AND AUTONOMOUS SYSTEMS, 2021, 135
  • [10] EarthSense, About us