Real-time large-scale dense RGB-D SLAM with volumetric fusion

被引:304
|
作者
Whelan, Thomas [1 ]
Kaess, Michael [2 ]
Johannsson, Hordur [3 ]
Fallon, Maurice [3 ]
Leonard, John J. [3 ]
McDonald, John [1 ]
机构
[1] Natl Univ Ireland, Dept Comp Sci, Maynooth, Kildare, Ireland
[2] Carnegie Mellon Univ, Inst Robot, Pittsburgh, PA 15213 USA
[3] MIT, Cambridge, MA 02139 USA
来源
基金
爱尔兰科学基金会; 美国国家科学基金会;
关键词
Volumetric fusion; camera pose estimation; dense methods; large scale; real time; RGB-D; SLAM; GPU;
D O I
10.1177/0278364914551008
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
We present a new simultaneous localization and mapping (SLAM) system capable of producing high-quality globally consistent surface reconstructions over hundreds of meters in real time with only a low-cost commodity RGB-D sensor. By using a fused volumetric surface reconstruction we achieve a much higher quality map over what would be achieved using raw RGB-D point clouds. In this paper we highlight three key techniques associated with applying a volumetric fusion-based mapping system to the SLAM problem in real time. First, the use of a GPU-based 3D cyclical buffer trick to efficiently extend dense every-frame volumetric fusion of depth maps to function over an unbounded spatial region. Second, overcoming camera pose estimation limitations in a wide variety of environments by combining both dense geometric and photometric camera pose constraints. Third, efficiently updating the dense map according to place recognition and subsequent loop closure constraints by the use of an as-rigid-as-possible' space deformation. We present results on a wide variety of aspects of the system and show through evaluation on de facto standard RGB-D benchmarks that our system performs strongly in terms of trajectory estimation, map quality and computational performance in comparison to other state-of-the-art systems.
引用
收藏
页码:598 / 626
页数:29
相关论文
共 50 条
  • [31] Cross Modal Multiscale Fusion Net for Real-time RGB-D Detection
    Yin, Kejie
    Liu, Sheng
    Liu, Ruyu
    Chen, Yibin
    Shen, Kang
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2386 - 2391
  • [32] Real-time dense 3D object reconstruction using RGB-D sensor
    Ruchay, Alexey
    Dorofeev, Konstantin
    Kalschikov, Vsevolod
    APPLICATIONS OF DIGITAL IMAGE PROCESSING XLIII, 2020, 11510
  • [33] Real-time dense map fusion for stereo SLAM
    Pire, Taihu
    Baravalle, Rodrigo
    D'Alessandro, Ariel
    Civera, Javier
    ROBOTICA, 2018, 36 (10) : 1510 - 1526
  • [34] A Primal-Dual Framework for Real-Time Dense RGB-D Scene Flow
    Jaimez, Mariano
    Souiai, Mohamed
    Gonzalez-Jimenez, Javier
    Cremers, Daniel
    2015 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2015, : 98 - 104
  • [35] Real-Time RGB-D Visual Tracking With Scale Estimation and Occlusion Handling
    Leng, Jiaxu
    Liu, Ying
    IEEE ACCESS, 2018, 6 : 24256 - 24263
  • [36] SG-SLAM: A Real-Time RGB-D Visual SLAM Toward Dynamic Scenes With Semantic and Geometric Information
    Cheng, Shuhong
    Sun, Changhe
    Zhang, Shijun
    Zhang, Dianfan
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [37] SG-SLAM: A Real-Time RGB-D Visual SLAM Toward Dynamic Scenes With Semantic and Geometric Information
    Cheng, Shuhong
    Sun, Changhe
    Zhang, Shijun
    Zhang, Dianfan
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [38] SG-SLAM: A Real-Time RGB-D Visual SLAM Toward Dynamic Scenes With Semantic and Geometric Information
    Cheng, Shuhong
    Sun, Changhe
    Zhang, Shijun
    Zhang, Dianfan
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [39] Dense Frame-to-Model SLAM with an RGB-D Camera
    Ye, Xiaodan
    Li, Jianing
    Wang, Lianghao
    Li, Dongxiao
    Zhang, Ming
    ADVANCES IN MULTIMEDIA INFORMATION PROCESSING - PCM 2017, PT I, 2018, 10735 : 588 - 597
  • [40] Dense Piecewise Planar RGB-D SLAM for Indoor Environments
    Le, Phi-Hung
    Kosecka, Jana
    2017 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2017, : 4944 - 4949