Rapid-Mapping: LiDAR-Visual Implicit Neural Representations for Real-Time Dense Mapping

被引:0
|
作者
Zhang, Hanwen [1 ]
Zou, Yujie [1 ]
Yan, Zhewen [1 ]
Cheng, Hui [1 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
来源
IEEE ROBOTICS AND AUTOMATION LETTERS | 2024年 / 9卷 / 09期
基金
中国国家自然科学基金;
关键词
Deep learning for visual perception; mapping; visual learning;
D O I
10.1109/LRA.2024.3440729
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Real-time dense mapping with high-fidelity textures in large-scale environments is such a challenge in robots, digital twins, and AR/VR applications. Neural Radiance Field (NeRF) has demonstrated remarkable capabilities in capturing intricate details and saving memory space, which provides significant advantages in the fine-grained reconstruction of large-scale scenes. Existing LiDAR-based mapping methods have not harnessed NeRF's ability to capture textures. In this letter, we propose the first real-time LiDAR-Visual mapping method in large-scale indoor and outdoor environments, named Rapid-Mapping, that utilizes implicit neural representations and preserves high-fidelity textures. First of all, to align the camera image and LiDAR depth, we propose a method for extrinsic refinement to mitigate the issue of texture blurring caused by long-range and extended temporal scale measurements. Also, to alleviate the effects of lighting conditions and camera hardware interference, we utilize prior hue information to constrain the inverse affine transformation. Extensive experiments validate that Rapid-Mapping enables real-time dense mapping in large-scale complex indoor and outdoor scenes, exhibiting more detailed realistic textures and more accurate geometry compared to existing methods.
引用
收藏
页码:8154 / 8161
页数:8
相关论文
共 50 条
  • [1] An asymmetric real-time dense visual localisation and mapping system
    Comport, Andrew I.
    Meilland, Maxime
    Rives, Patrick
    2011 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCV WORKSHOPS), 2011,
  • [2] Dense visual mapping of large scale environments for real-time localisation
    Meilland, Maxime
    Comport, Andrew Ian
    Rives, Patrick
    2011 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, 2011, : 4242 - 4248
  • [3] Real-time Omnidirectional Visual SLAM with Semi-Dense Mapping
    Wang, Senbo
    Yue, Jiguang
    Dong, Yanchao
    Shen, Runjie
    Zhang, Xinyu
    2018 IEEE INTELLIGENT VEHICLES SYMPOSIUM (IV), 2018, : 695 - 700
  • [4] iMAP: Implicit Mapping and Positioning in Real-Time
    Sucar, Edgar
    Liu, Shikun
    Ortiz, Joseph
    Davison, Andrew J.
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 6209 - 6218
  • [5] DTAM: Dense Tracking and Mapping in Real-Time
    Newcombe, Richard A.
    Lovegrove, Steven J.
    Davison, Andrew J.
    2011 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2011, : 2320 - 2327
  • [6] Real-time Scalable Dense Surfel Mapping
    Wang, Kaixuan
    Gao, Fei
    Shen, Shaojie
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 6919 - 6925
  • [7] HI-SLAM: Monocular Real-Time Dense Mapping With Hybrid Implicit Fields
    Zhang, Wei
    Sun, Tiecheng
    Wang, Sen
    Cheng, Qing
    Haala, Norbert
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (02): : 1548 - 1555
  • [8] Robust Real-Time Visual Odometry for Dense RGB-D Mapping
    Whelan, Thomas
    Johannsson, Hordur
    Kaess, Michael
    Leonard, John J.
    McDonald, John
    2013 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2013, : 5724 - 5731
  • [9] CodeMapping: Real-Time Dense Mapping for Sparse SLAM using Compact Scene Representations
    Matsuki, Hidenobu
    Scona, Raluca
    Czarnowski, Jan
    Davison, Andrew J.
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2021, 6 (04) : 7105 - 7112
  • [10] iMODE:Real-Time Incremental Monocular Dense Mapping Using Neural Field
    Matsuki, Hidenobu
    Sucar, Edgar
    Laidow, Tristan
    Wada, Kentaro
    Scona, Raluca
    Davison, Andrew J.
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA, 2023, : 4171 - 4177