Uncertainty-Aware Depth Network for Visual Inertial Odometry of Mobile Robots

被引:1
|
作者
Song, Jimin [1 ]
Jo, Hyunggi [1 ]
Jin, Yongsik [2 ]
Lee, Sang Jun [1 ]
机构
[1] Jeonbuk Natl Univ, Div Elect Engn, 567 Baekje Daero, Jeonju 54896, South Korea
[2] Elect & Telecommun Res Inst ETRI, Daegu Gyeongbuk Res Ctr, Daegu 42994, South Korea
基金
新加坡国家研究基金会;
关键词
simultaneous localization and mapping; visual-inertial odometry; depth estimation; uncertainty estimation; parking lot dataset; SLAM; VERSATILE; ROBUST; MONO;
D O I
10.3390/s24206665
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Simultaneous localization and mapping, a critical technology for enabling the autonomous driving of vehicles and mobile robots, increasingly incorporates multi-sensor configurations. Inertial measurement units (IMUs), known for their ability to measure acceleration and angular velocity, are widely utilized for motion estimation due to their cost efficiency. However, the inherent noise in IMU measurements necessitates the integration of additional sensors to facilitate spatial understanding for mapping. Visual-inertial odometry (VIO) is a prominent approach that combines cameras with IMUs, offering high spatial resolution while maintaining cost-effectiveness. In this paper, we introduce our uncertainty-aware depth network (UD-Net), which is designed to estimate both depth and uncertainty maps. We propose a novel loss function for the training of UD-Net, and unreliable depth values are filtered out to improve VIO performance based on the uncertainty maps. Experiments were conducted on the KITTI dataset and our custom dataset acquired from various driving scenarios. Experimental results demonstrated that the proposed VIO algorithm based on UD-Net outperforms previous methods with a significant margin.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] Monocular Visual Inertial Navigation for Mobile Robots using Uncertainty based Triangulation
    Heo, Sejong
    Cha, Jaehyuck
    Park, Chan Gook
    IFAC PAPERSONLINE, 2017, 50 (01): : 2217 - 2222
  • [22] Stereo Visual Inertial Odometry for Robots with Limited Computational Resources
    Bahnam, Stavrow
    Pfeiffer, Sven
    de Croon, Guido C. H. E.
    2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2021, : 9154 - 9159
  • [23] UTLNet: Uncertainty-Aware Transformer Localization Network for RGB-Depth Mirror Segmentation
    Zhou, Wujie
    Cai, Yuqi
    Zhang, Liting
    Yan, Weiqing
    Yu, Lu
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 4564 - 4574
  • [24] Uncertainty-Aware Knowledge Distillation for Collision Identification of Collaborative Robots
    Kwon, Wookyong
    Jin, Yongsik
    Lee, Sang Jun
    SENSORS, 2021, 21 (19)
  • [25] Uncertainty-Aware Resource Provisioning for Network Slicing
    Luu, Quang-Trung
    Kerboeuf, Sylvaine
    Kieffer, Michel
    IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2021, 18 (01): : 79 - 93
  • [26] Uncertainty-Aware Optimization for Network Provisioning and Routing
    Bi, Yingjie
    Tang, Ao
    2019 53RD ANNUAL CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS (CISS), 2019,
  • [27] Survey of Research on Visual Odometry Technology for Mobile Robots
    Chen M.
    Huang L.
    Wang S.
    Zhang Y.
    Chen Z.
    Nongye Jixie Xuebao/Transactions of the Chinese Society for Agricultural Machinery, 2024, 55 (03): : 1 - 20
  • [28] Unsupervised visual odometry method for greenhouse mobile robots
    Wu X.
    Zhou Y.
    Liu J.
    Liu Z.
    Wang C.
    Nongye Gongcheng Xuebao/Transactions of the Chinese Society of Agricultural Engineering, 2023, 39 (10): : 163 - 174
  • [29] Uncertainty-aware visual analytics: scope, opportunities, and challenges
    Robin G. C. Maack
    Gerik Scheuermann
    Hans Hagen
    Jose Tiberio Hernández Peñaloza
    Christina Gillmann
    The Visual Computer, 2023, 39 : 6345 - 6366
  • [30] Stereo Visual Odometry for Mobile Robots on Uneven Terrain
    Ericson, Stefan
    Astrand, Bjorn
    WCECS 2008: ADVANCES IN ELECTRICAL AND ELECTRONICS ENGINEERING - IAENG SPECIAL EDITION OF THE WORLD CONGRESS ON ENGINEERING AND COMPUTER SCIENCE, PROCEEDINGS, 2009, : 150 - +