Robust Depth-Aided Visual-Inertial-Wheel Odometry for Mobile Robots

被引:2
|
作者
Zhao, Xinyang [1 ]
Li, Qinghua [1 ]
Wang, Changhong [1 ]
Dou, Hexuan [1 ]
Liu, Bo [1 ]
机构
[1] Harbin Inst Technol, Space Control & Inertial Technol Res Ctr, Harbin 150001, Peoples R China
关键词
Cameras; Estimation; Uncertainty; Odometers; Measurement uncertainty; Wheels; Odometry; Sensor fusion; simultaneous location and mapping (SLAM); visual-inertial-odometer odometry; wheeled robots; SLAM;
D O I
10.1109/TIE.2023.3323731
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This article introduces visual-depth-inertial-wheel odometry (VDIWO), a robust approach for real-time localization of mobile robots in indoor and outdoor scenarios. Notably, VDIWO achieves accurate localization without relying on prior information. This approach integrates the RGB-D camera, inertial measurement unit, and odometer measurements in a tightly coupled optimization framework. First, we introduce the depth measurement model based on Gaussian mixed model to predict the depth uncertainty of feature points. Then, we propose a hybrid depth estimation method that utilizes both depth measurement fusion and multiview triangulation to estimate the depth of landmarks and simultaneously identify high-quality landmarks. Furthermore, we integrate visual reprojection with depth measurement constraints and odometer preintegration constraints into the tightly coupled optimization framework to further enhance pose estimation accuracy. We evaluate the performance of the VDIWO method using OpenLORIS datasets and real-world experiments. The results demonstrate the high accuracy and robustness of VDIWO for state estimation of mobile robots.
引用
收藏
页码:9161 / 9171
页数:11
相关论文
共 50 条
  • [21] Wheel Odometry aided Visual-Inertial Odometry for Land Vehicle Navigation in Winter Urban Environments
    Huang, Cheng
    Jiang, Yang
    O'Keefe, Kyle
    PROCEEDINGS OF THE 33RD INTERNATIONAL TECHNICAL MEETING OF THE SATELLITE DIVISION OF THE INSTITUTE OF NAVIGATION (ION GNSS+ 2020), 2020, : 2237 - 2251
  • [22] A Dynamic Visual-Inertial-Wheel Odometry With Semantic Constraints and Denoised IMU-Odometer Prior for Autonomous Driving
    Zhi, Meixia
    Deng, Chen
    Li, Bijun
    Zhang, Hongjuan
    Hong, Chengzhi
    IEEE SENSORS JOURNAL, 2024, 24 (17) : 27966 - 27980
  • [23] VIDO: A Robust and Consistent Monocular Visual-Inertial-Depth Odometry
    Gao, Yuanxi
    Yuan, Jing
    Jiang, Jingqi
    Sun, Qinxuan
    Zhang, Xuebo
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2023, 24 (03) : 2976 - 2992
  • [24] Visual-inertial-wheel SLAM with high-accuracy localization measurement for wheeled robots on complex terrain
    Zheng, Jiyuan
    Zhou, Kang
    Li, Jinling
    MEASUREMENT, 2025, 243
  • [25] Temporal delay estimation of sparse direct visual inertial odometry for mobile robots
    Cen, Ruping
    Zhang, Xinyue
    Tao, Yulin
    Xue, Fangzheng
    Zhang, Yuxin
    JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2020, 357 (07): : 3893 - 3906
  • [26] Robust Monocular Visual Odometry using Optical Flows for Mobile Robots
    Li Haifeng
    Hu Zunhe
    Chen Xinwei
    PROCEEDINGS OF THE 35TH CHINESE CONTROL CONFERENCE 2016, 2016, : 6003 - 6007
  • [27] Compass aided visual-inertial odometry
    Wang, Yandong
    Zhang, Tao
    Wang, Yuanchao
    Ma, Jingwei
    Li, Yanhui
    Han, Jingzhuang
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2019, 60 : 101 - 115
  • [28] GPS-aided Visual Wheel Odometry
    Song, Junlin
    Sanchez-Cuevas, Pedro J.
    Richard, Antoine
    Olivares-Mendez, Miguel
    2023 IEEE 26TH INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS, ITSC, 2023, : 375 - 382
  • [29] Review on Visual Odometry for Mobile Robots
    Ding W.-D.
    Xu D.
    Liu X.-L.
    Zhang D.-P.
    Chen T.
    Zidonghua Xuebao/Acta Automatica Sinica, 2018, 44 (03): : 385 - 400
  • [30] Inertial Aided Dense & Semi-Dense Methods for Robust Direct Visual Odometry
    Falquez, Juan M.
    Kasper, Michael
    Sibley, Gabe
    2016 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2016), 2016, : 3601 - 3607