A Fast and Accurate Visual Inertial Odometry Using Hybrid Point-Line Features

被引:0
|
作者
Chen, Zhenhang [1 ,2 ]
Miao, Zhiqiang [1 ,2 ]
Liu, Min [1 ,2 ]
Wu, Chengzhong [3 ]
Wang, Yaonan [1 ,2 ]
机构
[1] Hunan Univ, Sch Elect & Informat Engn, Changsha 410082, Peoples R China
[2] Hunan Univ, Natl Engn Lab Robot Visual Percept & Control, Changsha 410082, Peoples R China
[3] Jiangxi Commun Terminal Ind Technol Res Inst Co Lt, Jian 343100, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
Feature extraction; Simultaneous localization and mapping; Accuracy; Visualization; Odometry; Real-time systems; Location awareness; Optical flow; Optical filters; Jacobian matrices; Visual-inertial SLAM; localization; visual inertial odometry (VIO); hybrid msckf; line features; VERSATILE; ROBUST;
D O I
10.1109/LRA.2024.3490406
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Mainstream visual-inertial SLAM systems use point features for motion estimation and localization. However, point features do not perform well in scenes such as weak texture and motion blur. Therefore, the introduction of line features has received a lot of attention. In this letter, we propose a point-line based real-time monocular visual inertial odometry. Aiming at the problem that most of the current works do not fully utilize the line feature properties, we derive the point-line based hybrid Multi-State Constraint Kalman Filter (hybrid MSCKF) in detail. To further improve the line feature initialization accuracy, we propose a two-step line triangulation method. Since filter-based methods are susceptible to visual outliers, we also propose a redundant line feature removal strategy suitable for the filtering framework. According to the experimental results in EuRoC data set and real environment, the proposed algorithm outperforms other state-of-the-art algorithms in accuracy and real-time performance.
引用
收藏
页码:11345 / 11352
页数:8
相关论文
共 50 条
  • [41] PLE-SLAM: A Visual-Inertial SLAM Based on Point-Line Features and Efficient IMU Initialization
    He, Jiaming
    Li, Mingrui
    Wang, Yangyang
    Wang, Hongyu
    IEEE Sensors Journal,
  • [42] Leveraging Planar Regularities for Point Line Visual-Inertial Odometry
    Li, Xin
    He, Yijia
    Lin, Jinlong
    Liu, Xiao
    2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 5120 - 5127
  • [43] Robust RGB-D Visual Odometry Using Point and Line Features
    Sun, Chao
    Qiao, Nianzu
    Ge, Wei
    Sun, Jia
    2022 41ST CHINESE CONTROL CONFERENCE (CCC), 2022, : 3826 - 3831
  • [44] PLE-SLAM: A Visual-Inertial SLAM Based on Point-Line Features and Efficient IMU Initialization
    He, Jiaming
    Li, Mingrui
    Wang, Yangyang
    Wang, Hongyu
    IEEE SENSORS JOURNAL, 2025, 25 (04) : 6801 - 6811
  • [45] Accurate Line Reconstruction for Point and Line-Based Stereo Visual Odometry
    Luo, Xiaohua
    Tan, Zhitao
    Ding, Yong
    IEEE ACCESS, 2019, 7 : 185108 - 185120
  • [46] PLI-VIO: Real-time Monocular Visual-inertial Odometry Using Point and Line Interrelated Features
    Zhang, Jiahui
    Yang, Jinfu
    Shang, Qingzhen
    Li, Mingai
    INTERNATIONAL JOURNAL OF CONTROL AUTOMATION AND SYSTEMS, 2023, 21 (06) : 2004 - 2019
  • [47] PLI-VIO: Real-time Monocular Visual-inertial Odometry Using Point and Line Interrelated Features
    Jiahui Zhang
    Jinfu Yang
    Qingzhen Shang
    Mingai Li
    International Journal of Control, Automation and Systems, 2023, 21 : 2004 - 2019
  • [48] UPL-SLAM: Unconstrained RGB-D SLAM with Accurate Point-Line Features for Visual Perception
    Sun, Xianshuai
    Zhao, Yuming
    Wang, Yabiao
    Li, Zhigang
    He, Zhen
    Wang, Xiaohui
    IEEE Access, 13 : 8676 - 8690
  • [49] UPL-SLAM: Unconstrained RGB-D SLAM With Accurate Point-Line Features for Visual Perception
    Sun, Xianshuai
    Zhao, Yuming
    Wang, Yabiao
    Li, Zhigang
    He, Zhen
    Wang, Xiaohui
    IEEE ACCESS, 2025, 13 : 8676 - 8690
  • [50] PL-EVIO: Robust Monocular Event-Based Visual Inertial Odometry With Point and Line Features
    Guan, Weipeng
    Chen, Peiyu
    Xie, Yuhan
    Lu, Peng
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2024, 21 (04) : 6277 - 6293