Fast and Robust Monocular Visua-Inertial Odometry Using Points and Lines

被引:5
|
作者
Zhang, Ning [1 ]
Zhao, Yongjia [1 ]
机构
[1] Beihang Univ, State Key Lab Virtual Real Technol & Syst, Sch Automat Sci & Eletr Engn, Beijing 100191, Peoples R China
关键词
line feature; point-line feature fusion; semi-direct method; SIMULTANEOUS LOCALIZATION; SLAM; VERSATILE; VEHICLE;
D O I
10.3390/s19204545
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
When the camera moves quickly and the image is blurred or the texture in the scene is missing, the Simultaneous Localization and Mapping (SLAM) algorithm based on point feature experiences difficulty tracking enough effective feature points, and the positioning accuracy and robustness are poor, and even may not work properly. For this problem, we propose a monocular visual odometry algorithm based on the point and line features and combining IMU measurement data. Based on this, an environmental-feature map with geometric information is constructed, and the IMU measurement data is incorporated to provide prior and scale information for the visual localization algorithm. Then, the initial pose estimation is obtained based on the motion estimation of the sparse image alignment, and the feature alignment is further performed to obtain the sub-pixel level feature correlation. Finally, more accurate poses and 3D landmarks are obtained by minimizing the re-projection errors of local map points and lines. The experimental results on EuRoC public datasets show that the proposed algorithm outperforms the Open Keyframe-based Visual-Inertial SLAM (OKVIS-mono) algorithm and Oriented FAST and Rotated BRIEF-SLAM (ORB-SLAM) algorithm, which demonstrates the accuracy and speed of the algorithm.
引用
收藏
页数:21
相关论文
共 50 条
  • [21] Robust Visual Inertial Monocular using nonlinear optimization
    Li, Qingfeng
    Gu, Hao
    Han, Cuihong
    Gong, Weimeng
    Song, Shuang
    Meng, Max Q. -H.
    2017 IEEE INTERNATIONAL CONFERENCE ON INFORMATION AND AUTOMATION (IEEE ICIA 2017), 2017, : 483 - 488
  • [22] Robust Monocular Visual Odometry using Optical Flows for Mobile Robots
    Li Haifeng
    Hu Zunhe
    Chen Xinwei
    PROCEEDINGS OF THE 35TH CHINESE CONTROL CONFERENCE 2016, 2016, : 6003 - 6007
  • [23] SLC-VIO: a stereo visual-inertial odometry based on structural lines and points belonging to lines
    Wei, Chenchen
    Tang, Yanfeng
    Yang, Lingfang
    Huang, Zhi
    ROBOTICA, 2022, 40 (08) : 2765 - 2785
  • [24] Visual-Inertial Odometry of Structured and Unstructured Lines Based on Vanishing Points in Indoor Environments
    He, Xiaojing
    Li, Baoquan
    Qiu, Shulei
    Liu, Kexin
    APPLIED SCIENCES-BASEL, 2024, 14 (05):
  • [25] Real-time motion state estimation of feature points based on optical flow field for robust monocular visual-inertial odometry in dynamic scenes
    Cao, Long
    Liu, Jingbin
    Lei, Jietao
    Zhang, Wei
    Chen, Yongsen
    Hyyppa, Juha
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 274
  • [26] Robust Stereo Visual-Inertial Odometry Using Nonlinear Optimization
    Ma, Shujun
    Bai, Xinhui
    Wang, Yinglei
    Fang, Rui
    SENSORS, 2019, 19 (17)
  • [27] Robust monocular visual odometry for road vehicles using uncertain perspective projection
    David Van Hamme
    Werner Goeman
    Peter Veelaert
    Wilfried Philips
    EURASIP Journal on Image and Video Processing, 2015
  • [28] Robust monocular visual odometry for road vehicles using uncertain perspective projection
    Van Hamme, David
    Goeman, Werner
    Veelaert, Peter
    Philips, Wilfried
    EURASIP JOURNAL ON IMAGE AND VIDEO PROCESSING, 2015,
  • [29] PL-EVIO: Robust Monocular Event-Based Visual Inertial Odometry With Point and Line Features
    Guan, Weipeng
    Chen, Peiyu
    Xie, Yuhan
    Lu, Peng
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2024, 21 (04) : 6277 - 6293
  • [30] Fast bi-monocular Visual Odometry using Factor Graph Sparsification
    Debeunne, Cesar
    Vallve, Joan
    Torres, Alex
    Vivet, Damien
    2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2023, : 10716 - 10722