UPLP-SLAM: Unified point-line-plane feature fusion for RGB-D visual SLAM

被引:20
|
作者
Yang, Haozhi [1 ,2 ]
Yuan, Jing [1 ,2 ]
Gao, Yuanxi [1 ,2 ]
Sun, Xingyu [1 ,2 ]
Zhang, Xuebo [1 ,2 ]
机构
[1] Nankai Univ, Coll Artificial Intelligence, Tianjin 300353, Peoples R China
[2] Tianjin Key Lab Intelligent Robot, Tianjin 300353, Peoples R China
关键词
Simultaneous localization and mapping (SLAM); Point-line-plane fusion; Uniform representation mutual; ASssociation joint optimization; SIMULTANEOUS LOCALIZATION; FILTER; ROBOT;
D O I
10.1016/j.inffus.2023.03.006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most of the existing RGB-D simultaneous localization and mapping (SLAM) systems are based on point features or point-line features or point-plane features. However, the existing multi-feature fusion SLAM methods based on the filter framework are not accurate and robust. And the fusion methods based on the optimization framework process different kinds of features separately and integrate them loosely. In the optimization-based framework, how to tightly fuse various kinds of features for achieving more accurate and robust pose estima-tion has been rarely considered. In this paper, we propose a unified point-line-plane fusion RGB-D visual SLAM method for navigation of mobile robots in structured environments, making full use of the information of three kinds of geometric features. Specifically, it extracts point, line, and plane features using images captured by the RGB-D camera and expresses them in a uniform way. Then, a mutual association scheme is designed for data association of point, line and plane features, which not only considers the correspondence of homogeneous features, i.e., point-point, line-line and plane-plane pairs, but also includes the association of heterogeneous features, i.e., point-line, point-plane and line-plane pairs. Afterwards, the matching errors between homoge-neous features and the association errors between heterogeneous features are uniformly represented and jointly optimized to estimate the camera pose and feature parameters for accurate and consistent localization and map building. It is worth pointing out that the proposed unified framework contains two levels. From the system framework perspective, all the main components of the SLAM system, i.e., feature representation, feature as-sociation and error function are handled in a unified manner, which increases the accuracy and compactness of the multi-feature SLAM system. From the feature processing perspective, both homogeneous features and het-erogeneous features are uniformly used, which provides more spatial constraints on pose estimation. Finally, the accuracy and robustness of the proposed method are verified by experiment comparisons with state-of-the-art visual SLAM systems on public datasets and in real-world environments
引用
收藏
页码:51 / 65
页数:15
相关论文
共 50 条
  • [1] PLP-SLAM: A Visual SLAM Method Based on Point-Line-Plane Feature Fusion
    Li H.
    Hu Z.
    Chen X.
    Chen, Xinwei (chenxw_mju@126.com), 2017, Chinese Academy of Sciences (39): : 214 - 220and229
  • [2] SLAM Algorithm with Point-Line Feature Fusion Based on RGB-D Camera
    Ma, Li
    Xu, Mengcong
    Zhou, Lei
    Huanan Ligong Daxue Xuebao/Journal of South China University of Technology (Natural Science), 2022, 50 (02): : 76 - 83
  • [3] PLPD-SLAM: Point-Line-Plane-based RGB-D SLAM for Dynamic Environments
    Dong, Juan
    Lu, Maobin
    Xu, Yong
    Deng, Fang
    Chen, Jie
    2024 IEEE 18TH INTERNATIONAL CONFERENCE ON CONTROL & AUTOMATION, ICCA 2024, 2024, : 719 - 724
  • [4] Point-line feature fusion based field real-time RGB-D SLAM
    Li, Qingyu
    Wang, Xin
    Wu, Tian
    Yang, Huijun
    COMPUTERS & GRAPHICS-UK, 2022, 107 : 10 - 19
  • [5] PLPF-VSLAM: An indoor visual SLAM with adaptive fusion of point-line-plane features
    Yan, Jinjin
    Zheng, Youbing
    Yang, Jinquan
    Mihaylova, Lyudmila
    Yuan, Weijie
    Gu, Fuqiang
    JOURNAL OF FIELD ROBOTICS, 2024, 41 (01) : 50 - 67
  • [6] Visual SLAM with RGB-D Cameras
    Jin, Qiongyao
    Liu, Yungang
    Man, Yongchao
    Li, Fengzhong
    PROCEEDINGS OF THE 38TH CHINESE CONTROL CONFERENCE (CCC), 2019, : 4072 - 4077
  • [7] UPL-SLAM: Unconstrained RGB-D SLAM with Accurate Point-Line Features for Visual Perception
    Sun, Xianshuai
    Zhao, Yuming
    Wang, Yabiao
    Li, Zhigang
    He, Zhen
    Wang, Xiaohui
    IEEE Access, 13 : 8676 - 8690
  • [8] UPL-SLAM: Unconstrained RGB-D SLAM With Accurate Point-Line Features for Visual Perception
    Sun, Xianshuai
    Zhao, Yuming
    Wang, Yabiao
    Li, Zhigang
    He, Zhen
    Wang, Xiaohui
    IEEE ACCESS, 2025, 13 : 8676 - 8690
  • [9] A Robust Fusion Method For RGB-D SLAM
    Liu, Tong
    Mang, Xiaowei
    Wei, Ziang
    Yuan, Zejian
    2013 CHINESE AUTOMATION CONGRESS (CAC), 2013, : 474 - 481
  • [10] Dense Visual SLAM for RGB-D Cameras
    Kerl, Christian
    Sturm, Juergen
    Cremers, Daniel
    2013 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2013, : 2100 - 2106