Visual Odometry Implementation and Accuracy Evaluation Based on Real-time Appearance-based Mapping

被引:2
作者
Hu, Bo [1 ]
Huang, He [1 ,2 ]
机构
[1] Beijing Univ Civil Engn & Architecture, Sch Geomat & Urban Spatial Informat, 15 Yongyuan Rd, Beijing 102616, Peoples R China
[2] Beijing Univ Civil Engn & Architecture, Beijing Adv Innovat Ctr Future Urban Design, 1 Exhibit Hall Rd, Beijing 100044, Peoples R China
关键词
visual odometry; RTAB-MAP; feature detection algorithms; RANSAC; accuracy evaluation;
D O I
10.18494/SAM.2020.2870
中图分类号
TH7 [仪器、仪表];
学科分类号
0804 ; 080401 ; 081102 ;
摘要
With the rapid development of artificial intelligence and machine learning technology, various robots serving humans have emerged. The positioning and navigation technology of robots has become a research hotspot. Robots relying on visual odometry (VO) are favored by people for their low price and wide range of applications. In this paper, we focus on the algorithm and implementation of VO based on the feature point method. Firstly, the current mainstream feature detection algorithms are compared and analyzed in terms of real-time performance and time efficiency. The random sample consensus algorithm is used to eliminate the mismatch in the image feature matching process. Secondly, using the Kinect vision sensor and the Turtlebot mobile robot system to build the RGB-D simultaneous localization and mapping (SLAM) experimental platform, the real-time appearance-based mapping (RTAB- MAP)-based VO is realized. Finally, to verify the actual running performance of the VO, a series of motion experiments were performed in an indoor environment. The accuracy of the pose estimation of the VO was evaluated, which provides a useful reference for the development of mobile robot positioning technology.
引用
收藏
页码:2261 / 2275
页数:15
相关论文
共 40 条
[31]   Real-Time Model-Based Video Stabilization for Microaerial Vehicles [J].
Wilbert G. Aguilar ;
Cecilio Angulo .
Neural Processing Letters, 2016, 43 :459-477
[32]   A real-time Road Boundary Detection Algorithm Based on Driverless Cars [J].
Zhu, Xuekui ;
Gao, Meijuan ;
Li, Shangnian .
PROCEEDINGS OF THE 2015 4TH NATIONAL CONFERENCE ON ELECTRICAL, ELECTRONICS AND COMPUTER ENGINEERING ( NCEECE 2015), 2016, 47 :843-848
[33]   Real-Time Model-Based Video Stabilization for Microaerial Vehicles [J].
Aguilar, Wilbert G. ;
Angulo, Cecilio .
NEURAL PROCESSING LETTERS, 2016, 43 (02) :459-477
[34]   Real-time video surveillance based on combining foreground extraction and human detection [J].
Zeng, Hui-Chi ;
Huang, Szu-Hao ;
Lai, Shang-Hong .
ADVANCES IN MULTIMEDIA MODELING, PROCEEDINGS, 2008, 4903 :70-79
[35]   A Real-Time and High Precision Hardware Implementation of RANSAC Algorithm for Visual SLAM Achieving Mismatched Feature Point Pair Elimination [J].
He, Wenzheng ;
Lu, Zikuo ;
Liu, Xin ;
Xu, Ziwei ;
Zhang, Jingshuo ;
Yang, Chen ;
Geng, Li .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2024, 71 (11) :5102-5114
[36]   Real-time Rigid Motion Segmentation using Grid-based Optical Flow [J].
Lee, Sangil ;
Kim, H. Jin .
2017 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2017, :1552-1557
[37]   Research on real time 6DOF robot localization based on visual and inertial fusion [J].
Zheng, Xinfang ;
Guo, Yongxiang .
PROCEEDINGS OF THE 2017 5TH INTERNATIONAL CONFERENCE ON MACHINERY, MATERIALS AND COMPUTING TECHNOLOGY (ICMMCT 2017), 2017, 126 :1392-1402
[38]   Real-time identification, localization, and grading method for navel oranges based on RGB-D camera [J].
Liu D. ;
Zhu L. ;
Ji W. ;
Lian Y. .
Nongye Gongcheng Xuebao/Transactions of the Chinese Society of Agricultural Engineering, 2022, 38 (14) :154-165
[39]   A QUANTITATIVE EVALUATION METHOD BASED ON EMD FOR DETERMINING THE ACCURACY OF TIME-VARYING SEISMIC WAVELET EXTRACTION [J].
Zhang, Peng ;
Dai, Yongshou ;
Wang, Rongrong ;
Tan, Yongcheng .
JOURNAL OF SEISMIC EXPLORATION, 2017, 26 (03) :267-292
[40]   Real-Time 3D Reconstruction of UAV Acquisition System for the Urban Pipe Based on RTAB-Map [J].
Chen, Xinbao ;
Zhu, Xiaodong ;
Liu, Chang .
APPLIED SCIENCES-BASEL, 2023, 13 (24)