Pose Estimation of a Mobile Robot Based on Fusion of IMU Data and Vision Data Using an Extended Kalman Filter

被引:126
作者
Alatise, Mary B. [1 ]
Hancke, Gerhard P. [1 ,2 ]
机构
[1] Univ Pretoria, Dept Elect Elect & Comp Engn, ZA-0028 Pretoria, South Africa
[2] City Univ Hong Kong, Dept Comp Sci, Hong Kong, Hong Kong, Peoples R China
关键词
pose estimation; mobile robot; inertial sensors; vision; object; extended Kalman filter; AUGMENTED REALITY; INERTIAL SENSORS; TRACKING; FEATURES; SCALE; CALIBRATION; CONSENSUS; STANDARD; OBJECTS;
D O I
10.3390/s17102164
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Using a single sensor to determine the pose estimation of a device cannot give accurate results. This paper presents a fusion of an inertial sensor of six degrees of freedom (6-DoF) which comprises the 3-axis of an accelerometer and the 3-axis of a gyroscope, and a vision to determine a low-cost and accurate position for an autonomous mobile robot. For vision, a monocular vision-based object detection algorithm speeded-up robust feature (SURF) and random sample consensus (RANSAC) algorithms were integrated and used to recognize a sample object in several images taken. As against the conventional method that depend on point-tracking, RANSAC uses an iterative method to estimate the parameters of a mathematical model from a set of captured data which contains outliers. With SURF and RANSAC, improved accuracy is certain; this is because of their ability to find interest points (features) under different viewing conditions using a Hessain matrix. This approach is proposed because of its simple implementation, low cost, and improved accuracy. With an extended Kalman filter (EKF), data from inertial sensors and a camera were fused to estimate the position and orientation of the mobile robot. All these sensors were mounted on the mobile robot to obtain an accurate localization. An indoor experiment was carried out to validate and evaluate the performance. Experimental results show that the proposed method is fast in computation, reliable and robust, and can be considered for practical applications. The performance of the experiments was verified by the ground truth data and root mean square errors (RMSEs).
引用
收藏
页数:22
相关论文
共 73 条
  • [1] A standard testing and calibration procedure for low cost MEMS inertial sensors and units
    Aggarwal, P.
    Syed, Z.
    Niu, X.
    El-Sheimy, N.
    [J]. JOURNAL OF NAVIGATION, 2008, 61 (02) : 323 - 336
  • [2] Ahmad N., 2013, International Journal of Signal Processing Systems, V1, P256, DOI [10.12720/ijsps.1.2.256-262, DOI 10.12720/IJSPS.1.2.256-262]
  • [3] Alatise M., 2017, P IEEE AFRICON 2017
  • [4] Alomari A., 1904, SENSORS, V2017, P17, DOI [10.3390/s1708190428820451, DOI 10.3390/S1708190428820451]
  • [5] [Anonymous], 2006, Matrix, DOI [10.1093/jxb/erm298, DOI 10.1093/JXB/ERM298]
  • [6] A survey of augmented reality
    Azuma, RT
    [J]. PRESENCE-VIRTUAL AND AUGMENTED REALITY, 1997, 6 (04): : 355 - 385
  • [7] Speeded-Up Robust Features (SURF)
    Bay, Herbert
    Ess, Andreas
    Tuytelaars, Tinne
    Van Gool, Luc
    [J]. COMPUTER VISION AND IMAGE UNDERSTANDING, 2008, 110 (03) : 346 - 359
  • [8] Review and classification of vision-based localisation techniques in unknown environments
    Ben-Afia, Amani
    Deambrogio, Lina
    Salos, Daniel
    Escher, Anne-Christine
    Macabiau, Christophe
    Soulier, Laurent
    Gay-Bellile, Vincent
    [J]. IET RADAR SONAR AND NAVIGATION, 2014, 8 (09) : 1059 - 1072
  • [9] INDOOR NAVIGATION WITH FOOT-MOUNTED STRAPDOWN INERTIAL NAVIGATION AND MAGNETIC SENSORS
    Bird, Jeff
    Arden, Dale
    [J]. IEEE WIRELESS COMMUNICATIONS, 2011, 18 (02) : 28 - 35
  • [10] Advanced tracking through efficient image processing and visual-inertial sensor fusion
    Bleser, Gabriele
    Stricker, Didier
    [J]. COMPUTERS & GRAPHICS-UK, 2009, 33 (01): : 59 - 72