Communicationless Navigation Through Robust Visual Odometry

被引:0
|
作者
Van Hamme, David [1 ]
Veelaert, Peter [1 ]
Philips, Wilfried [2 ]
机构
[1] Univ Coll Ghent, Vis Syst Res Grp, Ghent, Belgium
[2] Univ Ghent, Image Proc & Interpretat IPI Res Grp, Dept Telecommun & Informat Proc, Ghent, Belgium
来源
2012 15TH INTERNATIONAL IEEE CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS (ITSC) | 2012年
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
GPS navigation is often found undependable in urban situations where tall structures occlude large parts of the sky. To keep accurate position in these situations, we need an alternative method. We propose a novel visual odometry method that is shown to provide reliable relative motion estimation in typical urban road driving using a single camera. While the short-term accuracy is good, relative motion estimation by itself is susceptible to drift and therefore insufficient to provide good long-term absolute position estimates. To overcome this drift, we propose to warp the visual odometry output to a stored map. This warping must be able to cope with temporary discrepancies between visual odometry and map data. The proposed mapping does not make a hard decision about road position, but instead entertains all plausible hypotheses about the followed trajectory and their associated warping costs. Evaluation on real test sequences proves that the method succesfully eliminates drift, and on average stays within 7 metres of simultaneously recorded GPS data. This proves that the combined visual odometry and mapping are sufficient to provide positioning with comparable accuracy to GPS in those situations when GPS is unavailable.
引用
收藏
页码:1555 / 1560
页数:6
相关论文
共 50 条
  • [1] Stereo-based visual odometry for robust rover navigation
    Cumani, Aldo
    Guiducci, Antonio
    WSEAS Transactions on Circuits and Systems, 2006, 5 (10): : 1556 - 1562
  • [2] Robust Multispectral Visual-Inertial Navigation With Visual Odometry Failure Recovery
    Beauvisage, Axel
    Ahiska, Kenan
    Aouf, Nabil
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (07) : 9089 - 9101
  • [3] Radar Visual Inertial Odometry and Radar Thermal Inertial Odometry: Robust Navigation even in Challenging Visual Conditions
    Doer, Christopher
    Trommer, Gert F.
    2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2021, : 331 - 338
  • [4] Visual Multimodal Odometry: Robust Visual Odometry in Harsh Environments
    Kleinschmidt, Sebastian P.
    Wagner, Bernardo
    2018 IEEE INTERNATIONAL SYMPOSIUM ON SAFETY, SECURITY, AND RESCUE ROBOTICS (SSRR), 2018,
  • [5] PIVO: Probabilistic Inertial-Visual Odometry for Occlusion-Robust Navigation
    Solin, Arno
    Cortes, Santiago
    Rahtu, Esa
    Kannala, Juho
    2018 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2018), 2018, : 616 - 625
  • [6] Visual Odometry for Autonomous Robot Navigation Through Efficient Outlier Rejection
    Kostavelis, Ioannis
    Boukas, Evangelos
    Nalpantidis, Lazaros
    Gasteratos, Antonios
    2013 IEEE INTERNATIONAL CONFERENCE ON IMAGING SYSTEMS AND TECHNIQUES (IST 2013), 2013, : 45 - 50
  • [7] Enabling Continuous Planetary Rover Navigation through FPGA Stereo and Visual Odometry
    Howard, Thomas M.
    Morfopoulos, Arin
    Morrison, Jack
    Kuwata, Yoshiaki
    Villalpando, Carlos
    Matthies, Larry
    McHenry, Michael
    2012 IEEE AEROSPACE CONFERENCE, 2012,
  • [8] Robust and Accurate Deterministic Visual Odometry
    Benet, Pierre
    Guinamard, Alexis
    PROCEEDINGS OF THE 33RD INTERNATIONAL TECHNICAL MEETING OF THE SATELLITE DIVISION OF THE INSTITUTE OF NAVIGATION (ION GNSS+ 2020), 2020, : 2260 - 2271
  • [9] A Framework for Fast and Robust Visual Odometry
    Wu, Meiqing
    Lam, Siew-Kei
    Srikanthan, Thambipillai
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2017, 18 (12) : 3433 - 3448
  • [10] Robust Visual Odometry in Underwater Environment
    Zhang, Jun
    Ila, Viorela
    Kneip, Laurent
    2018 OCEANS - MTS/IEEE KOBE TECHNO-OCEANS (OTO), 2018,