Robust Visual Odometry for Complex Urban Environments

被引:0
|
作者
Parra, Ignacio [1 ]
Angel Sotelo, Miguel [1 ]
Vlacic, Ljubo [2 ]
机构
[1] Univ Alcala de Henares, Escuela Politecn Super, Dept Elect, Madrid, Spain
[2] Griffith Univ, ICSL, Brisbane, Qld, Australia
来源
2008 IEEE INTELLIGENT VEHICLES SYMPOSIUM, VOLS 1-3 | 2008年
关键词
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This paper describes a new approach for estimating the vehicle motion trajectory in complex urban environments by means of visual odometry. A new strategy for robust feature extraction and data post-processing is developed and tested on-road. Scale-invariant Image Features (SIFT) are used in order to cope with the complexity of urban environments. The obtained results are discussed and compared to previous works. In the prototype system, the ego-motion of the vehicle is computed using a stereo-vision system mounted next to the rear view mirror of the car. Feature points are matched between pairs of frames and linked into 3D trajectories. The distance between estimations is dynamically adapted based on reprojection and estimation errors. Vehicle motion is estimated using the non-linear, photogrametric approach based on RANSAC (RAndom SAmple Consensus). The obvious application of the method is to provide on-board driver assistance in navigation tasks, or to provide a means of autonomously navigating a vehicle. The method has been tested in real traffic conditions without using prior knowledge about the scene or the vehicle motion. An example of how to estimate a vehicle's trajectory is provided along with suggestions for possible further improvement of the proposed odometry algorithm.
引用
收藏
页码:916 / +
页数:2
相关论文
共 50 条
  • [1] Robust visual odometry for vehicle localization in urban environments
    Parra, I.
    Sotelo, M. A.
    Llorca, D. F.
    Ocana, M.
    ROBOTICA, 2010, 28 : 441 - 452
  • [2] Visual Multimodal Odometry: Robust Visual Odometry in Harsh Environments
    Kleinschmidt, Sebastian P.
    Wagner, Bernardo
    2018 IEEE INTERNATIONAL SYMPOSIUM ON SAFETY, SECURITY, AND RESCUE ROBOTICS (SSRR), 2018,
  • [3] Stereo camera visual odometry for moving urban environments
    Delmas, Patrice
    Gee, Trevor
    INTEGRATED COMPUTER-AIDED ENGINEERING, 2019, 26 (03) : 243 - 256
  • [4] Driven to Distraction: Self-Supervised Distractor Learning for Robust Monocular Visual Odometry in Urban Environments
    Barnes, Dan
    Maddern, Will
    Pascoe, Geoffrey
    Posner, Ingmar
    2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2018, : 1894 - 1900
  • [5] Monocular Visual Odometry in Urban Environments Using an Omnidirectional Camera
    Tardif, Jean-Philippe
    Pavlidis, Yanis
    Daniilidis, Kostas
    2008 IEEE/RSJ INTERNATIONAL CONFERENCE ON ROBOTS AND INTELLIGENT SYSTEMS, VOLS 1-3, CONFERENCE PROCEEDINGS, 2008, : 2531 - 2538
  • [6] Robust Visual Odometry Leveraging Mixture of Manhattan Frames in Indoor Environments
    Yuan, Huayu
    Wu, Chengfeng
    Deng, Zhongliang
    Yin, Jiahui
    SENSORS, 2022, 22 (22)
  • [7] Stereo visual odometry in urban environments based on detecting ground features
    de la Escalera, Arturo
    Izquierdo, Ebroul
    Martin, David
    Musleh, Basam
    Garcia, Fernando
    Maria Armingol, Jose
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2016, 80 : 1 - 10
  • [8] Visual Odometry in Challenging Environments: An Urban Underground Railway Scenario Case
    Etxeberria-Garcia, Mikel
    Zamalloa, Maider
    Arana-Arexolaleiba, Nestor
    Labayen, Mikel
    IEEE ACCESS, 2022, 10 : 69200 - 69215
  • [9] A Stereo Visual Odometry Framework with Augmented Perception for Dynamic Urban Environments
    Contreras, Marcelo
    Bhatt, Neel P.
    Hashemi, Ehsan
    2023 IEEE 26TH INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS, ITSC, 2023, : 4094 - 4099
  • [10] Wheel Odometry aided Visual-Inertial Odometry for Land Vehicle Navigation in Winter Urban Environments
    Huang, Cheng
    Jiang, Yang
    O'Keefe, Kyle
    PROCEEDINGS OF THE 33RD INTERNATIONAL TECHNICAL MEETING OF THE SATELLITE DIVISION OF THE INSTITUTE OF NAVIGATION (ION GNSS+ 2020), 2020, : 2237 - 2251