Visual odometry on the mars exploration rovers

被引:0
|
作者
Cheng, Y [1 ]
Maimone, M [1 ]
Matthies, L [1 ]
机构
[1] CALTECH, Jet Prop Lab, Pasadena, CA 91109 USA
来源
INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS, VOL 1-4, PROCEEDINGS | 2005年
关键词
MER; mars exploration rover; visual odometry; motion estimation; egomotion;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
NASA's Mars Exploration Rovers (MER) were designed to traverse in Viking Lander-I style terrains: mostly flat, with many small non-obstacle rocks and occasional obstacles. During actual operations in such terrains, onboard position estimates derived solely from the onboard Inertial Measurement Unit and wheel encoder-based odometry achieved well within the design goal of at most 10% error However MER vehicles were also driven along slippery slopes tilted as high as 31 degrees. In such conditions an additional capability was employed to maintain a sufficiently accurate onboard position estimate: Visual Odometry. The MER Visual Odometry system comprises onboard software for comparing stereo pairs taken by the pointable mast-mounted 45 degree FOV Navigation cameras (NAV-CAMs). The system computes an update to the 6-DOF rover pose (x, y, z, roll, pitch, yaw) by tracking the motion of autonomously-selected "interesting" terrain features between two pairs of stereo images, in both 2D pixel and 3D world coordinates. A maximum likelihood estimator is applied to the computed 3D offsets to produce a final, corrected estimate of vehicle motion between the two pairs. In this paper we describe the Visual Odometry algorithm used on the Mars Exploration Rovers, and summarize its results from the first year of operations on Mars.
引用
收藏
页码:903 / 910
页数:8
相关论文
共 50 条
  • [31] Landmark-aware autonomous odometry correction and map pruning for planetary rovers
    Lu, Chenxi
    Yu, Meng
    Li, Hua
    Cui, Hutao
    ACTA ASTRONAUTICA, 2025, 226 : 86 - 96
  • [32] Deep Direct Visual Odometry
    Zhao, Chaoqiang
    Tang, Yang
    Sun, Qiyu
    Vasilakos, Athanasios V.
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (07) : 7733 - 7742
  • [33] Practical Infrared Visual Odometry
    Borges, Paulo Vinicius Koerich
    Vidas, Stephen
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2016, 17 (08) : 2205 - 2213
  • [34] VSO: Visual Semantic Odometry
    Lianos, Konstantinos-Nektarios
    Schoenberger, Johannes L.
    Pollefeys, Marc
    Sattler, Torsten
    COMPUTER VISION - ECCV 2018, PT IV, 2018, 11208 : 246 - 263
  • [35] A review of monocular visual odometry
    Ming He
    Chaozheng Zhu
    Qian Huang
    Baosen Ren
    Jintao Liu
    The Visual Computer, 2020, 36 : 1053 - 1065
  • [36] Visual odometry - A Review of Approaches
    Zhao, Boxin
    Hu, Tianjiang
    Shen, Lincheng
    2015 IEEE INTERNATIONAL CONFERENCE ON INFORMATION AND AUTOMATION, 2015, : 2569 - 2573
  • [37] The mars exploration rover surface mobility flight Software: Driving ambition
    Biesiadecki, Jeffrey J.
    Maimone, Mark W.
    2006 IEEE AEROSPACE CONFERENCE, VOLS 1-9, 2006, : 51 - +
  • [38] PVO:Panoramic Visual Odometry
    Lin, Minjie
    Cao, Qixin
    Zhang, Haoruo
    2018 3RD IEEE INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS AND MECHATRONICS (IEEE ICARM), 2018, : 491 - 496
  • [39] A review of monocular visual odometry
    He, Ming
    Zhu, Chaozheng
    Huang, Qian
    Ren, Baosen
    Liu, Jintao
    VISUAL COMPUTER, 2020, 36 (05) : 1053 - 1065
  • [40] Research on autonomous navigation of lunar rovers for the moon exploration
    Cui, Pingyuan
    Yue, Fuzhan
    Cui, Hutao
    2006 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS, VOLS 1-3, 2006, : 1042 - +