Visual odometry on the mars exploration rovers

被引:0
|
作者
Cheng, Y [1 ]
Maimone, M [1 ]
Matthies, L [1 ]
机构
[1] CALTECH, Jet Prop Lab, Pasadena, CA 91109 USA
来源
INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS, VOL 1-4, PROCEEDINGS | 2005年
关键词
MER; mars exploration rover; visual odometry; motion estimation; egomotion;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
NASA's Mars Exploration Rovers (MER) were designed to traverse in Viking Lander-I style terrains: mostly flat, with many small non-obstacle rocks and occasional obstacles. During actual operations in such terrains, onboard position estimates derived solely from the onboard Inertial Measurement Unit and wheel encoder-based odometry achieved well within the design goal of at most 10% error However MER vehicles were also driven along slippery slopes tilted as high as 31 degrees. In such conditions an additional capability was employed to maintain a sufficiently accurate onboard position estimate: Visual Odometry. The MER Visual Odometry system comprises onboard software for comparing stereo pairs taken by the pointable mast-mounted 45 degree FOV Navigation cameras (NAV-CAMs). The system computes an update to the 6-DOF rover pose (x, y, z, roll, pitch, yaw) by tracking the motion of autonomously-selected "interesting" terrain features between two pairs of stereo images, in both 2D pixel and 3D world coordinates. A maximum likelihood estimator is applied to the computed 3D offsets to produce a final, corrected estimate of vehicle motion between the two pairs. In this paper we describe the Visual Odometry algorithm used on the Mars Exploration Rovers, and summarize its results from the first year of operations on Mars.
引用
收藏
页码:903 / 910
页数:8
相关论文
共 50 条
  • [1] Visual odometry on the Mars Exploration Rovers - A tool to ensure accurate driving and science imaging
    Cheng, Yang
    Maimone, Mark W.
    Matthies, Larry
    IEEE ROBOTICS & AUTOMATION MAGAZINE, 2006, 13 (02) : 54 - 62
  • [2] Visual Odometry for Planetary Exploration Rovers in Sandy Terrains
    Li, Linhui
    Lian, Jing
    Guo, Lie
    Wang, Rongben
    INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, 2013, 10
  • [3] Improvements in Visual Odometry Algorithm for Planetary Exploration Rovers
    Dinesh, Sudin
    Rao, K. Koteswara
    Unnikrishnan, Manju
    Brinda, V
    Lalithambika, V. R.
    Dhekane, M. V.
    2013 INTERNATIONAL CONFERENCE ON EMERGING TRENDS IN COMMUNICATION, CONTROL, SIGNAL PROCESSING AND COMPUTING APPLICATIONS (IEEE-C2SPCA-2013), 2013,
  • [4] Mars exploration rovers navigation results
    Louis A. D’Amario
    The Journal of the Astronautical Sciences, 2006, 54 : 129 - 173
  • [5] Mars exploration rovers navigation results
    D'Amario, Louis A.
    JOURNAL OF THE ASTRONAUTICAL SCIENCES, 2006, 54 (02) : 129 - 173
  • [6] 3D information retrieval for Visual Odometry System of Planetary Exploration Rovers
    Chitra, K.
    Dinesh, Sudin
    Mishra, Deepak
    Brinda, V.
    Lalithambika, V. R.
    Kumar, B. Manoj
    2013 INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTING, COMMUNICATIONS AND INFORMATICS (ICACCI), 2013, : 354 - 360
  • [7] Global path planning on board the mars exploration rovers
    Carsten, Joseph
    Rankin, Arturo
    Ferguson, Dave
    Stentz, Anthony
    2007 IEEE AEROSPACE CONFERENCE, VOLS 1-9, 2007, : 9 - +
  • [8] Mars exploration rovers orbit determination system modeling
    Wawrzyniak, Geoffrey
    Baird, Darren
    Graat, Eric
    McElrath, Tim
    Portock, Brian
    Watkins, Michael
    JOURNAL OF THE ASTRONAUTICAL SCIENCES, 2006, 54 (02) : 175 - 197
  • [9] Mars exploration rovers orbit determination system modeling
    Geoffrey Wawrzyniak
    Darren Baird
    Eric Graat
    Tim McElrath
    Brian Portock
    Michael Watkins
    The Journal of the Astronautical Sciences, 2006, 54 : 175 - 197
  • [10] Improving the Robustness of a Direct Visual Odometry Algorithm for Planetary Rovers
    Martinez, Geovanni
    2018 15TH INTERNATIONAL CONFERENCE ON ELECTRICAL ENGINEERING, COMPUTING SCIENCE AND AUTOMATIC CONTROL (CCE), 2018,