The MADMAX data set for visual-inertial rover navigation on mars

被引:24
作者
Meyer, Lukas [1 ]
Smisek, Michal [1 ]
Villacampa, Alejandro Fontan [1 ]
Maza, Laura Oliva [1 ]
Medina, Daniel [2 ]
Schuster, Martin J. [1 ]
Steidle, Florian [1 ]
Vayugundla, Mallikarjuna [1 ]
Mueller, Marcus G. [1 ]
Rebele, Bernhard [3 ]
Wedler, Armin [1 ]
Triebel, Rudolph [1 ]
机构
[1] German Aerosp Ctr DLR, Inst Robot & Mechatron, Dept Percept & Cognit, Muenchener Str 20, D-82234 Wessling, Germany
[2] German Aerosp Ctr DLR, Inst Commun & Nav, Dept Naut Syst, Neustrelitz, Germany
[3] German Aerosp Ctr DLR, Inst Robot & Mechatron, Anal & Control Adv Robot Syst, Wessling, Germany
基金
欧盟地平线“2020”;
关键词
exploration; extreme environments; navigation; planetary robotics; SLAM;
D O I
10.1002/rob.22016
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Planetary rovers increasingly rely on vision-based components for autonomous navigation and mapping. Developing and testing these components requires representative optical conditions, which can be achieved by either field testing at planetary analog sites on Earth or using prerecorded data sets from such locations. However, the availability of representative data is scarce and field testing in planetary analog sites requires a substantial financial investment and logistical overhead, and it entails the risk of damaging complex robotic systems. To address these issues, we use our compact human-portable DLR Sensor Unit for Planetary Exploration Rovers (SUPER) in the Moroccan desert to show resource-efficient field testing and make the resulting Morocco-Acquired data set of Mars-Analog eXploration (MADMAX) publicly accessible. The data set consists of 36 different navigation experiments, captured at eight Mars analog sites of widely varying environmental conditions. Its longest trajectory covers 1.5 km and the combined trajectory length is 9.2 km. The data set contains time-stamped recordings from monochrome stereo cameras, a color camera, omnidirectional cameras in stereo configuration, and from an inertial measurement unit. Additionally, we provide the ground truth in position and orientation together with the associated uncertainties, obtained by a real-time kinematic-based algorithm that fuses the global navigation satellite system data of two body antennas. Finally, we run two state-of-the-art navigation algorithms, ORB-SLAM2 and VINS-mono, on our data to evaluate their accuracy and to provide a baseline, which can be used as a performance reference of accuracy and robustness for other navigation algorithms. The data set can be accessed at .
引用
收藏
页码:833 / 853
页数:21
相关论文
共 44 条
  • [1] The Blackbird UAV dataset
    Antonini, Amado
    Guerra, Winter
    Murali, Varun
    Sayre-McCord, Thomas
    Karaman, Sertac
    [J]. INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2020, 39 (10-11) : 1346 - 1364
  • [2] Arbter, 2020, DLR CALDE DLR CALLAB
  • [3] Azkarate R., 2019, P S ADV SPAC TECHN R
  • [4] Field trial results of planetary rover visual motion estimation in Mars analogue terrain
    Bakambu, Joseph Nsasi
    Langley, Chris
    Pushpanathan, Giri
    MacLean, W. James
    Mukherji, Raja
    Dupuis, Erick
    [J]. JOURNAL OF FIELD ROBOTICS, 2012, 29 (03) : 413 - 425
  • [5] Barfoot T., 2017, STATE ESTIMATION ROB
  • [6] Brinkmann Wiebke., 2019, P 2019 SPACE TECH IN
  • [7] The EuRoC micro aerial vehicle datasets
    Burri, Michael
    Nikolic, Janosch
    Gohl, Pascal
    Schneider, Thomas
    Rehder, Joern
    Omari, Sammy
    Achtelik, Markus W.
    Siegwart, Roland
    [J]. INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2016, 35 (10) : 1157 - 1163
  • [8] Bussmann K, 2018, IEEE INT C INT ROBOT, P2449, DOI 10.1109/IROS.2018.8594294
  • [9] Design and field testing of a rover with an actively articulated suspension system in a Mars analog terrain
    Cordes, Florian
    Kirchner, Frank
    Babu, Ajish
    [J]. JOURNAL OF FIELD ROBOTICS, 2018, 35 (07) : 1149 - 1181
  • [10] Low-cost vision-based AUV guidance system for reef navigation
    Dunbabin, M
    Corke, P
    Buskey, G
    [J]. 2004 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1- 5, PROCEEDINGS, 2004, : 7 - 12