Autonomous Visual Mapping and Exploration With a Micro Aerial Vehicle

被引:65
作者
Heng, Lionel [1 ]
Honegger, Dominik [1 ]
Lee, Gim Hee [1 ]
Meier, Lorenz [1 ]
Tanskanen, Petri [1 ]
Fraundorfer, Friedrich [2 ]
Pollefeys, Marc [1 ]
机构
[1] ETH, Comp Vis & Geometry Lab, CH-8092 Zurich, Switzerland
[2] Tech Univ Munich, Fac Civil Engn & Surveying, D-80333 Munich, Germany
关键词
NAVIGATION;
D O I
10.1002/rob.21520
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Cameras are a natural fit for micro aerial vehicles (MAVs) due to their low weight, low power consumption, and two-dimensional field of view. However, computationally-intensive algorithms are required to infer the 3D structure of the environment from 2D image data. This requirement is made more difficult with the MAV's limited payload which only allows for one CPU board. Hence, we have to design efficient algorithms for state estimation, mapping, planning, and exploration. We implement a set of algorithms on two different vision-based MAV systems such that these algorithms enable the MAVs tomap and explore unknown environments. By using both self-built and off-the-shelf systems, we show that our algorithms can be used on different platforms. All algorithms necessary for autonomous mapping and exploration run on-board the MAV. Using a front-looking stereo camera as the main sensor, we maintain a tiled octree-based 3D occupancy map. The MAV uses this map for local navigation and frontier-based exploration. In addition, we use a wall-following algorithm as an alternative exploration algorithm in open areas where frontier-based exploration under-performs. During the exploration, data is transmitted to the ground station which runs large-scale visual SLAM. We estimate the MAV's state with inertial data from an IMU together with metric velocity measurements from a custom-built optical flow sensor and pose estimates from visual odometry. We verify our approaches with experimental results, which to the best of our knowledge, demonstrate our MAVs to be the first vision-based MAVs to autonomously explore both indoor and outdoor environments. (C) 2014 Wiley Periodicals, Inc.
引用
收藏
页码:654 / 675
页数:22
相关论文
共 37 条
[1]  
Achtelik Markus, 2011, IEEE International Conference on Robotics and Automation, P3056
[2]   Drawing Stereo Disparity Images into Occupancy Grids: Measurement Model and Fast Implementation [J].
Andert, Franz .
2009 IEEE-RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, 2009, :5191-5197
[3]  
[Anonymous], 2007, Proc. CVWW '07
[4]  
[Anonymous], IEEE INT C COMP VIS
[5]   RANGE-Robust Autonomous Navigation in GPS-Denied Environments [J].
Bachrach, Abraham ;
Prentice, Samuel ;
He, Ruijie ;
Roy, Nicholas .
JOURNAL OF FIELD ROBOTICS, 2011, 28 (05) :644-666
[6]   Speeded-Up Robust Features (SURF) [J].
Bay, Herbert ;
Ess, Andreas ;
Tuytelaars, Tinne ;
Van Gool, Luc .
COMPUTER VISION AND IMAGE UNDERSTANDING, 2008, 110 (03) :346-359
[7]  
Bills C., 2011, 2011 IEEE International Conference on Robotics and Automation (ICRA 2011), P5776, DOI 10.1109/ICRA.2011.5980136
[8]   BRIEF: Computing a Local Binary Descriptor Very Fast [J].
Calonder, Michael ;
Lepetit, Vincent ;
Oezuysal, Mustafa ;
Trzcinski, Tomasz ;
Strecha, Christoph ;
Fua, Pascal .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2012, 34 (07) :1281-1298
[9]  
Choset H., 2005, Principles of robot motion: theory, algorithms, and implementation
[10]   An open-source navigation system for micro aerial vehicles [J].
Dryanovski, Ivan ;
Valenti, Roberto G. ;
Xiao, Jizhong .
AUTONOMOUS ROBOTS, 2013, 34 (03) :177-188