Vision based obstacle avoidance and motion tracking for autonomous behaviors in underwater vehicles

被引:0
作者
Leonardi, Marco [1 ]
Stahl, Annette [1 ]
Gazzea, Michele [1 ]
Ludvigsen, Martin [2 ]
Rist-Christensen, Ida [2 ]
Nornes, Stein M. [3 ]
机构
[1] NTNU AMOS, Dept Engn Cybernet, Ctr Autonomous Marine Operat & Syst, Trondheim, Norway
[2] NTNU, Dept Marine Technol, Appl Underwater Robot Lab, Trondheim, Norway
[3] NTNU AMOS, Dept Marine Technol, Ctr Autonomous Marine Operat & Syst, Trondheim, Norway
来源
OCEANS 2017 - ABERDEEN | 2017年
关键词
underwater stereo vision; obstacle avoidance; ROV; AUV; motion tracking; position estimation;
D O I
暂无
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Performing reliable underwater localization and maneuvering of Remotely Operated underwater Vehicles (ROVs) and Autonomous Underwater Vehicles (AUVs) near nature protection areas, historical sites or other man-made structures is a difficult task. Traditionally, different sensing techniques are exploited with sonar being the most often used to extract depth information and to avoid obstacles. However, little has been published on complete control systems that utilize robotic vision for such underwater applications. This paper provides a proof of concept regarding a series of experiments investigating the use of stereo vision for underwater obstacle avoidance and position estimation. The test platform has been a ROV equipped with two industrial cameras and external light sources. Methods for underwater calibration, disparity map and 3D point cloud processing have been used, to obtain more reliable information about obstacles in front of the ROV. Results from laboratory research work and from field experiments demonstrate that underwater obstacle avoidance with stereo cameras is possible and can increase the autonomous capabilities of ROVs by providing appropriate information for navigation, path planning, safer missions and environment awareness.
引用
收藏
页数:10
相关论文
共 27 条
[1]   Real-time path planning and obstacle avoidance for RAIS: An autonomous underwater vehicle [J].
Antonelli, G ;
Chiaverini, S ;
Finotello, R ;
Schiavon, R .
IEEE JOURNAL OF OCEANIC ENGINEERING, 2001, 26 (02) :216-227
[2]   Fast ego-motion estimation with multi-rate fusion of inertial and vision [J].
Armesto, Leopoldo ;
Tornero, Josep ;
Vincze, Markus .
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2007, 26 (06) :577-589
[3]   Quantization error in stereo imaging systems [J].
Balasubramanian, R ;
Das, S ;
Udayabaskaran, S ;
Swaminathan, K .
INTERNATIONAL JOURNAL OF COMPUTER MATHEMATICS, 2002, 79 (06) :671-691
[4]   Speeded-Up Robust Features (SURF) [J].
Bay, Herbert ;
Ess, Andreas ;
Tuytelaars, Tinne ;
Van Gool, Luc .
COMPUTER VISION AND IMAGE UNDERSTANDING, 2008, 110 (03) :346-359
[5]  
Belcher EO, 2002, OCEANS 2002 MTS/IEEE CONFERENCE & EXHIBITION, VOLS 1-4, CONFERENCE PROCEEDINGS, P2124
[6]   Image-side perspective and stereoscopy [J].
Bercovitz, J .
STEREOSCOPIC DISPLAYS AND VIRTUAL REALITY SYSTEMS V, 1998, 3295 :288-298
[7]  
Bryant M., 2000, Proceedings of Australian Conference on Robotics and Autom (Melbourne, Australia, September 2000), P111
[8]  
Carrera Arnau, 2014, OCEANS 2014, DOI 10.1109/OCEANS-TAIPEI.2014.6964483
[9]  
Cieslak P, 2015, IEEE INT CONF ROBOT, P529, DOI 10.1109/ICRA.2015.7139230
[10]   Experiments with underwater robot localization and tracking [J].
Corke, Peter ;
Detweiler, Carrick ;
Dunbabin, Matthew ;
Hamilton, Michael ;
Rus, Daniela ;
Vasilescu, Iuliu .
PROCEEDINGS OF THE 2007 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-10, 2007, :4556-+