Autonomous Underwater Vehicle Control for Fishnet Inspection in Turbid Water Environments

被引:16
作者
Lee, Hoosang [1 ]
Jeong, Daehyeon [1 ]
Yu, Hongje [1 ]
Ryu, Jeha [2 ]
机构
[1] Gwangju Inst Sci & Technol GIST, Sch Integrated Technol, 123 Cheomdangwagi Ro, Gwangju, South Korea
[2] Gwangju Inst Sci & Technol GIST, Sch Integrated Technol & Artificial Intelligence, Grad Sch Program, 123 Cheomdangwagi Ro, Gwangju, South Korea
关键词
Autonomous underwater vehicle; convolutional neural network; underwater inspection; vision-based control; SHIP HULL INSPECTION; NAVIGATION;
D O I
10.1007/s12555-021-0357-9
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Fisheries are essential for the economic supply of proteins. Detecting damaged fishnets using autonomous underwater vehicles (AUVs) may be an efficient and safe solution for avoiding dangers to human divers. However, in turbid underwater environments, visibility is significantly degraded by floating particles that cause light attenuation, which is one of the main problems for accurate underwater inspection by optical cameras. To obtain clear images for net inspection, we propose an AUV pose control strategy for fish farming net inspection in turbid water, based on the mean gradient feature over the partial or entire image. To alleviate the laborious human process of setting the desired set-point for distance control, a convolutional neural network (CNN) is trained offline using a supervised learning method and combined with a controller. The proposed method can maintain a relatively constant relative pose with respect to a fishnet, which is sufficient to acquire clear net images in turbid water and check whether a part of the net is damaged or not. Experimental results in both swimming pools and real fish farm environments demonstrated the effectiveness of the proposed methods.
引用
收藏
页码:3383 / 3392
页数:10
相关论文
共 33 条
[21]   Vision-based object detection and tracking for autonomous navigation of underwater robots [J].
Lee, Donghwa ;
Kim, Gonyop ;
Kim, Donghoon ;
Myung, Hyun ;
Choi, Hyun-Taek .
OCEAN ENGINEERING, 2012, 48 :59-68
[22]   AUV docking experiments based on vision positioning using two cameras [J].
Li, Ye ;
Jiang, Yanqing ;
Cao, Jian ;
Wang, Bo ;
Li, Yiming .
OCEAN ENGINEERING, 2015, 110 :163-173
[23]  
Lin T. X., 2020, PROC IEEE INT C SYST
[24]  
Livanos G, 2018, IEEE CONF IMAGING SY, P191
[25]   THRESHOLD SELECTION METHOD FROM GRAY-LEVEL HISTOGRAMS [J].
OTSU, N .
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1979, 9 (01) :62-66
[26]   Automated fish cage net inspection using image processing techniques [J].
Paspalakis, Stavros ;
Moirogiorgou, Konstantia ;
Papandroulakis, Nikos ;
Giakos, George ;
Zervakis, Michalis .
IET IMAGE PROCESSING, 2020, 14 (10) :2028-2034
[27]   Experimental evaluation of hydroacoustic instruments for ROV navigation along aquaculture net pens [J].
Rundtop, Per ;
Frank, Kevin .
AQUACULTURAL ENGINEERING, 2016, 74 :143-156
[28]   MobileNetV2: Inverted Residuals and Linear Bottlenecks [J].
Sandler, Mark ;
Howard, Andrew ;
Zhu, Menglong ;
Zhmoginov, Andrey ;
Chen, Liang-Chieh .
2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, :4510-4520
[29]  
Shen C., 2015, J INFORM COMPUTATION, V12, P4137, DOI [10.12733/jics20106168, DOI 10.12733/JICS20106168]
[30]  
Sobel I, 1968, STANF ART PROJ, P271