Fast Underwater Optical Beacon Finding and High Accuracy Visual Ranging Method Based on Deep Learning

被引:5
作者
Zhang, Bo [1 ]
Zhong, Ping [1 ]
Yang, Fu [1 ]
Zhou, Tianhua [2 ]
Shen, Lingfei [2 ]
机构
[1] Donghua Univ, Coll Sci, Shanghai 201620, Peoples R China
[2] Chinses Acad Sci, Shanghai Inst Opt & Fine Mech, Key Lab Space Laser Commun & Detect Technol, Shanghai 201800, Peoples R China
基金
中国国家自然科学基金; 上海市自然科学基金;
关键词
autonomous underwater vehicles; target detection; monocular vision; deep learning; POSE ESTIMATION; NAVIGATION; AUV;
D O I
10.3390/s22207940
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Visual recognition and localization of underwater optical beacons is an important step in autonomous underwater vehicle (AUV) docking. The main issues that restrict the use of underwater monocular vision range are the attenuation of light in water, the mirror image between the water surface and the light source, and the small size of the optical beacon. In this study, a fast monocular camera localization method for small 4-light beacons is proposed. A YOLO V5 (You Only Look Once) model with coordinated attention (CA) mechanisms is constructed. Compared with the original model and the model with convolutional block attention mechanisms (CBAM), and our model improves the prediction accuracy to 96.1% and the recall to 95.1%. A sub-pixel light source centroid localization method combining super-resolution generative adversarial networks (SRGAN) image enhancement and Zernike moments is proposed. The detection range of small optical beacons is increased from 7 m to 10 m. In the laboratory self-made pool and anechoic pool experiments, the average relative distance error of our method is 1.04 percent, and the average detection speed is 0.088 s (11.36 FPS). This study offers a solution for the long-distance fast and accurate positioning of underwater small optical beacons due to their fast recognition, accurate ranging, and wide detection range characteristics.
引用
收藏
页数:21
相关论文
共 32 条
[1]   Vision-based Deep Learning algorithm for Underwater Object Detection and Tracking [J].
Alla, Durga Nooka Venkatesh ;
Jyothi, V. Bala Naga ;
Venkataraman, H. ;
Ramadass, G. A. .
OCEANS 2022, 2022,
[2]   Paving the way for a future underwater omni-directional wireless optical communication systems [J].
Baiden, Greg ;
Bissiri, Yassiah ;
Masoti, Andrew .
OCEAN ENGINEERING, 2009, 36 (9-10) :633-640
[3]  
Bochkovskiy A., 2020, ARXIV 200410934
[4]   Inertial Sensor Self-Calibration in a Visually-Aided Navigation Approach for a Micro-AUV [J].
Bonin-Font, Francisco ;
Massot-Campos, Miquel ;
Lluis Negre-Carrasco, Pep ;
Oliver-Codina, Gabriel ;
Beltran, Joan P. .
SENSORS, 2015, 15 (01) :1825-1860
[5]   Close-Range Tracking of Underwater Vehicles Using Light Beacons [J].
Bosch, Josep ;
Gracias, Nuno ;
Ridao, Pere ;
Istenic, Klemen ;
Ribas, David .
SENSORS, 2016, 16 (04) :1-26
[6]   Practical Tracking of Permanent Magnet Linear Motor Via Logarithmic Sliding Mode Control [J].
Dong, Hanlin ;
Yang, Xuebo ;
Basin, Michael, V .
IEEE-ASME TRANSACTIONS ON MECHATRONICS, 2022, 27 (05) :4112-4121
[7]  
Guo Y., 2021, P INT C AUTONOMOUS U, P2658
[8]   Coordinate Attention for Efficient Mobile Network Design [J].
Hou, Qibin ;
Zhou, Daquan ;
Feng, Jiashi .
2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, :13708-13717
[9]   Stereo-vision-based AUV navigation system for resetting the inertial navigation system error [J].
Hsu, Horng Yi ;
Toda, Yuichiro ;
Yamashita, Kohei ;
Watanabe, Keigo ;
Sasano, Masahiko ;
Okamoto, Akihiro ;
Inaba, Shogo ;
Minami, Mamoru .
ARTIFICIAL LIFE AND ROBOTICS, 2022, 27 (01) :165-178
[10]   Autonomous inspection of underwater structures [J].
Jacobi, Marco .
ROBOTICS AND AUTONOMOUS SYSTEMS, 2015, 67 :80-86