Noise Robust Underwater Fishing Net Recognition Based on Range Gated Imaging

被引:0
作者
Xu, Zhensong [1 ,2 ]
Wang, Xinwei [1 ,2 ,3 ]
Sun, Liang [1 ]
Song, Bo [1 ,2 ]
Zhang, Yue [1 ]
Lei, Pingshun [1 ]
Chen, Jianan [1 ]
He, Jun [1 ]
Zhou, Yan [1 ,2 ,3 ]
Liu, Yuliang [1 ,2 ]
机构
[1] Chinese Acad Sci, Optoelect Syst Lab, Inst Semicond, Beijing 100083, Peoples R China
[2] Univ Chinese Acad Sci, Coll Mat Sci & Optoelect Technol, Inst Semicond, Beijing 100049, Peoples R China
[3] Univ Chinese Acad Sci, Sch Elect Elect & Commun Engn, Beijing 100049, Peoples R China
来源
IEEE ACCESS | 2024年 / 12卷
基金
中国国家自然科学基金;
关键词
Logic gates; Image recognition; Semantics; Noise; Target recognition; Semantic segmentation; Feature extraction; Optical imaging; Noise measurement; Sonar; Underwater image recognition; semantic segmentation; range gated imaging; deep learning; generative adversarial network; SYSTEM;
D O I
10.1109/ACCESS.2024.3510335
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Underwater fishing net recognition plays an indispensable role in applications such as safe navigation of unmanned underwater vehicles, protection of marine ecology and marine ranching. However, the performance of underwater fishing net recognition usually degrades seriously due to noise interference in underwater environments. In this paper, we use range gated imaging as the detection device, and propose a semantic fishing net recognition network (SFNR-Net) for underwater fishing net recognition at long distance. The proposed SFNR-Net introduces an auxiliary semantic segmentation module (ASSM) to introduce extra semantic information and enhance feature representation under noisy conditions. Besides, to address the problem of unbalanced training data, we employ semantic regulated cycle-consistent generative adversarial network (CycleGAN) as a data augmentation approach. To improve the quality of generated data, we propose a semantic loss to regulate the training of CycleGAN. Comprehensive experiments on the test data show that SFNR-Net can effectively solve noise interference and achieve the best recognition accuracy of 96.28% compared with existing methods. Field experiments in underwater environments with different turbidity further validate the advantages of our method.
引用
收藏
页码:185492 / 185510
页数:19
相关论文
共 63 条
  • [1] Li J., Zhang G., Jiang C., Zhang W., A survey of maritime unmanned search system: Theory, applications and future directions, Ocean Eng, 285, (2023)
  • [2] Stelfox M., Hudgins J., Sweet M., A review of ghost gear entanglement amongst marine mammals, reptiles and elasmobranchs, Mar. Pollut. Bull., 111, 1-2, pp. 6-17, (2016)
  • [3] Livanos G., Zervakis M., Chalkiadakis V., Moirogiorgou K., Giakos G., Papandroulakis N., Intelligent navigation and control of a prototype autonomous underwater vehicle for automated inspection of aquaculture net pen cages, in Proc. IEEE Int. Conf. Imag. Syst. Techn. (IST), pp. 1-6, (2018)
  • [4] Qin R., Zhao X., Zhu W., Yang Q., He B., Li G., Yan T., Multiple receptive field network (MRF-Net) for autonomous underwater vehicle fishing net detection using forward-looking sonar images, Sensors, 21, 6, (2021)
  • [5] Yu F., He B., Liu J., Wang Q., Dual-branch framework: AUV-based target recognition method for marine survey, Eng. Appl. Artif. Intell., 115, (2022)
  • [6] Ye X., Wang X., Deep generative network and regression network for fishing nets detection in real-time, in Proc. 37th Chin. Control Conf. (CCC), pp. 9466-9471, (2018)
  • [7] Paspalakis S., Moirogiorgou K., Papandroulakis N., Giakos G., Zervakis M., Automated fish cage net inspection using image processing techniques, IET Image Process, 14, 10, pp. 2028-2034, (2020)
  • [8] Qiu W., Pakrashi V., Ghosh B., Fishing net health state estimation using underwater imaging, J. Mar. Sci. Eng., 8, 9, (2020)
  • [9] Zhao Y.-P., Niu L.-J., Du H., Bi C.-W., An adaptive method of damage detection for fishing nets based on image processing technology, Aquacultural Eng, 90, (2020)
  • [10] Zhong Y., Wang J., Lu Q., SCAUIE-net: Underwater image enhancement method based on spatial and channel attention, IEEE Access, 11, pp. 72172-72185, (2023)