AquaYOLO: Advanced YOLO-based fish detection for optimized aquaculture pond monitoring

被引:1
|
作者
Vijayalakshmi, M. [1 ]
Sasithradevi, A. [2 ]
机构
[1] Vellore Inst Technol, Sch Elect Engn, Chennai 600127, India
[2] Vellore Inst Technol, Ctr Adv Data Sci, Chennai 600127, India
来源
SCIENTIFIC REPORTS | 2025年 / 15卷 / 01期
关键词
Fish Detection; Hierarchical features; Aquaculture Monitoring; Deep Learning; YOLO;
D O I
10.1038/s41598-025-89611-y
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Aquaculture plays an important role in ensuring global food security, supporting economic growth, and protecting natural resources. However, traditional methods of monitoring aquatic environments are time-consuming and labor-intensive. To address this, there is growing interest in using computer vision for more efficient aqua monitoring. Fish detection is a key challenging step in these vision-based systems, as it faces challenges such as changing light conditions, varying water clarity, different types of vegetation, and dynamic backgrounds. To overcome these challenges, we introduce a new model called AquaYOLO, an optimized model specifically designed for aquaculture applications. The backbone of AquaYOLO employs CSP layers and enhanced convolutional operations to extract hierarchical features. The head enhances feature representation through upsampling, concatenation, and multi-scale fusion. The detection head uses a precise 40 x 40 scale for box regression and dropping the final C2f layer to ensure accurate localization. To test the AquaYOLO model, we utilize DePondFi dataset (Detection of Pond Fish) collected from aquaponds in South India. DePondFi dataset contains around 50k bounding box annotations across 8150 images. Proposed AquaYOLO model performs well, achieving a precision, recall and mAP@50 of 0.889, 0.848, and 0.909 respectively. Our model ensures efficient and affordable fish detection for small-scale aquaculture.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] CEH-YOLO: A composite enhanced YOLO-based model for underwater object detection
    Feng, Jiangfan
    Jin, Tao
    ECOLOGICAL INFORMATICS, 2024, 82
  • [22] Development of YOLO-based Model for Fall Detection in IoT Smart Home Applications
    Gao, Pengcheng
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (10) : 1118 - 1125
  • [23] A new YOLO-based method for real-time crowd detection from video and performance analysis of YOLO models
    Gunduz, Mehmet Sirin
    Isik, Gultekin
    JOURNAL OF REAL-TIME IMAGE PROCESSING, 2023, 20 (01)
  • [24] A Yolo-based object monitoring approach for smart shops surveillance system
    Xu, Wei
    Zhai, Yujin
    JOURNAL OF OPTICS-INDIA, 2024, 53 (04): : 3163 - 3170
  • [25] A new YOLO-based method for real-time crowd detection from video and performance analysis of YOLO models
    Mehmet Şirin Gündüz
    Gültekin Işık
    Journal of Real-Time Image Processing, 2023, 20
  • [26] YOLO-Based Deep Learning Model for Pressure Ulcer Detection and Classification
    Aldughayfiq, Bader
    Ashfaq, Farzeen
    Jhanjhi, N. Z.
    Humayun, Mamoona
    HEALTHCARE, 2023, 11 (09)
  • [27] YOLO-Based Object Detection in Industry 4.0 Fischertechnik Model Environment
    Schneidereit, Slavomira
    Yarahmadi, Ashkan Mansouri
    Schneidereit, Toni
    Breuss, Michael
    Gebauer, Marc
    INTELLIGENT SYSTEMS AND APPLICATIONS, VOL 2, INTELLISYS 2023, 2024, 823 : 1 - 20
  • [28] Light-YOLO: A Lightweight and Efficient YOLO-Based Deep Learning Model for Mango Detection
    Zhong, Zhengyang
    Yun, Lijun
    Cheng, Feiyan
    Chen, Zaiqing
    Zhang, Chunjie
    AGRICULTURE-BASEL, 2024, 14 (01):
  • [29] A Yolo-based Approach for Fire and Smoke Detection in IoT Surveillance Systems
    Zhang, Dawei
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (01) : 87 - 94
  • [30] Interactive YOLO-based Object Detection using a Polygonal Region of Interest for Airborne Surveillance Applications
    Vinod, Jithin
    Dhipu, T. M.
    Rajesh, R.
    2024 IEEE SPACE, AEROSPACE AND DEFENCE CONFERENCE, SPACE 2024, 2024, : 901 - 905