Automatic Shrimp Fry Counting Method Using Multi-Scale Attention Fusion

被引:1
|
作者
Peng, Xiaohong [1 ]
Zhou, Tianyu [1 ]
Zhang, Ying [1 ,2 ]
Zhao, Xiaopeng [1 ]
机构
[1] Guangdong Ocean Univ, Fac Math & Comp Sci, Zhanjiang 524088, Peoples R China
[2] Zhanjiang Bay Lab, Southern Marine Sci & Engn Guangdong Lab, Zhanjiang 524000, Peoples R China
关键词
smart aquaculture; deep learning; shrimp fry counting; SFCNet; multi-scale attention fusion;
D O I
10.3390/s24092916
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Shrimp fry counting is an important task for biomass estimation in aquaculture. Accurate counting of the number of shrimp fry in tanks can not only assess the production of mature shrimp but also assess the density of shrimp fry in the tanks, which is very helpful for the subsequent growth status, transportation management, and yield assessment. However, traditional manual counting methods are often inefficient and prone to counting errors; a more efficient and accurate method for shrimp fry counting is urgently needed. In this paper, we first collected and labeled the images of shrimp fry in breeding tanks according to the constructed experimental environment and generated corresponding density maps using the Gaussian kernel function. Then, we proposed a multi-scale attention fusion-based shrimp fry counting network called the SFCNet. Experiments showed that our proposed SFCNet model reached the optimal performance in terms of shrimp fry counting compared to CNN-based baseline counting models, with MAEs and RMSEs of 3.96 and 4.682, respectively. This approach was able to effectively calculate the number of shrimp fry and provided a better solution for accurately calculating the number of shrimp fry.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] Lightweight Blueberry Fruit Recognition Based on Multi-Scale and Attention Fusion NCBAM
    Yang, Wenji
    Ma, Xinxin
    Hu, Wenchao
    Tang, Pengjie
    AGRONOMY-BASEL, 2022, 12 (10):
  • [22] A Multi-Scale Progressive Collaborative Attention Network for Remote Sensing Fusion Classification
    Ma, Wenping
    Li, Yating
    Zhu, Hao
    Ma, Haoxiang
    Jiao, Licheng
    Shen, Jianchao
    Hou, Biao
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (08) : 3897 - 3911
  • [23] Binocular Depth Estimation Algorithm Based on Multi-Scale Attention Feature Fusion
    Yang Huitong
    Lei Lang
    Lin Yongchun
    LASER & OPTOELECTRONICS PROGRESS, 2022, 59 (18)
  • [24] Residual attention mechanism and weighted feature fusion for multi-scale object detection
    Jie Zhang
    Qiye Qi
    Huanlong Zhang
    Qifan Du
    Fengxian Wang
    Xiaoping Shi
    Multimedia Tools and Applications, 2023, 82 : 40873 - 40889
  • [25] L-Unet: A Landslide Extraction Model Using Multi-Scale Feature Fusion and Attention Mechanism
    Dong, Zhangyu
    An, Sen
    Zhang, Jin
    Yu, Jinqiu
    Li, Jinhui
    Xu, Daoli
    REMOTE SENSING, 2022, 14 (11)
  • [26] Residual attention mechanism and weighted feature fusion for multi-scale object detection
    Zhang, Jie
    Qi, Qiye
    Zhang, Huanlong
    Du, Qifan
    Wang, Fengxian
    Shi, Xiaoping
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (26) : 40873 - 40889
  • [27] A Multi-Scale Natural Scene Text Detection Method Based on Attention Feature Extraction and Cascade Feature Fusion
    Li, Nianfeng
    Wang, Zhenyan
    Huang, Yongyuan
    Tian, Jia
    Li, Xinyuan
    Xiao, Zhiguo
    SENSORS, 2024, 24 (12)
  • [28] MSANet: an improved semantic segmentation method using multi-scale attention for remote sensing images
    Zhang, Xiaolu
    Wang, Zhaoshun
    Zhang, Jianheng
    Wei, Anlei
    REMOTE SENSING LETTERS, 2022, 13 (12) : 1249 - 1259
  • [29] RASNet: Renal automatic segmentation using an improved U-Net with multi-scale perception and attention unit
    Cao, Gaoyu
    Sun, Zhanquan
    Wang, Chaoli
    Geng, Hongquan
    Fu, Hongliang
    Yin, Zhong
    Pan, Minlan
    PATTERN RECOGNITION, 2024, 150
  • [30] Multi-Scale Feature Extraction Method of Hyperspectral Image with Attention Mechanism
    Xu Zhangchi
    Guo Baofeng
    Wu Wenhao
    You Jingyun
    Su Xiaotong
    LASER & OPTOELECTRONICS PROGRESS, 2024, 61 (04)