Compressing recognition network of cotton disease with spot-adaptive knowledge distillation

被引:0
|
作者
Zhang, Xinwen [1 ]
Feng, Quan [1 ]
Zhu, Dongqin [1 ]
Liang, Xue [1 ]
Zhang, Jianhua [2 ,3 ]
机构
[1] Gansu Agr Univ, Sch Mech & Elect Engn, Lanzhou, Peoples R China
[2] Chinese Acad Agr Sci, Agr Informat Inst, Beijing, Peoples R China
[3] Chinese Acad Agr Sci, Natl Nanfan Res Inst, Sanya, Peoples R China
来源
FRONTIERS IN PLANT SCIENCE | 2024年 / 15卷
基金
中国国家自然科学基金;
关键词
cotton diseases; deep learning; model compression; knowledge distillation; spot-adaptive;
D O I
10.3389/fpls.2024.1433543
中图分类号
Q94 [植物学];
学科分类号
071001 ;
摘要
Deep networks play a crucial role in the recognition of agricultural diseases. However, these networks often come with numerous parameters and large sizes, posing a challenge for direct deployment on resource-limited edge computing devices for plant protection robots. To tackle this challenge for recognizing cotton diseases on the edge device, we adopt knowledge distillation to compress the big networks, aiming to reduce the number of parameters and the computational complexity of the networks. In order to get excellent performance, we conduct combined comparison experiments from three aspects: teacher network, student network and distillation algorithm. The teacher networks contain three classical convolutional neural networks, while the student networks include six lightweight networks in two categories of homogeneous and heterogeneous structures. In addition, we investigate nine distillation algorithms using spot-adaptive strategy. The results demonstrate that the combination of DenseNet40 as the teacher and ShuffleNetV2 as the student show best performance when using NST algorithm, yielding a recognition accuracy of 90.59% and reducing FLOPs from 0.29 G to 0.045 G. The proposed method can facilitate the lightweighting of the model for recognizing cotton diseases while maintaining high recognition accuracy and offer a practical solution for deploying deep models on edge computing devices.
引用
收藏
页数:14
相关论文
共 47 条
  • [1] Spot-Adaptive Knowledge Distillation
    Song, Jie
    Chen, Ying
    Ye, Jingwen
    Song, Mingli
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2022, 31 : 3359 - 3370
  • [2] Multilevel Adaptive Knowledge Distillation Network for Incremental SAR Target Recognition
    Yu, Xuelian
    Dong, Fulu
    Ren, Haohao
    Zhang, Chengfa
    Zou, Lin
    Zhou, Yun
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2023, 20
  • [3] Knowledge distillation in plant disease recognition
    Ali Ghofrani
    Rahil Mahdian Toroghi
    Neural Computing and Applications, 2022, 34 : 14287 - 14296
  • [4] Knowledge distillation in plant disease recognition
    Ghofrani, Ali
    Toroghi, Rahil Mahdian
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (17): : 14287 - 14296
  • [5] Compressing deep graph convolution network with multi-staged knowledge distillation
    Kim, Junghun
    Jung, Jinhong
    Kang, U.
    PLOS ONE, 2021, 16 (08):
  • [6] AdaDistill: Adaptive Knowledge Distillation for Deep Face Recognition
    Boutros, Fadi
    Struc, Vitomir
    Damer, Naser
    COMPUTER VISION - ECCV 2024, PT LV, 2025, 15113 : 163 - 182
  • [7] Compressing medical deep neural network models for edge devices using knowledge distillation
    Alabbasy, F. MohiEldeen
    Abohamama, A. S.
    Alabbasy, Mohieldeen
    JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2023, 35 (07)
  • [8] Cross-layer knowledge distillation with KL divergence and offline ensemble for compressing deep neural network
    Chou, Hsing-Hung
    Chiu, Ching-Te
    Liao, Yi-Ping
    APSIPA TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING, 2021, 10 : 303 - 338
  • [9] Distant Speech Recognition Based on Knowledge Distillation and Generative Adversarial Network
    Wu, Long
    Li, Ta
    Wang, Li
    Yan, Yong-Hong
    Ruan Jian Xue Bao/Journal of Software, 2019, 30 : 25 - 34
  • [10] Adaptive lightweight network construction method for Self-Knowledge Distillation
    Lu, Siyuan
    Zeng, Weiliang
    Li, Xueshi
    Ou, Jiajun
    NEUROCOMPUTING, 2025, 624