Compressing recognition network of cotton disease with spot-adaptive knowledge distillation

被引:0
|
作者
Zhang, Xinwen [1 ]
Feng, Quan [1 ]
Zhu, Dongqin [1 ]
Liang, Xue [1 ]
Zhang, Jianhua [2 ,3 ]
机构
[1] Gansu Agr Univ, Sch Mech & Elect Engn, Lanzhou, Peoples R China
[2] Chinese Acad Agr Sci, Agr Informat Inst, Beijing, Peoples R China
[3] Chinese Acad Agr Sci, Natl Nanfan Res Inst, Sanya, Peoples R China
来源
FRONTIERS IN PLANT SCIENCE | 2024年 / 15卷
基金
中国国家自然科学基金;
关键词
cotton diseases; deep learning; model compression; knowledge distillation; spot-adaptive;
D O I
10.3389/fpls.2024.1433543
中图分类号
Q94 [植物学];
学科分类号
071001 ;
摘要
Deep networks play a crucial role in the recognition of agricultural diseases. However, these networks often come with numerous parameters and large sizes, posing a challenge for direct deployment on resource-limited edge computing devices for plant protection robots. To tackle this challenge for recognizing cotton diseases on the edge device, we adopt knowledge distillation to compress the big networks, aiming to reduce the number of parameters and the computational complexity of the networks. In order to get excellent performance, we conduct combined comparison experiments from three aspects: teacher network, student network and distillation algorithm. The teacher networks contain three classical convolutional neural networks, while the student networks include six lightweight networks in two categories of homogeneous and heterogeneous structures. In addition, we investigate nine distillation algorithms using spot-adaptive strategy. The results demonstrate that the combination of DenseNet40 as the teacher and ShuffleNetV2 as the student show best performance when using NST algorithm, yielding a recognition accuracy of 90.59% and reducing FLOPs from 0.29 G to 0.045 G. The proposed method can facilitate the lightweighting of the model for recognizing cotton diseases while maintaining high recognition accuracy and offer a practical solution for deploying deep models on edge computing devices.
引用
收藏
页数:14
相关论文
共 47 条
  • [31] Cotton Disease Recognition Method in Natural Environment Based on Convolutional Neural Network
    Shao, Yi
    Yang, Wenzhong
    Wang, Jiajia
    Lu, Zhifeng
    Zhang, Meng
    Chen, Danny
    AGRICULTURE-BASEL, 2024, 14 (09):
  • [32] Knowledge distillation based lightweight domain adversarial neural network for electroencephalogram-based emotion recognition
    Wang, Zhe
    Wang, Yongxiong
    Tang, Yiheng
    Pan, Zhiqun
    Zhang, Jiapeng
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2024, 95
  • [33] TGNet: A Lightweight Infrared Thermal Image Gesture Recognition Network Based on Knowledge Distillation and Model Pruning
    Chen, L.
    Sun, Q.
    Xu, Z.
    Liao, Y.
    2024 CROSS STRAIT RADIO SCIENCE AND WIRELESS TECHNOLOGY CONFERENCE, CSRSWTC 2024, 2024, : 96 - 98
  • [34] PocketNet: Extreme Lightweight Face Recognition Network Using Neural Architecture Search and Multistep Knowledge Distillation
    Boutros, Fadi
    Siebke, Patrick
    Klemt, Marcel
    Damer, Naser
    Kirchbuchner, Florian
    Kuijper, Arjan
    IEEE ACCESS, 2022, 10 : 46823 - 46833
  • [35] KDALDL: Knowledge Distillation-Based Adaptive Label Distribution Learning Network for Bone Age Assessment
    Zheng, Hao-Dong
    Yu, Lei
    Lu, Yu-Ting
    Zhang, Wei-Hao
    Yu, Yan-Jun
    IEEE ACCESS, 2024, 12 : 17679 - 17689
  • [36] Recognition of tea disease spot based on hyperspectral image and genetic optimization neural network
    Zhang S.
    Wang Z.
    Zou X.
    Qian Y.
    Yu L.
    Zou, Xiuguo (zouxiuguo@njau.edu.cn), 1600, Chinese Society of Agricultural Engineering (33): : 200 - 207
  • [37] ProKD: An Unsupervised Prototypical Knowledge Distillation Network for Zero-Resource Cross-Lingual Named Entity Recognition
    Ge, Ling
    Hu, Chunming
    Ma, Guanghui
    Zhang, Hong
    Liu, Jihong
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 11, 2023, : 12818 - 12826
  • [38] Lightweight expression recognition combined attention fusion network with hybrid knowledge distillation for occluded e-learner facial images
    Chen, Yan
    Li, Kexuan
    Tian, Feng
    Wei, Ganglin
    Seberi, Morteza
    NEUROCOMPUTING, 2025, 628
  • [39] Research on a Classification Method for Strip Steel Surface Defects Based on Knowledge Distillation and a Self-Adaptive Residual Shrinkage Network
    Huang, Xinbo
    Song, Zhiwei
    Ji, Chao
    Zhang, Ye
    Yang, Luya
    ALGORITHMS, 2023, 16 (11)
  • [40] STRM-KD: Semantic topological relation matching knowledge distillation model for smart agriculture apple leaf disease recognition
    Li, Daxiang
    Zhang, Wenkai
    Liu, Ying
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 263