Research and Validation of Potato Late Blight Detection Method Based on Deep Learning

被引:7
作者
Feng, Junzhe [1 ]
Hou, Bingru [2 ]
Yu, Chenhao [1 ]
Yang, Huanbo [2 ]
Wang, Chao [2 ]
Shi, Xiaoyi [1 ]
Hu, Yaohua [1 ,3 ]
机构
[1] Zhejiang A&F Univ, Coll Opt Mech & Elect Engn, Hangzhou 311300, Peoples R China
[2] Northwest A&F Univ, Coll Mech & Elect Engn, Xianyang 712100, Peoples R China
[3] Minist Agr & Rural Affairs, Coconstruct Minist & Prov, Key Lab Agr Equipment Hilly & Mountainous Areas So, Hangzhou 311300, Peoples R China
来源
AGRONOMY-BASEL | 2023年 / 13卷 / 06期
基金
中国国家自然科学基金;
关键词
potato late blight; deep learning; lightweight; ShuffleNetV2; inference speed;
D O I
10.3390/agronomy13061659
中图分类号
S3 [农学(农艺学)];
学科分类号
0901 ;
摘要
Late blight, caused by phytophthora infestans, is a devastating disease in potato production. In severe cases, this can lead to potato crop failure. To rapidly detect potato late blight, in this study, a deep learning model was developed to discriminate the degree of potato leaf diseases with high recognition accuracy and a fast inference speed. It constructed a total of seven categories of potato leaf disease datasets in single and complex backgrounds, which were augmented using data enhancement method increase to increase the number of images to 7039. In this study, the performance of the pre-trained model for fine-grained classification of potato leaf diseases was evaluated comprehensively in terms of accuracy, inference speed, and the number of parameters. The ShuffleNetV2 2x model with better generalization ability and faster inference speed was selected and improved. Three improvement strategies were proposed: introducing an attention module, reducing the depth of the network, and reducing the number of 1 x 1 convolutions. Their effects on the performance of the underlying model were explored through experiments, and the best form of improvement was determined. The loss function of the improved model converged to 0.36. This was compared to the base model, which was reduced by 34.5%. In the meantime, the improved model reduced the number of parameters, FLOPs, and model size by approximately 23%, increased classification accuracy by 0.85%, and improved CPU inference speed by 25%. Deploying the improved model to the embedded device, the overall classification precision was 94%, and the average time taken to detect a single image was 3.27 s. The method provided critical technical support for the automatic identification of potato late blight.
引用
收藏
页数:22
相关论文
共 35 条
  • [1] An Artificial Intelligence-Based Stacked Ensemble Approach for Prediction of Protein Subcellular Localization in Confocal Microscopy Images
    Aggarwal, Sonam
    Gupta, Sheifali
    Gupta, Deepali
    Gulzar, Yonis
    Juneja, Sapna
    Alwan, Ali A.
    Nauman, Ali
    [J]. SUSTAINABILITY, 2023, 15 (02)
  • [2] [Anonymous], 2000, 17980342000 GBT
  • [3] Barman U., 2020, P 2020 INT C COMP PE
  • [4] Detection of rice plant diseases based on deep transfer learning
    Chen, Junde
    Zhang, Defu
    Nanehkaran, Yaser A.
    Li, Dele
    [J]. JOURNAL OF THE SCIENCE OF FOOD AND AGRICULTURE, 2020, 100 (07) : 3246 - 3256
  • [5] Howard AG, 2017, Arxiv, DOI arXiv:1704.04861
  • [7] A Convolution Neural Network-Based Seed Classification System
    Gulzar, Yonis
    Hamid, Yasir
    Soomro, Arjumand Bano
    Alwan, Ali A.
    Journaux, Ludovic
    [J]. SYMMETRY-BASEL, 2020, 12 (12): : 1 - 18
  • [8] GhostNet: More Features from Cheap Operations
    Han, Kai
    Wang, Yunhe
    Tian, Qi
    Guo, Jianyuan
    Xu, Chunjing
    Xu, Chang
    [J]. 2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, : 1577 - 1586
  • [9] Searching for MobileNetV3
    Howard, Andrew
    Sandler, Mark
    Chu, Grace
    Chen, Liang-Chieh
    Chen, Bo
    Tan, Mingxing
    Wang, Weijun
    Zhu, Yukun
    Pang, Ruoming
    Vasudevan, Vijay
    Le, Quoc V.
    Adam, Hartwig
    [J]. 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 1314 - 1324
  • [10] Hu J, 2018, PROC CVPR IEEE, P7132, DOI [10.1109/TPAMI.2019.2913372, 10.1109/CVPR.2018.00745]