Citrus Pests and Diseases Recognition Model Using Weakly Dense Connected Convolution Network

被引:59
作者
Xing, Shuli [1 ]
Lee, Marely [1 ]
Lee, Keun-kwang [2 ]
机构
[1] Chon Buk Natl Univ, Ctr Adv Image & Informat Technol, Sch Elect & Informat Engn, Jeonju 54896, Chon Buk, South Korea
[2] Koguryeo Coll, Dept Beauty Arts, Naju 520930, South Korea
基金
新加坡国家研究基金会;
关键词
citrus; pests and diseases identification; convolutional neural network; parameter efficiency; DEEP; IDENTIFICATION;
D O I
10.3390/s19143195
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Pests and diseases can cause severe damage to citrus fruits. Farmers used to rely on experienced experts to recognize them, which is a time consuming and costly process. With the popularity of image sensors and the development of computer vision technology, using convolutional neural network (CNN) models to identify pests and diseases has become a recent trend in the field of agriculture. However, many researchers refer to pre-trained models of ImageNet to execute different recognition tasks without considering their own dataset scale, resulting in a waste of computational resources. In this paper, a simple but effective CNN model was developed based on our image dataset. The proposed network was designed from the aspect of parameter efficiency. To achieve this goal, the complexity of cross-channel operation was increased and the frequency of feature reuse was adapted to network depth. Experiment results showed that Weakly DenseNet-16 got the highest classification accuracy with fewer parameters. Because this network is lightweight, it can be used in mobile devices.
引用
收藏
页数:18
相关论文
共 40 条
[1]  
[Anonymous], 2018, ARXIV180104381V4
[2]  
[Anonymous], P CLEF WORK NOT TOUL
[3]  
[Anonymous], 2016, ARXIV161105431
[4]  
[Anonymous], 2013, P 30 INT C MACHINE L
[5]  
[Anonymous], 2015, Nature, DOI [10.1038/nature14539, DOI 10.1038/NATURE14539]
[6]  
[Anonymous], 2017, ARXIV70404861
[7]  
[Anonymous], 2016, P BRIT MACH VIS C
[8]  
[Anonymous], ARXIV161106440V2
[9]  
[Anonymous], ADV NEURAL INFORM PR, DOI DOI 10.1109/TPAMI.2016.2577031
[10]  
[Anonymous], ARXIV160808710V3