Deep learning-based multi-task prediction system for plant disease and species detection

被引:36
作者
Keceli, Ali Seydi [1 ]
Kaya, Aydin [1 ]
Catal, Cagatay [2 ]
Tekinerdogan, Bedir [3 ]
机构
[1] Hacettepe Univ, Dept Comp Engn, Ankara, Turkey
[2] Qatar Univ, Dept Comp Sci & Engn, Doha, Qatar
[3] Wageningen Univ & Res, Informat Technol Grp, Wageningen, Netherlands
关键词
Plant classification; Multi-task learning; Transfer learning; Deep neural networks; Convolutional neural networks;
D O I
10.1016/j.ecoinf.2022.101679
中图分类号
Q14 [生态学(生物生态学)];
学科分类号
071012 ; 0713 ;
摘要
The manual prediction of plant species and plant diseases is expensive, time-consuming, and requires expertise that is not always available. Automated approaches, including machine learning and deep learning, are increasingly being applied to surmount these challenges. For this, accurate models are needed to provide reliable predictions and guide the decision-making process. So far, these two problems have been addressed separately, and likewise, separate models have been developed for each of these two problems, but considering that plant species and plant disease prediction are often related tasks, they can be considered together. We therefore propose and validate a novel approach based on the multi-task learning strategy, using shared representations between these related tasks, because they perform better than individual models. We apply a multi-input network that uses raw images and transferred deep features extracted from a pre-trained deep model to predict each plant's type and disease. We develop an end-to-end multi-task model that carries out more than one learning task at a time and combines the Convolutional Neural Network (CNN) features and transferred features. We then evaluate this model using public datasets. The results of our experiments demonstrated that this Multi Input Multi-Task Neural Network model increases efficiency and yields faster learning for similar detection tasks.
引用
收藏
页数:14
相关论文
共 40 条
[1]   PERCEPTUAL QUALITY ASSESSMENT OF DIGITAL IMAGES USING DEEP FEATURES [J].
Ahmed, Nisar ;
Asif, Hafiz Muhammad Shahzad .
COMPUTING AND INFORMATICS, 2020, 39 (03) :385-409
[2]   Modeling of migratory beekeeper behaviors with machine learning approach using meteorological and environmental variables: The case of Turkey [J].
Albayrak, Ahmet ;
Ceven, Suleyman ;
Bayir, Raif .
ECOLOGICAL INFORMATICS, 2021, 66
[3]  
[Anonymous], 2017, CoRR, abs/1707.08114
[4]   Few-Shot Learning approach for plant disease classification using images taken in the field [J].
Argueso, David ;
Picon, Artzai ;
Irusta, Unai ;
Medela, Alfonso ;
San-Emeterio, Miguel G. ;
Bereciartua, Arantza ;
Alvarez-Gila, Aitor .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2020, 175
[5]   Solving Current Limitations of Deep Learning Based Approaches for Plant Disease Detection [J].
Arsenovic, Marko ;
Karanovic, Mirjana ;
Sladojevic, Srdjan ;
Anderla, Andras ;
Stefanovic, Darko .
SYMMETRY-BASEL, 2019, 11 (07)
[6]   Rice heading stage automatic observation by multi-classifier cascade based rice spike detection method [J].
Bai, Xiaodong ;
Cao, Zhiguo ;
Zhao, Laiding ;
Zhang, Junrong ;
Lv, Chenfei ;
Li, Cuina ;
Xie, Jidong .
AGRICULTURAL AND FOREST METEOROLOGY, 2018, 259 :260-270
[7]   Multimodal Machine Learning: A Survey and Taxonomy [J].
Baltrusaitis, Tadas ;
Ahuja, Chaitanya ;
Morency, Louis-Philippe .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2019, 41 (02) :423-443
[8]   Deep Learning for Tomato Diseases: Classification and Symptoms Visualization [J].
Brahimi, Mohammed ;
Boukhalfa, Kamel ;
Moussaoui, Abdelouahab .
APPLIED ARTIFICIAL INTELLIGENCE, 2017, 31 (04) :299-315
[9]   Using deep transfer learning for image-based plant disease identification [J].
Chen, Junde ;
Chen, Jinxiu ;
Zhang, Defu ;
Sun, Yuandong ;
Nanehkaran, Y. A. .
COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2020, 173
[10]   Detection of rice plant diseases based on deep transfer learning [J].
Chen, Junde ;
Zhang, Defu ;
Nanehkaran, Yaser A. ;
Li, Dele .
JOURNAL OF THE SCIENCE OF FOOD AND AGRICULTURE, 2020, 100 (07) :3246-3256