SIMILAR PLANT SPECIES CLASSIFICATION WITH DUAL INPUT TRANSFER LEARNING MODELS

被引:1
作者
Sharma, Parul [1 ]
Abrol, Pawanesh [1 ]
机构
[1] Univ Jammu, Dept Comp Sci & Informat Technol, Baba Saheb Ambedkar Rd, Jammu 180006, Jammu & Kashmir, India
关键词
Citrus species classification; dual-input CNN; Grad-CAM; similar species classification; transfer learning;
D O I
10.5958/0974-4517.2023.00040.X
中图分类号
Q81 [生物工程学(生物技术)]; Q93 [微生物学];
学科分类号
071005 ; 0836 ; 090102 ; 100705 ;
摘要
The classification of plant species is crucial for preserving biodiversity, identifying endangered species, preventing diseases, and controlling weeds. However, with the vast number of species and their similar physical appearances, identifying and classifying them correctly can be difficult for human experts. Artificial Intelligence, such as Convolution Neural Networks (CNNs), can aid in automatic plant species recognition, especially for identifying similar species that require the learning of subtle differences. In the present investigation, transfer learning models such as MobileNet, AlexNet, and GoogLeNet were used to classify ten citrus species that look alike. A dataset of images of fruits and leaves of these citrus plants was compiled, and all images were segmented and their backgrounds were subtracted to create a new segmented dataset. In addition, instead of using only one organ as input to the CNN, dual inputs of both fruits and leaves were also used. The classification accuracy of 78.25% was achieved when MobileNet model was used on original dataset of fruits and it extended to 97.25% when GoogLeNet model was used on segmented dataset when both the organs were used as input. Evidently, present study provides innovative methods and techniques to accurately distinguish and classify visually similar plant species.
引用
收藏
页码:361 / 371
页数:11
相关论文
共 33 条
  • [1] Impact of dataset size and variety on the effectiveness of deep learning and transfer learning for plant disease classification
    Arnal Barbedo, Jayme Garcia
    [J]. COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2018, 153 : 46 - 53
  • [3] Davis J., 2006, P 23 INT C MACH LEAR, VVolume 148, P233, DOI [10.1145/1143844.1143874, DOI 10.1145/1143844.1143874]
  • [4] Deng J, 2009, PROC CVPR IEEE, P248, DOI 10.1109/CVPRW.2009.5206848
  • [5] Dueck K., 2023, AZ Animals
  • [6] Howard AG, 2017, Arxiv, DOI [arXiv:1704.04861, DOI 10.48550/ARXIV.1704.04861, 10.48550/arXiv.1704.04861]
  • [7] Automated species identification: why not?
    Gaston, KJ
    O'Neill, MA
    [J]. PHILOSOPHICAL TRANSACTIONS OF THE ROYAL SOCIETY B-BIOLOGICAL SCIENCES, 2004, 359 (1444) : 655 - 667
  • [8] Goeau H., 2018, BIODIVERS INFORM SCI, V2, DOI [10.3897/biss.2.25637, DOI 10.3897/BISS.2.25637]
  • [9] Deep Residual Learning for Image Recognition
    He, Kaiming
    Zhang, Xiangyu
    Ren, Shaoqing
    Sun, Jian
    [J]. 2016 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2016, : 770 - 778
  • [10] Multiscale Distance Matrix for Fast Plant Leaf Recognition
    Hu, Rongxiang
    Jia, Wei
    Ling, Haibin
    Huang, Deshuang
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2012, 21 (11) : 4667 - 4672